Nothing Special   »   [go: up one dir, main page]

US20160110791A1 - Method, computer program product, and system for providing a sensor-based environment - Google Patents

Method, computer program product, and system for providing a sensor-based environment Download PDF

Info

Publication number
US20160110791A1
US20160110791A1 US14/590,240 US201514590240A US2016110791A1 US 20160110791 A1 US20160110791 A1 US 20160110791A1 US 201514590240 A US201514590240 A US 201514590240A US 2016110791 A1 US2016110791 A1 US 2016110791A1
Authority
US
United States
Prior art keywords
person
item
field
view
customer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/590,240
Inventor
Dean Frederick Herring
Monsak Jason Chirakansakcharoen
Ankit Singh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Global Commerce Solutions Holdings Corp
Original Assignee
Toshiba Global Commerce Solutions Holdings Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Global Commerce Solutions Holdings Corp filed Critical Toshiba Global Commerce Solutions Holdings Corp
Priority to US14/590,240 priority Critical patent/US20160110791A1/en
Assigned to TOSHIBA GLOBAL COMMERCE SOLUTIONS HOLDINGS CORPORATION reassignment TOSHIBA GLOBAL COMMERCE SOLUTIONS HOLDINGS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HERRING, DEAN FREDERICK, CHIRAKANSAKCHAROEN, MONSAK JASON, SINGH, Ankit
Publication of US20160110791A1 publication Critical patent/US20160110791A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/0036Checkout procedures
    • G07G1/0045Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader
    • G07G1/0054Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G19/00Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
    • G01G19/40Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight
    • G01G19/413Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight using electromechanical or electronic computing means
    • G01G19/414Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight using electromechanical or electronic computing means using electronic computing means only
    • G01G19/4144Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight using electromechanical or electronic computing means using electronic computing means only for controlling weight of goods in commercial establishments, e.g. supermarket, P.O.S. systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • G06K9/00624
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/18Payment architectures involving self-service terminals [SST], vending machines, kiosks or multimedia terminals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/20Point-of-sale [POS] network systems
    • G06Q20/204Point-of-sale [POS] network systems comprising interface for record bearing medium or carrier for electronic funds transfer or payment credit
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/20Point-of-sale [POS] network systems
    • G06Q20/208Input by product or record sensing, e.g. weighing or scanner processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/20Point-of-sale [POS] network systems
    • G06Q20/209Specified transaction journal output feature, e.g. printed receipt or voice output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/322Aspects of commerce using mobile devices [M-devices]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/01Customer relationship services
    • G06Q30/015Providing customer assistance, e.g. assisting a customer within a business location or via helpdesk
    • G06Q30/016After-sales
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0207Discounts or incentives, e.g. coupons or rebates
    • G06Q30/0224Discounts or incentives, e.g. coupons or rebates based on user history
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0207Discounts or incentives, e.g. coupons or rebates
    • G06Q30/0235Discounts or incentives, e.g. coupons or rebates constrained by time limit or expiration date
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0269Targeted advertisements based on user profile or attribute
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0609Buyer or seller confidence or verification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0623Item investigation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0623Item investigation
    • G06Q30/0625Directed, with specific intent or strategy
    • G06Q30/0629Directed, with specific intent or strategy for generating comparisons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0633Lists, e.g. purchase orders, compilation or processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0633Lists, e.g. purchase orders, compilation or processing
    • G06Q30/0635Processing of requisition or of purchase orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0639Item locations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • G06T7/0069
    • G06T7/2093
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/80Recognising image objects characterised by unique random patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/0036Checkout procedures
    • G07G1/0045Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/0036Checkout procedures
    • G07G1/0045Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader
    • G07G1/0054Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles
    • G07G1/0063Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles with means for detecting the geometric dimensions of the article of which the code is read, such as its size or height, for the verification of the registration
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/0036Checkout procedures
    • G07G1/0045Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader
    • G07G1/0054Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles
    • G07G1/0072Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles with means for detecting the weight of the article of which the code is read, for the verification of the registration
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G3/00Alarm indicators, e.g. bells
    • G07G3/003Anti-theft control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G19/00Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
    • G01G19/40Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/68Food, e.g. fruit or vegetables
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/179Human faces, e.g. facial parts, sketches or expressions metadata assisted face recognition

Definitions

  • the present disclosure relates to a sensor-based environment, and more specifically, to providing an adaptive personal experience within such an environment using a determined field of view of the person.
  • FIG. 1 illustrates an integrated shopping environment, according to one embodiment.
  • FIG. 2 illustrates a shopping environment system, according to one embodiment.
  • FIG. 3 illustrates an integrated shopping environment, according to one embodiment.
  • FIG. 4 illustrates a system of influencing shopping experience based on a customer field of view, according to one embodiment.
  • FIGS. 5A and 5B illustrate an example wearable computing device for use in a shopping environment, according to one embodiment.
  • FIGS. 6A-6C illustrate determining a customer field of view and identifying items included within the field of view, according to one embodiment.
  • FIG. 7 illustrates several example views of determining a customer focus on an item within the customer's field of view, according to one embodiment.
  • FIGS. 8A-8C illustrate several example presentations of information related to an item, according to one embodiment.
  • FIG. 9A illustrates a method of influencing a shopping experience of a customer within a shopping environment, according to one embodiment.
  • FIG. 9B illustrates a method of determining content to present to a customer based on a customer interest score, according to one embodiment.
  • aspects of the current disclosure relate to an integrated environment capable of providing a personalized, automated, and adaptive experience for a person within the environment.
  • a number of different sensor devices may be employed within the environment, and networked with various computing devices such as point-of-sale (POS) terminals, digital signage, servers, and mobile or handheld computing devices to provide a seamless integration of mobile technologies and e-commerce into a traditional experience.
  • POS point-of-sale
  • a retailer or other provider may determine a person's field of view, and may compile personal behaviors and determine personal preferences. This data may then be used to provide timely, tailored recommendations in real-time to the person in order to more effectively influence their experience. While generally discussed within the context of a shopping environment, such as a retail store or other commercial environment, it is contemplated that the techniques disclosed herein may be applied to other environments (some non-limiting examples include libraries, museums, classrooms, hospitals, etc.) to provide an adaptive experience for persons included therein.
  • FIG. 1 illustrates an integrated shopping environment, according to one embodiment.
  • environment 100 includes a plurality of terminals 105 , a plurality of servers 110 1 , 110 2 coupled to a network 115 , one or more sensors 120 of different types, one or more user devices 140 , and one or more other devices 150 .
  • the environment 100 may be integrated into the layout of a retail store, market, or other commercial environment that is known or hereinafter developed.
  • Terminals 105 generally include any structure that is capable of receiving input from customers and/or producing output to customers within the environment 100 .
  • the terminals 105 may include computing systems, portions of computing systems, or devices controllable by computing systems.
  • a terminal may include a computing device that is communicatively coupled with a visual display and audio speaker(s), as well as being communicatively coupled with one or more input devices.
  • a terminal may include a visual display and associated driver hardware, but a computing device coupled to the terminal and providing data for display is disposed separately from the terminal.
  • terminals 105 may be implemented as standalone devices, such as a kiosk disposed on the store floor or monolithic device disposed on a shelf or platform.
  • terminals 105 may be integrated partially or wholly with other components of the environment 100 , such as input or output devices included with shelving or other structural components in the environment (e.g., components used for product display or storage). In some embodiments, terminals 105 may be modular and may be easily attachable and detachable to elements of the environment 100 , such as the structural components.
  • terminals 105 may be distributed throughout the environment 100 and may enhance various phases of the shopping experience for customers.
  • terminals 105 may include digital signage 108 disposed throughout the environment, such as included in or near aisles, endcaps, displays, and/or shelving in the environment.
  • a customer may view and/or interact with the digital signage 108 as he/she moves through the store environment.
  • the digital signage may be included in a static display or may be movable, such as including digital signage within a customer's shopping cart or basket.
  • Terminals 105 may also include POS terminals 106 that provide a checkout functionality, allowing the customer to complete his/her shopping transaction (e.g., make payment for selected items).
  • terminals 105 may provide an integrated functionality.
  • the terminals may function in one mode as digital signage, and when engaged by a customer, the terminals function as a POS terminal.
  • Servers 110 1 , 110 2 generally include processors, memory, and communications capabilities, and may perform various computing tasks to support the commercial operation of the environment 100 .
  • Servers 110 1 , 110 2 communicate using various wired and/or wireless communications methods with terminals 105 , sensors 120 , and with other networked devices such as user devices 140 and other devices 150 .
  • Servers 110 1 , 110 2 generally execute computer program code in which input data is received from networked devices, the input data is processed and/or stored by the servers, and output data is provided to networked devices for operation of the environment 100 .
  • Sensors 120 may include various sensor devices, such as video sensors 125 , audio sensors 130 , and other sensors 135 .
  • the other sensors 135 generally include any sensors that are capable of providing meaningful information about customer interactions with the environment, e.g., location sensors, weight sensors, and so forth.
  • Sensors 120 may be deployed throughout the environment 100 in fixed and/or in movable locations. For example, sensors 120 may be statically included in walls, floors, ceilings, displays, or other devices, or may be included in shopping carts or baskets capable of being transported around the environment.
  • sensors 120 may include adjustable position sensor devices, such as motorized cameras attached to a rail, wire, or frame.
  • sensors 120 may be included on one or more unmanned vehicles, such as unmanned ground vehicles (UGVs) or unmanned aerial vehicles (UAVs or “drones”). Sensors 120 may also include sensor devices that are included in user devices 140 or other devices 150 (which in some cases may include body-worn or carried devices). User devices 140 and other devices 150 may include passive or actively-powered devices capable of communicating with at least one of the networked devices of environment 100 .
  • a passive device which may be worn or carried
  • NFC near-field communication
  • Active devices may include mobile computing devices, such as smartphones or tablets, or wearable devices such as a Google GlassTM interactive eyepiece (Glass is a trademark of Google Inc.).
  • the user devices 140 generally denotes ownership or possession of the devices by customers, while the other devices 150 denotes ownership or possession by the retailer or other administrator of the environment 100 . In some cases, other devices 150 may be carried by employees and used in the course of their employment. User devices 140 and other devices 150 may execute applications or other program code that generally enables various features provided by the servers and/or other networked computing devices.
  • FIG. 2 illustrates a shopping environment system, according to one embodiment.
  • the system 200 corresponds to the environment 100 described above.
  • System 200 includes one or more processors 205 , memory 210 , and input/output 250 , which are interconnected using one or more connections 240 .
  • system 200 may be included in a singular computing device, and the connection 240 may be a common bus.
  • system 200 is distributed and includes a plurality of discrete computing devices that are connected through wired or wireless networking.
  • Processors 205 may include any processing element suitable for performing functions described herein, and may include single or multiple core processors, as well as combinations thereof.
  • Processors 205 may be included in a single computing device, or may represent an aggregation of processing elements included across a number of networked devices such as user devices 140 , POS terminals 105 , etc.
  • Memory 210 may include a variety of computer-readable media selected for their size, relative performance, or other capabilities: volatile and/or non-volatile media, removable and/or non-removable media, etc.
  • Memory 210 may include cache, random access memory (RAM), storage, etc.
  • Storage included as part of memory 210 may typically provide a non-volatile memory for the networked computing devices (e.g., servers 110 1 , 110 2 ), and may include one or more different storage elements such as Flash memory, a hard disk drive, a solid state drive, an optical storage device, and/or a magnetic storage device.
  • Memory 210 may be included in a single computing device or may represent an aggregation of memory included in networked devices.
  • Memory 210 may include a plurality of modules 211 for performing various functions described herein.
  • the modules 211 include program code that is executable by one or more of the processors 205 .
  • modules 211 include user identification 212 , item identification 214 , advertising 216 , recommendations 218 , virtual cart 220 , assistance 222 , security 224 , power management 226 , gaming 228 , audit 230 , loyalty program 232 , and inventory 234 .
  • the modules 211 may also interact to perform certain functions. For example, a loyalty program module 232 during operation may make calls to user identification module 212 , item identification module 214 , advertising module 216 , and so forth.
  • Memory 210 may also include customer profiles 236 and customer images 238 , which may be accessed and/or modified by various of the modules 211 .
  • the customer profiles 236 and customer images 238 may be stored on the servers 110 1 , 110 2 or on a separate database.
  • I/O 250 may include a number of different devices interfacing with various computing devices and with the shopping environment.
  • I/O 250 includes sensors 120 , which have been described above.
  • I/O 250 may further include input devices 252 and output devices 254 that are included to enhance the shopping experience for customers.
  • terminals 105 , user devices 140 , and other devices 150 may include visual displays and/or audio speakers (examples of the output devices 254 ), and various input devices 252 (such as cameras, keyboards or keypads, touchscreens, buttons, inertial sensors, etc.).
  • I/O 250 may further include wired or wireless connections to an external network 256 using I/O adapter circuitry.
  • Network 256 may include one or more networks of various types, including a local area or local access network (LAN), a general wide area network (WAN), and/or a public network (e.g., the Internet).
  • LAN local area or local access network
  • WAN wide area network
  • public network e.g., the Internet
  • various networked computing devices of the system 200 are interconnected using a LAN, and one or more computing devices (e.g., servers 110 1 , 110 2 , user devices 140 ) include connections to the Internet.
  • FIG. 3 illustrates an integrated shopping environment, according to one embodiment.
  • the environment 300 includes a plurality of sensor modules 302 disposed in the ceiling 301 of the store.
  • the sensor modules 302 may each include one or more types of sensors, such as video sensors (e.g., cameras), audio sensors (e.g., microphones), and so forth.
  • Sensor modules 302 may also include actuating devices for providing a desired sensor orientation.
  • Sensor modules or individual sensors may generally be disposed at any suitable location within the environment 300 . Some non-limiting examples of alternative locations include below, within, or above the floor 330 , within other structural components of the environment 300 such as a shelving unit 303 or walls, and so forth.
  • sensors may be disposed on, within, or near product display areas such as shelving unit 303 .
  • the sensors may also be oriented toward an expected location of a customer interaction with items, to provide better data about a customer's interaction, such as determining a customer's field of view.
  • Environment 300 also includes a number of kiosks (or terminals) 305 .
  • kiosks 305 may be configured for performing customer checkout and/or other shopping functions.
  • Each kiosk 305 may each include computing devices or portions of computing systems, and may include various I/O devices, such as visual displays, audio speakers, cameras, microphones, etc. for interacting with the customer.
  • a customer 340 may have a mobile computing device, such as smartphone 345 , that communicatively couples with the kiosk 305 for completing the checkout transaction.
  • the mobile computing device may execute a store application that is connected to the networked computing systems (e.g., through servers 110 1 , 110 2 ), or may be directly connected to kiosk 305 through wireless networks within the environment (e.g., over Wi-Fi or Bluetooth). In one embodiment, the mobile computing device may couple to the kiosk 305 when brought within range, e.g., using Bluetooth or NFC.
  • the Environment 300 also includes one or more shelving units 303 having shelves 310 that support various store items 315 .
  • multiple shelving units 303 may be disposed in a particular arrangement in the environment 300 , and the space between adjacent shelving units may form aisles through which customers may travel.
  • the shelving unit 303 may include visual sensors or other sensor devices or I/O devices. The sensors or devices may couple to a customer's smartphone 345 and/or other networked computing devices (including servers) within the environment 300 .
  • the front portions 320 of shelves 310 may include video sensors oriented outward from the shelving unit 303 to capture customer interactions with items 315 on the shelving unit 305 , and the data from the video sensors may be provided to back-end servers for storage and/or analysis.
  • portions of the shelving unit 303 (such as the front portions 320 of shelves 310 ) may include indicator lights or other visual display devices or audio output devices that are used to communicate with a customer.
  • FIG. 4 illustrates a system of influencing shopping experience based on a customer field of view, according to one embodiment.
  • System 400 may be used in coordination with the various shopping environments described herein. Generally, system 400 may share at least portions of several components with the shopping environment system 200 , such as processors 205 , memory 210 , and I/O 250 . System 400 may also utilize one or more of the modules 211 to provide various aspects of the system's functionality, such as item identification module 214 , advertising module 216 , recommendation module 218 , and so on.
  • I/O 250 includes output devices 254 and sensors 120 .
  • Output devices 254 include one or more devices for presenting information to customers and include audio output devices 440 and visual output devices 445 .
  • the audio output devices 440 may include conventional audio speakers having any suitable form factor (e.g., in a stereo, headphones, etc.), as well as devices using alternative methods of producing sound to a customer, such as bone conduction transducers in a worn device.
  • Visual output devices 445 may include visual displays and various visual indicators such as light emitting diodes (LEDs).
  • Other output devices 450 may provide information to customers through tactile feedback (e.g., haptic devices) or other sensory stimuli.
  • Sensors 120 may include visual sensors 455 which may be carried or worn sensors 460 , and distributed sensors 465 that are disposed throughout the shopping environment.
  • Other sensors 470 may be included that are suitable for collecting information about a customer and his/her interactions within the shopping environment. Examples of other sensors 470 may include infrared (IR) sensors, thermal sensors, weight sensors, capacitive sensors, magnetic sensors, sonar sensors, radar sensors, lidar sensors, and so forth.
  • IR infrared
  • the visual sensors 455 may be used to capture one or more images 415 of the customer and/or the shopping environment, which may include views from various perspectives (e.g., a customer-worn visual sensor, static or movable visual sensors at various locations in the environment).
  • the images 415 may be stored in memory 210 , and may be individually or collectively processed to determine information about customers in the environment and their respective interactions with items in the environment.
  • Memory 210 includes one or more programs 435 that collectively receive data about customers and the shopping environment, process the received data, and transmit information to customers in order to influence the customers' shopping experiences.
  • Programs 435 may include program code to determine a customer's field of view at a given time, including which items are included in the field of view. In one embodiment, the customer's field of view may be determined directly.
  • a body-worn device may include a visual sensor (i.e., a worn visual sensor 460 ) that, when the device is worn, gives the visual sensor an orientation that is similar to the orientation of customer's head or eyes (e.g., a forward-looking camera). Images captured from the worn visual sensor may generally reflect the customer's field of view.
  • the customer's field of view may be estimated (or determined indirectly) using other sensor measurements.
  • the customer's field of view may be estimated by determining the orientation of one or both of the customer's eyes. Eye orientation may be determined using worn visual sensors 460 (e.g., an inward-facing camera on a head-worn device) and/or distributed visual sensors 465 (e.g., capturing images of the customer's face and image processing to determine an eye orientation).
  • the customer's field of view may be estimated by determining the position and/or orientation of the customer's head and/or body using various visual sensor measurements.
  • the customer's field of view may be represented in any suitable data format, such as an image or as coordinate data (e.g., Cartesian, polar, spherical coordinates).
  • a single visual sensor 455 may be used to determine a customer's field of view
  • several embodiments employ a combination of a plurality of visual sensors 455 to determine a field of view. These embodiments may be preferred as providing additional data to support a more accurate estimate of the field of view.
  • the plurality of visual sensors used to determine a field of view may include visual sensors selected from different categories (worn sensors 460 , distributed sensors 465 ) to provide additional robustness to the collected data.
  • Programs 435 may also include program code to identify one or more items included within the customer's field of view.
  • the identification process may be performed directly or estimated.
  • One example of direct identification is performing image processing on images collected from a worn, forward-looking camera to visually identify one or more items. Estimating items within a customer's field of view may require combining sensor data with known information about the shopping environment, such as the items in the environment (item data 420 ) and their relative arrangement or layout within the environment (location data 425 ).
  • Programs 435 may also include program code to present information to customers based on the identified one or more items.
  • the information may be used to influence the customer's shopping experience.
  • the information presented to the customer may include information about the identified items (e.g., nutritional data, pricing, a customer's purchase history of the items, etc.), information encouraging the purchase of identified items (e.g., bringing particular items to the customer's attention, touting the items' features, offering discounts or other promotions on the items, etc.), and information encouraging the purchase of alternatives to the identified items (e.g., highlighting differences of the items, offering discounts, etc.).
  • programs 435 may access and analyze various additional data in memory 210 that is related to the customer and perhaps other customers of the shopping environment.
  • programs 435 may analyze shopping data 430 collected from previous shopping experiences for the particular customer and/or for other customers or groups of customers.
  • shopping data 430 may include customer views and the items included therein, progressions of customer views (showing a customer's interest over time), selection or purchase history for items, and so forth. While shopping data 430 may be compiled and used to generate information to present to customers and influence their shopping experiences in real-time, the shopping data 430 may also be used by the retailer or administrator of the shopping environment to modify the layout of the environment. The shopping data 430 may help the retailer identify trends in customer shopping, and to optimize placement of items within the environment to improve customer sales.
  • System 400 may also present information to customers based on their personal preferences 405 .
  • the preferences 405 may generally be stored in a corresponding customer profile 236 , and may reflect preferences that are explicitly specified by a customer, or may be determined based on the customer's historical shopping behavior (e.g., included in shopping data 430 ). For example, a customer may have an allergy to a particular ingredient, and the customer may enter this allergy information in preferences 405 , e.g., using a mobile phone app for the retailer. Accordingly, the system 400 when determining which information to present to the customer may present information that highlights items within the customer's field of view that include the ingredient, and may further suggest alternative items that do not include the ingredient.
  • a customer's shopping history may also be used to determine customer preferences 405 .
  • the customer's determined fields of view and purchase history from shopping data 430 may be processed to deduce which items, combinations of items, and/or aspects of items are preferred by the customer.
  • preferred aspects might include preferred brands, costs, quantities, sizes, ingredients, nutritional properties (e.g., calorie content, fat, sugar, vitamins, minerals, etc.), and so forth.
  • the customer may specify a preference for low-fat foods, and the system may determine recommended items based on the items included in the customer's field of view and the customer's preferences. This may include suggesting a particular item within the field of view for purchase (or alternatively, an item located outside the field of view) and/or advising the customer about the item's properties vis-à-vis the customer's preferences (e.g., reporting fat content).
  • a customer's preferences may be included as a logical combination of a plurality of these aspects (e.g., a customer prefers Brand X items to Brand Y, so long as the cost of the Brand X item is no more than 150% of the Brand Y item).
  • other customers' shopping data may also be used to deduce a particular customer's preferences.
  • the preferences may be dynamically updated to identify whether deduced preferences are accurate or not. The dynamic updating may be caused by the customer's explicit indication and/or by the customer's shopping patterns following the deduced preference. For example, the system 400 may deduce that a customer has a categorical preference for Brand X items over similar Brand Y items.
  • the customer's historical shopping data indicated that the customer looked at a Brand X item (e.g., field of view data) before selecting and purchasing a similar Brand Y item (e.g., field of view data and/or purchase history data).
  • the system in response may adapt or may entirely remove the deduced (and determined inaccurate) preference.
  • system 400 may present information to customers that is also based on other programs 410 .
  • programs 410 may include fitness, nutrition, or health goals, money management goals, etc.
  • the programs 410 themselves may be integrated into a store application and accessible by the customer's mobile computing device or wearable device.
  • the store application may interface with applications from other providers to determine the customer's goals and present appropriate information during the shopping experience.
  • the system 400 could include a nutrition-oriented program, and may make suggestions for more nutritious items to a customer who is looking at junk food items (e.g., candy).
  • FIGS. 5A and 5B illustrate an example wearable computing device for use in a shopping environment, according to one embodiment.
  • Portions of wearable computing device 500 may be head-worn or worn on other portions of the body.
  • the wearable computing device 500 includes a housing 505 that includes several structural components.
  • the band 505 may be used as a structural frame, supporting other portions while itself being supported by a customer's head when worn.
  • Other structural components may include nose pieces 520 , ear piece 515 , and enclosure 512 .
  • Enclosure 512 may include a computing device 525 , which includes video I/O components 530 and audio I/O components 540 .
  • the enclosure formed by the earpiece 515 may also house components, such as a battery 545 for powering the computing device 525 .
  • computing device 525 also includes wireless networking components for communicating with other computing devices.
  • the video I/O components 530 may include a forward-looking camera 513 that provides an estimate of the wearer's field of view based on their head orientation, and a transparent prism 514 that is used to project light onto the wearer's retina, displaying information to the wearer over the wearer's natural field of view.
  • Other video I/O components may include an inward-looking camera that is configured to capture the eye orientation of the wearer, a conventional display device (e.g., a LCD), and so forth.
  • Audio I/O components 540 may include one or more microphones and/or audio output devices, such as speakers or bone-conducting transducers.
  • FIG. 5B illustrates the wear of wearable computing device 500 .
  • scene 550 includes the wearer's natural view 560 .
  • a portion of the area of the natural view 560 may be used for displaying an overlay 565 (e.g., using the prism 514 ) that provides additional information to the wearer.
  • the display of the overlay 565 may be adequately transparent to permit the wearer to continue to observe their natural view through the overlay area.
  • the overlay 565 includes a map view of the wearer's current location (e.g., the wearer is at W 34 th street).
  • information may be selected and visually presented in text and/or graphics to a customer wearing such a device to influence his/her shopping experience.
  • FIGS. 6A-6C illustrate determining a customer field of view and identifying items included within the field of view, according to one embodiment.
  • a shelving unit 603 is depicted having a plurality of shelves 610 that each support and display a number of items 612 that are available for selection and purchase by a customer.
  • a customer's field of view 615 and an area 605 outside the customer field of view.
  • the customer's field of view 615 may be represented by an image captured from a forward-looking camera. While shown as generally rectangular, the customer field of view 615 may have any suitable alternative shape and size.
  • the customer's actual vision may encompass a significantly larger area, but determining the field of view for purposes of the shopping environment may include applying a threshold or weighting scheme that emphasizes areas that are closer to the center of a customer's vision.
  • data provided by various visual sensors and/or other sensors may be used to make these determinations.
  • the field of view 615 may include a plurality of fully included items 620 , as well as a plurality of partially included items 625 .
  • certain embodiments may categorically include or exclude the partially included items 625 .
  • An alternative embodiment may rely on image processing to determine whether a partially included item 625 should be identified as included. For example, if the processing cannot recognize the particular item with a certain degree of confidence, the item may be excluded.
  • partially included items 625 may be included, and the amount of item inclusion (e.g., the percentage of surface area of the item included) may be used to determine a customer focus or calculate a customer interest score, which are discussed further below.
  • Items that are fully or partially included in the customer field of view 615 may be recognized by performing image processing techniques on images captured by various visual sensors. For example, images that include the items may be compared against stock item images stored in a database or server. To aid image processing, items may also include markers or distinctive symbols, some of which may include item identification data such as barcodes or quick response (QR) codes. Of course, other processing techniques may be employed to recognize a particular item, such as textual recognition, determining the item's similarity to adjacent items, and so forth.
  • image processing techniques may be employed to recognize a particular item, such as textual recognition, determining the item's similarity to adjacent items, and so forth.
  • Scene 650 of FIG. 6B depicts a customer 660 in a shopping environment.
  • the customer 660 is standing in an aisle 655 adjacent to a shelving unit 603 , which has a plurality of shelves 610 .
  • Visual sensors may capture one or more images of scene 650 from various spatial perspectives, and the images may be used to determine the customer's field of view. Specifically, various aspects of the scene that are captured in the images may be used to estimate the customer's field of view.
  • the relative position and/or orientation of portions of the customer's body may be determined.
  • the position and orientation of the customer's eyes 680 may be determined.
  • eye position within the environment may be determined in Cartesian coordinates (i.e., determining x, y, and z-direction values) and eye orientation may be represented by an angle a defined relative to a reference direction or plane (such as horizontal or an x-y plane corresponding to a particular value of z).
  • other portions of the customer's body may (also) be used to determine the field of view, such as the position and orientation of the customer's head 665 , or of one or both shoulders 670 .
  • the customer's interaction with the shelving unit 603 by extending her arm 675 may also be captured in one or more images, and the direction of the extended arm may be used to determine her field of view.
  • embodiments may use combinations of various aspects of the scene to determine the customer's field of view.
  • the combinations may be weighted; for example, data showing a customer 660 reaching out her arm 675 towards a specific item may be weighted more heavily to determine her field of view than the orientation of her shoulders.
  • the weights may be dynamically updated based on the customer's shopping behaviors following an estimate of the customer's field of view. For example, if a customer reached for (or selected) an item that was not included in the determined field of view, the system may adjust the relative weighting in order to accurately capture the customer's field of view.
  • This adjustment may include determining correlation values between particular captured aspects of the scene to the selected item; for example, the customer's head may be partly turned towards the selected item, but their eye orientation may generally be more closely tied to the selected item.
  • the correlation values may be more useful where one or more aspects of the scene cannot be determined (e.g., the system may be unable to determine eye orientation for a customer wearing sunglasses, non-optimal visual sensor positioning, etc.).
  • Scene 685 of FIG. 6C illustrates an overhead view of several customers 660 in a shopping environment.
  • the view of scene 685 may be represented by an image captured from a ceiling mounted camera, or from a drone.
  • the orientation of customer 660 A may be estimated using the relative position of his/her shoulders 670 . As shown, a line connecting the two shoulders may be compared to a reference direction or plane (e.g., parallel to the length of shelving unit 603 A) and represented by an angle ⁇ .
  • the orientation of customer 660 B may be estimated using the orientation of his/her head 665 , comparing a direction of the customer's head to a reference direction or plane, which may be represented by an angle ⁇ .
  • Images may also capture a customer 660 C interacting with the shelving unit 603 B, and the position and/or orientation of the customer's arm 675 may be used to determine the customer's field of view.
  • FIG. 7 illustrates several example views of determining a customer focus on an item within the customer's field of view, according to one embodiment. While the computing systems may determine information to present to a customer based on all of the items identified within a determined field of view, in some embodiments it may be advantageous to make a further identification of one or more items that the customer is specifically focused on. Presenting information to the customer that is based on the customer-focused items may generally provide a more relevant and more persuasive information to influence the customer's shopping experience.
  • the determined field of view 705 of a customer may include different groups of items at different times, and the progression of the customer's field of view over time may help determine which item(s) within the field of view are specifically being focused on by the customer.
  • the customer's focus on a particular item may indicate that the item is merely attracting the customer's attention, or that the customer is deciding whether or not to purchase the item. Either way, understanding the object of a customer's focus may help retailers or suppliers to improve packaging and placement of items or to influence the customer's shopping experience in real-time.
  • View 1 illustrates a field of view 705 that includes items 715 A, 715 B, and a portion of 715 C.
  • items 715 A and 715 B may be included in a customer focus determination due to the items' full inclusion within the field of view 705 .
  • item 715 C may be excluded from a customer focus for being only partially included in the field of view 705 .
  • all three items may be included by virtue of being at least partially included in the field of view 705 .
  • a customer focus on particular items may be a time-based determination.
  • the computing system may determine that the customer is focused on items 715 A-C.
  • the particular item(s) must continuously remain in the field of view 705 during the preset amount of time (e.g., remain across several samples of the field of view during this time).
  • View 2 illustrates a field of view 705 that includes item 715 B, and portions of items 715 A and 715 C. View 2 could represent the same field of view as View 1 at a later time.
  • the customer's focus may be determined to include item 715 B but not 715 A.
  • the system could determine that 715 B is the lone customer-focused item based on its relatively central position within the field of view 705 , or perhaps based on the changes to the field of view from View 1 (i.e., a shift away from previously fully included item 715 A).
  • item 715 B must also remain within the field of view 705 for the predetermined amount of time to be considered as a customer-focused item.
  • the customer-focused items may still include item 715 A and/or item 715 C.
  • View 3 illustrates a field of view 705 that includes item 715 B, and portions of items 715 A and 715 C. View 3 differs from View 2 in that the field of view 705 is “closer” to the shelving unit 703 and items 715 in View 3 than in View 2 . View 3 could represent the same field of view as View 2 at a later time, e.g., as the customer moves towards the shelving unit 703 .
  • the system could determine that 715 B is a customer-focused item based on its relatively central position within the field of view 705 and/or based on the changes to the field of view from View 2 (i.e., item 715 B occupies an increased percentage of the field of view 705 ).
  • View 4 also illustrates a field of view 705 that includes item 715 B, and portions of items 715 A and 715 C. View 4 could represent the same field of view as View 3 at a later time.
  • the customer has selected item 715 B and is holding the item in his/her hand 725 .
  • the system could determine that 715 B is a customer-focused item based on its relatively central position within the field of view 705 , based on the changes to the field of view from View 3 (e.g., item 715 B occupies an increased percentage of the field of view 705 ), and/or based on the presence of a portion 730 of the customer's hand 725 included in the field of view 705 .
  • a customer interest score may be determined for items included within the customer's field of view.
  • the customer interest score is generally time-based, and score may be used to timely interject desired information in order to influence a customer's shopping decisions (e.g., whether to buy a particular item) in real-time.
  • a customer interest score is calculated and updated concurrent with determining a customer focus, and the customer interest score for an item exceeding a threshold value may be used to determine that the customer has focused on the item.
  • the customer interest score may also be adjusted based on various other aspects shown in the Views 1 - 4 , such as position within the field of view (e.g., a central position may mean more interest), percentage occupied within the field of view (e.g., a larger percentage may mean more interest), customer selection and/or manipulation of the item, and so forth.
  • position within the field of view e.g., a central position may mean more interest
  • percentage occupied within the field of view e.g., a larger percentage may mean more interest
  • customer selection and/or manipulation of the item e.g., a customer's interest score may reflect their particular shopping behaviors (e.g., a customer is determined to be 95% likely to pick up and manipulate a customer-focused item within their field of view, so the mere fact of picking up an item might not indicate an increased level of customer interest in the item).
  • the customer interest score may generally decrease over time as it becomes more likely that the customer will not purchase the particular item.
  • the system may provide information about the item at desired times (corresponding to customer interest scores). This may include providing information about the items or alternatives, and/or offering promotional pricing on the item (or an alternative item).
  • the customer shopping history may include data regarding the number of promotions already offered to the customer, the length of time (or customer interest score) before offering the promotion, the customer utilization rate of the previous promotions, etc.
  • FIGS. 8A-8C illustrate several example presentations of information related to an item, according to one embodiment.
  • the presentation of information may be based upon some or all of the items that are identified as being within the customer's field of view.
  • FIG. 8A illustrates a single item 805 within the customer's field of view 705 ; this could also represent an example in which one item is identified from several items as being a customer focused item.
  • similar techniques may be applied to a field of view or a customer focus that includes more than one item.
  • the computing system may present information to the customer about a particular item, based on the item 805 .
  • the presented information may relate to the same item 805 and/or to one or more different items.
  • the different items may include other items located within the customer's field of view 705 , and may include items (such as item 810 ) that are not included within the field of view 705 .
  • the information may be determined and timely presented so as to influence the customer to purchase the particular item(s).
  • FIG. 8B illustrates several example presentations of information to a customer.
  • the presentations may be delivered in real-time to the customer using any suitable manner, such as a visual output (e.g., graphical and/or textual), audio-based output, and/or other sensory output (e.g., haptic). While the presentations 815 , 820 , 825 are shown as textual presentations for simplicity, it is possible for the presentations to be spoken or presented in any suitable alternative manner.
  • a presentation may be displayed by a wearable computing device of the customer, such as a visual overlay on a display of the wearable computing device as discussed above with respect to FIGS. 5A and 5B .
  • a presentation may be transmitted as a message to the customer's mobile computing device, causing text or graphics to be displayed on the mobile computing device and causing the device to vibrate.
  • Presentation 815 indicates a customer preference for a second item 810 to the currently viewed or focused item 805 .
  • Presentation 815 generally suggests that the customer may also prefer the second item 810 .
  • the customer preference may be determined based on compiled previous shopping data for other customers.
  • the shopping data may include separate statistics for the two items, for example, based on historical purchase data for each item.
  • the shopping data may also include compiled information that directly interrelates the two items. For example, in previous shopping experiences, other customers also focused on item 805 (e.g., based on determined fields of view) but ultimately declined to purchase item 805 in favor of item 810 .
  • Presentation 820 indicates that a promotion is available for the second item 810 .
  • Presentation 820 may reflect a sales promotion or prices already in effect (e.g., a reporting function), or the promotion may be dynamically generated to encourage the customer to purchase item 810 .
  • manufacturers or stores may wish to promote item 810 over item 805 , and the promotion may be presented to the customer in real-time.
  • the promotion may be determined, as well as the timing of its presentation to the customer, based on a determined level of customer interest in an item.
  • Presentation 825 indicates that item 805 would have a negative impact on the customer's goals.
  • the presentation 825 may be based on customer profile data such as customer preferences and/or programs. Examples of programs may include fitness, nutrition, or health goals, money management goals, etc.
  • customer profile data such as customer preferences and/or programs. Examples of programs may include fitness, nutrition, or health goals, money management goals, etc.
  • programs may include fitness, nutrition, or health goals, money management goals, etc.
  • one embodiment may present the information to the customer upon which the negative impact determination was made. If the system determines that item 805 is incompatible with specified goals or preferences, or that better alternative items exist consistent with the customer's goals/preferences, the system may recommend one or more alternative items for presentation to the customer.
  • FIG. 8C illustrates several example presentations of information to a customer.
  • the presentations may be delivered in real-time to the customer using any suitable manner, such as a visual output (e.g., graphical and/or textual), audio-based output, and/or other sensory output (e.g., haptic).
  • the presentations may be output using a customer's worn or carried computing device, or using output hardware (e.g., displays, speakers, etc.) that is disposed within the shopping environment.
  • output hardware e.g., displays, speakers, etc.
  • LEDs or other visual display elements may be included in the shelving units, such as display elements 835 a - b and 836 c - d.
  • a single item 805 is included within a customer's field of view 705 , and item 810 is located outside the field of view.
  • similar techniques may be applied to a field of view or a customer focus that includes more than one item.
  • the system may display various indicators that highlight the item 805 to the user. For example, if the field of view 705 represents the customer view through a head-worn computing device such as wearable computing device 500 , the system may display colors, shapes, etc. that visually enhance the item 805 , or may reduce the visibility of other items in the field of view (e.g., gray out).
  • the system may also display location information of the item 805 , for example, by illuminating one or more of the display elements 835 a - b nearest to the item 805 .
  • the system may output directional information from the current field of view 705 to the item.
  • the system may display an arrow 830 overlay indicating the relative direction from the field of view 705 or item 805 to the target item 810 .
  • the system may also provide directional information using one or more of the display elements 835 , 836 .
  • the display elements 835 , 836 may be illuminated in sequence from left to right to guide the customers' eye toward item 810 .
  • the various types of information discussed here may additionally or alternately be provided by non-visual means to the customer, such as audio or haptic outputs.
  • FIG. 9A illustrates a method of influencing a shopping experience of a customer within a shopping environment, according to one embodiment.
  • Method 900 may generally be used in conjunction with the various environments and systems discussed above.
  • Method 900 begins at block 905 , where a field of view is determined for the customer. Determining a field of view may include a “direct” determination, such as capturing an image from a body-worn, forward-looking camera (i.e., the image strongly corresponds to the customer's eye gaze). Alternatively (or in addition to the direct determination), the field of view may be determined “indirectly” by capturing images of the shopping environment and performing image processing to determine various aspects. The aspects may include the determined position and orientation of one or more of the customer's body, head, shoulders, eyes, arms, hands, etc. The images may be captured by visual sensors that are worn or disposed throughout the shopping environment.
  • identifying items may include performing image processing on a captured image that reflects the customer's field of view (e.g., from a forward-looking camera).
  • identifying items may include referencing coordinate data determined for the customer's field of view against a known store layout that includes item location information. In such an embodiment, image processing may not be necessary to identify the one or more first items.
  • At block 925 at least one second item is selected for presentation to the customer.
  • the selection is based on the identified one or more first items.
  • at least one of the second item(s) may be included in the identified first items.
  • each of the second item(s) may be different than the identified first items.
  • the selection may be based on decisional logic for recommending a particular item for sale, based on historical shopping data for the customer (and/or other customers) and/or customer preferences or programs.
  • information related to the at least one second item is presented to the customer.
  • Presenting information may be done in any suitable manner, such as displaying textual or graphical information in a visual overlay, displaying directional information to the customer, or outputting sound or other feedback.
  • the information may be related to the properties of the second item (e.g., price, nutrition information, etc.) or to historical shopping data, promotions, or other customer-specific preferences or programs.
  • the timing of the presentation may be determined based on customer interest scores or other timing considerations that are collectively intended to influence a customer's shopping purchases.
  • FIG. 9B illustrates a method of determining content to present to a customer based on a customer interest score, according to one embodiment.
  • Method 940 may generally be used in conjunction with the various shopping environments and systems discussed above. In one embodiment, method 940 is performed as part of performing method 900 .
  • Method 940 begins at block 945 , where a customer focus is determined on at least one item selected from one or more items that are included within the customer's field of view. The customer focus may be time-based, such as requiring the item to remain within the field of view for a minimum amount of time.
  • a customer interest score is determined for the at least one customer-focused item.
  • the customer interest score may also be time-based, and may be influenced by various characteristics of the interaction of the customer with the item, which may be observed through the determined customer field of view.
  • the customer interest score may be determined concurrently with determining the customer focus on an item.
  • the customer-focused item is determined based on the customer interest score for that item reaching a predetermined threshold value.
  • the customer-focused item is determined by the item's score that exceeds scores for other items within the field of view.
  • the content of the information to present to the customer is determined based on the determined customer interest score. Determining the content may include determining whether to present information for the customer-focused item or for an alternate item, whether or not to present a promotion for the item or the alternate item, determining an amount of the promotion, and so forth.
  • aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module,” or “system.”
  • the present disclosure may be a system, a method, and/or a computer program product.
  • the computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure.
  • the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
  • the computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • a non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disk
  • memory stick a floppy disk
  • a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon
  • a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
  • the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
  • Computer readable program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
  • the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures.
  • two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • Embodiments of the disclosure may be provided to end users through a cloud computing infrastructure.
  • Cloud computing generally refers to the provision of scalable computing resources as a service over a network.
  • Cloud computing may be defined as a computing capability that provides an abstraction between the computing resource and its underlying technical architecture (e.g., servers, storage, networks), enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction.
  • cloud computing allows a user to access virtual computing resources (e.g., storage, data, applications, and even complete virtualized computing systems) in “the cloud,” without regard for the underlying physical systems (or locations of those systems) used to provide the computing resources.
  • cloud computing resources are provided to a user on a pay-per-use basis, where users are charged only for the computing resources actually used (e.g., an amount of storage space consumed by a user or a number of virtualized systems instantiated by the user).
  • a user can access any of the resources that reside in the cloud at any time, and from anywhere across the Internet.
  • applications e.g., a retail store app for a mobile computing device
  • related data e.g., compiled shopping data

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Finance (AREA)
  • Strategic Management (AREA)
  • Development Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Multimedia (AREA)
  • Game Theory and Decision Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Signal Processing (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Hardware Design (AREA)
  • Remote Sensing (AREA)
  • Operations Research (AREA)
  • Tourism & Hospitality (AREA)
  • Quality & Reliability (AREA)
  • Human Resources & Organizations (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Cash Registers Or Receiving Machines (AREA)

Abstract

Method, computer program product, and system to influence a person within an environment having a plurality of items for selection. The method includes capturing, using a first visual sensor disposed within the environment, field of view information for the person, performing analysis on the field of view information using a computing device, and identifying, based on the analysis, one or more first items of the plurality of items that are included within the field of view of the person. The method further includes selecting, based on the identified one or more items, at least one second item of the plurality of items for presentation to the person; and presenting information related to the at least one second item to the person.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims benefit of U.S. provisional patent application Ser. No. 62/064,323, filed Oct. 15, 2014, entitled “Integrated Shopping Environment,” which is herein incorporated by reference.
  • BACKGROUND
  • The present disclosure relates to a sensor-based environment, and more specifically, to providing an adaptive personal experience within such an environment using a determined field of view of the person.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
  • FIG. 1 illustrates an integrated shopping environment, according to one embodiment.
  • FIG. 2 illustrates a shopping environment system, according to one embodiment.
  • FIG. 3 illustrates an integrated shopping environment, according to one embodiment.
  • FIG. 4 illustrates a system of influencing shopping experience based on a customer field of view, according to one embodiment.
  • FIGS. 5A and 5B illustrate an example wearable computing device for use in a shopping environment, according to one embodiment.
  • FIGS. 6A-6C illustrate determining a customer field of view and identifying items included within the field of view, according to one embodiment.
  • FIG. 7 illustrates several example views of determining a customer focus on an item within the customer's field of view, according to one embodiment.
  • FIGS. 8A-8C illustrate several example presentations of information related to an item, according to one embodiment.
  • FIG. 9A illustrates a method of influencing a shopping experience of a customer within a shopping environment, according to one embodiment.
  • FIG. 9B illustrates a method of determining content to present to a customer based on a customer interest score, according to one embodiment.
  • To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures. It is contemplated that elements disclosed in one embodiment may be beneficially utilized on other embodiments without specific recitation. The illustrations referred to here should not be understood as being drawn to scale unless specifically noted. Also, the drawings are often simplified and details or components omitted for clarity of presentation and explanation. The drawings and discussion serve to explain principles discussed below, where like designations denote like elements.
  • DETAILED DESCRIPTION
  • Aspects of the current disclosure relate to an integrated environment capable of providing a personalized, automated, and adaptive experience for a person within the environment. A number of different sensor devices may be employed within the environment, and networked with various computing devices such as point-of-sale (POS) terminals, digital signage, servers, and mobile or handheld computing devices to provide a seamless integration of mobile technologies and e-commerce into a traditional experience.
  • Using one or more visual sensors within the environment, a retailer or other provider may determine a person's field of view, and may compile personal behaviors and determine personal preferences. This data may then be used to provide timely, tailored recommendations in real-time to the person in order to more effectively influence their experience. While generally discussed within the context of a shopping environment, such as a retail store or other commercial environment, it is contemplated that the techniques disclosed herein may be applied to other environments (some non-limiting examples include libraries, museums, classrooms, hospitals, etc.) to provide an adaptive experience for persons included therein.
  • FIG. 1 illustrates an integrated shopping environment, according to one embodiment. As shown, environment 100 includes a plurality of terminals 105, a plurality of servers 110 1, 110 2 coupled to a network 115, one or more sensors 120 of different types, one or more user devices 140, and one or more other devices 150. In some embodiments, the environment 100 may be integrated into the layout of a retail store, market, or other commercial environment that is known or hereinafter developed.
  • Terminals 105 generally include any structure that is capable of receiving input from customers and/or producing output to customers within the environment 100. The terminals 105 may include computing systems, portions of computing systems, or devices controllable by computing systems. In one example, a terminal may include a computing device that is communicatively coupled with a visual display and audio speaker(s), as well as being communicatively coupled with one or more input devices. In another example, a terminal may include a visual display and associated driver hardware, but a computing device coupled to the terminal and providing data for display is disposed separately from the terminal. In some embodiments, terminals 105 may be implemented as standalone devices, such as a kiosk disposed on the store floor or monolithic device disposed on a shelf or platform. In some embodiments, terminals 105 may be integrated partially or wholly with other components of the environment 100, such as input or output devices included with shelving or other structural components in the environment (e.g., components used for product display or storage). In some embodiments, terminals 105 may be modular and may be easily attachable and detachable to elements of the environment 100, such as the structural components.
  • Generally, terminals 105 may be distributed throughout the environment 100 and may enhance various phases of the shopping experience for customers. For example, terminals 105 may include digital signage 108 disposed throughout the environment, such as included in or near aisles, endcaps, displays, and/or shelving in the environment. A customer may view and/or interact with the digital signage 108 as he/she moves through the store environment. The digital signage may be included in a static display or may be movable, such as including digital signage within a customer's shopping cart or basket. Terminals 105 may also include POS terminals 106 that provide a checkout functionality, allowing the customer to complete his/her shopping transaction (e.g., make payment for selected items). In some embodiments, terminals 105 may provide an integrated functionality. For example, the terminals may function in one mode as digital signage, and when engaged by a customer, the terminals function as a POS terminal.
  • Servers 110 1, 110 2 generally include processors, memory, and communications capabilities, and may perform various computing tasks to support the commercial operation of the environment 100. Servers 110 1, 110 2 communicate using various wired and/or wireless communications methods with terminals 105, sensors 120, and with other networked devices such as user devices 140 and other devices 150. Servers 110 1, 110 2 generally execute computer program code in which input data is received from networked devices, the input data is processed and/or stored by the servers, and output data is provided to networked devices for operation of the environment 100.
  • Sensors 120 may include various sensor devices, such as video sensors 125, audio sensors 130, and other sensors 135. The other sensors 135 generally include any sensors that are capable of providing meaningful information about customer interactions with the environment, e.g., location sensors, weight sensors, and so forth. Sensors 120 may be deployed throughout the environment 100 in fixed and/or in movable locations. For example, sensors 120 may be statically included in walls, floors, ceilings, displays, or other devices, or may be included in shopping carts or baskets capable of being transported around the environment. In one embodiment, sensors 120 may include adjustable position sensor devices, such as motorized cameras attached to a rail, wire, or frame. In one embodiment, sensors 120 may be included on one or more unmanned vehicles, such as unmanned ground vehicles (UGVs) or unmanned aerial vehicles (UAVs or “drones”). Sensors 120 may also include sensor devices that are included in user devices 140 or other devices 150 (which in some cases may include body-worn or carried devices). User devices 140 and other devices 150 may include passive or actively-powered devices capable of communicating with at least one of the networked devices of environment 100. One example of a passive device (which may be worn or carried) is a near-field communication (NFC) tag. Active devices may include mobile computing devices, such as smartphones or tablets, or wearable devices such as a Google Glass™ interactive eyepiece (Glass is a trademark of Google Inc.). The user devices 140 generally denotes ownership or possession of the devices by customers, while the other devices 150 denotes ownership or possession by the retailer or other administrator of the environment 100. In some cases, other devices 150 may be carried by employees and used in the course of their employment. User devices 140 and other devices 150 may execute applications or other program code that generally enables various features provided by the servers and/or other networked computing devices.
  • FIG. 2 illustrates a shopping environment system, according to one embodiment. Generally, the system 200 corresponds to the environment 100 described above. System 200 includes one or more processors 205, memory 210, and input/output 250, which are interconnected using one or more connections 240. In one embodiment, system 200 may be included in a singular computing device, and the connection 240 may be a common bus. In other embodiments, system 200 is distributed and includes a plurality of discrete computing devices that are connected through wired or wireless networking. Processors 205 may include any processing element suitable for performing functions described herein, and may include single or multiple core processors, as well as combinations thereof. Processors 205 may be included in a single computing device, or may represent an aggregation of processing elements included across a number of networked devices such as user devices 140, POS terminals 105, etc.
  • Memory 210 may include a variety of computer-readable media selected for their size, relative performance, or other capabilities: volatile and/or non-volatile media, removable and/or non-removable media, etc. Memory 210 may include cache, random access memory (RAM), storage, etc. Storage included as part of memory 210 may typically provide a non-volatile memory for the networked computing devices (e.g., servers 110 1, 110 2), and may include one or more different storage elements such as Flash memory, a hard disk drive, a solid state drive, an optical storage device, and/or a magnetic storage device. Memory 210 may be included in a single computing device or may represent an aggregation of memory included in networked devices. Memory 210 may include a plurality of modules 211 for performing various functions described herein. The modules 211 include program code that is executable by one or more of the processors 205. As shown, modules 211 include user identification 212, item identification 214, advertising 216, recommendations 218, virtual cart 220, assistance 222, security 224, power management 226, gaming 228, audit 230, loyalty program 232, and inventory 234. The modules 211 may also interact to perform certain functions. For example, a loyalty program module 232 during operation may make calls to user identification module 212, item identification module 214, advertising module 216, and so forth. The person of ordinary skill will recognize that the modules provided here are merely non-exclusive examples; different functions and/or groupings of functions may be included as desired to suitably operate the shopping environment. Memory 210 may also include customer profiles 236 and customer images 238, which may be accessed and/or modified by various of the modules 211. In one embodiment, the customer profiles 236 and customer images 238 may be stored on the servers 110 1, 110 2 or on a separate database.
  • Input/output (I/O) 250 may include a number of different devices interfacing with various computing devices and with the shopping environment. I/O 250 includes sensors 120, which have been described above. I/O 250 may further include input devices 252 and output devices 254 that are included to enhance the shopping experience for customers. In one embodiment, terminals 105, user devices 140, and other devices 150 may include visual displays and/or audio speakers (examples of the output devices 254), and various input devices 252 (such as cameras, keyboards or keypads, touchscreens, buttons, inertial sensors, etc.). I/O 250 may further include wired or wireless connections to an external network 256 using I/O adapter circuitry. Network 256 may include one or more networks of various types, including a local area or local access network (LAN), a general wide area network (WAN), and/or a public network (e.g., the Internet). In one embodiment, various networked computing devices of the system 200 are interconnected using a LAN, and one or more computing devices (e.g., servers 110 1, 110 2, user devices 140) include connections to the Internet.
  • FIG. 3 illustrates an integrated shopping environment, according to one embodiment. The environment 300 includes a plurality of sensor modules 302 disposed in the ceiling 301 of the store. The sensor modules 302 may each include one or more types of sensors, such as video sensors (e.g., cameras), audio sensors (e.g., microphones), and so forth. Sensor modules 302 may also include actuating devices for providing a desired sensor orientation. Sensor modules or individual sensors may generally be disposed at any suitable location within the environment 300. Some non-limiting examples of alternative locations include below, within, or above the floor 330, within other structural components of the environment 300 such as a shelving unit 303 or walls, and so forth. In some embodiments, sensors may be disposed on, within, or near product display areas such as shelving unit 303. The sensors may also be oriented toward an expected location of a customer interaction with items, to provide better data about a customer's interaction, such as determining a customer's field of view.
  • Environment 300 also includes a number of kiosks (or terminals) 305. Generally, kiosks 305 may be configured for performing customer checkout and/or other shopping functions. Each kiosk 305 may each include computing devices or portions of computing systems, and may include various I/O devices, such as visual displays, audio speakers, cameras, microphones, etc. for interacting with the customer. In some embodiments, a customer 340 may have a mobile computing device, such as smartphone 345, that communicatively couples with the kiosk 305 for completing the checkout transaction. In one embodiment, the mobile computing device may execute a store application that is connected to the networked computing systems (e.g., through servers 110 1, 110 2), or may be directly connected to kiosk 305 through wireless networks within the environment (e.g., over Wi-Fi or Bluetooth). In one embodiment, the mobile computing device may couple to the kiosk 305 when brought within range, e.g., using Bluetooth or NFC.
  • Environment 300 also includes one or more shelving units 303 having shelves 310 that support various store items 315. Though not shown, multiple shelving units 303 may be disposed in a particular arrangement in the environment 300, and the space between adjacent shelving units may form aisles through which customers may travel. In some embodiments, the shelving unit 303 may include visual sensors or other sensor devices or I/O devices. The sensors or devices may couple to a customer's smartphone 345 and/or other networked computing devices (including servers) within the environment 300. For example, the front portions 320 of shelves 310 may include video sensors oriented outward from the shelving unit 303 to capture customer interactions with items 315 on the shelving unit 305, and the data from the video sensors may be provided to back-end servers for storage and/or analysis. In some embodiments, portions of the shelving unit 303 (such as the front portions 320 of shelves 310) may include indicator lights or other visual display devices or audio output devices that are used to communicate with a customer.
  • FIG. 4 illustrates a system of influencing shopping experience based on a customer field of view, according to one embodiment. System 400 may be used in coordination with the various shopping environments described herein. Generally, system 400 may share at least portions of several components with the shopping environment system 200, such as processors 205, memory 210, and I/O 250. System 400 may also utilize one or more of the modules 211 to provide various aspects of the system's functionality, such as item identification module 214, advertising module 216, recommendation module 218, and so on.
  • I/O 250 includes output devices 254 and sensors 120. Output devices 254 include one or more devices for presenting information to customers and include audio output devices 440 and visual output devices 445. The audio output devices 440 may include conventional audio speakers having any suitable form factor (e.g., in a stereo, headphones, etc.), as well as devices using alternative methods of producing sound to a customer, such as bone conduction transducers in a worn device. Visual output devices 445 may include visual displays and various visual indicators such as light emitting diodes (LEDs). Other output devices 450 may provide information to customers through tactile feedback (e.g., haptic devices) or other sensory stimuli. Sensors 120 may include visual sensors 455 which may be carried or worn sensors 460, and distributed sensors 465 that are disposed throughout the shopping environment. Other sensors 470 may be included that are suitable for collecting information about a customer and his/her interactions within the shopping environment. Examples of other sensors 470 may include infrared (IR) sensors, thermal sensors, weight sensors, capacitive sensors, magnetic sensors, sonar sensors, radar sensors, lidar sensors, and so forth.
  • The visual sensors 455 may be used to capture one or more images 415 of the customer and/or the shopping environment, which may include views from various perspectives (e.g., a customer-worn visual sensor, static or movable visual sensors at various locations in the environment). The images 415 may be stored in memory 210, and may be individually or collectively processed to determine information about customers in the environment and their respective interactions with items in the environment.
  • Memory 210 includes one or more programs 435 that collectively receive data about customers and the shopping environment, process the received data, and transmit information to customers in order to influence the customers' shopping experiences. Programs 435 may include program code to determine a customer's field of view at a given time, including which items are included in the field of view. In one embodiment, the customer's field of view may be determined directly. For example, a body-worn device may include a visual sensor (i.e., a worn visual sensor 460) that, when the device is worn, gives the visual sensor an orientation that is similar to the orientation of customer's head or eyes (e.g., a forward-looking camera). Images captured from the worn visual sensor may generally reflect the customer's field of view.
  • In some embodiments, the customer's field of view may be estimated (or determined indirectly) using other sensor measurements. In one embodiment, the customer's field of view may be estimated by determining the orientation of one or both of the customer's eyes. Eye orientation may be determined using worn visual sensors 460 (e.g., an inward-facing camera on a head-worn device) and/or distributed visual sensors 465 (e.g., capturing images of the customer's face and image processing to determine an eye orientation). In other embodiments, the customer's field of view may be estimated by determining the position and/or orientation of the customer's head and/or body using various visual sensor measurements. The customer's field of view may be represented in any suitable data format, such as an image or as coordinate data (e.g., Cartesian, polar, spherical coordinates).
  • While it is possible that a single visual sensor 455 may be used to determine a customer's field of view, several embodiments employ a combination of a plurality of visual sensors 455 to determine a field of view. These embodiments may be preferred as providing additional data to support a more accurate estimate of the field of view. Additionally, the plurality of visual sensors used to determine a field of view may include visual sensors selected from different categories (worn sensors 460, distributed sensors 465) to provide additional robustness to the collected data.
  • Programs 435 may also include program code to identify one or more items included within the customer's field of view. The identification process may be performed directly or estimated. One example of direct identification is performing image processing on images collected from a worn, forward-looking camera to visually identify one or more items. Estimating items within a customer's field of view may require combining sensor data with known information about the shopping environment, such as the items in the environment (item data 420) and their relative arrangement or layout within the environment (location data 425).
  • Programs 435 may also include program code to present information to customers based on the identified one or more items. In some embodiments, the information may be used to influence the customer's shopping experience. The information presented to the customer may include information about the identified items (e.g., nutritional data, pricing, a customer's purchase history of the items, etc.), information encouraging the purchase of identified items (e.g., bringing particular items to the customer's attention, touting the items' features, offering discounts or other promotions on the items, etc.), and information encouraging the purchase of alternatives to the identified items (e.g., highlighting differences of the items, offering discounts, etc.).
  • To present relevant and persuasive information in real-time to a particular customer, programs 435 may access and analyze various additional data in memory 210 that is related to the customer and perhaps other customers of the shopping environment. In one embodiment, programs 435 may analyze shopping data 430 collected from previous shopping experiences for the particular customer and/or for other customers or groups of customers. For example, shopping data 430 may include customer views and the items included therein, progressions of customer views (showing a customer's interest over time), selection or purchase history for items, and so forth. While shopping data 430 may be compiled and used to generate information to present to customers and influence their shopping experiences in real-time, the shopping data 430 may also be used by the retailer or administrator of the shopping environment to modify the layout of the environment. The shopping data 430 may help the retailer identify trends in customer shopping, and to optimize placement of items within the environment to improve customer sales.
  • System 400 may also present information to customers based on their personal preferences 405. The preferences 405 may generally be stored in a corresponding customer profile 236, and may reflect preferences that are explicitly specified by a customer, or may be determined based on the customer's historical shopping behavior (e.g., included in shopping data 430). For example, a customer may have an allergy to a particular ingredient, and the customer may enter this allergy information in preferences 405, e.g., using a mobile phone app for the retailer. Accordingly, the system 400 when determining which information to present to the customer may present information that highlights items within the customer's field of view that include the ingredient, and may further suggest alternative items that do not include the ingredient.
  • A customer's shopping history may also be used to determine customer preferences 405. For example, the customer's determined fields of view and purchase history from shopping data 430 may be processed to deduce which items, combinations of items, and/or aspects of items are preferred by the customer. For example, preferred aspects might include preferred brands, costs, quantities, sizes, ingredients, nutritional properties (e.g., calorie content, fat, sugar, vitamins, minerals, etc.), and so forth. For example, the customer may specify a preference for low-fat foods, and the system may determine recommended items based on the items included in the customer's field of view and the customer's preferences. This may include suggesting a particular item within the field of view for purchase (or alternatively, an item located outside the field of view) and/or advising the customer about the item's properties vis-à-vis the customer's preferences (e.g., reporting fat content).
  • Of course, a customer's preferences may be included as a logical combination of a plurality of these aspects (e.g., a customer prefers Brand X items to Brand Y, so long as the cost of the Brand X item is no more than 150% of the Brand Y item). In some embodiments, other customers' shopping data may also be used to deduce a particular customer's preferences. Of course, the preferences may be dynamically updated to identify whether deduced preferences are accurate or not. The dynamic updating may be caused by the customer's explicit indication and/or by the customer's shopping patterns following the deduced preference. For example, the system 400 may deduce that a customer has a categorical preference for Brand X items over similar Brand Y items. However, the customer's historical shopping data indicated that the customer looked at a Brand X item (e.g., field of view data) before selecting and purchasing a similar Brand Y item (e.g., field of view data and/or purchase history data). The system in response may adapt or may entirely remove the deduced (and determined inaccurate) preference.
  • In some embodiments, system 400 may present information to customers that is also based on other programs 410. Examples of programs 410 may include fitness, nutrition, or health goals, money management goals, etc. In some embodiments, the programs 410 themselves may be integrated into a store application and accessible by the customer's mobile computing device or wearable device. In other embodiments, the store application may interface with applications from other providers to determine the customer's goals and present appropriate information during the shopping experience. For example, the system 400 could include a nutrition-oriented program, and may make suggestions for more nutritious items to a customer who is looking at junk food items (e.g., candy).
  • FIGS. 5A and 5B illustrate an example wearable computing device for use in a shopping environment, according to one embodiment. Portions of wearable computing device 500 may be head-worn or worn on other portions of the body. The wearable computing device 500 includes a housing 505 that includes several structural components. The band 505 may be used as a structural frame, supporting other portions while itself being supported by a customer's head when worn. Other structural components may include nose pieces 520, ear piece 515, and enclosure 512. Enclosure 512 may include a computing device 525, which includes video I/O components 530 and audio I/O components 540. The enclosure formed by the earpiece 515 may also house components, such as a battery 545 for powering the computing device 525. Although not shown, computing device 525 also includes wireless networking components for communicating with other computing devices. The video I/O components 530 may include a forward-looking camera 513 that provides an estimate of the wearer's field of view based on their head orientation, and a transparent prism 514 that is used to project light onto the wearer's retina, displaying information to the wearer over the wearer's natural field of view. Other video I/O components may include an inward-looking camera that is configured to capture the eye orientation of the wearer, a conventional display device (e.g., a LCD), and so forth. Audio I/O components 540 may include one or more microphones and/or audio output devices, such as speakers or bone-conducting transducers.
  • FIG. 5B illustrates the wear of wearable computing device 500. As shown, scene 550 includes the wearer's natural view 560. A portion of the area of the natural view 560 may be used for displaying an overlay 565 (e.g., using the prism 514) that provides additional information to the wearer. The display of the overlay 565 may be adequately transparent to permit the wearer to continue to observe their natural view through the overlay area. As shown, the overlay 565 includes a map view of the wearer's current location (e.g., the wearer is at W 34th street). In some embodiments, information may be selected and visually presented in text and/or graphics to a customer wearing such a device to influence his/her shopping experience.
  • FIGS. 6A-6C illustrate determining a customer field of view and identifying items included within the field of view, according to one embodiment. In scene 600, a shelving unit 603 is depicted having a plurality of shelves 610 that each support and display a number of items 612 that are available for selection and purchase by a customer.
  • Within the scene 600 are defined a customer's field of view 615 and an area 605 outside the customer field of view. In one embodiment, the customer's field of view 615 may be represented by an image captured from a forward-looking camera. While shown as generally rectangular, the customer field of view 615 may have any suitable alternative shape and size. For example, the customer's actual vision may encompass a significantly larger area, but determining the field of view for purposes of the shopping environment may include applying a threshold or weighting scheme that emphasizes areas that are closer to the center of a customer's vision. Of course, data provided by various visual sensors and/or other sensors may be used to make these determinations.
  • The field of view 615 may include a plurality of fully included items 620, as well as a plurality of partially included items 625. When determining which items to identify as “included” in the field of view, certain embodiments may categorically include or exclude the partially included items 625. An alternative embodiment may rely on image processing to determine whether a partially included item 625 should be identified as included. For example, if the processing cannot recognize the particular item with a certain degree of confidence, the item may be excluded. In another alternative embodiment, partially included items 625 may be included, and the amount of item inclusion (e.g., the percentage of surface area of the item included) may be used to determine a customer focus or calculate a customer interest score, which are discussed further below.
  • Items that are fully or partially included in the customer field of view 615 may be recognized by performing image processing techniques on images captured by various visual sensors. For example, images that include the items may be compared against stock item images stored in a database or server. To aid image processing, items may also include markers or distinctive symbols, some of which may include item identification data such as barcodes or quick response (QR) codes. Of course, other processing techniques may be employed to recognize a particular item, such as textual recognition, determining the item's similarity to adjacent items, and so forth.
  • Scene 650 of FIG. 6B depicts a customer 660 in a shopping environment. The customer 660 is standing in an aisle 655 adjacent to a shelving unit 603, which has a plurality of shelves 610. Visual sensors may capture one or more images of scene 650 from various spatial perspectives, and the images may be used to determine the customer's field of view. Specifically, various aspects of the scene that are captured in the images may be used to estimate the customer's field of view.
  • In one embodiment, the relative position and/or orientation of portions of the customer's body may be determined. In one embodiment, the position and orientation of the customer's eyes 680 may be determined. For example, eye position within the environment may be determined in Cartesian coordinates (i.e., determining x, y, and z-direction values) and eye orientation may be represented by an angle a defined relative to a reference direction or plane (such as horizontal or an x-y plane corresponding to a particular value of z). In other embodiments, other portions of the customer's body may (also) be used to determine the field of view, such as the position and orientation of the customer's head 665, or of one or both shoulders 670. In other embodiments, the customer's interaction with the shelving unit 603 by extending her arm 675 may also be captured in one or more images, and the direction of the extended arm may be used to determine her field of view.
  • Of course, embodiments may use combinations of various aspects of the scene to determine the customer's field of view. In some embodiments, the combinations may be weighted; for example, data showing a customer 660 reaching out her arm 675 towards a specific item may be weighted more heavily to determine her field of view than the orientation of her shoulders. In some embodiments, the weights may be dynamically updated based on the customer's shopping behaviors following an estimate of the customer's field of view. For example, if a customer reached for (or selected) an item that was not included in the determined field of view, the system may adjust the relative weighting in order to accurately capture the customer's field of view. This adjustment may include determining correlation values between particular captured aspects of the scene to the selected item; for example, the customer's head may be partly turned towards the selected item, but their eye orientation may generally be more closely tied to the selected item. In some embodiments, the correlation values may be more useful where one or more aspects of the scene cannot be determined (e.g., the system may be unable to determine eye orientation for a customer wearing sunglasses, non-optimal visual sensor positioning, etc.).
  • Scene 685 of FIG. 6C illustrates an overhead view of several customers 660 in a shopping environment. In one embodiment, the view of scene 685 may be represented by an image captured from a ceiling mounted camera, or from a drone.
  • Certain additional aspects depicted in scene 685 and captured in images may be used to estimate a customer's field of view. In one example, the orientation of customer 660A may be estimated using the relative position of his/her shoulders 670. As shown, a line connecting the two shoulders may be compared to a reference direction or plane (e.g., parallel to the length of shelving unit 603A) and represented by an angle β. In another example, the orientation of customer 660B may be estimated using the orientation of his/her head 665, comparing a direction of the customer's head to a reference direction or plane, which may be represented by an angle γ. Images may also capture a customer 660C interacting with the shelving unit 603B, and the position and/or orientation of the customer's arm 675 may be used to determine the customer's field of view.
  • FIG. 7 illustrates several example views of determining a customer focus on an item within the customer's field of view, according to one embodiment. While the computing systems may determine information to present to a customer based on all of the items identified within a determined field of view, in some embodiments it may be advantageous to make a further identification of one or more items that the customer is specifically focused on. Presenting information to the customer that is based on the customer-focused items may generally provide a more relevant and more persuasive information to influence the customer's shopping experience.
  • Items 715A-D are included on a shelf 710 of a shelving unit 703. The determined field of view 705 of a customer may include different groups of items at different times, and the progression of the customer's field of view over time may help determine which item(s) within the field of view are specifically being focused on by the customer. Generally, the customer's focus on a particular item may indicate that the item is merely attracting the customer's attention, or that the customer is deciding whether or not to purchase the item. Either way, understanding the object of a customer's focus may help retailers or suppliers to improve packaging and placement of items or to influence the customer's shopping experience in real-time.
  • View 1 illustrates a field of view 705 that includes items 715A, 715B, and a portion of 715C. In one embodiment, items 715A and 715B may be included in a customer focus determination due to the items' full inclusion within the field of view 705. Conversely, item 715C may be excluded from a customer focus for being only partially included in the field of view 705. In an alternative embodiment, all three items may be included by virtue of being at least partially included in the field of view 705. In some embodiments, a customer focus on particular items may be a time-based determination. For example, if the customer's field of view 705 remained relatively steady during a preset amount of time (e.g., 5 or 10 seconds), such that both items 715A, 715B remained within the field of view during this time, the computing system may determine that the customer is focused on items 715A-C. In some embodiments, the particular item(s) must continuously remain in the field of view 705 during the preset amount of time (e.g., remain across several samples of the field of view during this time).
  • View 2 illustrates a field of view 705 that includes item 715B, and portions of items 715A and 715C. View 2 could represent the same field of view as View 1 at a later time. In some embodiments, the customer's focus may be determined to include item 715B but not 715A. For example, the system could determine that 715B is the lone customer-focused item based on its relatively central position within the field of view 705, or perhaps based on the changes to the field of view from View 1 (i.e., a shift away from previously fully included item 715A). In one embodiment, item 715B must also remain within the field of view 705 for the predetermined amount of time to be considered as a customer-focused item. Of course, other methods are possible to identify one or more relatively prominent items within a field of view to determine a customer's focus, such as determining of the percentage of the item that is included within the field of view, determining the percentage of the field of view occupied by the item, and so forth. In alternative embodiments, the customer-focused items may still include item 715A and/or item 715C.
  • View 3 illustrates a field of view 705 that includes item 715B, and portions of items 715A and 715C. View 3 differs from View 2 in that the field of view 705 is “closer” to the shelving unit 703 and items 715 in View 3 than in View 2. View 3 could represent the same field of view as View 2 at a later time, e.g., as the customer moves towards the shelving unit 703. Here, the system could determine that 715B is a customer-focused item based on its relatively central position within the field of view 705 and/or based on the changes to the field of view from View 2 (i.e., item 715B occupies an increased percentage of the field of view 705).
  • View 4 also illustrates a field of view 705 that includes item 715B, and portions of items 715A and 715C. View 4 could represent the same field of view as View 3 at a later time. In View 4, the customer has selected item 715B and is holding the item in his/her hand 725. Here, the system could determine that 715B is a customer-focused item based on its relatively central position within the field of view 705, based on the changes to the field of view from View 3 (e.g., item 715B occupies an increased percentage of the field of view 705), and/or based on the presence of a portion 730 of the customer's hand 725 included in the field of view 705.
  • In some embodiments, a customer interest score may be determined for items included within the customer's field of view. The customer interest score is generally time-based, and score may be used to timely interject desired information in order to influence a customer's shopping decisions (e.g., whether to buy a particular item) in real-time. In one embodiment, a customer interest score is calculated and updated concurrent with determining a customer focus, and the customer interest score for an item exceeding a threshold value may be used to determine that the customer has focused on the item. The customer interest score may also be adjusted based on various other aspects shown in the Views 1-4, such as position within the field of view (e.g., a central position may mean more interest), percentage occupied within the field of view (e.g., a larger percentage may mean more interest), customer selection and/or manipulation of the item, and so forth. Of course, the calculation of a customer's interest score may reflect their particular shopping behaviors (e.g., a customer is determined to be 95% likely to pick up and manipulate a customer-focused item within their field of view, so the mere fact of picking up an item might not indicate an increased level of customer interest in the item).
  • After determining that a customer has focused on a particular item, the customer interest score may generally decrease over time as it becomes more likely that the customer will not purchase the particular item. To maintain or improve customer interest, and to thereby increase the likelihood that the customer will purchase the item, the system may provide information about the item at desired times (corresponding to customer interest scores). This may include providing information about the items or alternatives, and/or offering promotional pricing on the item (or an alternative item). In one embodiment, to provide more effective promotions (e.g., more customer-tailored and more timely), and to prevent customer abuse of a time-based promotional pricing scheme, the customer shopping history may include data regarding the number of promotions already offered to the customer, the length of time (or customer interest score) before offering the promotion, the customer utilization rate of the previous promotions, etc.
  • FIGS. 8A-8C illustrate several example presentations of information related to an item, according to one embodiment. As discussed above, the presentation of information may be based upon some or all of the items that are identified as being within the customer's field of view. For ease of description, FIG. 8A illustrates a single item 805 within the customer's field of view 705; this could also represent an example in which one item is identified from several items as being a customer focused item. However, similar techniques may be applied to a field of view or a customer focus that includes more than one item. In some embodiments, the computing system may present information to the customer about a particular item, based on the item 805. The presented information may relate to the same item 805 and/or to one or more different items. The different items may include other items located within the customer's field of view 705, and may include items (such as item 810) that are not included within the field of view 705. In some embodiments, the information may be determined and timely presented so as to influence the customer to purchase the particular item(s).
  • FIG. 8B illustrates several example presentations of information to a customer. The presentations may be delivered in real-time to the customer using any suitable manner, such as a visual output (e.g., graphical and/or textual), audio-based output, and/or other sensory output (e.g., haptic). While the presentations 815, 820, 825 are shown as textual presentations for simplicity, it is possible for the presentations to be spoken or presented in any suitable alternative manner. In one example, a presentation may be displayed by a wearable computing device of the customer, such as a visual overlay on a display of the wearable computing device as discussed above with respect to FIGS. 5A and 5B. In another example, a presentation may be transmitted as a message to the customer's mobile computing device, causing text or graphics to be displayed on the mobile computing device and causing the device to vibrate.
  • Presentation 815 indicates a customer preference for a second item 810 to the currently viewed or focused item 805. Presentation 815 generally suggests that the customer may also prefer the second item 810. The customer preference may be determined based on compiled previous shopping data for other customers. The shopping data may include separate statistics for the two items, for example, based on historical purchase data for each item. The shopping data may also include compiled information that directly interrelates the two items. For example, in previous shopping experiences, other customers also focused on item 805 (e.g., based on determined fields of view) but ultimately declined to purchase item 805 in favor of item 810.
  • Presentation 820 indicates that a promotion is available for the second item 810. Presentation 820 may reflect a sales promotion or prices already in effect (e.g., a reporting function), or the promotion may be dynamically generated to encourage the customer to purchase item 810. For example, manufacturers or stores may wish to promote item 810 over item 805, and the promotion may be presented to the customer in real-time. As discussed above, in some embodiments, the promotion may be determined, as well as the timing of its presentation to the customer, based on a determined level of customer interest in an item.
  • Presentation 825 indicates that item 805 would have a negative impact on the customer's goals. The presentation 825 may be based on customer profile data such as customer preferences and/or programs. Examples of programs may include fitness, nutrition, or health goals, money management goals, etc. Although not shown here, one embodiment may present the information to the customer upon which the negative impact determination was made. If the system determines that item 805 is incompatible with specified goals or preferences, or that better alternative items exist consistent with the customer's goals/preferences, the system may recommend one or more alternative items for presentation to the customer.
  • FIG. 8C illustrates several example presentations of information to a customer. Like the presentations discussed above, the presentations may be delivered in real-time to the customer using any suitable manner, such as a visual output (e.g., graphical and/or textual), audio-based output, and/or other sensory output (e.g., haptic). The presentations may be output using a customer's worn or carried computing device, or using output hardware (e.g., displays, speakers, etc.) that is disposed within the shopping environment. For example, LEDs or other visual display elements may be included in the shelving units, such as display elements 835 a-b and 836 c-d.
  • As shown, a single item 805 is included within a customer's field of view 705, and item 810 is located outside the field of view. Of course, similar techniques may be applied to a field of view or a customer focus that includes more than one item. In cases where the item 805 (already included in the field of view 705) is recommended for presentation to the customer, the system may display various indicators that highlight the item 805 to the user. For example, if the field of view 705 represents the customer view through a head-worn computing device such as wearable computing device 500, the system may display colors, shapes, etc. that visually enhance the item 805, or may reduce the visibility of other items in the field of view (e.g., gray out). The system may also display location information of the item 805, for example, by illuminating one or more of the display elements 835 a-b nearest to the item 805.
  • In cases where an item is recommended for presentation that is not disposed within the field of view 705 (e.g., item 810), the system may output directional information from the current field of view 705 to the item. For example, the system may display an arrow 830 overlay indicating the relative direction from the field of view 705 or item 805 to the target item 810. The system may also provide directional information using one or more of the display elements 835, 836. For example, the display elements 835, 836 may be illuminated in sequence from left to right to guide the customers' eye toward item 810. Of course, the various types of information discussed here (e.g., highlighting, location, and directional) may additionally or alternately be provided by non-visual means to the customer, such as audio or haptic outputs.
  • FIG. 9A illustrates a method of influencing a shopping experience of a customer within a shopping environment, according to one embodiment. Method 900 may generally be used in conjunction with the various environments and systems discussed above. Method 900 begins at block 905, where a field of view is determined for the customer. Determining a field of view may include a “direct” determination, such as capturing an image from a body-worn, forward-looking camera (i.e., the image strongly corresponds to the customer's eye gaze). Alternatively (or in addition to the direct determination), the field of view may be determined “indirectly” by capturing images of the shopping environment and performing image processing to determine various aspects. The aspects may include the determined position and orientation of one or more of the customer's body, head, shoulders, eyes, arms, hands, etc. The images may be captured by visual sensors that are worn or disposed throughout the shopping environment.
  • At block 915, one or more first items that are included within the customer's field of view are identified. In one embodiment, identifying items may include performing image processing on a captured image that reflects the customer's field of view (e.g., from a forward-looking camera). In another embodiment, identifying items may include referencing coordinate data determined for the customer's field of view against a known store layout that includes item location information. In such an embodiment, image processing may not be necessary to identify the one or more first items.
  • At block 925, at least one second item is selected for presentation to the customer. The selection is based on the identified one or more first items. In one embodiment, at least one of the second item(s) may be included in the identified first items. In other embodiments, each of the second item(s) may be different than the identified first items. The selection may be based on decisional logic for recommending a particular item for sale, based on historical shopping data for the customer (and/or other customers) and/or customer preferences or programs.
  • At block 935, information related to the at least one second item is presented to the customer. Presenting information may be done in any suitable manner, such as displaying textual or graphical information in a visual overlay, displaying directional information to the customer, or outputting sound or other feedback. The information may be related to the properties of the second item (e.g., price, nutrition information, etc.) or to historical shopping data, promotions, or other customer-specific preferences or programs. The timing of the presentation may be determined based on customer interest scores or other timing considerations that are collectively intended to influence a customer's shopping purchases.
  • FIG. 9B illustrates a method of determining content to present to a customer based on a customer interest score, according to one embodiment. Method 940 may generally be used in conjunction with the various shopping environments and systems discussed above. In one embodiment, method 940 is performed as part of performing method 900. Method 940 begins at block 945, where a customer focus is determined on at least one item selected from one or more items that are included within the customer's field of view. The customer focus may be time-based, such as requiring the item to remain within the field of view for a minimum amount of time.
  • At block 955, a customer interest score is determined for the at least one customer-focused item. The customer interest score may also be time-based, and may be influenced by various characteristics of the interaction of the customer with the item, which may be observed through the determined customer field of view. In one embodiment, the customer interest score may be determined concurrently with determining the customer focus on an item. In one embodiment, the customer-focused item is determined based on the customer interest score for that item reaching a predetermined threshold value. In another embodiment, the customer-focused item is determined by the item's score that exceeds scores for other items within the field of view.
  • At block 965, the content of the information to present to the customer is determined based on the determined customer interest score. Determining the content may include determining whether to present information for the customer-focused item or for an alternate item, whether or not to present a promotion for the item or the alternate item, determining an amount of the promotion, and so forth.
  • The descriptions of the various embodiments of the present disclosure have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
  • In the preceding, reference is made to embodiments presented in this disclosure. However, the scope of the present disclosure is not limited to specific described embodiments. Instead, any combination of the following features and elements, whether related to different embodiments or not, is contemplated to implement and practice contemplated embodiments. Furthermore, although embodiments disclosed herein may achieve advantages over other possible solutions or over the prior art, whether or not a particular advantage is achieved by a given embodiment is not limiting of the scope of the present disclosure. Thus, the following aspects, features, embodiments and advantages are merely illustrative and are not considered elements or limitations of the appended claims except where explicitly recited in a claim(s). Likewise, reference to “the invention” shall not be construed as a generalization of any inventive subject matter disclosed herein and shall not be considered to be an element or limitation of the appended claims except where explicitly recited in a claim(s).
  • Aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module,” or “system.”
  • The present disclosure may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure.
  • The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
  • Computer readable program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
  • Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
  • The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
  • Embodiments of the disclosure may be provided to end users through a cloud computing infrastructure. Cloud computing generally refers to the provision of scalable computing resources as a service over a network. More formally, cloud computing may be defined as a computing capability that provides an abstraction between the computing resource and its underlying technical architecture (e.g., servers, storage, networks), enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction. Thus, cloud computing allows a user to access virtual computing resources (e.g., storage, data, applications, and even complete virtualized computing systems) in “the cloud,” without regard for the underlying physical systems (or locations of those systems) used to provide the computing resources.
  • Typically, cloud computing resources are provided to a user on a pay-per-use basis, where users are charged only for the computing resources actually used (e.g., an amount of storage space consumed by a user or a number of virtualized systems instantiated by the user). A user can access any of the resources that reside in the cloud at any time, and from anywhere across the Internet. In context of the present disclosure, a user may access applications (e.g., a retail store app for a mobile computing device) or related data (e.g., compiled shopping data) available in the cloud. Doing so allows a user to access this information from any computing system attached to a network connected to the cloud (e.g., the Internet).
  • While the foregoing is directed to embodiments of the present disclosure, other and further embodiments of the disclosure may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.

Claims (20)

What is claimed is:
1. A computer-implemented method to influence a person within an environment having a plurality of items for selection, the method comprising:
capturing, using a first visual sensor disposed within the environment, field of view information for the person;
performing analysis on the field of view information using a computing device;
identifying, based on the analysis, one or more first items of the plurality of items that are included within the field of view of the person;
selecting, based on the identified one or more first items, at least one second item of the plurality of items for presentation to the person; and
presenting information related to the at least one second item to the person.
2. The computer-implemented method of claim 1, wherein capturing field of view information for the person includes using the first visual sensor to determine at least one of an eye orientation, a head orientation, and a body orientation of the person.
3. The computer-implemented method of claim 1, wherein performing analysis on the field of view information includes comparing the field of view information with location data for the plurality of items of the environment.
4. The computer-implemented method of claim 1, wherein performing analysis on the field of view information includes performing image processing upon one or more images of portions of the environment captured using at least a second visual sensor disposed within the environment.
5. The computer-implemented method of claim 1, further comprising:
determining a focus of the person on at least one item selected from the one or more first items included within the person's field of view, wherein determining a focus is based on inclusion of the at least one selected item within the person's field of view for at least a predetermined first length of time.
6. The computer-implemented method of claim 5, further comprising:
calculating an interest score of the person for the at least one selected item based on a length of time of inclusion within the person's field of view.
7. The computer-implemented method of claim 6, wherein presenting information related to the at least one second item includes determining content of the information based on the interest score.
8. The computer-implemented method of claim 1, wherein selecting the at least one second item is based on data compiled from one or more previous experiences of one or more persons.
9. The computer-implemented method of claim 1, wherein selecting the at least one second item is based on personal preferences included in a personal profile associated with the person.
10. The computer-implemented method of claim 1, wherein the at least one second item is not included within the person's field of view, and wherein presenting information related to the at least one second item includes indicating a direction to a location of the at least one second item within the environment.
11. A computer program product to influence a person within an environment having a plurality of items for selection, the computer program product comprising:
a computer-readable storage medium having computer-readable program code embodied therewith, the computer-readable program code executable by one or more computer processors to:
capture, using a first visual sensor disposed within the environment, field of view information for the person;
perform analysis on the field of view information using a computing device;
identify one or more first items of the plurality of items that are included within the field of view of the person;
select, based on the identified one or more first items, at least one second item of the plurality of items for presentation to the person; and
present information related to the at least one second item to the person.
12. The computer program product of claim 11, wherein performing analysis on the field of view information includes performing image processing upon one or more images of portions of the environment captured using at least a second visual sensor disposed within the environment.
13. The computer program product of claim 11, wherein the computer-readable program code is further executable to:
determine a focus of the person on at least one item selected from the one or more first items included within the person's field of view, wherein determining a focus is based on inclusion of the at least one selected item within the person's field of view for at least a predetermined first length of time.
14. The computer program product of claim 13, wherein the computer-readable program code is further executable to:
calculate an interest score of the person for the at least one selected item based on a length of time of inclusion within the person's field of view.
15. The computer program product of claim 14, wherein presenting information related to the at least one second item includes determining content of the information based on the interest score.
16. A system to influence a person within an environment having a plurality of items for selection, the system comprising:
one or more computer processors;
a first visual sensor disposed within the environment and communicatively coupled with the one or more computer processors; and
a memory containing a program which, when executed by the one or more computer processors, performs an operation comprising:
capturing, using a first visual sensor disposed within the environment, field of view information for the person;
performing analysis on the field of view information using a computing device;
identifying, based on the analysis, one or more first items of the plurality of items that are included within the field of view of the person;
selecting, based on the identified one or more first items, at least one second item of the plurality of items for presentation to the person; and
presenting information related to the at least one second item to the person.
17. The system of claim 16, wherein the first visual sensor is included in a body-worn computing device of the person.
18. The system of claim 16, further comprising a plurality of visual sensors distributed throughout the environment, and wherein performing analysis on the field of view information includes performing image processing upon one or more images of portions of the environment captured using the plurality of visual sensors.
19. The system of claim 16, further comprising an output device communicatively coupled with the one or more computer processors, wherein the output device is configured to present the information related to the at least one second item to the person using at least one of audio and visual output.
20. The system of claim 19, wherein the output device is one of a handheld computing device and a body-worn computing device of the person.
US14/590,240 2014-10-15 2015-01-06 Method, computer program product, and system for providing a sensor-based environment Abandoned US20160110791A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/590,240 US20160110791A1 (en) 2014-10-15 2015-01-06 Method, computer program product, and system for providing a sensor-based environment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201462064323P 2014-10-15 2014-10-15
US14/590,240 US20160110791A1 (en) 2014-10-15 2015-01-06 Method, computer program product, and system for providing a sensor-based environment

Publications (1)

Publication Number Publication Date
US20160110791A1 true US20160110791A1 (en) 2016-04-21

Family

ID=55748790

Family Applications (19)

Application Number Title Priority Date Filing Date
US14/590,240 Abandoned US20160110791A1 (en) 2014-10-15 2015-01-06 Method, computer program product, and system for providing a sensor-based environment
US14/644,888 Abandoned US20160110700A1 (en) 2014-10-15 2015-03-11 Transaction audit suggestions based on customer shopping behavior
US14/659,128 Active 2037-01-24 US10482724B2 (en) 2014-10-15 2015-03-16 Method, computer program product, and system for providing a sensor-based environment
US14/659,169 Active 2035-05-24 US9679327B2 (en) 2014-10-15 2015-03-16 Visual checkout with center of mass security check
US14/663,190 Active 2036-11-26 US10417878B2 (en) 2014-10-15 2015-03-19 Method, computer program product, and system for providing a sensor-based environment
US14/673,390 Active US9424601B2 (en) 2014-10-15 2015-03-30 Method, computer program product, and system for providing a sensor-based environment
US14/675,206 Abandoned US20160110751A1 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/674,776 Active 2037-06-21 US10176677B2 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/674,845 Active US9786000B2 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/675,161 Active 2035-06-27 US9842363B2 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for producing combined image information to provide extended vision
US14/675,025 Active 2036-01-16 US10776844B2 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/674,922 Abandoned US20160110793A1 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/883,178 Active 2037-02-02 US10157413B2 (en) 2014-10-15 2015-10-14 Method of using, apparatus, product, and system for a no touch point-of-sale self-checkout
US14/883,146 Active 2038-06-13 US10810648B2 (en) 2014-10-15 2015-10-14 Method, product, and system for unmanned vehicles in retail environments
US14/883,198 Active 2038-07-15 US11127061B2 (en) 2014-10-15 2015-10-14 Method, product, and system for identifying items for transactions
US15/837,507 Active 2035-05-07 US10593163B2 (en) 2014-10-15 2017-12-11 Method, computer program product, and system for producing combined image information to provide extended vision
US16/201,194 Active 2035-11-01 US10825068B2 (en) 2014-10-15 2018-11-27 Method of using apparatus, product, and system for a no touch point-of-sale self-checkout
US16/241,610 Active US10672051B2 (en) 2014-10-15 2019-01-07 Method, computer program product, and system for providing a sensor-based environment
US17/010,456 Active 2036-06-17 US11514497B2 (en) 2014-10-15 2020-09-02 Method of using, apparatus, product, and system for a no touch point-of-sale self-checkout

Family Applications After (18)

Application Number Title Priority Date Filing Date
US14/644,888 Abandoned US20160110700A1 (en) 2014-10-15 2015-03-11 Transaction audit suggestions based on customer shopping behavior
US14/659,128 Active 2037-01-24 US10482724B2 (en) 2014-10-15 2015-03-16 Method, computer program product, and system for providing a sensor-based environment
US14/659,169 Active 2035-05-24 US9679327B2 (en) 2014-10-15 2015-03-16 Visual checkout with center of mass security check
US14/663,190 Active 2036-11-26 US10417878B2 (en) 2014-10-15 2015-03-19 Method, computer program product, and system for providing a sensor-based environment
US14/673,390 Active US9424601B2 (en) 2014-10-15 2015-03-30 Method, computer program product, and system for providing a sensor-based environment
US14/675,206 Abandoned US20160110751A1 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/674,776 Active 2037-06-21 US10176677B2 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/674,845 Active US9786000B2 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/675,161 Active 2035-06-27 US9842363B2 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for producing combined image information to provide extended vision
US14/675,025 Active 2036-01-16 US10776844B2 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/674,922 Abandoned US20160110793A1 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/883,178 Active 2037-02-02 US10157413B2 (en) 2014-10-15 2015-10-14 Method of using, apparatus, product, and system for a no touch point-of-sale self-checkout
US14/883,146 Active 2038-06-13 US10810648B2 (en) 2014-10-15 2015-10-14 Method, product, and system for unmanned vehicles in retail environments
US14/883,198 Active 2038-07-15 US11127061B2 (en) 2014-10-15 2015-10-14 Method, product, and system for identifying items for transactions
US15/837,507 Active 2035-05-07 US10593163B2 (en) 2014-10-15 2017-12-11 Method, computer program product, and system for producing combined image information to provide extended vision
US16/201,194 Active 2035-11-01 US10825068B2 (en) 2014-10-15 2018-11-27 Method of using apparatus, product, and system for a no touch point-of-sale self-checkout
US16/241,610 Active US10672051B2 (en) 2014-10-15 2019-01-07 Method, computer program product, and system for providing a sensor-based environment
US17/010,456 Active 2036-06-17 US11514497B2 (en) 2014-10-15 2020-09-02 Method of using, apparatus, product, and system for a no touch point-of-sale self-checkout

Country Status (1)

Country Link
US (19) US20160110791A1 (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170116657A1 (en) * 2015-10-26 2017-04-27 Sk Planet Co., Ltd. Payment information providing system using wearable device and payment information providing method using the same
US20170206571A1 (en) * 2016-01-14 2017-07-20 Adobe Systems Incorporated Generating leads using internet of things devices at brick-and-mortar stores
US20180322514A1 (en) * 2017-05-08 2018-11-08 Walmart Apollo, Llc Uniquely identifiable customer traffic systems and methods
US20190213534A1 (en) * 2018-01-10 2019-07-11 Trax Technology Solutions Pte Ltd. Withholding notifications due to temporary misplaced products
US20190318372A1 (en) * 2018-04-13 2019-10-17 Shopper Scientist Llc Shopping time allocated to product exposure in a shopping environment
US10572843B2 (en) * 2014-02-14 2020-02-25 Bby Solutions, Inc. Wireless customer and labor management optimization in retail settings
US10778906B2 (en) * 2017-05-10 2020-09-15 Grabango Co. Series-configured camera array for efficient deployment
US10810595B2 (en) 2017-09-13 2020-10-20 Walmart Apollo, Llc Systems and methods for real-time data processing, monitoring, and alerting
US10861086B2 (en) 2016-05-09 2020-12-08 Grabango Co. Computer vision system and method for automatic checkout
US10963704B2 (en) 2017-10-16 2021-03-30 Grabango Co. Multiple-factor verification for vision-based systems
US11095470B2 (en) 2016-07-09 2021-08-17 Grabango Co. Remote state following devices
US11132737B2 (en) 2017-02-10 2021-09-28 Grabango Co. Dynamic customer checkout experience within an automated shopping environment
US11176684B2 (en) * 2019-02-18 2021-11-16 Acer Incorporated Customer behavior analyzing method and customer behavior analyzing system
US11226688B1 (en) 2017-09-14 2022-01-18 Grabango Co. System and method for human gesture processing from video input
US11282077B2 (en) 2017-08-21 2022-03-22 Walmart Apollo, Llc Data comparison efficiency for real-time data processing, monitoring, and alerting
US11288650B2 (en) 2017-06-21 2022-03-29 Grabango Co. Linking computer vision interactions with a computer kiosk
US11288648B2 (en) 2018-10-29 2022-03-29 Grabango Co. Commerce automation for a fueling station
EP3834130A4 (en) * 2018-08-07 2022-09-14 Lynxight Ltd. Drowning detection enhanced by swimmer analytics
US11481805B2 (en) 2018-01-03 2022-10-25 Grabango Co. Marketing and couponing in a retail environment using computer vision
US11507933B2 (en) 2019-03-01 2022-11-22 Grabango Co. Cashier interface for linking customers to virtual data
US12079771B2 (en) 2018-01-10 2024-09-03 Trax Technology Solutions Pte Ltd. Withholding notifications due to temporary misplaced products

Families Citing this family (191)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9139363B2 (en) 2013-03-15 2015-09-22 John Lert Automated system for transporting payloads
US10366445B2 (en) * 2013-10-17 2019-07-30 Mashgin Inc. Automated object recognition kiosk for retail checkouts
US11551287B2 (en) * 2013-10-17 2023-01-10 Mashgin Inc. Automated object recognition kiosk for retail checkouts
US11138581B2 (en) * 2014-01-10 2021-10-05 Elo Touch Solutions, Inc. Multi-mode point-of-sale device
US10129507B2 (en) * 2014-07-15 2018-11-13 Toshiba Global Commerce Solutions Holdings Corporation System and method for self-checkout using product images
US10991049B1 (en) 2014-09-23 2021-04-27 United Services Automobile Association (Usaa) Systems and methods for acquiring insurance related informatics
US20160110791A1 (en) * 2014-10-15 2016-04-21 Toshiba Global Commerce Solutions Holdings Corporation Method, computer program product, and system for providing a sensor-based environment
US9330474B1 (en) 2014-12-23 2016-05-03 Ricoh Co., Ltd. Distinguishing between stock keeping units using a physical dimension of a region depicted in an image
US9342900B1 (en) * 2014-12-23 2016-05-17 Ricoh Co., Ltd. Distinguishing between stock keeping units using marker based methodology
US20160259344A1 (en) 2015-03-06 2016-09-08 Wal-Mart Stores, Inc. Shopping facility assistance systems, devices, and methods to facilitate responding to a user's request for product pricing information
US20180099846A1 (en) 2015-03-06 2018-04-12 Wal-Mart Stores, Inc. Method and apparatus for transporting a plurality of stacked motorized transport units
WO2016142794A1 (en) 2015-03-06 2016-09-15 Wal-Mart Stores, Inc Item monitoring system and method
US12084824B2 (en) 2015-03-06 2024-09-10 Walmart Apollo, Llc Shopping facility assistance systems, devices and methods
IN2015CH01602A (en) * 2015-03-28 2015-04-24 Wipro Ltd
US9990644B2 (en) 2015-05-13 2018-06-05 Shelfbucks, Inc. System and methods for determining location of pop displays with wireless beacons using known wireless beacon locations
CA2930166A1 (en) * 2015-05-19 2016-11-19 Wal-Mart Stores, Inc. Systems and methods for displaying checkout lane information
US10552933B1 (en) 2015-05-20 2020-02-04 Digimarc Corporation Image processing methods and arrangements useful in automated store shelf inspections
US9697560B2 (en) * 2015-05-21 2017-07-04 Encompass Technologies Llp Product palletizing system
US10489863B1 (en) 2015-05-27 2019-11-26 United Services Automobile Association (Usaa) Roof inspection systems and methods
US20160350776A1 (en) * 2015-05-29 2016-12-01 Wal-Mart Stores, Inc. Geolocation analytics
CA2988122A1 (en) 2015-06-02 2016-12-08 Alert Corporation Storage and retrieval system
US11142398B2 (en) 2015-06-02 2021-10-12 Alert Innovation Inc. Order fulfillment system
US11203486B2 (en) 2015-06-02 2021-12-21 Alert Innovation Inc. Order fulfillment system
US20160358145A1 (en) * 2015-06-05 2016-12-08 Yummy Foods, Llc Systems and methods for frictionless self-checkout merchandise purchasing
US9571738B2 (en) * 2015-06-23 2017-02-14 Toshiba Tec Kabushiki Kaisha Image processing apparatus
US9911290B1 (en) * 2015-07-25 2018-03-06 Gary M. Zalewski Wireless coded communication (WCC) devices for tracking retail interactions with goods and association to user accounts
US10318976B2 (en) * 2015-07-28 2019-06-11 Walmart Apollo, Llc Methods for determining measurement data of an item
US20170090195A1 (en) * 2015-09-25 2017-03-30 Intel Corporation Selective object filtering devices, systems and methods
WO2017085771A1 (en) * 2015-11-16 2017-05-26 富士通株式会社 Payment assistance system, payment assistance program, and payment assistance method
US10891574B2 (en) * 2015-11-17 2021-01-12 Target Brands, Inc. Planogram resetting using augmented reality in a retail environment
US10565577B2 (en) * 2015-12-16 2020-02-18 Samsung Electronics Co., Ltd. Guided positional tracking
US9875548B2 (en) * 2015-12-18 2018-01-23 Ricoh Co., Ltd. Candidate list generation
US10041827B2 (en) * 2015-12-21 2018-08-07 Ncr Corporation Image guided scale calibration
US10607081B2 (en) * 2016-01-06 2020-03-31 Orcam Technologies Ltd. Collaboration facilitator for wearable devices
US10650368B2 (en) * 2016-01-15 2020-05-12 Ncr Corporation Pick list optimization method
US10366144B2 (en) 2016-04-01 2019-07-30 Ebay Inc. Analyzing and linking a set of images by identifying objects in each image to determine a primary image and a secondary image
CA2961938A1 (en) 2016-04-01 2017-10-01 Wal-Mart Stores, Inc. Systems and methods for moving pallets via unmanned motorized unit-guided forklifts
CN109641658A (en) * 2016-05-18 2019-04-16 沃尔玛阿波罗有限责任公司 Device and method for showing content using the vehicles are transported
US10331964B2 (en) * 2016-05-23 2019-06-25 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Trunk inventory detector
US10083358B1 (en) * 2016-07-26 2018-09-25 Videomining Corporation Association of unique person to point-of-sale transaction data
US20180032990A1 (en) * 2016-07-28 2018-02-01 Ncr Corporation Item location detection on scales
US20190164200A1 (en) * 2016-07-28 2019-05-30 Essilor International Eyeglasses-matching tool
US10445791B2 (en) * 2016-09-08 2019-10-15 Walmart Apollo, Llc Systems and methods for autonomous assistance and routing
US10438164B1 (en) 2016-09-27 2019-10-08 Amazon Technologies, Inc. Merging events in interactive data processing systems
US10769581B1 (en) * 2016-09-28 2020-09-08 Amazon Technologies, Inc. Overhanging item background subtraction
US11188947B2 (en) 2016-10-05 2021-11-30 Abl Ip Holding, Llc Analyzing movement of data collectors/gateways associated with retail displays
WO2018081782A1 (en) * 2016-10-31 2018-05-03 Caliburger Cayman Devices and systems for remote monitoring of restaurants
AU2017362508A1 (en) * 2016-11-17 2019-06-20 Walmart Apollo, Llc Automated-service retail system and method
US20180150793A1 (en) 2016-11-29 2018-05-31 Alert Innovation Inc. Automated retail supply chain and inventory management system
US10685386B2 (en) * 2016-11-30 2020-06-16 Bank Of America Corporation Virtual assessments using augmented reality user devices
FR3059967B1 (en) * 2016-12-12 2019-01-25 Poma METHOD AND INSTALLATION FOR TRANSPORTING TRUCK VEHICLES BY A CABLE
WO2018123433A1 (en) 2016-12-28 2018-07-05 パナソニックIpマネジメント株式会社 Tool system
JP2018108633A (en) * 2016-12-28 2018-07-12 パナソニックIpマネジメント株式会社 Tool system
JP2020504066A (en) 2017-01-10 2020-02-06 アラート イノヴェイション インコーポレイテッド Automatic store with exchangeable automatic mobile robot
US11403610B2 (en) * 2017-01-13 2022-08-02 Sensormatic Electronics, LLC Systems and methods for inventory monitoring
JP6873711B2 (en) 2017-01-16 2021-05-19 東芝テック株式会社 Product recognition device
KR102520627B1 (en) * 2017-02-01 2023-04-12 삼성전자주식회사 Apparatus and method and for recommending products
JP7478320B2 (en) 2017-02-24 2024-05-07 ウォルマート アポロ リミテッド ライアビリティ カンパニー Inventory control system and method
WO2018165093A1 (en) * 2017-03-07 2018-09-13 Walmart Apollo, Llc Unmanned vehicle in shopping environment
US11494729B1 (en) * 2017-03-27 2022-11-08 Amazon Technologies, Inc. Identifying user-item interactions in an automated facility
US11087271B1 (en) 2017-03-27 2021-08-10 Amazon Technologies, Inc. Identifying user-item interactions in an automated facility
US11238401B1 (en) 2017-03-27 2022-02-01 Amazon Technologies, Inc. Identifying user-item interactions in an automated facility
SG10201703096XA (en) * 2017-04-13 2018-11-29 Mastercard Asia Pacific Pte Ltd An airborne apparatus and transaction method
US11270348B2 (en) 2017-05-19 2022-03-08 Abl Ip Holding, Llc Systems and methods for tracking products transported in shipping containers
US11138901B1 (en) 2017-06-28 2021-10-05 Amazon Technologies, Inc. Item recognition and analysis
JP7034615B2 (en) * 2017-07-07 2022-03-14 東芝テック株式会社 Checkout equipment and programs
US10540390B1 (en) * 2017-08-07 2020-01-21 Amazon Technologies, Inc. Image-based item identification
US11200692B2 (en) 2017-08-07 2021-12-14 Standard Cognition, Corp Systems and methods to check-in shoppers in a cashier-less store
US10650545B2 (en) 2017-08-07 2020-05-12 Standard Cognition, Corp. Systems and methods to check-in shoppers in a cashier-less store
US11250376B2 (en) 2017-08-07 2022-02-15 Standard Cognition, Corp Product correlation analysis using deep learning
US11232687B2 (en) 2017-08-07 2022-01-25 Standard Cognition, Corp Deep learning-based shopper statuses in a cashier-less store
US10474988B2 (en) 2017-08-07 2019-11-12 Standard Cognition, Corp. Predicting inventory events using foreground/background processing
US10853965B2 (en) * 2017-08-07 2020-12-01 Standard Cognition, Corp Directional impression analysis using deep learning
US10474991B2 (en) 2017-08-07 2019-11-12 Standard Cognition, Corp. Deep learning-based store realograms
FR3070086B1 (en) * 2017-08-08 2019-08-30 Safran Identity & Security FRAUD DETECTION FOR ACCESS CONTROL BY FACIAL RECOGNITION
JP6856220B2 (en) 2017-08-09 2021-04-07 株式会社DSi Weighing systems, electronic scales and markers for electronic scales
CN109409175B (en) * 2017-08-16 2024-02-27 图灵通诺(北京)科技有限公司 Settlement method, device and system
JP6903524B2 (en) * 2017-09-01 2021-07-14 東芝テック株式会社 Weighing device
CN108446604A (en) * 2017-09-27 2018-08-24 缤果可为(北京)科技有限公司 Intelligent price tag displaying device and its application
US10691931B2 (en) * 2017-10-04 2020-06-23 Toshiba Global Commerce Solutions Sensor-based environment for providing image analysis to determine behavior
CN109753980A (en) * 2017-11-03 2019-05-14 虹软科技股份有限公司 A kind of method and apparatus for detection
US20190156270A1 (en) 2017-11-18 2019-05-23 Walmart Apollo, Llc Distributed Sensor System and Method for Inventory Management and Predictive Replenishment
CN107944960A (en) * 2017-11-27 2018-04-20 深圳码隆科技有限公司 A kind of self-service method and apparatus
US11232511B1 (en) * 2017-11-28 2022-01-25 A9.Com, Inc. Computer vision based tracking of item utilization
US20190180301A1 (en) * 2017-12-09 2019-06-13 Walmart Apollo, Llc System for capturing item demand transference
US10956726B1 (en) * 2017-12-12 2021-03-23 Amazon Technologies, Inc. Obfuscating portions of video data
UA127479U (en) * 2017-12-18 2018-08-10 Юрій Юрійович Голузинець AUTOMATED SYSTEM OF IDENTIFICATION AND PERSONALIZED COMMUNICATION WITH CONSUMERS OF GOODS AND SERVICES
CN110108341B (en) * 2018-02-01 2022-12-02 北京京东乾石科技有限公司 Automatic weighing method and system for unmanned aerial vehicle
FR3077261A1 (en) * 2018-02-01 2019-08-02 Eddy Gouraud LARGE SURFACE TROLLEY INTEGRATING ARTIFICIAL INTELLIGENCE WITH VISUAL OBJECT RECOGNITION
EP3750032A4 (en) * 2018-02-06 2021-11-17 Wal-Mart Apollo, LLC Customized augmented reality item filtering system
US20200193502A1 (en) * 2018-02-07 2020-06-18 Samuel Smith Recommendation engine for clothing selection and wardrobe management
CN108389316B (en) * 2018-03-02 2021-07-13 北京京东尚科信息技术有限公司 Automatic vending method, apparatus and computer-readable storage medium
USD848530S1 (en) 2018-03-14 2019-05-14 Tambria Wagner Sign
US11074616B2 (en) * 2018-03-15 2021-07-27 International Business Machines Corporation Predictive media content delivery
JP7173518B2 (en) * 2018-03-19 2022-11-16 日本電気株式会社 Information processing system, information processing method and program
JP7200487B2 (en) * 2018-03-19 2023-01-10 日本電気株式会社 Settlement system, settlement method and program
US11455499B2 (en) * 2018-03-21 2022-09-27 Toshiba Global Commerce Solutions Holdings Corporation Method, system, and computer program product for image segmentation in a sensor-based environment
WO2019181033A1 (en) * 2018-03-22 2019-09-26 日本電気株式会社 Registration system, registration method, and program
CN108520605A (en) * 2018-03-23 2018-09-11 阿里巴巴集团控股有限公司 A kind of self-help shopping air control method and system
JP6775541B2 (en) 2018-04-03 2020-10-28 株式会社Subaru Position measurement method and position measurement system
WO2019213418A1 (en) * 2018-05-02 2019-11-07 Walmart Apollo, Llc Systems and methods for transactions at a shopping cart
CN108765061A (en) * 2018-05-02 2018-11-06 开源物联网(广州)有限公司 Enterprise and user's integral intelligent service system
US20190347635A1 (en) * 2018-05-10 2019-11-14 Adobe Inc. Configuring a physical environment based on electronically detected interactions
CN108806074B (en) * 2018-06-05 2021-08-03 腾讯科技(深圳)有限公司 Shopping information generation method and device and storage medium
US11308530B2 (en) * 2018-06-20 2022-04-19 International Business Machines Corporation Automated personalized customer service utilizing lighting
US11367041B2 (en) * 2018-06-25 2022-06-21 Robert Bosch Gmbh Occupancy sensing system for custodial services management
CN112639862A (en) * 2018-06-29 2021-04-09 鬼屋技术有限责任公司 System, apparatus and method for item location, inventory creation, routing, imaging and detection
CN108875690A (en) * 2018-06-29 2018-11-23 百度在线网络技术(北京)有限公司 Unmanned Retail commodity identifying system
US10769587B2 (en) 2018-07-02 2020-09-08 Walmart Apollo, Llc Systems and methods of storing and retrieving retail store product inventory
CN108845289B (en) * 2018-07-03 2021-08-03 京东方科技集团股份有限公司 Positioning method and system for shopping cart and shopping cart
CN109102361A (en) * 2018-07-24 2018-12-28 湖南餐智科技有限公司 A kind of article order confirmation method and system based on intelligent electronic-scale
US20200039676A1 (en) * 2018-08-02 2020-02-06 The Recon Group LLP System and methods for automatic labeling of articles of arbitrary shape, size and orientation
CN110857879A (en) * 2018-08-23 2020-03-03 中国石油天然气股份有限公司 Gas weighing device
US11488400B2 (en) * 2018-09-27 2022-11-01 Ncr Corporation Context-aided machine vision item differentiation
WO2020079651A1 (en) 2018-10-17 2020-04-23 Supersmart Ltd. Imaging used to reconcile cart weight discrepancy
US10977671B2 (en) * 2018-10-19 2021-04-13 Punchh Inc. Item level 3D localization and imaging using radio frequency waves
US10769713B1 (en) * 2018-10-25 2020-09-08 Eleana Townsend Electronic shopping cart
US10607116B1 (en) * 2018-10-30 2020-03-31 Eyezon Ltd Automatically tagging images to create labeled dataset for training supervised machine learning models
US11176597B2 (en) * 2018-10-30 2021-11-16 Ncr Corporation Associating shoppers together
DK3651091T3 (en) * 2018-11-07 2023-07-31 Wuerth Int Ag WAREHOUSE ADMINISTRATION SYSTEM WITH POSITION DETERMINATION OF STOCK ITEMS AND ASSIGNED STORAGE AREAS
US10891586B1 (en) 2018-11-23 2021-01-12 Smart Supervision System LLC Systems and methods of detecting, identifying and classifying objects positioned on a surface
US11222225B2 (en) * 2018-11-29 2022-01-11 International Business Machines Corporation Image recognition combined with personal assistants for item recovery
US11880877B2 (en) 2018-12-07 2024-01-23 Ghost House Technology, Llc System for imaging and detection
WO2020117343A1 (en) * 2018-12-07 2020-06-11 Ghost House Technology, Llc System, apparatus and method of item location, list creation, routing, imaging and detection
CN111325492B (en) 2018-12-14 2023-04-28 阿里巴巴集团控股有限公司 Supply chain optimization method and system
US11373402B2 (en) * 2018-12-20 2022-06-28 Google Llc Systems, devices, and methods for assisting human-to-human interactions
US11017641B2 (en) * 2018-12-21 2021-05-25 Sbot Technologies Inc. Visual recognition and sensor fusion weight detection system and method
WO2020176439A1 (en) * 2019-02-25 2020-09-03 Walmart Apollo, Llc Systems and methods of product recognition through multi-model image processing
US11436657B2 (en) * 2019-03-01 2022-09-06 Shopify Inc. Self-healing recommendation engine
US11120459B2 (en) * 2019-03-01 2021-09-14 International Business Machines Corporation Product placement optimization using blind-spot analysis in retail environments
US11558539B2 (en) 2019-03-13 2023-01-17 Smart Supervision System LLC Systems and methods of detecting and identifying an object
US11501346B2 (en) 2019-03-26 2022-11-15 Toshiba Global Commerce Solutions Holdings Corporation System and method for facilitating seamless commerce
JP7373729B2 (en) * 2019-03-29 2023-11-06 パナソニックIpマネジメント株式会社 Settlement payment device and unmanned store system
JP7320747B2 (en) * 2019-03-29 2023-08-04 パナソニックIpマネジメント株式会社 Settlement payment device and unmanned store system
US10540700B1 (en) * 2019-04-11 2020-01-21 RoboSystems, Inc. Personal shopping assistant
US11232575B2 (en) 2019-04-18 2022-01-25 Standard Cognition, Corp Systems and methods for deep learning-based subject persistence
US11462083B2 (en) * 2019-06-25 2022-10-04 Ncr Corporation Display with integrated cameras
US11868956B1 (en) * 2019-06-27 2024-01-09 Amazon Technologies, Inc. Session-based analysis of events
CN110363624A (en) * 2019-07-05 2019-10-22 深圳市浩科电子有限公司 A kind of semi-artificial shop selling system
KR20210017087A (en) * 2019-08-06 2021-02-17 삼성전자주식회사 Method for recognizing voice and an electronic device supporting the same
CN112466035B (en) * 2019-09-06 2022-08-12 图灵通诺(北京)科技有限公司 Commodity identification method, device and system based on vision and gravity sensing
US11393253B1 (en) 2019-09-16 2022-07-19 Amazon Technologies, Inc. Using sensor data to determine activity
JP7381268B2 (en) * 2019-09-19 2023-11-15 東芝テック株式会社 transaction processing system
US11869032B2 (en) * 2019-10-01 2024-01-09 Medixin Inc. Computer system and method for offering coupons
US11282059B1 (en) * 2019-10-03 2022-03-22 Inmar Clearing, Inc. Food item system including virtual cart and related methods
US20210103966A1 (en) * 2019-10-04 2021-04-08 Lujean J. CUMMINGS Retail drone
CN110715870B (en) * 2019-10-21 2020-12-01 梅州粤顺科技有限公司 Cargo weight data cheating detection system
CN110708565B (en) * 2019-10-22 2022-08-19 广州虎牙科技有限公司 Live broadcast interaction method and device, server and machine-readable storage medium
US10607080B1 (en) * 2019-10-25 2020-03-31 7-Eleven, Inc. Feedback and training for a machine learning algorithm configured to determine customer purchases during a shopping session at a physical store
US11386411B2 (en) * 2019-10-31 2022-07-12 Toshiba Global Commerce Solutions Holdings Corporation System and method for operating a point-of-sale (POS) system in a retail environment
US11480437B2 (en) * 2019-11-21 2022-10-25 International Business Machines Corporation Transportation system used by individuals having a visual impairment utilizing 5G communications
WO2021113810A1 (en) * 2019-12-06 2021-06-10 Mashgin, Inc. System and method for identifying items
CN111127750A (en) * 2019-12-24 2020-05-08 西安科技大学 Commodity displacement identification method based on gravity sensor data
CN111145430A (en) * 2019-12-27 2020-05-12 北京每日优鲜电子商务有限公司 Method and device for detecting commodity placing state and computer storage medium
CN111157087A (en) * 2020-01-17 2020-05-15 广东乐心医疗电子股份有限公司 Weighing method, weighing apparatus and storage medium
JP7338706B2 (en) * 2020-01-23 2023-09-05 日本電気株式会社 Processing device, processing method and program
US20230027388A1 (en) * 2020-01-23 2023-01-26 Nec Corporation Price information determination apparatus, price information determination method, and non-transitory computer readable medium
WO2021150976A1 (en) 2020-01-24 2021-07-29 Synchrony Bank Systems and methods for machine vision based object recognition
US11501386B2 (en) * 2020-02-04 2022-11-15 Kpn Innovations, Llc. Methods and systems for physiologically informed account metrics utilizing artificial intelligence
CN113269296A (en) * 2020-02-14 2021-08-17 艾利丹尼森零售信息服务公司 Counting machine and method for counting articles
US11727224B1 (en) * 2020-03-18 2023-08-15 Amazon Technologies, Inc. Determining RFID-tag locations in a facility
US11232798B2 (en) 2020-05-21 2022-01-25 Bank Of America Corporation Audio analysis system for automatic language proficiency assessment
US11404051B2 (en) 2020-05-21 2022-08-02 Bank Of America Corporation Textual analysis system for automatic language proficiency assessment
US11823129B2 (en) * 2020-05-28 2023-11-21 Zebra Technologies Corporation Item collection guidance system
US11463259B2 (en) * 2020-06-02 2022-10-04 Harpreet Sachdeva System and method for managing trust and wearable device for use therewith
CN111798457B (en) * 2020-06-10 2021-04-06 上海众言网络科技有限公司 Image visual weight determining method and device and image evaluation method
US11303853B2 (en) 2020-06-26 2022-04-12 Standard Cognition, Corp. Systems and methods for automated design of camera placement and cameras arrangements for autonomous checkout
US11361468B2 (en) 2020-06-26 2022-06-14 Standard Cognition, Corp. Systems and methods for automated recalibration of sensors for autonomous checkout
AU2021305159A1 (en) * 2020-07-07 2023-02-09 Omni Consumer Products, Llc Systems and methods for integrating physical and virtual purchasing
US20220038423A1 (en) * 2020-07-28 2022-02-03 Twistlock, Ltd. System and method for application traffic and runtime behavior learning and enforcement
US11619539B2 (en) * 2020-07-31 2023-04-04 Zebra Technologies Corporation Inadvertent subsequent scan prevention for symbology reader with weighing platter
US20230306406A1 (en) * 2020-08-05 2023-09-28 Maya Labs, Inc. Contactless payment kiosk system and method
US11468496B2 (en) 2020-08-07 2022-10-11 International Business Machines Corporation Smart contact lenses based shopping
US11829990B2 (en) * 2020-09-11 2023-11-28 Sensormatic Electronics, LLC Method and system for self-checkout in a retail environment
JP2022050964A (en) * 2020-09-18 2022-03-31 トヨタ自動車株式会社 Information processing device, information processing method, and program
US11682008B2 (en) * 2020-09-28 2023-06-20 Vadim Nikolaevich ALEKSANDROV Method of authenticating a customer, method of carrying out a payment transaction and payment system implementing the specified methods
US20220101290A1 (en) * 2020-09-29 2022-03-31 Zebra Technologies Corporation Ppe verification system at pos
WO2022072337A1 (en) * 2020-09-30 2022-04-07 United States Postal Service System and method for improving item scan rates in distribution network
CN112325780B (en) * 2020-10-29 2022-01-25 青岛聚好联科技有限公司 Distance measuring and calculating method and device based on community monitoring
US11823214B2 (en) * 2020-12-20 2023-11-21 Maplebear, Inc. Classifying fraud instances in completed orders
FR3118816A1 (en) * 2021-01-11 2022-07-15 daniel GIUDICE Scan Pay and AI self-check via Smartphone
US11748730B2 (en) * 2021-02-25 2023-09-05 Zebra Technologies Corporation Camera enhanced off-platter detection system
US20220284600A1 (en) * 2021-03-05 2022-09-08 Inokyo, Inc. User identification in store environments
US20220335510A1 (en) * 2021-04-20 2022-10-20 Walmart Apollo, Llc Systems and methods for personalized shopping
US11816997B2 (en) 2021-04-29 2023-11-14 Ge Aviation Systems Llc Demand driven crowdsourcing for UAV sensor
US11798063B2 (en) * 2021-06-17 2023-10-24 Toshiba Global Commerce Solutions Holdings Corporation Methods of assigning products from a shared shopping list to participating shoppers using shopper characteristics and product parameters and related systems
US20220415523A1 (en) * 2021-06-24 2022-12-29 International Business Machines Corporation Contact tracing utilzing device signatures captured during transactions
US11842376B2 (en) * 2021-06-25 2023-12-12 Toshiba Global Commerce Solutions Holdings Corporation Method, medium, and system for data lookup based on correlation of user interaction information
US20230100437A1 (en) * 2021-09-29 2023-03-30 Toshiba Global Commerce Solutions Holdings Corporation Image recall system
US11928662B2 (en) * 2021-09-30 2024-03-12 Toshiba Global Commerce Solutions Holdings Corporation End user training for computer vision system
US20230385890A1 (en) * 2022-05-25 2023-11-30 The Toronto-Dominion Bank Distributed authentication in ambient commerce

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080228577A1 (en) * 2005-08-04 2008-09-18 Koninklijke Philips Electronics, N.V. Apparatus For Monitoring a Person Having an Interest to an Object, and Method Thereof
US20110143779A1 (en) * 2009-12-11 2011-06-16 Think Tek, Inc. Providing City Services using Mobile Devices and a Sensor Network
US20130054377A1 (en) * 2011-08-30 2013-02-28 Nils Oliver Krahnstoever Person tracking and interactive advertising
US20130060843A1 (en) * 2010-03-29 2013-03-07 Rakuten, Inc. Server apparatus, information providing method, information providing program, recording medium recording the information providing program, and information providing system
US20130085345A1 (en) * 2011-09-30 2013-04-04 Kevin A. Geisner Personal Audio/Visual System Providing Allergy Awareness
US20130290107A1 (en) * 2012-04-27 2013-10-31 Soma S. Santhiveeran Behavior based bundling
US20140372211A1 (en) * 2013-06-14 2014-12-18 International Business Machines Corporation Real-time advertisement based on common point of attraction of different viewers

Family Cites Families (255)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4327819A (en) 1980-08-01 1982-05-04 Coutta John M Object detection system for a shopping cart
JPS60134128U (en) * 1984-02-16 1985-09-06 株式会社 石田衡器製作所 Weighing device
US4964053A (en) * 1988-04-22 1990-10-16 Checkrobot, Inc. Self-checkout of produce items
US5235509A (en) * 1989-06-28 1993-08-10 Management Information Support, Inc. Customer self-ordering system using information displayed on a screen
US5425140A (en) * 1992-03-20 1995-06-13 International Business Machines Corporation Method and apparatus for providing conditional cascading in a computer system graphical user interface
US5485006A (en) 1994-01-28 1996-01-16 S.T.O.P. International (Brighton) Inc. Product detection system for shopping carts
DE69419139D1 (en) 1993-02-05 1999-07-22 S T O P International Brighton MONITORING DEVICE FOR CONTROLLING SHOPPING CART
US5497314A (en) * 1994-03-07 1996-03-05 Novak; Jeffrey M. Automated apparatus and method for object recognition at checkout counters
JP3325897B2 (en) * 1994-05-13 2002-09-17 株式会社イシダ Combination weighing device
US5666157A (en) 1995-01-03 1997-09-09 Arc Incorporated Abnormality detection and surveillance system
EP0727760A3 (en) * 1995-02-17 1997-01-29 Ibm Produce size recognition system
JP3276547B2 (en) 1995-12-01 2002-04-22 シャープ株式会社 Image recognition method
US6092725A (en) 1997-01-24 2000-07-25 Symbol Technologies, Inc. Statistical sampling security methodology for self-scanning checkout system
US5825002A (en) 1996-09-05 1998-10-20 Symbol Technologies, Inc. Device and method for secure data updates in a self-checkout system
US5973699A (en) * 1996-09-19 1999-10-26 Platinum Technology Ip, Inc. System and method for increasing the performance for real-time rendering of three-dimensional polygonal data
US6292827B1 (en) * 1997-06-20 2001-09-18 Shore Technologies (1999) Inc. Information transfer systems and method with dynamic distribution of data, control and management of information
US7207477B1 (en) * 2004-03-08 2007-04-24 Diebold, Incorporated Wireless transfer of account data and signature from hand-held device to electronic check generator
US7686213B1 (en) * 1998-04-17 2010-03-30 Diebold Self-Service Systems Division Of Diebold, Incorporated Cash withdrawal from ATM via videophone
US8561889B2 (en) * 1998-04-17 2013-10-22 Diebold Self-Service Systems Division Of Diebold, Incorporated Banking terminal that operates to cause financial transfers responsive to data bearing records
US5910769A (en) 1998-05-27 1999-06-08 Geisler; Edwin Shopping cart scanning system
US6513015B2 (en) * 1998-09-25 2003-01-28 Fujitsu Limited System and method for customer recognition using wireless identification and visual data transmission
US6296186B1 (en) * 1998-11-19 2001-10-02 Ncr Corporation Produce recognition system including a produce shape collector
US6268882B1 (en) * 1998-12-31 2001-07-31 Elbex Video Ltd. Dome shaped camera with simplified construction and positioning
AUPQ212499A0 (en) 1999-08-10 1999-09-02 Ajax Cooke Pty Ltd Item recognition method and apparatus
US6250671B1 (en) * 1999-08-16 2001-06-26 Cts Corporation Vehicle occupant position detector and airbag control system
US8391851B2 (en) 1999-11-03 2013-03-05 Digimarc Corporation Gestural techniques with wireless mobile phone devices
AU762164B2 (en) * 1999-11-16 2003-06-19 Icepat Ag Method and system for ordering products
US6726094B1 (en) * 2000-01-19 2004-04-27 Ncr Corporation Method and apparatus for multiple format image capture for use in retail transactions
US6685000B2 (en) 2000-05-19 2004-02-03 Kabushiki Kaisha Nippon Conlux Coin discrimination method and device
GB2368928A (en) 2000-07-21 2002-05-15 Dennis Stephen Livingstone Computer system for a kitchen
US20020027164A1 (en) 2000-09-07 2002-03-07 Mault James R. Portable computing apparatus particularly useful in a weight management program
US6412694B1 (en) * 2000-09-20 2002-07-02 Ncr Corporation Produce recognition system and method including weighted rankings
CA2426681A1 (en) 2000-10-26 2002-06-20 Healthetech, Inc. Body supported activity and condition monitor
US7845554B2 (en) * 2000-10-30 2010-12-07 Fujitsu Frontech North America, Inc. Self-checkout method and apparatus
US7540424B2 (en) * 2000-11-24 2009-06-02 Metrologic Instruments, Inc. Compact bar code symbol reading system employing a complex of coplanar illumination and imaging stations for omni-directional imaging of objects within a 3D imaging volume
US7640512B1 (en) * 2000-12-22 2009-12-29 Automated Logic Corporation Updating objects contained within a webpage
US20020079367A1 (en) * 2000-12-27 2002-06-27 Montani John J. Method and apparatus for operating a self-service checkout terminal to access a customer account
US7933797B2 (en) 2001-05-15 2011-04-26 Shopper Scientist, Llc Purchase selection behavior analysis system and method
US6601762B2 (en) * 2001-06-15 2003-08-05 Koninklijke Philips Electronics N.V. Point-of-sale (POS) voice authentication transaction system
US20030018897A1 (en) * 2001-07-20 2003-01-23 Psc Scanning, Inc. Video identification verification system and method for a self-checkout system
US20030036985A1 (en) * 2001-08-15 2003-02-20 Soderholm Mark J. Product locating system for use in a store or other facility
WO2003017045A2 (en) * 2001-08-16 2003-02-27 Trans World New York Llc User-personalized media sampling, recommendation and purchasing system using real-time inventory database
US20030039379A1 (en) 2001-08-23 2003-02-27 Koninklijke Philips Electronics N.V. Method and apparatus for automatically assessing interest in a displayed product
US7389918B2 (en) * 2001-10-23 2008-06-24 Ncr Corporation Automatic electronic article surveillance for self-checkout
CN1299505C (en) * 2001-12-13 2007-02-07 皇家飞利浦电子股份有限公司 Recommending media content on a media system
US6991066B2 (en) * 2002-02-01 2006-01-31 International Business Machines Corporation Customized self-checkout system
US7194327B2 (en) * 2002-07-12 2007-03-20 Peter Ar-Fu Lam Body profile coding method and apparatus useful for assisting users to select wearing apparel
US20140254896A1 (en) 2011-07-18 2014-09-11 Tiger T G Zhou Unmanned drone, robot system for delivering mail, goods, humanoid security, crisis negotiation, mobile payments, smart humanoid mailbox and wearable personal exoskeleton heavy load flying machine
US7050078B2 (en) 2002-12-19 2006-05-23 Accenture Global Services Gmbh Arbitrary object tracking augmented reality applications
JP3992629B2 (en) * 2003-02-17 2007-10-17 株式会社ソニー・コンピュータエンタテインメント Image generation system, image generation apparatus, and image generation method
US7653574B2 (en) * 2003-12-30 2010-01-26 Trans World Entertainment Corporation Systems and methods for the selection and purchase of digital assets
US7180014B2 (en) * 2003-03-20 2007-02-20 Boris Farber Method and equipment for automated tracking and identification of nonuniform items
US7465342B2 (en) 2003-04-07 2008-12-16 Silverbrook Research Pty Ltd Method of minimizing absorption of visible light in ink compositions comprising infrared metal-dithiolene dyes
US7406331B2 (en) * 2003-06-17 2008-07-29 Sony Ericsson Mobile Communications Ab Use of multi-function switches for camera zoom functionality on a mobile phone
US6926202B2 (en) * 2003-07-22 2005-08-09 International Business Machines Corporation System and method of deterring theft of consumers using portable personal shopping solutions in a retail environment
US20050097064A1 (en) * 2003-11-04 2005-05-05 Werden Todd C. Method and apparatus to determine product weight and calculate price using a camera
US7853477B2 (en) 2003-12-30 2010-12-14 O'shea Michael D RF-based electronic system and method for automatic cross-marketing promotional offers and check-outs
US7246745B2 (en) * 2004-02-27 2007-07-24 Evolution Robotics Retail, Inc. Method of merchandising for checkout lanes
US7100824B2 (en) 2004-02-27 2006-09-05 Evolution Robotics, Inc. System and methods for merchandise checkout
US7337960B2 (en) 2004-02-27 2008-03-04 Evolution Robotics, Inc. Systems and methods for merchandise automatic checkout
EP1759304A2 (en) 2004-06-01 2007-03-07 L-3 Communications Corporation Method and system for wide area security monitoring, sensor management and situational awareness
US7631808B2 (en) 2004-06-21 2009-12-15 Stoplift, Inc. Method and apparatus for detecting suspicious activity using video analysis
US7516888B1 (en) 2004-06-21 2009-04-14 Stoplift, Inc. Method and apparatus for auditing transaction activity in retail and other environments using visual recognition
US8448858B1 (en) * 2004-06-21 2013-05-28 Stoplift, Inc. Method and apparatus for detecting suspicious activity using video analysis from alternative camera viewpoint
US20050283402A1 (en) * 2004-06-22 2005-12-22 Ncr Corporation System and method of facilitating remote interventions in a self-checkout system
US20060018516A1 (en) * 2004-07-22 2006-01-26 Masoud Osama T Monitoring activity using video information
US7219838B2 (en) 2004-08-10 2007-05-22 Howell Data Systems System and method for notifying a cashier of the presence of an item in an obscured area of a shopping cart
US7168618B2 (en) 2004-08-12 2007-01-30 International Business Machines Corporation Retail store method and system
US7229015B2 (en) 2004-12-28 2007-06-12 International Business Machines Corporation Self-checkout system
US7646887B2 (en) 2005-01-04 2010-01-12 Evolution Robotics Retail, Inc. Optical flow for object recognition
JP4284448B2 (en) 2005-01-28 2009-06-24 富士フイルム株式会社 Image processing apparatus and method
US8040361B2 (en) 2005-04-11 2011-10-18 Systems Technology, Inc. Systems and methods for combining virtual and real-time physical environments
US8046375B2 (en) * 2005-06-16 2011-10-25 Lycos, Inc. Geo targeted commerce
US7660747B2 (en) 2005-06-28 2010-02-09 Media Cart Holdings, Inc. Media enabled shopping cart system with point of sale identification and method
DE102005036572A1 (en) 2005-08-01 2007-02-08 Scheidt & Bachmann Gmbh A method of automatically determining the number of people and / or objects in a gate
US8639543B2 (en) 2005-11-01 2014-01-28 International Business Machines Corporation Methods, systems, and media to improve employee productivity using radio frequency identification
US7640193B2 (en) * 2005-12-09 2009-12-29 Google Inc. Distributed electronic commerce system with centralized virtual shopping carts
JP4607797B2 (en) 2006-03-06 2011-01-05 株式会社東芝 Behavior discrimination device, method and program
US7930204B1 (en) 2006-07-25 2011-04-19 Videomining Corporation Method and system for narrowcasting based on automatic analysis of customer behavior in a retail store
WO2008013846A2 (en) 2006-07-26 2008-01-31 Sensormatic Electronics Corporation Mobile readpoint system and method for reading electronic tags
US7697551B2 (en) * 2006-09-01 2010-04-13 Nuance Communications, Inc. System for instant message to telephone speech and back
WO2008031163A1 (en) 2006-09-13 2008-03-20 Eatingsafe Pty Ltd. On-line ingredient register
US7987111B1 (en) 2006-10-30 2011-07-26 Videomining Corporation Method and system for characterizing physical retail spaces by determining the demographic composition of people in the physical retail spaces utilizing video image analysis
US7533799B2 (en) 2006-12-14 2009-05-19 Ncr Corporation Weight scale fault detection
US9269244B2 (en) 2007-03-06 2016-02-23 Verint Systems Inc. Event detection based on video metadata
US8146811B2 (en) 2007-03-12 2012-04-03 Stoplift, Inc. Cart inspection for suspicious items
US20080228549A1 (en) * 2007-03-14 2008-09-18 Harrison Michael J Performance evaluation systems and methods
US8965042B2 (en) 2007-03-20 2015-02-24 International Business Machines Corporation System and method for the measurement of retail display effectiveness
US7762458B2 (en) 2007-03-25 2010-07-27 Media Cart Holdings, Inc. Media enabled shopping system user interface
US7679522B2 (en) 2007-03-26 2010-03-16 Media Cart Holdings, Inc. Media enhanced shopping systems with electronic queuing
US20080294514A1 (en) * 2007-05-23 2008-11-27 Calman Matthew A System and method for remote deposit capture and customer information gathering
US8794524B2 (en) * 2007-05-31 2014-08-05 Toshiba Global Commerce Solutions Holdings Corporation Smart scanning system
CN101802860A (en) * 2007-07-09 2010-08-11 维蒂公开股份有限公司 Mobile device marketing and advertising platforms, methods, and systems
US7672876B2 (en) 2007-07-13 2010-03-02 Sunrise R&D Holdings, Llc System for shopping in a store
US8876001B2 (en) 2007-08-07 2014-11-04 Ncr Corporation Methods and apparatus for image recognition in checkout verification
US20090039165A1 (en) 2007-08-08 2009-02-12 Ncr Corporation Methods and Apparatus for a Bar Code Scanner Providing Video Surveillance
US7909248B1 (en) 2007-08-17 2011-03-22 Evolution Robotics Retail, Inc. Self checkout with visual recognition
JP4413957B2 (en) 2007-08-24 2010-02-10 株式会社東芝 Moving object detection device and autonomous moving object
US7949568B2 (en) 2007-08-31 2011-05-24 Accenture Global Services Limited Determination of product display parameters based on image processing
US8189855B2 (en) * 2007-08-31 2012-05-29 Accenture Global Services Limited Planogram extraction based on image processing
JP5080196B2 (en) 2007-10-09 2012-11-21 任天堂株式会社 Program, information processing apparatus, information processing system, and information processing method
US8456293B1 (en) 2007-10-22 2013-06-04 Alarm.Com Incorporated Providing electronic content based on sensor data
US8781908B2 (en) * 2007-11-08 2014-07-15 Walmart Stores, Inc. Method and apparatus for automated shopper checkout using radio frequency identification technology
US20110131005A1 (en) 2007-12-18 2011-06-02 Hiromu Ueshima Mobile recording apparatus, body movement measuring apparatus, information processing apparatus, movement pattern determining apparatus, activity amount calculating apparatus, recording method, body movement measuring method, information processing method, movement pattern determining method, activity amount calculating met
US20090160975A1 (en) * 2007-12-19 2009-06-25 Ncr Corporation Methods and Apparatus for Improved Image Processing to Provide Retroactive Image Focusing and Improved Depth of Field in Retail Imaging Systems
FR2927442B1 (en) 2008-02-12 2013-06-14 Cliris METHOD FOR DETERMINING A LOCAL TRANSFORMATION RATE OF AN OBJECT OF INTEREST
US8746557B2 (en) * 2008-02-26 2014-06-10 Toshiba Global Commerce Solutions Holding Corporation Secure self-checkout
JP5521276B2 (en) * 2008-03-13 2014-06-11 富士通株式会社 Authentication apparatus, authentication method, and authentication program
WO2009117450A1 (en) 2008-03-18 2009-09-24 Invism, Inc. Enhanced immersive soundscapes production
US8419433B2 (en) 2008-04-15 2013-04-16 International Business Machines Corporation Monitoring recipe preparation using interactive cooking device
US8229158B2 (en) * 2008-04-29 2012-07-24 International Business Machines Corporation Method, system, and program product for determining a state of a shopping receptacle
US7448542B1 (en) * 2008-05-05 2008-11-11 International Business Machines Corporation Method for detecting a non-scan at a retail checkout station
US7965184B1 (en) 2008-06-16 2011-06-21 Bank Of America Corporation Cash handling facility management
US8219438B1 (en) * 2008-06-30 2012-07-10 Videomining Corporation Method and system for measuring shopper response to products based on behavior and facial expression
US8126195B2 (en) * 2008-07-01 2012-02-28 International Business Machines Corporation Graphical retail item identification with point-of-sale terminals
WO2010019852A2 (en) * 2008-08-15 2010-02-18 Mohammed Hashim-Waris Supply chain management systems and methods
CN101653662A (en) 2008-08-21 2010-02-24 鸿富锦精密工业(深圳)有限公司 Robot
US8448859B2 (en) 2008-09-05 2013-05-28 Datalogic ADC, Inc. System and method for preventing cashier and customer fraud at retail checkout
US20100063862A1 (en) * 2008-09-08 2010-03-11 Thompson Ronald L Media delivery system and system including a media delivery system and a building automation system
US8818875B2 (en) * 2008-09-23 2014-08-26 Toshiba Global Commerce Solutions Holdings Corporation Point of sale system with item image capture and deferred invoicing capability
US8194985B2 (en) * 2008-10-02 2012-06-05 International Business Machines Corporation Product identification using image analysis and user interaction
US8493408B2 (en) 2008-11-19 2013-07-23 Apple Inc. Techniques for manipulating panoramas
FR2938774A1 (en) 2008-11-27 2010-05-28 Parrot DEVICE FOR CONTROLLING A DRONE
US8289162B2 (en) 2008-12-22 2012-10-16 Wimm Labs, Inc. Gesture-based user interface for a wearable portable device
US8571298B2 (en) 2008-12-23 2013-10-29 Datalogic ADC, Inc. Method and apparatus for identifying and tallying objects
US8494909B2 (en) 2009-02-09 2013-07-23 Datalogic ADC, Inc. Automatic learning in a merchandise checkout system with visual recognition
US8494215B2 (en) 2009-03-05 2013-07-23 Microsoft Corporation Augmenting a field of view in connection with vision-tracking
US20100262461A1 (en) * 2009-04-14 2010-10-14 Mypoints.Com Inc. System and Method for Web-Based Consumer-to-Business Referral
CN102447836A (en) * 2009-06-16 2012-05-09 英特尔公司 Camera applications in a handheld device
US10296937B2 (en) * 2009-06-29 2019-05-21 Excalibur Ip, Llc Operating a sensor recording marketplace
GB0913990D0 (en) * 2009-08-11 2009-09-16 Connelly Sean R Trolley
US20110060641A1 (en) 2009-09-04 2011-03-10 Bank Of America Customer benefit offers at kiosks and self-service devices
US8452868B2 (en) 2009-09-21 2013-05-28 Checkpoint Systems, Inc. Retail product tracking system, method, and apparatus
US8538820B1 (en) 2009-10-26 2013-09-17 Stoplift, Inc. Method and apparatus for web-enabled random-access review of point of sale transactional video
US8332255B2 (en) * 2009-11-09 2012-12-11 Palo Alto Research Center Incorporated Sensor-integrated mirror for determining consumer shopping behavior
US8320633B2 (en) * 2009-11-27 2012-11-27 Ncr Corporation System and method for identifying produce
US8479975B2 (en) * 2010-02-11 2013-07-09 Cimbal Inc. System and method for using machine-readable indicia to provide additional information and offers to potential customers
US8964298B2 (en) * 2010-02-28 2015-02-24 Microsoft Corporation Video display modification based on sensor input for a see-through near-to-eye display
US20110231331A1 (en) * 2010-03-19 2011-09-22 International Business Machines Corporation Providing An Enhanced Shopping Experience
EP2372627A3 (en) * 2010-04-01 2011-10-12 Richard E. Rowe Providing city services using mobile devices and a sensor network
US20110279446A1 (en) 2010-05-16 2011-11-17 Nokia Corporation Method and apparatus for rendering a perspective view of objects and content related thereto for location-based services on mobile device
US9053473B2 (en) 2010-05-28 2015-06-09 Ncr Corporation Techniques for assisted self checkout
SE535853C2 (en) 2010-07-08 2013-01-15 Itab Scanflow Ab checkout counter
US8488881B2 (en) 2010-07-27 2013-07-16 International Business Machines Corporation Object segmentation at a self-checkout
US9326116B2 (en) * 2010-08-24 2016-04-26 Rhonda Enterprises, Llc Systems and methods for suggesting a pause position within electronic text
US8941723B2 (en) 2010-08-26 2015-01-27 Blast Motion Inc. Portable wireless mobile device motion capture and analysis system and method
US8615105B1 (en) * 2010-08-31 2013-12-24 The Boeing Company Object tracking system
US20120072936A1 (en) * 2010-09-20 2012-03-22 Microsoft Corporation Automatic Customized Advertisement Generation System
US9171442B2 (en) * 2010-11-19 2015-10-27 Tyco Fire & Security Gmbh Item identification using video recognition to supplement bar code or RFID information
US8418919B1 (en) * 2011-01-04 2013-04-16 Intellectual Ventures Fund 79 Llc Apparatus and method for mobile checkout
WO2012106815A1 (en) * 2011-02-11 2012-08-16 4D Retail Technology Corp. System and method for virtual shopping display
US8317086B2 (en) * 2011-02-16 2012-11-27 International Business Machines Corporation Communication of transaction data within a self-checkout environment
US20120233003A1 (en) * 2011-03-08 2012-09-13 Bank Of America Corporation Providing retail shopping assistance
US20120239504A1 (en) * 2011-03-15 2012-09-20 Microsoft Corporation Virtual Shopping Assistance
JP5780791B2 (en) * 2011-03-23 2015-09-16 オリンパス株式会社 Cell tracking method
US20120271715A1 (en) 2011-03-25 2012-10-25 Morton Timothy B System and method for the automatic delivery of advertising content to a consumer based on the consumer's indication of interest in an item or service available in a retail environment
DE102011016663A1 (en) * 2011-04-05 2012-10-11 How To Organize (H2O) Gmbh Device and method for identifying instruments
US20120290288A1 (en) * 2011-05-09 2012-11-15 Xerox Corporation Parsing of text using linguistic and non-linguistic list properties
US20120320214A1 (en) 2011-06-06 2012-12-20 Malay Kundu Notification system and methods for use in retail environments
US8698874B2 (en) * 2011-06-10 2014-04-15 Microsoft Corporation Techniques for multiple video source stitching in a conference room
KR101822655B1 (en) * 2011-06-21 2018-01-29 삼성전자주식회사 Object recognition method using camera and camera system for the same
EP2538298A1 (en) 2011-06-22 2012-12-26 Sensefly Sàrl Method for acquiring images from arbitrary perspectives with UAVs equipped with fixed imagers
US20130030915A1 (en) * 2011-06-23 2013-01-31 Qualcomm Incorporated Apparatus and method for enhanced in-store shopping services using mobile device
US8544729B2 (en) * 2011-06-24 2013-10-01 American Express Travel Related Services Company, Inc. Systems and methods for gesture-based interaction with computer systems
US8851372B2 (en) 2011-07-18 2014-10-07 Tiger T G Zhou Wearable personal digital device with changeable bendable battery and expandable display used as standalone electronic payment card
US9754312B2 (en) 2011-06-30 2017-09-05 Ncr Corporation Techniques for personalizing self checkouts
US20130024265A1 (en) * 2011-07-22 2013-01-24 Marc Lotzof Programmable Customer Loyalty and Discount Card
US9251679B2 (en) 2011-08-16 2016-02-02 Tamperseal Ab Method and a system for monitoring the handling of an object
US20130046648A1 (en) * 2011-08-17 2013-02-21 Bank Of America Corporation Shopping list system and process
US20130054367A1 (en) * 2011-08-22 2013-02-28 Bank Of America Corporation Mobile door buster offer transmission based on historical transaction data
US20130054395A1 (en) * 2011-08-25 2013-02-28 Michael Cyr Methods and systems for self-service checkout
US20130222371A1 (en) 2011-08-26 2013-08-29 Reincloud Corporation Enhancing a sensory perception in a field of view of a real-time source within a display screen through augmented reality
US9367770B2 (en) 2011-08-30 2016-06-14 Digimarc Corporation Methods and arrangements for identifying objects
US9033238B2 (en) 2011-08-30 2015-05-19 Digimarc Corporation Methods and arrangements for sensing identification information from objects
US8560357B2 (en) * 2011-08-31 2013-10-15 International Business Machines Corporation Retail model optimization through video data capture and analytics
US10204366B2 (en) * 2011-09-29 2019-02-12 Electronic Commodities Exchange Apparatus, article of manufacture and methods for customized design of a jewelry item
US8498903B2 (en) * 2011-09-29 2013-07-30 Ncr Corporation System and method for performing a security check at a checkout terminal
WO2013071150A1 (en) 2011-11-11 2013-05-16 Bar Code Specialties, Inc. (Dba Bcs Solutions) Robotic inventory systems
JP2013109539A (en) 2011-11-21 2013-06-06 Hitachi Consumer Electronics Co Ltd Product purchase device and product purchase method
US20140304107A1 (en) 2012-12-03 2014-10-09 CLARKE William McALLISTER Webrooming with rfid-scanning robots
US9747480B2 (en) 2011-12-05 2017-08-29 Adasa Inc. RFID and robots for multichannel shopping
US10223710B2 (en) * 2013-01-04 2019-03-05 Visa International Service Association Wearable intelligent vision device apparatuses, methods and systems
US20130185155A1 (en) * 2012-01-12 2013-07-18 Big Red Pen, Inc. Systems and methods for providing contributions from third parties to lower a cost of a transaction for a purchaser
US20130254044A1 (en) 2012-01-13 2013-09-26 Peter Terry Catoe Self-checkout guidance systems and methods
US9202105B1 (en) * 2012-01-13 2015-12-01 Amazon Technologies, Inc. Image analysis for user authentication
JP5579202B2 (en) 2012-01-16 2014-08-27 東芝テック株式会社 Information processing apparatus, store system, and program
US9247211B2 (en) 2012-01-17 2016-01-26 Avigilon Fortress Corporation System and method for video content analysis using depth sensing
US9522791B2 (en) * 2012-02-10 2016-12-20 Carnegie Mellon University, A Pennsylvania Non-Profit Corporation System and method of material handling using one or more imaging devices on the transferring vehicle to control the material distribution into the storage portion of the receiving vehicle
WO2013134865A1 (en) 2012-03-16 2013-09-19 In Situ Media Corporation System and method for scanning, tracking and collating customer shopping selections
JP5785123B2 (en) * 2012-03-16 2015-09-24 株式会社イシダ Combination weighing device
US20130254114A1 (en) 2012-03-23 2013-09-26 Ncr Corporation Network-based self-checkout
TW201339903A (en) 2012-03-26 2013-10-01 Hon Hai Prec Ind Co Ltd System and method for remotely controlling AUV
US20130256395A1 (en) 2012-03-29 2013-10-03 Symbol Technologies, Inc. System for and method of expediting self-checkout at point-of-sale stations
US9892438B1 (en) * 2012-05-03 2018-02-13 Stoplift, Inc. Notification system and methods for use in retail environments
US9384668B2 (en) 2012-05-09 2016-07-05 Singularity University Transportation using network of unmanned aerial vehicles
US20130311260A1 (en) * 2012-05-17 2013-11-21 Luvocracy Inc. Reward Structures
US10387817B2 (en) 2012-05-17 2019-08-20 Catalina Marketing Corporation System and method of initiating in-trip audits in a self-checkout system
US20140006165A1 (en) * 2012-06-28 2014-01-02 Bank Of America Corporation Systems and methods for presenting offers during an in-store shopping experience
US20140006128A1 (en) * 2012-06-28 2014-01-02 Bank Of America Corporation Systems and methods for presenting offers during a shopping experience
US8805014B2 (en) 2012-07-11 2014-08-12 Ncr Corporation Produce color data correction method and an apparatus therefor
US8919653B2 (en) 2012-07-19 2014-12-30 Datalogic ADC, Inc. Exception handling in automated data reading systems
US9135789B2 (en) 2012-07-31 2015-09-15 Ncr Corporation Method and apparatus for reducing recognition times in an image-based product recognition system
US9171382B2 (en) * 2012-08-06 2015-10-27 Cloudparc, Inc. Tracking speeding violations and controlling use of parking spaces using cameras
US8856034B2 (en) * 2012-08-16 2014-10-07 International Business Machines Corporation Intelligent point of sale system
US20140207600A1 (en) * 2012-08-24 2014-07-24 Daniel Ezell System and method for collection and management of items
WO2014033354A1 (en) 2012-08-30 2014-03-06 Nokia Corporation A method and apparatus for updating a field of view in a user interface
US20140098185A1 (en) 2012-10-09 2014-04-10 Shahram Davari Interactive user selected video/audio views by real time stitching and selective delivery of multiple video/audio sources
US9396622B2 (en) 2012-11-02 2016-07-19 Tyco Fire & Security Gmbh Electronic article surveillance tagged item validation prior to deactivation
AU2013204965B2 (en) 2012-11-12 2016-07-28 C2 Systems Limited A system, method, computer program and data signal for the registration, monitoring and control of machines and devices
WO2014106260A1 (en) 2012-12-31 2014-07-03 Fujikura Composite America, Inc. Electronic scale
JP5314199B1 (en) * 2013-01-29 2013-10-16 パナソニック株式会社 Customer segment analysis apparatus, customer segment analysis system, and customer segment analysis method
US20140214623A1 (en) * 2013-01-30 2014-07-31 Wal-Mart Stores, Inc. In-store customer scan process including product automated ingredient warning
US10438228B2 (en) * 2013-01-30 2019-10-08 Walmart Apollo, Llc Systems and methods for price matching and comparison
US9076157B2 (en) 2013-01-30 2015-07-07 Wal-Mart Stores, Inc. Camera time out feature for customer product scanning device
US20140222596A1 (en) 2013-02-05 2014-08-07 Nithin Vidya Prakash S System and method for cardless financial transaction using facial biomertics
US10179543B2 (en) 2013-02-27 2019-01-15 Magna Electronics Inc. Multi-camera dynamic top view vision system
US10127588B2 (en) * 2013-02-28 2018-11-13 Ncr Corporation Methods and apparatus for providing customer assistance
US20140274307A1 (en) * 2013-03-13 2014-09-18 Brainz SAS System and method for providing virtual world reward in response to the user accepting and/or responding to an advertisement for a real world product received in the virtual world
US9330413B2 (en) * 2013-03-14 2016-05-03 Sears Brands, L.L.C. Checkout and/or ordering systems and methods
US9177224B1 (en) * 2013-03-14 2015-11-03 Amazon Technologies, Inc. Object recognition and tracking
US8818572B1 (en) 2013-03-15 2014-08-26 State Farm Mutual Automobile Insurance Company System and method for controlling a remote aerial device for up-close inspection
US8989922B2 (en) 2013-03-15 2015-03-24 Azure Sky Group, LLC. Modular drone and methods for use
US9033227B2 (en) 2013-05-20 2015-05-19 Ncr Corporation Methods and systems for performing security weight checks at checkouts
US20140365334A1 (en) 2013-06-07 2014-12-11 Bby Solutions, Inc. Retail customer service interaction system and method
US20140365333A1 (en) * 2013-06-07 2014-12-11 Bby Solutions, Inc. Retail store customer natural-gesture interaction with animated 3d images using sensor array
US20140365336A1 (en) 2013-06-07 2014-12-11 Bby Solutions, Inc. Virtual interactive product display with mobile device interaction
US20140367466A1 (en) * 2013-06-12 2014-12-18 Motorola Solutions, Inc. Checkout kiosk
US9338440B2 (en) * 2013-06-17 2016-05-10 Microsoft Technology Licensing, Llc User interface for three-dimensional modeling
US10176456B2 (en) 2013-06-26 2019-01-08 Amazon Technologies, Inc. Transitioning items from a materials handling facility
US10268983B2 (en) 2013-06-26 2019-04-23 Amazon Technologies, Inc. Detecting item interaction and movement
US9127891B2 (en) * 2013-07-10 2015-09-08 Honeywell International, Inc. Furnace visualization
US20150025969A1 (en) * 2013-07-18 2015-01-22 Fetch Rewards, LLC Multisystem Interface for Roaming Self-Checkout
US20150025929A1 (en) * 2013-07-18 2015-01-22 Wal-Mart Stores, Inc. System and method for providing assistance
US9473747B2 (en) * 2013-07-25 2016-10-18 Ncr Corporation Whole store scanner
US20150039388A1 (en) * 2013-07-30 2015-02-05 Arun Rajaraman System and method for determining consumer profiles for targeted marketplace activities
KR20150018037A (en) * 2013-08-08 2015-02-23 주식회사 케이티 System for monitoring and method for monitoring using the same
US20150100433A1 (en) * 2013-10-04 2015-04-09 Retailigence Corporation Online Reservation System For Local Pickup Of Products Across Multiple Retailers
WO2015061008A1 (en) * 2013-10-26 2015-04-30 Amazon Technologies, Inc. Unmanned aerial vehicle delivery system
US20150134413A1 (en) * 2013-10-31 2015-05-14 International Business Machines Corporation Forecasting for retail customers
WO2015084687A1 (en) * 2013-12-02 2015-06-11 Wal-Mart Stores, Inc. System and method for placing an order using a local device
US9122958B1 (en) * 2014-02-14 2015-09-01 Social Sweepster, LLC Object recognition or detection based on verification tests
US9244280B1 (en) * 2014-03-25 2016-01-26 Rockwell Collins, Inc. Near eye display system and method for display enhancement or redundancy
US10078136B2 (en) * 2014-03-25 2018-09-18 Amazon Technologies, Inc. Sense and avoid for automated mobile vehicles
US9779395B2 (en) * 2014-05-13 2017-10-03 Wal-Mart Stores, Inc. Systems and methods for identifying transaction capabilities of cashier
US20150379118A1 (en) * 2014-06-27 2015-12-31 United Video Properties, Inc. Methods and systems for generating playlists based on activities being performed by a user
US10062099B2 (en) * 2014-07-25 2018-08-28 Hewlett Packard Enterprise Development Lp Product identification based on location associated with image of product
CA3235272A1 (en) * 2014-07-25 2016-01-28 Gatekeeper Systems, Inc. Monitoring usage or status of cart retrievers
US11042887B2 (en) * 2014-08-29 2021-06-22 Shopper Scientist Llc Product exposure analysis in a shopping environment
US10366447B2 (en) * 2014-08-30 2019-07-30 Ebay Inc. Providing a virtual shopping environment for an item
WO2016057610A1 (en) * 2014-10-07 2016-04-14 Wal-Mart Stores, Inc. Apparatus and method of scanning products and interfacing with a customer's personal mobile device
US20160110791A1 (en) 2014-10-15 2016-04-21 Toshiba Global Commerce Solutions Holdings Corporation Method, computer program product, and system for providing a sensor-based environment
JP6302849B2 (en) * 2015-01-23 2018-03-28 東芝テック株式会社 Article recognition apparatus, sales data processing apparatus, and control program
EP3374947A4 (en) * 2015-11-09 2019-03-27 Simbe Robotics, Inc. Method for tracking stock level within a store
US9827683B1 (en) * 2016-07-28 2017-11-28 X Development Llc Collaborative inventory monitoring
MX2017011354A (en) * 2016-09-07 2018-09-21 Walmart Apollo Llc In-store audio systems, devices, and methods.

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080228577A1 (en) * 2005-08-04 2008-09-18 Koninklijke Philips Electronics, N.V. Apparatus For Monitoring a Person Having an Interest to an Object, and Method Thereof
US20110143779A1 (en) * 2009-12-11 2011-06-16 Think Tek, Inc. Providing City Services using Mobile Devices and a Sensor Network
US20130060843A1 (en) * 2010-03-29 2013-03-07 Rakuten, Inc. Server apparatus, information providing method, information providing program, recording medium recording the information providing program, and information providing system
US20130054377A1 (en) * 2011-08-30 2013-02-28 Nils Oliver Krahnstoever Person tracking and interactive advertising
US20130085345A1 (en) * 2011-09-30 2013-04-04 Kevin A. Geisner Personal Audio/Visual System Providing Allergy Awareness
US20130290107A1 (en) * 2012-04-27 2013-10-31 Soma S. Santhiveeran Behavior based bundling
US20140372211A1 (en) * 2013-06-14 2014-12-18 International Business Machines Corporation Real-time advertisement based on common point of attraction of different viewers

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10572843B2 (en) * 2014-02-14 2020-02-25 Bby Solutions, Inc. Wireless customer and labor management optimization in retail settings
US11288606B2 (en) 2014-02-14 2022-03-29 Bby Solutions, Inc. Wireless customer and labor management optimization in retail settings
US20170116657A1 (en) * 2015-10-26 2017-04-27 Sk Planet Co., Ltd. Payment information providing system using wearable device and payment information providing method using the same
US11816717B2 (en) * 2015-10-26 2023-11-14 Sk Planet Co., Ltd. Wearable device for providing payment information
US20210133844A1 (en) * 2015-10-26 2021-05-06 Sk Planet Co., Ltd. Payment information providing system using wearable device and payment information providing method using the same
US10922733B2 (en) * 2015-10-26 2021-02-16 Sk Planet Co., Ltd. Payment information providing system using wearable device and payment information providing method using the same
US20170206571A1 (en) * 2016-01-14 2017-07-20 Adobe Systems Incorporated Generating leads using internet of things devices at brick-and-mortar stores
US11113734B2 (en) * 2016-01-14 2021-09-07 Adobe Inc. Generating leads using Internet of Things devices at brick-and-mortar stores
US11216868B2 (en) 2016-05-09 2022-01-04 Grabango Co. Computer vision system and method for automatic checkout
US10861086B2 (en) 2016-05-09 2020-12-08 Grabango Co. Computer vision system and method for automatic checkout
US11295552B2 (en) 2016-07-09 2022-04-05 Grabango Co. Mobile user interface extraction
US11095470B2 (en) 2016-07-09 2021-08-17 Grabango Co. Remote state following devices
US11302116B2 (en) 2016-07-09 2022-04-12 Grabango Co. Device interface extraction
US11132737B2 (en) 2017-02-10 2021-09-28 Grabango Co. Dynamic customer checkout experience within an automated shopping environment
WO2018208671A1 (en) * 2017-05-08 2018-11-15 Walmart Apollo, Llc Uniquely identifiable customer traffic systems and methods
US20180322514A1 (en) * 2017-05-08 2018-11-08 Walmart Apollo, Llc Uniquely identifiable customer traffic systems and methods
US10778906B2 (en) * 2017-05-10 2020-09-15 Grabango Co. Series-configured camera array for efficient deployment
US11805327B2 (en) 2017-05-10 2023-10-31 Grabango Co. Serially connected camera rail
US11288650B2 (en) 2017-06-21 2022-03-29 Grabango Co. Linking computer vision interactions with a computer kiosk
US11282077B2 (en) 2017-08-21 2022-03-22 Walmart Apollo, Llc Data comparison efficiency for real-time data processing, monitoring, and alerting
US10810595B2 (en) 2017-09-13 2020-10-20 Walmart Apollo, Llc Systems and methods for real-time data processing, monitoring, and alerting
US11226688B1 (en) 2017-09-14 2022-01-18 Grabango Co. System and method for human gesture processing from video input
US11501537B2 (en) 2017-10-16 2022-11-15 Grabango Co. Multiple-factor verification for vision-based systems
US10963704B2 (en) 2017-10-16 2021-03-30 Grabango Co. Multiple-factor verification for vision-based systems
US11481805B2 (en) 2018-01-03 2022-10-25 Grabango Co. Marketing and couponing in a retail environment using computer vision
US10628660B2 (en) * 2018-01-10 2020-04-21 Trax Technology Solutions Pte Ltd. Withholding notifications due to temporary misplaced products
US20190213534A1 (en) * 2018-01-10 2019-07-11 Trax Technology Solutions Pte Ltd. Withholding notifications due to temporary misplaced products
US12079771B2 (en) 2018-01-10 2024-09-03 Trax Technology Solutions Pte Ltd. Withholding notifications due to temporary misplaced products
US20190318372A1 (en) * 2018-04-13 2019-10-17 Shopper Scientist Llc Shopping time allocated to product exposure in a shopping environment
US11164197B2 (en) * 2018-04-13 2021-11-02 Shopper Scientist Llc Shopping time allocated to product exposure in a shopping environment
EP3834130A4 (en) * 2018-08-07 2022-09-14 Lynxight Ltd. Drowning detection enhanced by swimmer analytics
US11288648B2 (en) 2018-10-29 2022-03-29 Grabango Co. Commerce automation for a fueling station
US11176684B2 (en) * 2019-02-18 2021-11-16 Acer Incorporated Customer behavior analyzing method and customer behavior analyzing system
US11507933B2 (en) 2019-03-01 2022-11-22 Grabango Co. Cashier interface for linking customers to virtual data

Also Published As

Publication number Publication date
US9842363B2 (en) 2017-12-12
US10810648B2 (en) 2020-10-20
US10593163B2 (en) 2020-03-17
US20160110793A1 (en) 2016-04-21
US20160110622A1 (en) 2016-04-21
US20160110772A1 (en) 2016-04-21
US20190139375A1 (en) 2019-05-09
US10157413B2 (en) 2018-12-18
US11514497B2 (en) 2022-11-29
US20160110902A1 (en) 2016-04-21
US20160110799A1 (en) 2016-04-21
US20160110797A1 (en) 2016-04-21
US10776844B2 (en) 2020-09-15
US20200402130A1 (en) 2020-12-24
US9424601B2 (en) 2016-08-23
US20180108074A1 (en) 2018-04-19
US9786000B2 (en) 2017-10-10
US10825068B2 (en) 2020-11-03
US10482724B2 (en) 2019-11-19
US20160110751A1 (en) 2016-04-21
US20160110700A1 (en) 2016-04-21
US20160110702A1 (en) 2016-04-21
US20160110760A1 (en) 2016-04-21
US20190096198A1 (en) 2019-03-28
US9679327B2 (en) 2017-06-13
US10417878B2 (en) 2019-09-17
US20160110786A1 (en) 2016-04-21
US20160110703A1 (en) 2016-04-21
US10176677B2 (en) 2019-01-08
US20160109281A1 (en) 2016-04-21
US20160110701A1 (en) 2016-04-21
US11127061B2 (en) 2021-09-21
US10672051B2 (en) 2020-06-02

Similar Documents

Publication Publication Date Title
US20160110791A1 (en) Method, computer program product, and system for providing a sensor-based environment
JP6264380B2 (en) Sales promotion system, sales promotion method, sales promotion program, and shelf system
JP2018518778A (en) Augmented reality devices, systems and methods for purchase
CA3034834C (en) Sensor-enabled prioritization of processing task requests in an environment
CA3034823C (en) Image segmentation in a sensor-based environment
US20200250736A1 (en) Systems, method and apparatus for frictionless shopping
WO2016051182A1 (en) System and method for monitoring display unit compliance
US11107091B2 (en) Gesture based in-store product feedback system
US10691931B2 (en) Sensor-based environment for providing image analysis to determine behavior
US10311497B2 (en) Server, analysis method and computer program product for analyzing recognition information and combination information
US20240013287A1 (en) Real time visual feedback for augmented reality map routing and item selection
WO2016051183A1 (en) System and method for monitoring display unit compliance
JP2016218822A (en) Marketing information use device, marketing information use method and program
WO2023079846A1 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
AS Assignment

Owner name: TOSHIBA GLOBAL COMMERCE SOLUTIONS HOLDINGS CORPORA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HERRING, DEAN FREDERICK;CHIRAKANSAKCHAROEN, MONSAK JASON;SINGH, ANKIT;SIGNING DATES FROM 20141223 TO 20141224;REEL/FRAME:034644/0068

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION