Nothing Special   »   [go: up one dir, main page]

WO2010141637A1 - System and method for learning user genres and styles and matching products to user preferences - Google Patents

System and method for learning user genres and styles and matching products to user preferences Download PDF

Info

Publication number
WO2010141637A1
WO2010141637A1 PCT/US2010/037139 US2010037139W WO2010141637A1 WO 2010141637 A1 WO2010141637 A1 WO 2010141637A1 US 2010037139 W US2010037139 W US 2010037139W WO 2010141637 A1 WO2010141637 A1 WO 2010141637A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
fashion
genre
fashion product
product content
Prior art date
Application number
PCT/US2010/037139
Other languages
French (fr)
Inventor
Tianli Yu
Orhan Camoglu
Luca Bertelli
Jacquie Marie Phillips
Muralidharan Venkatasubramanian
Diem Vu
Munjal Shah
Salih Burak Gokturk
Original Assignee
Like.Com
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Like.Com filed Critical Like.Com
Priority to KR1020127000140A priority Critical patent/KR20120085707A/en
Priority to JP2012514104A priority patent/JP2012529122A/en
Priority to CA2764056A priority patent/CA2764056A1/en
Priority to EP10784043.1A priority patent/EP2438509A4/en
Priority to AU2010256641A priority patent/AU2010256641A1/en
Publication of WO2010141637A1 publication Critical patent/WO2010141637A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F7/00Methods or arrangements for processing data by operating upon the order or content of the data handled
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services

Definitions

  • off-line learning is composed of well established techniques that have been thoroughly dissected, on-line algorithms have received a lot of attention in the last decade, with several applications ranging from learning complex background and appearance models, object detection and classification, modeling and predicting user behavior.
  • On-line learning can become the only viable solution in applications where the training data is never available in batch, but is gathered concurrently to the decision/classification process and hence the need to design an adaptive learning technique.
  • off-line or batch paradigms need to be retrained once new/unseen data is presented.
  • FIG. 1 illustrates a system that uses visual information to identify genre and fashion style preferences of a user, according to one or more embodiments.
  • FIG. 2 illustrates a method for predicting a preference of a user to a particular genre, according to one or more embodiments.
  • FIG. 3A depicts an example of a panel that can be generated to present a set of visual aids to the user in order to prompt the user into providing a response, under an embodiment.
  • FIG. 3B shows a panel that enables the user to select size information for various types of fashion products, such issues, tops, bottoms, and addresses.
  • FIG. 3C illustrates a panel that enables a user to specify or indicate the user's preference to characteristics patterns, color, and shape.
  • FIG. 4 describes a method for programmatically predicting the genre or style of a product, under an embodiment.
  • FIG. 5 illustrates a method for matching a product to a customer preference, according to one or more embodiments.
  • FIG. 6 illustrates a result panel for communicating the programmatically determine fashion genre preferences of the user, according to an embodiment.
  • FIG. 7 illustrates a method for determining descriptive classifications and categories of fashion products provided by fashion product content items, under one or more embodiments.
  • FIG. 8 illustrates a system that makes fashion product recommendations to users using product class/category determinations and user activity information, according to an embodiment.
  • embodiments described herein provide a computer implemented method or system in which a user's genre preference to style or fashion can be determined programmatically.
  • embodiments enable programmatic classification and categorization of fashion products using image, text and metadata associated with a corresponding fashion product content item.
  • some embodiments enable a service or system to make programmatically determined recommendations relating top fashion products, based on information determined about the user's genre preferences and/or the determined genre of style of a fashion product represented by a content item.
  • embodiments described herein include a computer-implemented method for determining user preferences for fashion products.
  • a fashion preference of a user is determined based on a user's interaction with a plurality of fashion product content items that individually depict a corresponding fashion product.
  • a recommendation is made to a user of a fashion product based at least in part on the fashion preference of the user.
  • a fashion product content item is analyzed to determine a set of features of a fashion product depicted in the fashion product content item.
  • the fashion product is associated with a pre-defined descriptive category for each of a plurality of descriptive classifications, based on a quantitative analysis of the determined set of features.
  • the product content item and its pre-defined descriptive category for each of the plurality of descriptive classifications are used to determine or predict a user preference.
  • one or more processors are structured to analyze individual fashion product content items representing a catalog of fashion products to determine, for each fashion product content item, a set of features of a fashion product depicted in that fashion product content item.
  • Each fashion product represented by one of the fashion product content items is assigned to a pre-defined descriptive category for one or more corresponding descriptive classifications. The assignment is based on a quantitative analysis of the determined set of features.
  • One or more fashion product content items are detected which are deemed to be of interested to the user.
  • a fashion preference of the user is determined using the pre-defined descriptive category for each of the plurality of descriptive classifications of the one or more fashion product content items that are deemed of interest to the user.
  • Embodiments described herein include systems and methods for (i) learning a user's or customer's preferences in clothing styles, fashion and genres, (ii) predicting genres of different clothing products and fashion accessories, and/or (iii) using (a) known shopping parameters of a user (e.g. the user's size information, price preferences, hate or love for certain styles, patterns and colors) and/or (b) predicted genres and styles for each individual user, to propose the best matching products and accessories to customers.
  • known shopping parameters of a user e.g. the user's size information, price preferences, hate or love for certain styles, patterns and colors
  • predicted genres and styles for each individual user e.g. the user's size information, price preferences, hate or love for certain styles, patterns and colors
  • a fashion product includes, for example, clothing, accessories and apparel. Specific examples include blouses, shirts, dresses, shoes, socks, pants and bottoms, belts, jewelry (e.g. watches, earrings, necklaces), ties, hats, jackets and coats.
  • jewelry e.g. watches, earrings, necklaces
  • ties e.g. hats, jackets and coats.
  • a fashion product content item corresponds to a document or file that includes visual, textual and/or metadata information about a particular product.
  • the fashion product content items are generally available as part of an online catalog or e-commerce search engine. Typical aspects of such content items include (i) one or more images of a product, (ii) textual information about the product, including information about available sizes and variations to the product, (iii) pricing information, and/or (iv) links or data elements to facilitate their viewer of the content item to purchase the depicted fashion product.
  • Some embodiments recognize that computational complexity and latency of all these on-line learning techniques remain an open problem and can become critical in time constrained applications such as real-time object tracking or the on-line shopping scenario that is described in this paper.
  • a large number of high dimensional feature vectors enforces strict requirements on the number of operations allowed in order to meet the stringent time requirements.
  • Some embodiments described herein include computer- implemented techniques for learning user preferences from a user's interaction with an on-line interface (e.g. one provided at a shopping website). By predicting what the user likes, a better search ranking algorithm can be designed, which in turn results in a better experience perceived by the user.
  • an on-line interface e.g. one provided at a shopping website.
  • embodiments combine heterogeneous cues coming from visual and text features and, in particular, provide a compact yet discriminative representation of the user's preferences that traditional features are not able to achieve.
  • embodiments implement a learning stage which can process relatively large feature vectors in less then few milliseconds to avoid compromising the overall user experience.
  • programmatic means through execution of code, programming or other logic.
  • a programmatic action may be performed with software, firmware or hardware, and generally without user-intervention, albeit not necessarily automatically, as the action may be manually triggered.
  • One or more embodiments described herein may be implemented using programmatic elements, often referred to as modules or components, although other names may be used. Such programmatic elements may include a program, a subroutine, a portion of a program, or a software component or a hardware component capable of performing one or more stated tasks or functions.
  • a module or component can exist on a hardware component independently of other modules/components or a module/component can be a shared element or process of other modules/components, programs or machines.
  • a module or component may reside on one machine, such as on a client or on a server, or a module/component may be distributed amongst multiple machines, such as on multiple clients or server machines.
  • Any system described may be implemented in whole or in part on a server, or as part of a network service.
  • a system such as described herein may be implemented on a local computer or terminal, in whole or in part.
  • Embodiments described herein generally require the use of computers, including processing and memory resources.
  • systems described herein may be implemented on a server or network service.
  • Such servers may connect and be used by users over networks such as the Internet, or by a combination of networks, such as cellular networks and the Internet.
  • networks such as the Internet
  • one or more embodiments described herein may be implemented locally, in whole or in part, on computing machines such as desktops, cellular phones, personal digital assistances or laptop computers.
  • memory, processing and network resources may all be used in connection with the establishment, use or performance of any embodiment described herein (including with the performance of any method or with the implementation of any system).
  • one or more embodiments described herein may be implemented through the use of instructions that are executable by one or more processors. These instructions may be carried on a computer- readable medium.
  • Machines shown in figures below provide examples of processing resources and computer-readable mediums on which instructions for implementing embodiments of the invention can be carried and/or executed.
  • the numerous machines shown with embodiments of the invention include processor(s) and various forms of memory for holding data and instructions.
  • Examples of computer-readable mediums include permanent memory storage devices, such as hard drives on personal computers or servers.
  • Other examples of computer storage mediums include portable storage units, such as CD or DVD units, flash memory (such as carried on many cell phones and personal digital assistants (PDAs)), and magnetic memory.
  • Computers, terminals, network enabled devices e.g. mobile devices such as cell phones) are all examples of machines and devices that utilize processors, memory, and instructions stored on computer-readable mediums.
  • FIG. 1 illustrates a system that uses visual information to identify genre and fashion style preferences of a user, according to one or more embodiments.
  • a system such as described in FIG. 1 presents preselected images of fashion products to individuals in an attempt to determine likes, dislikes, preferences and other user feedback for ascertaining the user's style or genre preference.
  • conventional techniques for estimating a shopper's (e.g. user or customer) style or genre preference typically involves asking the individual about genres/styles that best describe their personal preference to style and genre.
  • the conventional approach is problematic-among the reasons, words are not sufficiently precisely to capture fashion preferences and statements. Additionally, users do not always know what their preferences are.
  • embodiments described herein and with FIG. 1 include a system that programmatically learns user fashion style and genre preferences using visual aids or pictures.
  • the system 100 may be provided in a variety of computing environments, including in a client-server architecture.
  • system 100 may be implemented on one or more servers (or other computing machines) to provide a service such as described by one or more embodiments detailed herein.
  • system 100 may be implemented on a website, such as in a e- commerce site, search engine or shopping portal.
  • System 100 may rely on genre definitions that are defined by experts or operators.
  • fashion genres include (and are not limited to) Nooks' that are of the following genres: chic, street, Boho, urban/hip-hop, and conservative.
  • experts may select clothing and clothing in ensembles that are representative of the various categories (the number of which is set by design or choice).
  • representative clothing and clothing ensembles form ground truth data, or points of comparison, in determining (i) genre preferences of the user, and (ii) predicting the genre of another item of clothing or apparel.
  • a system 100 depicts images of clothing and clothing ensembles in a worn state.
  • images of people including celebrity images wearing different genres of clothes and accessories can be shown to the user.
  • the user is enabled to respond to individual images to specify whether the depicted clothing is of a style or type that is in the user's preference.
  • the system can learn from user choices made on images, rather than on text descriptions or on user's self-reporting of preferences.
  • system 100 includes a user-interface 110, a user database 120, a genre score component 130, a genre determinator 134, a visual aid component 140, and a product database 150.
  • a user of system 100 may correspond to a shopper or a customer of fashion products.
  • system 100 is implemented on an online medium.
  • system 100 can be implemented as part of an e-commerce site, shopping portal, or other web-based or networked environment in which individuals are given the opportunity to view (and potentially purchase) fashion products.
  • the interface 110 may correspond to, for example, a webpage, or interactive feature provided on a webpage.
  • the user of system 100 is associated with the profile in user database 120.
  • the user may have an account with an operator of a service that provides system 100.
  • the user may be known by cookie/computer information, by account/login, or for a solitary online session with a provider (e.g. e-commerce site) of system 100.
  • a provider e.g. e-commerce site
  • the user may interact with the interface 110 and provide parameters 112 relating to fashion products that the user can wear.
  • the parameters that the user may specify include, for example, the user clothing size, preferred price range for fashion and clothing items, and preferred brand names.
  • the user may also volunteer information about visual characteristics of clothing and apparel that the user likes or dislikes.
  • the product database 150 retains information from fashion product content items.
  • a product database 150 may store information about fashion products depicted in the product content items. Such information may be programmatically determined from image, text and metadata analysis of fashion product content items, as provided by retailers, manufacturers and other suppliers of fashion products. The information that is programmatically determined about depicted fashion products is associated in database 150 with corresponding product content items, such as electronic catalog pages and sections.
  • the visual aid component 140 is configured to present to the user images, or visual aids, from which are elicited to make the genre/style preference determinations.
  • Visual aid component 140 communicates visuals 152 of fashion products to the user via the interface 110.
  • the visuals 152 depict fashion products, or ensembles of fashion products, in a worn state (e.g. as worn by a celebrity or model, on a mannequin, or computer generated onto an image of a person).
  • FIG. 2 and FIG. 3A illustrate examples of how the visuals 152 can be structured for presentation to the user.
  • the user can provide input through the interface 110 that indicates (i) the users like or dislike of a particular fashion product or ensemble; (ii) the user's preference of one fashion product over another; and/or (iii) a rating or feedback that indicates the level of the user's like or dislike for the fashion product.
  • the visual aid component 140 present a set of visuals 152 that prompt the user to enter a response that indicates the users visual preference for the fashion genre depicted by that visual Still further, as described with an embodiment of FIG. 2 or FIG. 3A, the visuals may be presented to the user in a quiz or game fashion. In the quiz or game fashion, the user is shown panels that individually depict competing fashion products of different genres. The user can respond to each panel by indicating their preference, or like dislike, a one fashion product over at the other end of panel.
  • the genre score component 130 records and determines a genre score from the user's input.
  • the genre score component 130 may record responses the user has too been presented in visuals 152, in order to score individual classifications of fashion genre.
  • the genre score component 130 in combination with the visual aid component 140, can record and score the users response to subcategories of fashion genre.
  • Numerous techniques may be employed to ascertain the fashion genre preferences of the user.
  • the set of visuals 152 is predesigned to depict a number of images of fashion products for each identify genre. The user simply responds with preference or like/dislike input when viewing images of the fashion products in order to indicate his likeness or preference of one fashion genre over another.
  • the genre score component 130 maintains a genre score 133 that is indicative of the user's genre preference, for genres represented by this set of visuals 152.
  • the genre score 133 can be recorded in the user database 120, in association with the profile from the user.
  • the genre determinator 134 determines one or more preferred genres and/or subcategories (e.g. primary, secondary, and tertiary genres) of the user based at least in part on the score 133.
  • the genre determinator 134 and/or score 133 may also influence the visuals 152 outputted for the user by the component 140, in that intelligence may be used by way of probabilistic assumptions that those users who have a certain genre preference are likely to have a particular like or dislike of another genre. For example, the user with business genre preference may be deemed unlikely to also like street genre clothing.
  • one or more embodiments provide that the fashion products identified in the product database 150 are tagged with genre descriptors 151.
  • the descriptors include programmatically determined genre descriptors, which can be determined by a product genre predictor component (PGPC) 154.
  • PGPC 154 analyzes the product content items in order to obtain information that can be used to determine the genre(s) of the fashion product depicted in the content item.
  • system 100 can used to determine genre preferences of the user, as well as to predict the genre classifications and categories of fashion products.
  • the genre descriptors 151 determined from the PGPC may include sub-genres or genre categories, including secondary and tertiary genres determinations. For example, many fashion products may share more than one genre.
  • FIG. 4 illustrates a method for predicting genre(s) of fashion products using fashion product content items, according to some embodiments.
  • system 100 also includes a product recommendation engine 170.
  • product recommendation engine 170 recommends a fashion product to the user, based on (i) user information that identifies genre/style preferences and parameters for fashion products that the user may purchase, and (ii) fashion product information.
  • User information 172 is provided by user database 120.
  • user information 172 is provided by genre preferences as outputted by the genre score component 133 and/or genre preference information 137.
  • the fashion product information 174 is retrieved from the product database 150.
  • the fashion product information 174 includes programmatically predicted genre classifications and/or subcategories, associated with individual products.
  • the fashion product information 174 may also include information retrieved from the fashion product content item, as well as tag (e.g. metadata) provided by a supplier of the fashion product content item or the underlying fashion product.
  • tag e.g. metadata
  • the recommendation engine 170 is able to recommend individual fashion products from, for example, products identified in the product database 150.
  • the recommended products 176 may be communicated to the user via the interface 110.
  • system 100 is able to show its confidence in predicting user genres and style.
  • system 100 includes an interface in which users are able to also record known parameters, such as the user's clothing size, price preference, and/or their like/dislike for certain styles, patterns and colors. This information is used while matching products to user preferences.
  • the overall system allows for multiple hierarchies of genre prediction: primary or top level genre predicting broad genre or style matches, secondary or second level genre predicting multiple fine-grain genre and styles, tertiary or third level genre predicting multiple domain specific styles, and so on.
  • FIG. 2 illustrates a method for predicting a preference of a user to a particular genre, according to one or more embodiments. More specifically, a method such as described determines, for a particular user, the user's primary, secondary and tertiary genres of preference. A method such as described may be implemented using a system such as described with FIG. 1. Accordingly, reference may be made to elements and numerals of FIG. 1 in order to describe suitable elements and components for performing a step or sub-step being described.
  • a set of images is shown to a user (210).
  • visual aid generator 140 selects and displays individual images of the set to the user via user-interface 110.
  • the set of images can be pre-selected to be from a diverse range of genres.
  • some or all of the genres are determined using manual definitions and selections.
  • the set of images may be sorted into different genres using manual input to classify each image in a particular genre.
  • some or all of the images in the set are programmatically determined to be associated with a genre. For example, programmatic methods may be used to identify similarity between items of clothing, and the similarity comparisons may be used to associate clothing with a particular genre.
  • the user is prompted to respond by providing an input (via interface 110) that indicates whether the user liked or disliked the image.
  • the user's responses are recorded (220).
  • the input is prompted from the user as part of a game in which the user can participate with input that states whether the user considered an individual image from the set as hot-or-not ("Hot-or-not game").
  • genre determinator 134 determines a user's preference to genre.
  • the genre determinator 134 uses an algorithm to determine the user's genre preferences (e.g. primary, secondary and tertiary). In one embodiment, an algorithm is used as follows:
  • Q r [q r i, q r 2, ⁇ q rn ]
  • the algorithm will update the user's genre probabilities. The update can be performed as follows: Of the two genres that are presented to the user, the one picked by the user is updated using
  • the algorithm terminates the test and returns the best genre to the user.
  • the algorithm picks two genre images to be shown to the user in the next round.
  • the algorithm can be generalized to present k (k > 2) images to the user.
  • the algorithm can also be generalized to determine t (t > 1) genres.
  • the criterion for stop can be modified to check the top t probabilities.
  • the strategy to select the next set of images should pick images from both the top t genres and the rest of the genres.
  • the user's responses to indicating likes or dislikes are used to determine the primary, the secondary and the tertiary genres of preference for the user (240).
  • the primary, the secondary and the tertiary genres of preference can be determined at the same time. One way to implement this is to sequentially predict the primary, secondary and tertiary genres.
  • an approximation algorithm can be used. If all the images used for primary genre prediction are also tagged with secondary and tertiary genres, then the images that the user selected during the primary genre prediction can be used to build multiple histograms — one for secondary genres and multiple (one per domain) for tertiary genres. The top genres in these histograms can be used to predict secondary and tertiary genres. [0060] To offer good user experience, some embodiments provide for progress feedback to indicate the amount of progress the user has made towards the computer-learning of his genres of preference. In one embodiment, a progress bar can be shown to the user to indicate the progress of the genre prediction. The distance between the threshold and the current best genre probability, max q ⁇ , can be used as progress indicator.
  • FIG. 3A depicts an example of a panel that can be generated to present the visual aids 152 (FIG. 1) to the user in order to prompt the user into providing a response, under an embodiment.
  • panel 310 is presented through the interface 110 (see FIG. 1).
  • panel 310 may be formatted as a webpage.
  • the panel 310 comprises a pair of images 312, 314 that each depict clothing (as worn by a celebrity or model) of a particular genre. The user can select one image over the other to indicate his preference of a particular genre depicted by that image (as compared to the genre depicted in the other image).
  • the user's selection of one image over another is the input that indicates the user's preference of one genre over another.
  • the visual aid component 140 presents another panel comprising another pair of images (depicting clothing of different genres) to the user in order to solicit a similar selection from the user.
  • the comparison game between image pairs can continue for a number of rounds, with a user selection in each round providing information as to the user's like/dislikes of the various genres defined with system 100.
  • FIG. 3B shows a panel 330 that enables the user to select size information for various types of fashion products, such issues, tops, bottoms, and addresses.
  • FIG. 3C illustrates a panel 350 that enables a user to specify or indicate the user's preference to characteristics patterns, color, and shape.
  • the characteristics that the user can specify preferences for may be specific to a particular type of fashion product.
  • the shape preferences of the user may be presented as being specific to the category of fashion products for shoes, or more specifically woman's shoes.
  • an online commerce environment (such as implemented by a system of FIG. 1) implements a recommendation engine to recommend additional clothing, apparel, or accessories. Such recommendations may be made to, for example, provide a fashion ensemble or matching set of clothing/apparel.
  • one or more embodiments provide that at least some available products for a commerce medium are programmatically analyzed in order to predict the individual product's genre and style.
  • FIG. 4 describes a method for programmatically predicting the genre or style of a product, under an embodiment.
  • Product genre prediction combines several different feature types, such as metadata features (based on textual description) and visual features (based on visual vocabularies computed from several thousand of images).
  • programmatic feature extraction can utilize different forms of features (410).
  • the features extraction includes metadata extraction (414) and visual feature extraction (418).
  • metadata feature extraction metadata features are identified and represented as a vector, where each word or word pair that appears in one of the metadata fields (such as title, description, brand, prices, etc.) represent one dimension in the vector.
  • Visual features can be determined using image analysis, and represented as vectors.
  • the vector can represent one global feature computed over the whole image, or one based on visual vocabulary computed over thousands of images.
  • These visual features include color, shape, and/or texture.
  • a final feature vector can be computed by combining the metadata and visual vectors, for example, by concatenating metadata feature and visual features one after another to form a single big feature vector V.
  • a set of products are manually tagged by fashion experts with primary, secondary, and tertiary genre tags to form a ground truth set (420).
  • Machine learning algorithms are used to learn the mapping from the extracted feature vector to different genres for these products (430). For each genre, given the feature vector V, a binary classifier can be learned to determine the probability of a product to belong to that genre or not. [0071] Genre prediction can then be performed for individual products that are not in the ground truth set (440). For each product, the probabilities of all genres are estimated and the top genres are selected as the genre predictions for that product.
  • a multilevel level classification can be performed in which secondary or tertiary genres are conditioned on the primary genre.
  • Primary genre classifiers are trained as previously stated.
  • a new set of secondary g 2 and tertiary genre g3 is trained for each primary genre gi.
  • the joint probability of primary and secondary/tertiary genres given the feature vector P(gi g 2 g 3 I V) can be computed as
  • product recommendations are made by (i) identifying predicted product genres of products (as described with
  • FIG. 4 (ii) identifying a given user's genre or style preference for clothing and apparel (as described with an embodiment of FIG. 2); and (iii) matching product to user using (i) and (ii).
  • products can be boosted for recommendation by boosting products which match user preferences to higher ranks and de-weighing products which do not match user preferences to lower ranks.
  • products which do not match user preferences can be de-weighted as follows: (i) filter non-matching products completely from presentation to user, or (ii) down-weigh such towards the end of results.
  • Matching products (or recommendations) can be viewed by user via period automatic emails (for example, emailed daily, twice in a week, once in a week, or once in a month) or by logging onto a website.
  • FIG. 5 illustrates a method for matching a product to a customer preference, according to one or more embodiments. Reference is made to components of FIG. 1 in order to describe suitable components for performing a step or sub-step being described. [0078] The primary and secondary/tertiary genre combination with the highest joint probability can be select as the genres of the product. [0079] For a given user, the user's primary, secondary and tertiary genres are identified (510). For example, the results of a process such as described by FIG.
  • the visual aid component 140 may present visuals 152 to prompt the user for a response.
  • a series of prompts may be solicited from the user in order to have the user specify comparative preferences of various different genres.
  • the resulting score (determined from the user's responses) is used to determine the user's fashion genre preferences.
  • a pool of products are identified from the product database 150 that match the user's preferences (520).
  • the matching products are subjected to a process of selection, filtering, are weighting, in order to identify a subset of fashion products to recommend to the user (530).
  • selection and filtering may be performed to exclude fashion products that are not available and the size of the user, or which are of a color, pattern or shape that the user has specified as being disliked.
  • the matching products may be filtered to eliminate items that have the color, brand or keywords that the user does not like.
  • the matching products may also be weighted to favor/disfavor fashion products that satisfy, for example, specified preferences of the user as to color, pattern, shape, or brand.
  • Matching products can then be presented to the user as, for example, a search or browse list (540).
  • the remaining products are then sorted by a matching score to determine the order in which they should be sent to the user.
  • the matching score can be computed as a linear combination of different individual matching scores:
  • the individual matching score includes the product's primary, secondary or tertiary genre probabilities, age matching score, price preferences, and other color, style or pattern preferences. [0084] RESULT PRESENTATION
  • results of various processes, algorithms and system output can be provided to user in various forms, some embodiments include an interactive tool that the user can use in order to determine the user's fashion genre preferences.
  • FIG. 6 illustrates a result panel for communicating the programmatically determine fashion genre preferences of the user, according to an embodiment.
  • a result panel 610 can be output in in response to an individual partaking in, for example, a quiz or challenge generated through the visual aid component 140.
  • result panel 610 may identify the user's primary genre (Sporty), and one of more secondary (Conservative) or tertiary genres (Modern, Boho).
  • the result panel 610 may also display fashion products that meet the users genre/style preferences.
  • the images of fashion products may be preselected, based on the images being deemed representative of the particular genre or genre combination. Alternatively, some or all of the fashion products depicted may be selected for the user. For example, parameters such as user specified color preferences may be used to present some items of clothing or apparel. Likewise, if a user prefers a certain style of shoes (e.g. boots, as specified by the user via an interface such as shown in FIG. 3C), footwear the result panel 610 may be depicted by boots.
  • a certain style of shoes e.g. boots, as specified by the user via an interface such as shown in FIG. 3C
  • Embodiments described herein may incorporate enhanced feature representation of descriptive classifications for fashion products.
  • descriptive classifications can be defined by human operators (e.g. experts) to include multiple categories (or sub-classifications).
  • fashion product content items e.g. catalog or web image of clothing
  • the extracted features are then analyzed to associate the fashion product with one of more descriptive classifications (of fashion products), and one or more categories are each associated descriptive classification.
  • FIG. 7 illustrates a method for determining descriptive classifications and categories of fashion products provided by fashion product content items, under one or more embodiments.
  • Descriptive classifications and categories (or sub-classifications) for fashion products are defined by human operators (710).
  • the descriptive classifications include (but are not limited to) : genre, shape or silhouette, pattern, and color.For example, the following classifications may be employed:
  • Panes n ,vhfv p ⁇ m, ⁇ tc t oka Kimih a set of primitive visual and text features are extracted from the content item (720). These features include, for example, color histogram, shape descriptors, texture features and text description features. To determine such features, image recognition and text analysis (including textual metadata analysis) can be performed on individual content items.
  • Analysis is performed on the primitive features in order to determine the classification and categorization (or sub-classifications) of the products depicted in the content items (730).
  • the analysis can be quantitative. More specifically, in one embodiment, the analysis can be statistical. Furthermore, multiple methods can be implemented to associate a fashion product with the classification. For color classification a set of cluster centers is created that is based on manually labeled ground truth. Each product (or image thereof) is assigned to the nearest cluster based on its distance in histogram space: f is the primitive feature vector comprehensive of visual and textual information;
  • Xi CFT are components of the color family hyper dimension X 0 " 7"
  • J is a mapping from distances to likelihoods.
  • a support vector machine classifier may be used to associate or assign the products to the classifications.
  • human operators e.g. fashion experts
  • the trained SVM is used to generate a decision value from the visual and text feature of the item.
  • ⁇ * c ⁇ ⁇ are the learned SVM parameters corresponding to each tag / of each hyperdimension T E ⁇ GT f ST, PT ⁇ .
  • f is the primitive feature vector of the item, while g 7 of all other items in the training set.
  • FIG. 8 illustrates a system that makes fashion product recommendations to users using product class/category determinations and user activity information, according to an embodiment.
  • a system such as described by an embodiment of FIG. 8 may represent a modification or variations to an embodiment described in FIG. 1, as well as elsewhere in this application.
  • functionality and components of FIG. 8 may optionally be viewed as supplementing or augmenting a system such as described with FIG. 1.
  • a system 800 may comprise the user database 120 and the product database 150.
  • the user database 120 may associate certain information with individual users, such as the users fashioned genre preferences (which may be programmatically determined) as well as parameters specified by the user (e.g. See FIG. 3B and FIG. 3C) in addition, the user database 120 may be coupled to a monitor component 810 that monitors or detects and user actions about fashion product content items and related activity.
  • the monitor component 810 may detect activity such as one or more of the following: (i) user interaction with the search results, including the user selecting or otherwise indicating interest to a particular item in the search result; (ii) user interaction with online browsing or shopping environment.
  • Information 812 that identifies items (e.g.
  • this information 812 includes items that were displayed to the user and which the user clicked-on, as well as items that were displayed to the user and not clicked on.
  • the user monitor 810 may detect session specific activity, or historical activity 814 from the user's past sessions.
  • the historical activity can extend to search terms that the user entered at, for example, a search engine or e-commerce site.
  • the user interaction may be detected through interface 810, or through the browser or browser data (e.g. browser history and cookie information).
  • the historical activity 814 includes the queries that the user typed in, the impressions (i.e. the items retrieved by the search engine and presented to the user) and the buy clicks (i.e. the items clicked by the user).
  • the set of queries is projected onto the fashion-aware feature space described above and several positive training samples are obtained.
  • the product database 150 is coupled to a product category/class determinator 820.
  • the category/class determinator 820 may analyze fashion product content items in order to determine one or more classifications/categories 822 of each product.
  • the category/class determinator 820 implements a process such as described by FIG. 7.
  • the resulting descriptive classification/categorization is stored in the product database 150.
  • a user preference profiler 830 generates a user profile 832 based on activity information 812 and/or historical information 814. The profiler 830 updates the user profile 832 for individual users.
  • the profiler 830 In creating and updating the user profile 832, the profiler 830 (i) identifies fashion products from the user activity information 812 (e.g. products that the user selected to view when browsing or searching, products the user elected not to view)); (ii) uses the product database 150 to determine classifications and categorizations of those products (as determined by FIG. 7); and (iii) uses the descriptive classifications and categorizations of the products identified from the activity information 812 to develop the user's profile 832.
  • the users profile 832 may augment, supplement or otherwise identify the fashioned genre preferences of the user.
  • the user profile 832 may be combined with, or be used as an alternative, to the programmatic fashion genre determination described by other embodiments.
  • the user profile 832 may be session specific and robust to determine that the user is looking for an event-specific outfit (e.g. evening gown), which otherwise may not be in the preference genre of the user.
  • the profiler 830 may also use the historical information 814 to develop the profile 832.
  • the recommendation engine 170 is configured to recommend products 176 data selected for the user based at least in part on the genre preferences as identified by the user profile 832 and/or genre preferences identified via the aid/score component.
  • the recommendation engine 170 may also include historical data 814 as a component for determining its recommended product 176.
  • the recommendation engine 170 may also be used to recommend and/or retrieve and/or rerank products in response to user query/search or request for products from a specific type of fashion products [099] SHORT-TERM USAGE
  • Embodiments recognize that in an online scenario, the short-term preference of the user can become of importance. Embodiments further recognize a need for an online algorithm that quickly learns from the user's actions, and enhances the user's shopping and search experience right away. For example, when a user is shopping for a formal holiday party vs. a resort vacation, his long term preferences about the colors, patterns, brands etc. will be of little use for improving the overall shopping experience. Hence a system that learns about the user real time as the user is interacting with the site can deliver more pertinent results.
  • the online system as the user is performing queries and doing clicks these are incorporated into a daily user profile.
  • a summary of the preferences is created via kernel density estimation and is kept to be used in the ranking.
  • the feature vectors describing the properties of item i are fetched (from a precomputed table) and efficiently aggregated in a generative model of the daily user profile by on-line update of a kernel density estimator:
  • n is the number of click of the user's session, while h is the kernel bandwidth.
  • the function p can be used to score the relevancy of an item feature vector x ⁇ to the current session.
  • a quadratic kernel may be used.

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Strategic Management (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Development Economics (AREA)
  • Tourism & Hospitality (AREA)
  • Primary Health Care (AREA)
  • Human Resources & Organizations (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Game Theory and Decision Science (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

A fashion preference of a user is determined based on a user's interaction with a plurality of fashion product content items that individually depict a corresponding fashion product. A recommendation is made to a user of a fashion product based at least in part on the fashion preference of the user.

Description

SYSTEM AND METHOD FOR LEARNING USER GENRES AND STYLES AND MATCHING
PRODUCTS TO USER PREFERENCES
Background
[0001] Digital photography has become a consumer application of great significance. It has afforded individuals convenience in capturing and sharing digital images. Devices that capture digital images have become low-cost, and the ability to send pictures from one location to the other has been one of the driving forces in the drive for more network bandwidth. [0002] Due to the relative low cost of memory and the availability of devices and platforms from which digital images can be viewed, the average consumer maintains most digital images on computer-readable mediums, such as hard drives, CD-Roms, and flash memory. The use of file folders are the primary source of organization, although applications have been created to aid users in organizing and viewing digital images. [0003] On-line learning is a machine learning paradigm in which an algorithm learns from one instance or sample at a time. While off-line learning is composed of well established techniques that have been thoroughly dissected, on-line algorithms have received a lot of attention in the last decade, with several applications ranging from learning complex background and appearance models, object detection and classification, modeling and predicting user behavior. On-line learning can become the only viable solution in applications where the training data is never available in batch, but is gathered concurrently to the decision/classification process and hence the need to design an adaptive learning technique. On the other hand, off-line or batch paradigms need to be retrained once new/unseen data is presented.
[0004] Several on-line variants of the most popular off-line machine learning algorithms have been proposed in the literature. Some approaches have sought to address the problem of training Support Vector Machines (SVM) with large amount of data. Since training an SVM requires solving a Quadratic Programming in a number of coefficients equivalent to the cardinality of the training set, memory requirements can become the bottleneck and therefore an on-line alternative is necessary. One approach has sought to introduce incremental decision tree classifiers that can be updated and retrained using new unseen data instances. Several contributions have been proposed to extend the popular AdaBoost algorithm to the online scenario, with several interesting variants ranging from Semi-Supervised Boosting to Multiple Instance Learning. BRIEF DESCRIPTION OF THE FIGURES
[0005] FIG. 1 illustrates a system that uses visual information to identify genre and fashion style preferences of a user, according to one or more embodiments.
[0006] FIG. 2 illustrates a method for predicting a preference of a user to a particular genre, according to one or more embodiments. [0007] FIG. 3A depicts an example of a panel that can be generated to present a set of visual aids to the user in order to prompt the user into providing a response, under an embodiment.
[0008] FIG. 3B shows a panel that enables the user to select size information for various types of fashion products, such issues, tops, bottoms, and addresses.
[009] FIG. 3C illustrates a panel that enables a user to specify or indicate the user's preference to characteristics patterns, color, and shape. [0010] FIG. 4 describes a method for programmatically predicting the genre or style of a product, under an embodiment.
[0011] FIG. 5 illustrates a method for matching a product to a customer preference, according to one or more embodiments. [0012] FIG. 6 illustrates a result panel for communicating the programmatically determine fashion genre preferences of the user, according to an embodiment.
[0013] FIG. 7 illustrates a method for determining descriptive classifications and categories of fashion products provided by fashion product content items, under one or more embodiments. [0014] FIG. 8 illustrates a system that makes fashion product recommendations to users using product class/category determinations and user activity information, according to an embodiment.
DETAILED DESCRIPTION
[0015] In fashion, people generally have their own unique preferences of style, color and genre of clothing. Their preferences as to genre and style is developed by their personal experience. For example, in a physical store, customers can describe their preferences and styles to a salesperson who can then recommend to them the right set of clothes and fashion accessories which match the customer's preferences. Embodiments recognize, however, that the same is not true for online shopping. In online shopping, a customer is forced to search and scan through many products for matching styles and preferences.
[0016] Accordingly, embodiments described herein provide a computer implemented method or system in which a user's genre preference to style or fashion can be determined programmatically.
[0017] Still further, embodiments enable programmatic classification and categorization of fashion products using image, text and metadata associated with a corresponding fashion product content item.
[0018] Still further, some embodiments enable a service or system to make programmatically determined recommendations relating top fashion products, based on information determined about the user's genre preferences and/or the determined genre of style of a fashion product represented by a content item.
[0019] More specifically, embodiments described herein include a computer-implemented method for determining user preferences for fashion products. In an embodiment, a fashion preference of a user is determined based on a user's interaction with a plurality of fashion product content items that individually depict a corresponding fashion product. A recommendation is made to a user of a fashion product based at least in part on the fashion preference of the user. [0020] According to another embodiment, a fashion product content item is analyzed to determine a set of features of a fashion product depicted in the fashion product content item. The fashion product is associated with a pre-defined descriptive category for each of a plurality of descriptive classifications, based on a quantitative analysis of the determined set of features. The product content item and its pre-defined descriptive category for each of the plurality of descriptive classifications are used to determine or predict a user preference.
[0021] In another embodiment, one or more processors (such as provided in any computing environment, such as server-client) are structured to analyze individual fashion product content items representing a catalog of fashion products to determine, for each fashion product content item, a set of features of a fashion product depicted in that fashion product content item. Each fashion product represented by one of the fashion product content items is assigned to a pre-defined descriptive category for one or more corresponding descriptive classifications. The assignment is based on a quantitative analysis of the determined set of features. One or more fashion product content items are detected which are deemed to be of interested to the user. A fashion preference of the user is determined using the pre-defined descriptive category for each of the plurality of descriptive classifications of the one or more fashion product content items that are deemed of interest to the user.
[0022] Embodiments described herein include systems and methods for (i) learning a user's or customer's preferences in clothing styles, fashion and genres, (ii) predicting genres of different clothing products and fashion accessories, and/or (iii) using (a) known shopping parameters of a user (e.g. the user's size information, price preferences, hate or love for certain styles, patterns and colors) and/or (b) predicted genres and styles for each individual user, to propose the best matching products and accessories to customers.
[0023] A fashion product includes, for example, clothing, accessories and apparel. Specific examples include blouses, shirts, dresses, shoes, socks, pants and bottoms, belts, jewelry (e.g. watches, earrings, necklaces), ties, hats, jackets and coats.
[0024] A fashion product content item corresponds to a document or file that includes visual, textual and/or metadata information about a particular product. The fashion product content items are generally available as part of an online catalog or e-commerce search engine. Typical aspects of such content items include (i) one or more images of a product, (ii) textual information about the product, including information about available sizes and variations to the product, (iii) pricing information, and/or (iv) links or data elements to facilitate their viewer of the content item to purchase the depicted fashion product.
[0025] Some embodiments recognize that computational complexity and latency of all these on-line learning techniques remain an open problem and can become critical in time constrained applications such as real-time object tracking or the on-line shopping scenario that is described in this paper. In fact, a large number of high dimensional feature vectors enforces strict requirements on the number of operations allowed in order to meet the stringent time requirements.
[0026] Some embodiments described herein include computer- implemented techniques for learning user preferences from a user's interaction with an on-line interface (e.g. one provided at a shopping website). By predicting what the user likes, a better search ranking algorithm can be designed, which in turn results in a better experience perceived by the user. In terms of feature selection, embodiments combine heterogeneous cues coming from visual and text features and, in particular, provide a compact yet discriminative representation of the user's preferences that traditional features are not able to achieve. In addition, embodiments implement a learning stage which can process relatively large feature vectors in less then few milliseconds to avoid compromising the overall user experience.
[0027] As used herein, the terms "programmatic", "programmatically" or variations thereof mean through execution of code, programming or other logic. A programmatic action may be performed with software, firmware or hardware, and generally without user-intervention, albeit not necessarily automatically, as the action may be manually triggered. [0028] One or more embodiments described herein may be implemented using programmatic elements, often referred to as modules or components, although other names may be used. Such programmatic elements may include a program, a subroutine, a portion of a program, or a software component or a hardware component capable of performing one or more stated tasks or functions. As used herein, a module or component, can exist on a hardware component independently of other modules/components or a module/component can be a shared element or process of other modules/components, programs or machines. A module or component may reside on one machine, such as on a client or on a server, or a module/component may be distributed amongst multiple machines, such as on multiple clients or server machines. Any system described may be implemented in whole or in part on a server, or as part of a network service. Alternatively, a system such as described herein may be implemented on a local computer or terminal, in whole or in part. In either case, implementation of system provided for in this application may require use of memory, processors and network resources (including data ports, and signal lines (optical, electrical etc.), unless stated otherwise. [0029] Embodiments described herein generally require the use of computers, including processing and memory resources. For example, systems described herein may be implemented on a server or network service. Such servers may connect and be used by users over networks such as the Internet, or by a combination of networks, such as cellular networks and the Internet. Alternatively, one or more embodiments described herein may be implemented locally, in whole or in part, on computing machines such as desktops, cellular phones, personal digital assistances or laptop computers. Thus, memory, processing and network resources may all be used in connection with the establishment, use or performance of any embodiment described herein (including with the performance of any method or with the implementation of any system).
[0030] Furthermore, one or more embodiments described herein may be implemented through the use of instructions that are executable by one or more processors. These instructions may be carried on a computer- readable medium. Machines shown in figures below provide examples of processing resources and computer-readable mediums on which instructions for implementing embodiments of the invention can be carried and/or executed. In particular, the numerous machines shown with embodiments of the invention include processor(s) and various forms of memory for holding data and instructions. Examples of computer-readable mediums include permanent memory storage devices, such as hard drives on personal computers or servers. Other examples of computer storage mediums include portable storage units, such as CD or DVD units, flash memory (such as carried on many cell phones and personal digital assistants (PDAs)), and magnetic memory. Computers, terminals, network enabled devices (e.g. mobile devices such as cell phones) are all examples of machines and devices that utilize processors, memory, and instructions stored on computer-readable mediums.
[0031] LEARNING GENRE AND STYLE PREFERENCES OF A SHOPPER [0032] FIG. 1 illustrates a system that uses visual information to identify genre and fashion style preferences of a user, according to one or more embodiments. A system such as described in FIG. 1 presents preselected images of fashion products to individuals in an attempt to determine likes, dislikes, preferences and other user feedback for ascertaining the user's style or genre preference. In contrast to embodiments described, conventional techniques for estimating a shopper's (e.g. user or customer) style or genre preference typically involves asking the individual about genres/styles that best describe their personal preference to style and genre. However, the conventional approach is problematic-among the reasons, words are not sufficiently precisely to capture fashion preferences and statements. Additionally, users do not always know what their preferences are.
[0033] Accordingly, embodiments described herein and with FIG. 1 include a system that programmatically learns user fashion style and genre preferences using visual aids or pictures. The system 100 may be provided in a variety of computing environments, including in a client-server architecture. For example, system 100 may be implemented on one or more servers (or other computing machines) to provide a service such as described by one or more embodiments detailed herein. In this environment, system 100 may be implemented on a website, such as in a e- commerce site, search engine or shopping portal.
[0034] System 100 may rely on genre definitions that are defined by experts or operators. For example, fashion genres include (and are not limited to) Nooks' that are of the following genres: chic, street, Boho, urban/hip-hop, and conservative. For example, experts may select clothing and clothing in ensembles that are representative of the various categories (the number of which is set by design or choice). In some cases such representative clothing and clothing ensembles form ground truth data, or points of comparison, in determining (i) genre preferences of the user, and (ii) predicting the genre of another item of clothing or apparel. In an embodiment, a system 100 depicts images of clothing and clothing ensembles in a worn state. For example, images of people (including celebrity images) wearing different genres of clothes and accessories can be shown to the user. The user is enabled to respond to individual images to specify whether the depicted clothing is of a style or type that is in the user's preference. Thus the system can learn from user choices made on images, rather than on text descriptions or on user's self-reporting of preferences.
[0035] More specifically, system 100 includes a user-interface 110, a user database 120, a genre score component 130, a genre determinator 134, a visual aid component 140, and a product database 150. A user of system 100 may correspond to a shopper or a customer of fashion products. In some embodiments, system 100 is implemented on an online medium. For example, system 100 can be implemented as part of an e-commerce site, shopping portal, or other web-based or networked environment in which individuals are given the opportunity to view (and potentially purchase) fashion products. The interface 110 may correspond to, for example, a webpage, or interactive feature provided on a webpage. [0036] The user of system 100 is associated with the profile in user database 120. For example, the user may have an account with an operator of a service that provides system 100. Alternatively, the user may be known by cookie/computer information, by account/login, or for a solitary online session with a provider (e.g. e-commerce site) of system 100. Independent of system 100 and determining style or genre preferences of the user, the user may interact with the interface 110 and provide parameters 112 relating to fashion products that the user can wear. The parameters that the user may specify include, for example, the user clothing size, preferred price range for fashion and clothing items, and preferred brand names. The user may also volunteer information about visual characteristics of clothing and apparel that the user likes or dislikes. For example, the user may specify preferred colors for certain types of clothing, preference information about fabrics or materials, preferred styles of shoes or apparel, types of jewelry, and aversions or preferences for particular types of patterns. [0037] The product database 150 retains information from fashion product content items. As described with some embodiments, a product database 150 may store information about fashion products depicted in the product content items. Such information may be programmatically determined from image, text and metadata analysis of fashion product content items, as provided by retailers, manufacturers and other suppliers of fashion products. The information that is programmatically determined about depicted fashion products is associated in database 150 with corresponding product content items, such as electronic catalog pages and sections. [0038] In order to programmatically determine genre/style preferences of the user, the visual aid component 140 is configured to present to the user images, or visual aids, from which are elicited to make the genre/style preference determinations. Visual aid component 140 communicates visuals 152 of fashion products to the user via the interface 110. In one embodiment, the visuals 152 depict fashion products, or ensembles of fashion products, in a worn state (e.g. as worn by a celebrity or model, on a mannequin, or computer generated onto an image of a person). FIG. 2 and FIG. 3A illustrate examples of how the visuals 152 can be structured for presentation to the user.
[0039] The user can provide input through the interface 110 that indicates (i) the users like or dislike of a particular fashion product or ensemble; (ii) the user's preference of one fashion product over another; and/or (iii) a rating or feedback that indicates the level of the user's like or dislike for the fashion product. The visual aid component 140 present a set of visuals 152 that prompt the user to enter a response that indicates the users visual preference for the fashion genre depicted by that visual Still further, as described with an embodiment of FIG. 2 or FIG. 3A, the visuals may be presented to the user in a quiz or game fashion. In the quiz or game fashion, the user is shown panels that individually depict competing fashion products of different genres. The user can respond to each panel by indicating their preference, or like dislike, a one fashion product over at the other end of panel.
[0040] The genre score component 130 records and determines a genre score from the user's input. The genre score component 130 may record responses the user has too been presented in visuals 152, in order to score individual classifications of fashion genre. Optionally, the genre score component 130, in combination with the visual aid component 140, can record and score the users response to subcategories of fashion genre. [0041] Numerous techniques may be employed to ascertain the fashion genre preferences of the user. In one embodiment, the set of visuals 152 is predesigned to depict a number of images of fashion products for each identify genre. The user simply responds with preference or like/dislike input when viewing images of the fashion products in order to indicate his likeness or preference of one fashion genre over another. The genre score component 130 maintains a genre score 133 that is indicative of the user's genre preference, for genres represented by this set of visuals 152. The genre score 133 can be recorded in the user database 120, in association with the profile from the user.
[0042] As an addition or alternative, the genre determinator 134 determines one or more preferred genres and/or subcategories (e.g. primary, secondary, and tertiary genres) of the user based at least in part on the score 133. The genre determinator 134 and/or score 133 may also influence the visuals 152 outputted for the user by the component 140, in that intelligence may be used by way of probabilistic assumptions that those users who have a certain genre preference are likely to have a particular like or dislike of another genre. For example, the user with business genre preference may be deemed unlikely to also like street genre clothing. [0043] Additionally, one or more embodiments provide that the fashion products identified in the product database 150 are tagged with genre descriptors 151. The descriptors include programmatically determined genre descriptors, which can be determined by a product genre predictor component (PGPC) 154. In particular, PGPC 154 analyzes the product content items in order to obtain information that can be used to determine the genre(s) of the fashion product depicted in the content item. Thus, system 100 can used to determine genre preferences of the user, as well as to predict the genre classifications and categories of fashion products. The genre descriptors 151 determined from the PGPC may include sub-genres or genre categories, including secondary and tertiary genres determinations. For example, many fashion products may share more than one genre. FIG. 4 illustrates a method for predicting genre(s) of fashion products using fashion product content items, according to some embodiments. As described, may base its determination on learning behavior, using a ground truth product set 155 provided by operators of system 100. [0044] In an embodiment, system 100 also includes a product recommendation engine 170. According to one or more embodiments, product recommendation engine 170 recommends a fashion product to the user, based on (i) user information that identifies genre/style preferences and parameters for fashion products that the user may purchase, and (ii) fashion product information. User information 172 is provided by user database 120. In particular, user information 172 is provided by genre preferences as outputted by the genre score component 133 and/or genre preference information 137. The fashion product information 174 is retrieved from the product database 150. The fashion product information 174 includes programmatically predicted genre classifications and/or subcategories, associated with individual products. The fashion product information 174 may also include information retrieved from the fashion product content item, as well as tag (e.g. metadata) provided by a supplier of the fashion product content item or the underlying fashion product. With user information and fashion product information, the recommendation engine 170 is able to recommend individual fashion products from, for example, products identified in the product database 150. The recommended products 176 may be communicated to the user via the interface 110.
[0045] Additionally, embodiments provide that system 100 is able to show its confidence in predicting user genres and style. As will be described, system 100 includes an interface in which users are able to also record known parameters, such as the user's clothing size, price preference, and/or their like/dislike for certain styles, patterns and colors. This information is used while matching products to user preferences. The overall system allows for multiple hierarchies of genre prediction: primary or top level genre predicting broad genre or style matches, secondary or second level genre predicting multiple fine-grain genre and styles, tertiary or third level genre predicting multiple domain specific styles, and so on.
[0046] FIG. 2 illustrates a method for predicting a preference of a user to a particular genre, according to one or more embodiments. More specifically, a method such as described determines, for a particular user, the user's primary, secondary and tertiary genres of preference. A method such as described may be implemented using a system such as described with FIG. 1. Accordingly, reference may be made to elements and numerals of FIG. 1 in order to describe suitable elements and components for performing a step or sub-step being described.
[0047] A set of images is shown to a user (210). In one embodiment, visual aid generator 140 selects and displays individual images of the set to the user via user-interface 110. The set of images can be pre-selected to be from a diverse range of genres. In one embodiment, some or all of the genres are determined using manual definitions and selections. Thus, the set of images may be sorted into different genres using manual input to classify each image in a particular genre. Alternatively, some or all of the images in the set are programmatically determined to be associated with a genre. For example, programmatic methods may be used to identify similarity between items of clothing, and the similarity comparisons may be used to associate clothing with a particular genre.
[0048] In response to being shown each image individually, the user is prompted to respond by providing an input (via interface 110) that indicates whether the user liked or disliked the image. The user's responses are recorded (220). In one implementation, the input is prompted from the user as part of a game in which the user can participate with input that states whether the user considered an individual image from the set as hot-or-not ("Hot-or-not game").
[0049] Based on user input, the user's genre preference is determined (230). With reference to an embodiment of FIG. 1, genre determinator 134 determines a user's preference to genre. In an embodiment, the genre determinator 134 uses an algorithm to determine the user's genre preferences (e.g. primary, secondary and tertiary). In one embodiment, an algorithm is used as follows:
[0050] Denote the whole set of genres as S = {S,, i = 1 ... n}, where n is the number of genres. Assume that each user has a predetermined set of favorable genres, denote as F = {Fj, j = 1 ... m}. Also assume that a set of images, denoted as T1, is provided for each genre S1. Now assume that when the user is shown a pair of images (Ii, I2) from different image set T1 and Tj, (i) if only one of them belongs to F (without loss of generality, assume it to be Ii), then the user has a higher probability p > 0.5 to pick Ii; (b) if both images belongs to F or none of the images belong to F, the user picks either image randomly with a probability of 0.5.
[0051] The algorithm maintains a vector of probabilities estimation, denoted as Qr = [qri, qr2, ■■■ qrn], of the user to belong to each genre after each round r. After each style question in a round, the algorithm will update the user's genre probabilities. The update can be performed as follows: Of the two genres that are presented to the user, the one picked by the user is updated using
Figure imgf000016_0001
[0052] The one that is not picked by the user is updated using
Figure imgf000016_0002
[0053] All the rest of the genres are updated using
Figure imgf000016_0003
[0054] After the update, Qr+i is normalized so that the sum of all the probabilities equals to 1.
[0055] Based on the current genre probabilities, do one of the following:
1. If one of the genre probabilities is above a certain threshold, then the algorithm terminates the test and returns the best genre to the user.
2. If none of the genre probabilities are above the threshold, then the algorithm picks two genre images to be shown to the user in the next round.
[0056] Different strategies can be used to choose the two images for the next round of the test, such as: (i) Randomly pick two genres; and/or (ii) Pick the top two genres that have the highest probabilities (this strategy helps the probability converge to the correct guess faster and minimizes the number of image pairs shown to the user).
[0057] The algorithm can be generalized to present k (k > 2) images to the user. In this case the update equation for (2) and (3) would be qr+1 = qr * (l - p) / (k - l) (4)
Figure imgf000017_0001
[0058] The algorithm can also be generalized to determine t (t > 1) genres. In this case, the criterion for stop can be modified to check the top t probabilities. The strategy to select the next set of images should pick images from both the top t genres and the rest of the genres. [0059] According to an embodiment, the user's responses to indicating likes or dislikes are used to determine the primary, the secondary and the tertiary genres of preference for the user (240). The primary, the secondary and the tertiary genres of preference can be determined at the same time. One way to implement this is to sequentially predict the primary, secondary and tertiary genres. However, to minimize the number of images shown to the user (and hence reduce user amount of user response), an approximation algorithm can be used. If all the images used for primary genre prediction are also tagged with secondary and tertiary genres, then the images that the user selected during the primary genre prediction can be used to build multiple histograms — one for secondary genres and multiple (one per domain) for tertiary genres. The top genres in these histograms can be used to predict secondary and tertiary genres. [0060] To offer good user experience, some embodiments provide for progress feedback to indicate the amount of progress the user has made towards the computer-learning of his genres of preference. In one embodiment, a progress bar can be shown to the user to indicate the progress of the genre prediction. The distance between the threshold and the current best genre probability, max qπ, can be used as progress indicator.
[0061] FIG. 3A depicts an example of a panel that can be generated to present the visual aids 152 (FIG. 1) to the user in order to prompt the user into providing a response, under an embodiment. In FIG. 3A, panel 310 is presented through the interface 110 (see FIG. 1). Thus, for example, panel 310 may be formatted as a webpage. The panel 310 comprises a pair of images 312, 314 that each depict clothing (as worn by a celebrity or model) of a particular genre. The user can select one image over the other to indicate his preference of a particular genre depicted by that image (as compared to the genre depicted in the other image). Thus, the user's selection of one image over another is the input that indicates the user's preference of one genre over another. Once the user makes a selection, the visual aid component 140 presents another panel comprising another pair of images (depicting clothing of different genres) to the user in order to solicit a similar selection from the user. According to an embodiment, the comparison game between image pairs can continue for a number of rounds, with a user selection in each round providing information as to the user's like/dislikes of the various genres defined with system 100. [0062] FIG. 3B shows a panel 330 that enables the user to select size information for various types of fashion products, such issues, tops, bottoms, and addresses. Parameters such as size can be used to make fashion product recommendations, filter recommendations to the user based on lack of availability of a given size, or skew the user's genre preference to accommodate a specific size or body type of the individual. [0063] In addition to recording user feedback of genre selection (via competing images of clothing), some embodiments provide that the user is able to enhance or augment the genre determination with input that specify some preferences of the user. FIG. 3C illustrates a panel 350 that enables a user to specify or indicate the user's preference to characteristics patterns, color, and shape. The characteristics that the user can specify preferences for may be specific to a particular type of fashion product. For example, the shape preferences of the user can specify may be presented as being specific to the category of fashion products for shoes, or more specifically woman's shoes. [0064] ASSOCIATED AND ENSEMBLE RECOMMENDATIONS FOR CLOTHING, APPAREL AND ACCESSORIES
[0065] According to embodiments, an online commerce environment (such as implemented by a system of FIG. 1) implements a recommendation engine to recommend additional clothing, apparel, or accessories. Such recommendations may be made to, for example, provide a fashion ensemble or matching set of clothing/apparel.
[0066] In order to facilitate recommendation of clothing/apparel or accessories, one or more embodiments provide that at least some available products for a commerce medium are programmatically analyzed in order to predict the individual product's genre and style. FIG. 4 describes a method for programmatically predicting the genre or style of a product, under an embodiment.
[0067] Product genre prediction combines several different feature types, such as metadata features (based on textual description) and visual features (based on visual vocabularies computed from several thousand of images).
[0068] For individual products in a catalog, programmatic feature extraction can utilize different forms of features (410). The features extraction includes metadata extraction (414) and visual feature extraction (418). In metadata feature extraction, metadata features are identified and represented as a vector, where each word or word pair that appears in one of the metadata fields (such as title, description, brand, prices, etc.) represent one dimension in the vector. Visual features can be determined using image analysis, and represented as vectors. Here, the vector can represent one global feature computed over the whole image, or one based on visual vocabulary computed over thousands of images. These visual features include color, shape, and/or texture. A final feature vector can be computed by combining the metadata and visual vectors, for example, by concatenating metadata feature and visual features one after another to form a single big feature vector V. [0069] A set of products are manually tagged by fashion experts with primary, secondary, and tertiary genre tags to form a ground truth set (420).
[0070] Machine learning algorithms (Support Vector Machine or boosting or Bayesian learning) are used to learn the mapping from the extracted feature vector to different genres for these products (430). For each genre, given the feature vector V, a binary classifier can be learned to determine the probability of a product to belong to that genre or not. [0071] Genre prediction can then be performed for individual products that are not in the ground truth set (440). For each product, the probabilities of all genres are estimated and the top genres are selected as the genre predictions for that product.
[0072] To estimate all primary (444), secondary (446) and tertiary genre (448) for a product, a multilevel level classification can be performed in which secondary or tertiary genres are conditioned on the primary genre. Primary genre classifiers are trained as previously stated. Then, given the primary genre gi of a product, a new set of secondary g2 and tertiary genre g3 is trained for each primary genre gi. During testing, the joint probability of primary and secondary/tertiary genres given the feature vector P(gi g2 g3 I V) can be computed as
P(9i 9293 1 V) = P(gi I V) * P(g2 1 gi V) * P(g3 1 gi V) (6)
[0073] PRODUCT MATCHING
[0074] According to embodiments, product recommendations are made by (i) identifying predicted product genres of products (as described with
FIG. 4), (ii) identifying a given user's genre or style preference for clothing and apparel (as described with an embodiment of FIG. 2); and (iii) matching product to user using (i) and (ii).
[0075] In some embodiments, products can be boosted for recommendation by boosting products which match user preferences to higher ranks and de-weighing products which do not match user preferences to lower ranks. [0076] As described herein, products which do not match user preferences can be de-weighted as follows: (i) filter non-matching products completely from presentation to user, or (ii) down-weigh such towards the end of results. Matching products (or recommendations) can be viewed by user via period automatic emails (for example, emailed daily, twice in a week, once in a week, or once in a month) or by logging onto a website. Also, depending on how often a product has been shown to the user and how often user has clicked on it, the system keeps learning the user's overall genre and domain-specific genre preferences. [0077] In more detail, FIG. 5 illustrates a method for matching a product to a customer preference, according to one or more embodiments. Reference is made to components of FIG. 1 in order to describe suitable components for performing a step or sub-step being described. [0078] The primary and secondary/tertiary genre combination with the highest joint probability can be select as the genres of the product. [0079] For a given user, the user's primary, secondary and tertiary genres are identified (510). For example, the results of a process such as described by FIG. 2 may be analyzed or retrieved to determine the user's preference genres. The visual aid component 140 may present visuals 152 to prompt the user for a response. A series of prompts may be solicited from the user in order to have the user specify comparative preferences of various different genres. The resulting score (determined from the user's responses) is used to determine the user's fashion genre preferences. [0080] Once a user's primary, secondary and tertiary genres of preference are identified, a pool of products are identified from the product database 150 that match the user's preferences (520). [0081] In one embodiment, the matching products are subjected to a process of selection, filtering, are weighting, in order to identify a subset of fashion products to recommend to the user (530). For example, selection and filtering may be performed to exclude fashion products that are not available and the size of the user, or which are of a color, pattern or shape that the user has specified as being disliked. As another example, the matching products may be filtered to eliminate items that have the color, brand or keywords that the user does not like. The matching products may also be weighted to favor/disfavor fashion products that satisfy, for example, specified preferences of the user as to color, pattern, shape, or brand.
Matching products can then be presented to the user as, for example, a search or browse list (540). In one embodiment, the remaining products are then sorted by a matching score to determine the order in which they should be sent to the user.
[0082] According to an embodiment, the matching score can be computed as a linear combination of different individual matching scores:
s = Wi * ai + W2 * a2 + ... where (w, > 0)
[0083] The individual matching score includes the product's primary, secondary or tertiary genre probabilities, age matching score, price preferences, and other color, style or pattern preferences. [0084] RESULT PRESENTATION
[0085] While results of various processes, algorithms and system output can be provided to user in various forms, some embodiments include an interactive tool that the user can use in order to determine the user's fashion genre preferences. FIG. 6 illustrates a result panel for communicating the programmatically determine fashion genre preferences of the user, according to an embodiment. A result panel 610 can be output in in response to an individual partaking in, for example, a quiz or challenge generated through the visual aid component 140. Through processes such as described by various embodiments, result panel 610 may identify the user's primary genre (Sporty), and one of more secondary (Conservative) or tertiary genres (Modern, Boho). The result panel 610 may also display fashion products that meet the users genre/style preferences. The images of fashion products may be preselected, based on the images being deemed representative of the particular genre or genre combination. Alternatively, some or all of the fashion products depicted may be selected for the user. For example, parameters such as user specified color preferences may be used to present some items of clothing or apparel. Likewise, if a user prefers a certain style of shoes (e.g. boots, as specified by the user via an interface such as shown in FIG. 3C), footwear the result panel 610 may be depicted by boots.
[0086] ENHANCED FEATURE REPRESENTATION
[0087] Embodiments described herein may incorporate enhanced feature representation of descriptive classifications for fashion products. In particular, descriptive classifications can be defined by human operators (e.g. experts) to include multiple categories (or sub-classifications). According to embodiments, fashion product content items (e.g. catalog or web image of clothing) are analyzed to extract features from images, text and metadata. The extracted features are then analyzed to associate the fashion product with one of more descriptive classifications (of fashion products), and one or more categories are each associated descriptive classification.
[0088] FIG. 7 illustrates a method for determining descriptive classifications and categories of fashion products provided by fashion product content items, under one or more embodiments. [0089] Descriptive classifications and categories (or sub-classifications) for fashion products are defined by human operators (710). In one embodiment, the descriptive classifications include (but are not limited to) : genre, shape or silhouette, pattern, and color.For example, the following classifications may be employed:
Silhoυeue !%> * -S X ϊ' htφ hcoi. open kv ^k^ ole-Λ, tunkivjk, cu
Panes n
Figure imgf000023_0001
,vhfv pυm, <tc t oka Kimih
Figure imgf000023_0002
[0090] For individual fashion product content items, a set of primitive visual and text features are extracted from the content item (720). These features include, for example, color histogram, shape descriptors, texture features and text description features. To determine such features, image recognition and text analysis (including textual metadata analysis) can be performed on individual content items.
[0091] Analysis is performed on the primitive features in order to determine the classification and categorization (or sub-classifications) of the products depicted in the content items (730). The analysis can be quantitative. More specifically, in one embodiment, the analysis can be statistical. Furthermore, multiple methods can be implemented to associate a fashion product with the classification. For color classification a set of cluster centers is created that is based on manually labeled ground truth. Each product (or image thereof) is assigned to the nearest cluster based on its distance in histogram space:
Figure imgf000024_0001
f is the primitive feature vector comprehensive of visual and textual information;
Cj with i = 1, ...NCFr are cluster centroids for the color family/classification; and
XiCFT are components of the color family hyper dimension X0"7"
J is a mapping from distances to likelihoods.
[0092] A support vector machine classifier (SVM) may be used to associate or assign the products to the classifications. For each classification, human operators (e.g. fashion experts) select a set of positive examples that possesses the properties corresponding to the tag, and a set negative examples that do not have those properties. As a new (unknown) item comes, the trained SVM is used to generate a decision value from the visual and text feature of the item. The decision value represents the item's distance to the separating hyperplane. Only the values on the positive side of the hyperplane are retained: J =I
where α * c< ^ are the learned SVM parameters corresponding to each tag / of each hyperdimension T E {GTf ST, PT}. As before, f is the primitive feature vector of the item, while g7 of all other items in the training set. K is the kernel function and τ(x) = xH(x), where H(x) is the Heaviside function.
[0093] FIG. 8 illustrates a system that makes fashion product recommendations to users using product class/category determinations and user activity information, according to an embodiment. A system such as described by an embodiment of FIG. 8 may represent a modification or variations to an embodiment described in FIG. 1, as well as elsewhere in this application. Thus, functionality and components of FIG. 8 may optionally be viewed as supplementing or augmenting a system such as described with FIG. 1.
[0094] A system 800 may comprise the user database 120 and the product database 150. As described with other embodiments, the user database 120 may associate certain information with individual users, such as the users fashioned genre preferences (which may be programmatically determined) as well as parameters specified by the user (e.g. See FIG. 3B and FIG. 3C) in addition, the user database 120 may be coupled to a monitor component 810 that monitors or detects and user actions about fashion product content items and related activity. The monitor component 810 may detect activity such as one or more of the following: (i) user interaction with the search results, including the user selecting or otherwise indicating interest to a particular item in the search result; (ii) user interaction with online browsing or shopping environment. Information 812 that identifies items (e.g. products) of interest can be stored in the database 120. In one embodiment, this information 812 includes items that were displayed to the user and which the user clicked-on, as well as items that were displayed to the user and not clicked on. [0095] The user monitor 810 may detect session specific activity, or historical activity 814 from the user's past sessions. The historical activity can extend to search terms that the user entered at, for example, a search engine or e-commerce site. The user interaction may be detected through interface 810, or through the browser or browser data (e.g. browser history and cookie information). In some embodiments, the historical activity 814 includes the queries that the user typed in, the impressions (i.e. the items retrieved by the search engine and presented to the user) and the buy clicks (i.e. the items clicked by the user). The set of queries, is projected onto the fashion-aware feature space described above and several positive training samples are obtained.
[0096] In an embodiment, the product database 150 is coupled to a product category/class determinator 820. The category/class determinator 820 may analyze fashion product content items in order to determine one or more classifications/categories 822 of each product. In an embodiment, the category/class determinator 820 implements a process such as described by FIG. 7. In an embodiment, the resulting descriptive classification/categorization is stored in the product database 150. [0097] According to one or more embodiments, a user preference profiler 830 generates a user profile 832 based on activity information 812 and/or historical information 814. The profiler 830 updates the user profile 832 for individual users. In creating and updating the user profile 832, the profiler 830 (i) identifies fashion products from the user activity information 812 (e.g. products that the user selected to view when browsing or searching, products the user elected not to view)); (ii) uses the product database 150 to determine classifications and categorizations of those products (as determined by FIG. 7); and (iii) uses the descriptive classifications and categorizations of the products identified from the activity information 812 to develop the user's profile 832. The users profile 832 may augment, supplement or otherwise identify the fashioned genre preferences of the user. Thus, the user profile 832 may be combined with, or be used as an alternative, to the programmatic fashion genre determination described by other embodiments. For example, the user profile 832 may be session specific and robust to determine that the user is looking for an event- specific outfit (e.g. evening gown), which otherwise may not be in the preference genre of the user. The profiler 830 may also use the historical information 814 to develop the profile 832.
[0098] In one embodiment, the recommendation engine 170 is configured to recommend products 176 data selected for the user based at least in part on the genre preferences as identified by the user profile 832 and/or genre preferences identified via the aid/score component. The recommendation engine 170 may also include historical data 814 as a component for determining its recommended product 176. The recommendation engine 170 may also be used to recommend and/or retrieve and/or rerank products in response to user query/search or request for products from a specific type of fashion products [099] SHORT-TERM USAGE
[0100] Embodiments recognize that in an online scenario, the short-term preference of the user can become of importance. Embodiments further recognize a need for an online algorithm that quickly learns from the user's actions, and enhances the user's shopping and search experience right away. For example, when a user is shopping for a formal holiday party vs. a resort vacation, his long term preferences about the colors, patterns, brands etc. will be of little use for improving the overall shopping experience. Hence a system that learns about the user real time as the user is interacting with the site can deliver more pertinent results.
[0101] In one embodiment, on the online system, as the user is performing queries and doing clicks these are incorporated into a daily user profile. A summary of the preferences is created via kernel density estimation and is kept to be used in the ranking. As the user enters queries and clicks on item i, the feature vectors describing the properties of item i are fetched (from a precomputed table) and efficiently aggregated in a generative model of the daily user profile by on-line update of a kernel density estimator:
Figure imgf000028_0001
[0102] where n is the number of click of the user's session, while h is the kernel bandwidth. The function p can be used to score the relevancy of an item feature vector xτ to the current session. A quadratic kernel may be used. After the user enters the query, all the items relevant to the query (visual and text based relevancy) are fetched from the item database along with the correspondent absolute rankings. The scores for each retrieved item are then computed according to the off-line and on-line models described above. [0103] CONCLUSION
[0104] Although numerous embodiments are described herein in terms of fashion products, alternative embodiments may extend to different types of products. In particular, embodiments may extend to other products that are generally classified by personal taste and appearance, such as furniture, carpets (and drapes), and design exteriors. [0105] CONCLUSION
[0106] Although illustrative embodiments have been described in detail herein with reference to the accompanying drawings, it is to be understood that the embodiments described are not limited to specific examples recited. As such, many modifications and variations are possible, including the matching of features described with one embodiment to another embodiment that makes no reference to such feature. Moreover, a particular feature described either individually or as part of an embodiment can be combined with other individually described features, or parts of other embodiments, even if the other features and embodiments make no mention of the particular feature.

Claims

What is claimed is:
1. A computer-implemented method for determining user preferences for fashion products, the method comprising:
using one or more processors to perform steps comprising: programmatically determining a fashion preference of a user based on a user's interaction with a plurality of fashion product content items that individually depict a corresponding fashion product; making a recommendation to a user of a fashion product based at least in part on the fashion preference of the user.
2. The computer-implemented method of claim 1, further comprising individual displaying the plurality of fashion product content items to the user, and prompting the user for a response that indicates a like or dislike of the plurality of fashion product content items.
3. The computer-implemented method of claim 1, wherein programmatically determining the fashion preference includes: identifying a set of images that individually depict one or more fashion items; displaying a sequence comprising a plurality of panels, in which each panel includes at least two images from the set to the user; for each panel, recording a response from the user that indicates which of the at least two images in that panel the user most likes or most dislikes.
4. The computer-implemented method of claim 3, wherein displaying the sequence includes creating each panel so that each fashion product content item of the individual panels displays a corresponding fashion product that is of a corresponding genre that is different than the fashion product of the other fashion product content item of the panel.
5. The computer-implemented method of claim 4, wherein the fashion product content items of each panel are determined to belong to the corresponding genre by manual input.
6. The computer-implemented method of claim 1, further comprising prompting the user to provide input that specifies one or more known parameters about the user's fashion preference.
7. The computer-implemented method of claim 1, wherein the one or more known parameters include a size or a price preference of the user.
8. The computer-implemented method of claim 1, wherein making the recommendation to a user of the fashion product includes making the recommendation of one or more fashion products based on the determined fashion preference and known parameters of the user.
9. A computer-implemented method for using programmatic descriptors for fashion products, the method comprising:
using one or more processors to perform steps comprising: analyzing a fashion product content item to determine a set of features of a fashion product depicted in the fashion product content item; programmatically associating the fashion product to a pre-defined descriptive category for each of a plurality of descriptive classifications, based on a quantitative analysis of the determined set of features; using the product content item and its pre-defined descriptive category for each of the plurality of descriptive classifications to determine or predict a user preference.
10. The method of claim 9, wherein the plurality of descriptive classifications include one or more of a genre class, a pattern class, a shape class, or a color family class.
11. The method of claim 9, wherein analyzing a fashion product content item includes performing image analysis on an image portion of the fashion product content item
12. The method of claim 9, wherein programmatically associating the fashion product to the pre-defined descriptive category includes determining a probability that the fashion product has a visual characteristic of each predefined category of one or more of the descriptive classifications.
13. The method of claim 9, wherein using the fashion product content item and its pre-defined descriptive category for each of the plurality of descriptive classifications includes detecting user selection or interaction with the fashion product content item, and using the pre-defined descriptive category of each of the descriptive classification in order to determine the user preference.
14. The method of claim 13, wherein detecting user selection or interaction with the fashion product content item includes monitoring which fashion product content items the user selects to view in order to determine a profile for that user based on the pre-defined descriptive category of the individual descriptive classifications for each product that the user viewed.
15. The method of claim 9, wherein using the fashion product content item and its pre-defined descriptive category for each of the plurality of descriptive classifications includes identifying a fashion genre or style preference of a user, and recommending, or not recommending, the fashion product based on the pre-defined descriptive categories associated with the fashion product content item and the fashion genre or style preference of the user.
16. The method of claim 9, further comprising: recording historical information pertaining to a user's online activity about fashion products; and determining the user's genre preferences for fashion products based in part on the historical information.
17. A computer-implemented method for determining user preferences for fashion products, the method comprising:
using one or more processors to perform steps comprising: analyzing individual fashion product content items representing a catalog of fashion products to determine, for each fashion product content item, a set of features of a fashion product depicted in that fashion product content item; programmatically associating each fashion product represented by one of the fashion product content items to a pre-defined descriptive category for each of a plurality of descriptive classifications, based on a quantitative analysis of the determined set of features; detecting one or more fashion product content items that is deemed to be of interested to the user; determining a fashion preference of the user using the pre-defined descriptive category for each of the plurality of descriptive classifications of the one or more fashion product content items that are deemed of interest to the user.
18. The method of claim 7, wherein determining the preference of the user includes using historical information that includes search terms previously used by the user.
PCT/US2010/037139 2009-06-03 2010-06-02 System and method for learning user genres and styles and matching products to user preferences WO2010141637A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
KR1020127000140A KR20120085707A (en) 2009-06-03 2010-06-02 System and method for learning user genres and styles and matching products to user preferences
JP2012514104A JP2012529122A (en) 2009-06-03 2010-06-02 System and method for learning user genre and style and matching products to user preferences
CA2764056A CA2764056A1 (en) 2009-06-03 2010-06-02 System and method for learning user genres and styles and matching products to user preferences
EP10784043.1A EP2438509A4 (en) 2009-06-03 2010-06-02 System and method for learning user genres and styles and matching products to user preferences
AU2010256641A AU2010256641A1 (en) 2009-06-03 2010-06-02 System and method for learning user genres and styles and matching products to user preferences

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US18396509P 2009-06-03 2009-06-03
US61/183,965 2009-06-03
US39679010P 2010-06-01 2010-06-01
US61/396,790 2010-06-01

Publications (1)

Publication Number Publication Date
WO2010141637A1 true WO2010141637A1 (en) 2010-12-09

Family

ID=43298124

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2010/037139 WO2010141637A1 (en) 2009-06-03 2010-06-02 System and method for learning user genres and styles and matching products to user preferences

Country Status (6)

Country Link
EP (1) EP2438509A4 (en)
JP (1) JP2012529122A (en)
KR (1) KR20120085707A (en)
AU (1) AU2010256641A1 (en)
CA (1) CA2764056A1 (en)
WO (1) WO2010141637A1 (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110307294A1 (en) * 2010-06-10 2011-12-15 International Business Machines Corporation Dynamic generation of products for online recommendation
US20120197891A1 (en) * 2011-01-27 2012-08-02 Electronic Entertainment Design And Research Genre discovery engines
WO2013086022A1 (en) * 2011-12-05 2013-06-13 Houzz, Inc. Consistent presentation of content and passive relevance determination of content relationship in an on-line commerce system
US9697232B2 (en) 2015-03-19 2017-07-04 International Business Machines Corporation System and method for creating a preference profile from shared images
WO2018052906A1 (en) * 2016-09-13 2018-03-22 Sophistio, Inc. Automatic wearable item classification systems and methods based upon normalized depictions
IT201600132446A1 (en) * 2016-12-29 2018-06-29 Else Corp S R L Learning-based system and recommendation method
WO2020018489A1 (en) * 2018-07-16 2020-01-23 Wantable, Inc. System and method determining individual style preference and delivering said style preferences
IT201800007812A1 (en) * 2018-08-03 2020-02-03 Else Corp Srl A 3D visual search and AI-based recommendation system
US10635952B2 (en) 2018-04-11 2020-04-28 International Business Machines Corporation Cognitive analysis and classification of apparel images
US10726474B2 (en) 2015-12-23 2020-07-28 Alibaba Group Holding Limited Displaying an online product on a product shelf
US10755229B2 (en) 2018-04-11 2020-08-25 International Business Machines Corporation Cognitive fashion-ability score driven fashion merchandising acquisition
US10904346B2 (en) 2018-12-03 2021-01-26 International Business Machines Corporation Weighted digital image object tagging
US10956928B2 (en) 2018-05-17 2021-03-23 International Business Machines Corporation Cognitive fashion product advertisement system and method
US10963744B2 (en) 2018-06-27 2021-03-30 International Business Machines Corporation Cognitive automated and interactive personalized fashion designing using cognitive fashion scores and cognitive analysis of fashion trends and data
US11068549B2 (en) 2019-11-15 2021-07-20 Capital One Services, Llc Vehicle inventory search recommendation using image analysis driven by machine learning
US11373231B2 (en) 2019-01-31 2022-06-28 Walmart Apollo, Llc System and method for determining substitutes for a requested product and the order to provide the substitutes
US11373228B2 (en) 2019-01-31 2022-06-28 Walmart Apollo, Llc System and method for determining substitutes for a requested product
WO2022140796A1 (en) * 2020-12-23 2022-06-30 BLNG Corporation Systems and methods for generating jewelry designs and models using machine learning
US11386301B2 (en) 2019-09-06 2022-07-12 The Yes Platform Cluster and image-based feedback system
US11538083B2 (en) 2018-05-17 2022-12-27 International Business Machines Corporation Cognitive fashion product recommendation system, computer program product, and method
US11599929B2 (en) 2014-04-17 2023-03-07 Ebay Inc. Fashion preference analysis
EP4139874A4 (en) * 2020-05-07 2023-08-16 Caastle, Inc. Methods and systems for providing a personalized user interface

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11120491B2 (en) 2013-09-24 2021-09-14 Ebay Inc. Method, medium, and system for social media based recommendations
JP6199685B2 (en) * 2013-10-03 2017-09-20 Necソリューションイノベータ株式会社 Fashion coordination support device and fashion coordination support system
JP6322070B2 (en) * 2014-07-08 2018-05-09 児玉 昇司 Information processing apparatus, information processing method, and program
US20160189274A1 (en) * 2014-12-31 2016-06-30 Ebay Inc. Fashion administration
US10032203B2 (en) 2015-02-18 2018-07-24 Microsoft Technology Licensing, Llc Dynamic property surfacing
CN106294420B (en) * 2015-05-25 2019-11-05 阿里巴巴集团控股有限公司 The method and device of business object collocation information is provided
KR20160146273A (en) 2015-06-12 2016-12-21 강산 System and method for providing intelligent matching commerce
JP2017033071A (en) * 2015-07-29 2017-02-09 株式会社タカヤコミュニケーションズ Electronic catalog system and retrieval device to be used in the same
US10918150B2 (en) * 2017-03-07 2021-02-16 Bodygram, Inc. Methods and systems for customized garment and outfit design generation
WO2018182068A1 (en) * 2017-03-30 2018-10-04 스노우 주식회사 Method and apparatus for providing recommendation information for item
KR20200104013A (en) * 2019-02-26 2020-09-03 주식회사 틸투원 Method and apparatus for recommending products
KR102245492B1 (en) * 2019-04-08 2021-04-27 오현상 electronic terminal device for providing a style checking function of a user based on a clothing image selected by a user
KR102270989B1 (en) 2019-06-20 2021-06-30 (주)대왕시스템 Artificial intelligence fashion coordination system
US20220301042A1 (en) * 2019-08-16 2022-09-22 Subfiber OÜ Method and system for navigating within and determining non-binary, subjective preferences within very large and specific data sets having objectively characterized metadata
KR102268009B1 (en) * 2019-08-27 2021-06-22 엔에이치엔 주식회사 Shopping mall system and method for recommendation goods using text analysis
KR102284148B1 (en) * 2019-09-11 2021-07-30 주식회사 인텔리시스 Method and System of recommending fashion based on vector based deep learning
KR20210041730A (en) * 2019-10-08 2021-04-16 오드컨셉 주식회사 Method, apparatus and computer program for fashion item recommendation
KR102392674B1 (en) * 2020-01-30 2022-04-29 오드컨셉 주식회사 Fashion goods recommendation methods, devices and systems
KR102211813B1 (en) * 2020-09-18 2021-02-02 가영 임 Method and apparatus for recommending the best shoes for user's feet
IT202100021545A1 (en) * 2021-08-09 2023-02-09 Luxottica Group S P A PRODUCT RECOMMENDATION METHOD.
JP7575422B2 (en) 2022-05-19 2024-10-29 Lineヤフー株式会社 Information processing device, information processing method, and information processing program

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080080745A1 (en) * 2005-05-09 2008-04-03 Vincent Vanhoucke Computer-Implemented Method for Performing Similarity Searches
US20080144943A1 (en) * 2005-05-09 2008-06-19 Salih Burak Gokturk System and method for enabling image searching using manual enrichment, classification, and/or segmentation
US20080154625A1 (en) * 2006-12-18 2008-06-26 Razz Serbanescu System and method for electronic commerce and other uses
US20080162574A1 (en) * 2006-11-22 2008-07-03 Sheldon Gilbert Analytical E-Commerce Processing System And Methods
US20080177640A1 (en) * 2005-05-09 2008-07-24 Salih Burak Gokturk System and method for using image analysis and search in e-commerce
US20080235604A1 (en) * 2007-03-23 2008-09-25 Peter Ebert Model-based customer engagement techniques
US20090019008A1 (en) * 2007-04-27 2009-01-15 Moore Thomas J Online shopping search engine for vehicle parts
US20090248599A1 (en) * 2008-04-01 2009-10-01 Hueter Geoffrey J Universal system and method for representing and predicting human behavior
US20100082604A1 (en) * 2008-09-22 2010-04-01 Microsoft Corporation Automatic search query suggestions with search result suggestions from user history

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001175745A (en) * 1999-12-21 2001-06-29 Matsushita Electric Ind Co Ltd System and method for electronic commercial transaction
WO2002079942A2 (en) * 2001-03-29 2002-10-10 Artmecca.Com System for visual preference determination and predictive product selection
JP2003345943A (en) * 2002-05-22 2003-12-05 Hitachi Ltd Coordinate search method and system
JP2004220200A (en) * 2003-01-10 2004-08-05 Sony Ericsson Mobilecommunications Japan Inc Coordinate information providing method and device, coordinate information providing system, and coordinate information providing program

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080080745A1 (en) * 2005-05-09 2008-04-03 Vincent Vanhoucke Computer-Implemented Method for Performing Similarity Searches
US20080144943A1 (en) * 2005-05-09 2008-06-19 Salih Burak Gokturk System and method for enabling image searching using manual enrichment, classification, and/or segmentation
US20080177640A1 (en) * 2005-05-09 2008-07-24 Salih Burak Gokturk System and method for using image analysis and search in e-commerce
US20080162574A1 (en) * 2006-11-22 2008-07-03 Sheldon Gilbert Analytical E-Commerce Processing System And Methods
US20080162269A1 (en) * 2006-11-22 2008-07-03 Sheldon Gilbert Analytical E-Commerce Processing System And Methods
US20080154625A1 (en) * 2006-12-18 2008-06-26 Razz Serbanescu System and method for electronic commerce and other uses
US20080235604A1 (en) * 2007-03-23 2008-09-25 Peter Ebert Model-based customer engagement techniques
US20090019008A1 (en) * 2007-04-27 2009-01-15 Moore Thomas J Online shopping search engine for vehicle parts
US20090248599A1 (en) * 2008-04-01 2009-10-01 Hueter Geoffrey J Universal system and method for representing and predicting human behavior
US20100082604A1 (en) * 2008-09-22 2010-04-01 Microsoft Corporation Automatic search query suggestions with search result suggestions from user history

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110307294A1 (en) * 2010-06-10 2011-12-15 International Business Machines Corporation Dynamic generation of products for online recommendation
US20120197891A1 (en) * 2011-01-27 2012-08-02 Electronic Entertainment Design And Research Genre discovery engines
US10657573B2 (en) 2011-12-05 2020-05-19 Houzz, Inc. Network site tag based display of images
WO2013086022A1 (en) * 2011-12-05 2013-06-13 Houzz, Inc. Consistent presentation of content and passive relevance determination of content relationship in an on-line commerce system
US9230223B2 (en) 2011-12-05 2016-01-05 Houzz, Inc. Consistent presentation of content and passive relevance determination of content relationship in an on-line commerce system
US10664892B2 (en) 2011-12-05 2020-05-26 Houzz, Inc. Page content display with conditional scroll gesture snapping
US11599929B2 (en) 2014-04-17 2023-03-07 Ebay Inc. Fashion preference analysis
US9697232B2 (en) 2015-03-19 2017-07-04 International Business Machines Corporation System and method for creating a preference profile from shared images
US9892345B2 (en) 2015-03-19 2018-02-13 International Business Machines Corporation System and method for creating a preference profile from shared images
US10042865B2 (en) 2015-03-19 2018-08-07 International Business Machines Corporation System and method for creating a preference profile from shared images
US10169371B2 (en) 2015-03-19 2019-01-01 International Business Machines Corporation System and method for creating a preference profile from shared images
US10726474B2 (en) 2015-12-23 2020-07-28 Alibaba Group Holding Limited Displaying an online product on a product shelf
US11334937B2 (en) 2015-12-23 2022-05-17 Advanced New Technologies Co., Ltd. Displaying an online product on a product shelf
US11030679B2 (en) 2015-12-23 2021-06-08 Advanced New Technologies Co., Ltd. Displaying an online product on a product shelf
WO2018052906A1 (en) * 2016-09-13 2018-03-22 Sophistio, Inc. Automatic wearable item classification systems and methods based upon normalized depictions
IT201600132446A1 (en) * 2016-12-29 2018-06-29 Else Corp S R L Learning-based system and recommendation method
US10755229B2 (en) 2018-04-11 2020-08-25 International Business Machines Corporation Cognitive fashion-ability score driven fashion merchandising acquisition
US10685265B2 (en) 2018-04-11 2020-06-16 International Business Machines Corporation Cognitive analysis and classification of apparel images
US10891585B2 (en) 2018-04-11 2021-01-12 International Business Machines Corporation Cognitive fashion-ability score driven fashion merchandising acquisition
US10635952B2 (en) 2018-04-11 2020-04-28 International Business Machines Corporation Cognitive analysis and classification of apparel images
US11538083B2 (en) 2018-05-17 2022-12-27 International Business Machines Corporation Cognitive fashion product recommendation system, computer program product, and method
US10956928B2 (en) 2018-05-17 2021-03-23 International Business Machines Corporation Cognitive fashion product advertisement system and method
US10963744B2 (en) 2018-06-27 2021-03-30 International Business Machines Corporation Cognitive automated and interactive personalized fashion designing using cognitive fashion scores and cognitive analysis of fashion trends and data
WO2020018489A1 (en) * 2018-07-16 2020-01-23 Wantable, Inc. System and method determining individual style preference and delivering said style preferences
IT201800007812A1 (en) * 2018-08-03 2020-02-03 Else Corp Srl A 3D visual search and AI-based recommendation system
US10904346B2 (en) 2018-12-03 2021-01-26 International Business Machines Corporation Weighted digital image object tagging
US11373231B2 (en) 2019-01-31 2022-06-28 Walmart Apollo, Llc System and method for determining substitutes for a requested product and the order to provide the substitutes
US11373228B2 (en) 2019-01-31 2022-06-28 Walmart Apollo, Llc System and method for determining substitutes for a requested product
US11386301B2 (en) 2019-09-06 2022-07-12 The Yes Platform Cluster and image-based feedback system
US11068549B2 (en) 2019-11-15 2021-07-20 Capital One Services, Llc Vehicle inventory search recommendation using image analysis driven by machine learning
US11775597B2 (en) 2019-11-15 2023-10-03 Capital One Services, Llc Vehicle inventory search recommendation using image analysis driven by machine learning
EP4139874A4 (en) * 2020-05-07 2023-08-16 Caastle, Inc. Methods and systems for providing a personalized user interface
WO2022140796A1 (en) * 2020-12-23 2022-06-30 BLNG Corporation Systems and methods for generating jewelry designs and models using machine learning

Also Published As

Publication number Publication date
KR20120085707A (en) 2012-08-01
EP2438509A1 (en) 2012-04-11
CA2764056A1 (en) 2010-12-09
JP2012529122A (en) 2012-11-15
EP2438509A4 (en) 2013-04-10
AU2010256641A1 (en) 2011-12-22

Similar Documents

Publication Publication Date Title
US20100313141A1 (en) System and Method for Learning User Genres and Styles and for Matching Products to User Preferences
WO2010141637A1 (en) System and method for learning user genres and styles and matching products to user preferences
US11823059B2 (en) Generating a personalized preference ranking network for providing visually-aware item recommendations
CN107424043B (en) Product recommendation method and device and electronic equipment
US7610255B2 (en) Method and system for computerized searching and matching multimedia objects using emotional preference
US11062379B2 (en) Automatic fashion outfit composition and recommendation system and method
US20180181569A1 (en) Visual category representation with diverse ranking
US11809985B2 (en) Algorithmic apparel recommendation
US20070081744A1 (en) System and method for use of images with recognition analysis
US20110194777A1 (en) System and method for use of images with recognition analysis
US10776417B1 (en) Parts-based visual similarity search
US20150170250A1 (en) Recommendation engine for clothing and apparel
KR20190117584A (en) Method and apparatus for detecting, filtering and identifying objects in streaming video
US20120072405A1 (en) Simulation-assisted search
US20190355041A1 (en) Cognitive fashion product recommendation system and method
CN108960945A (en) Method of Commodity Recommendation and device
US10769524B1 (en) Non-binary gender filter
EP1493118A1 (en) Determination of attributes based on product descriptions
US11238515B1 (en) Systems and method for visual search with attribute manipulation
CN112488781A (en) Search recommendation method and device, electronic equipment and readable storage medium
US11941681B2 (en) System, method, and computer program product for determining compatibility between items in images
Jaradat et al. Dynamic CNN models for fashion recommendation in Instagram
WO2007041647A2 (en) System and method for use of images with recognition analysis
CN118921498A (en) Big data-based commodity analysis method and system for live broadcasting room
Mattsson et al. Optimize Ranking System With Machine Learning

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10784043

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2010256641

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: 2764056

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2012514104

Country of ref document: JP

ENP Entry into the national phase

Ref document number: 2010256641

Country of ref document: AU

Date of ref document: 20100602

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2010784043

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20127000140

Country of ref document: KR

Kind code of ref document: A

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: PI1010949

Country of ref document: BR

NENP Non-entry into the national phase

Ref country code: DE

ENPW Started to enter national phase and was withdrawn or failed for other reasons

Ref document number: PI1010949

Country of ref document: BR