Nothing Special   »   [go: up one dir, main page]

US10803532B1 - Processing insured items holistically with mobile damage assessment and claims processing - Google Patents

Processing insured items holistically with mobile damage assessment and claims processing Download PDF

Info

Publication number
US10803532B1
US10803532B1 US16/570,421 US201916570421A US10803532B1 US 10803532 B1 US10803532 B1 US 10803532B1 US 201916570421 A US201916570421 A US 201916570421A US 10803532 B1 US10803532 B1 US 10803532B1
Authority
US
United States
Prior art keywords
damage
mobile device
processor
images
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US16/570,421
Inventor
Jennifer A. Brandmaier
Mark E. Faga
Robert H. Johnson
Daniel Koza
William Loo
Clint J. Marlow
Kurt M. Stricker
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Allstate Insurance Co
Original Assignee
Allstate Insurance Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Allstate Insurance Co filed Critical Allstate Insurance Co
Priority to US16/570,421 priority Critical patent/US10803532B1/en
Assigned to ALLSTATE INSURANCE COMPANY reassignment ALLSTATE INSURANCE COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FAGA, MARK E., JOHNSON, ROBERT H., STRICKER, KURT M., BRANDMAIER, JENNIFER A., KOZA, DANIEL, LOO, WILLIAM, MARLOW, CLINT J.
Assigned to ALLSTATE INSURANCE COMPANY reassignment ALLSTATE INSURANCE COMPANY CORRECTIVE ASSIGNMENT TO CORRECT THE RECEIVING ADDRESS PREVIOUSLY RECORDED AT REEL: 050617 FRAME: 0216. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT. Assignors: FAGA, MARK E., JOHNSON, ROBERT H., STRICKER, KURT M., BRANDMAIER, JENNIFER A., KOZA, DANIEL, LOO, WILLIAM, MARLOW, CLINT J.
Priority to US17/008,079 priority patent/US11386503B2/en
Application granted granted Critical
Publication of US10803532B1 publication Critical patent/US10803532B1/en
Priority to US17/862,159 priority patent/US12079877B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/08Insurance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

Definitions

  • the present disclosure relates to systems and methods for analyzing damage to an insured item such as a vehicle and processing an insurance claim related to the analyzed damage.
  • Conventional insurance claims processing is a complex process that starts with a first notification of loss related to an insured item. Upon notification of loss, the claim may be routed to multiple claims adjusters that analyze different aspects of the damage associated with the insured item in order to determine whether compensation for the loss is appropriate.
  • a mobile device may transmit data (e.g., images, video, etc.) related to damage associated with an insured item to an enhanced claims processing server.
  • the enhanced claims processing server may manage analysis of damage associated with the insured item and settlement of a claim related to the damage.
  • an enhanced claims processing server may analyze damage data received from a mobile device to generate a repair cost estimate for repairing the insured item.
  • FIG. 1 shows an illustrative operating environment in which various aspects of the disclosure may be implemented.
  • FIG. 2 shows a system of network devices and servers that may be used to implement the processes and functions of certain aspects of the present disclosure.
  • FIG. 3 shows a flow chart for an automated damage assessment process in accordance with certain aspects of the present disclosure.
  • FIG. 4 shows a series of initial display screens displayed when a user starts a damage assessment and claims processing application stored on a mobile device in accordance with certain aspects of the present disclosure.
  • FIG. 5A and FIG. 5B show a first series of display screens displayed on a mobile device as a user takes photos of a damaged vehicle in accordance with certain aspects of the present disclosure.
  • FIG. 5C shows a second series of display screens displayed on a mobile device as a user takes photos of a damaged vehicle in accordance with certain aspects of the present disclosure.
  • FIG. 6A and FIG. 6B show a series of display screens displayed on a mobile device for enabling a user to delete photos that have already been taken in accordance with certain aspects of the present disclosure.
  • FIG. 7A and FIG. 7B show a series of display screens displayed on a mobile device for enabling a user to submit photos for review by an enhanced claims processing server, in accordance with certain aspects of the present disclosure.
  • FIG. 8A and FIG. 8B show a series of display screens displayed on a mobile device for enabling a user to receive feedback from an enhanced claims processing server regarding previously submitted photos, in accordance with certain aspects of the present disclosure.
  • an enhanced claims processing server receives data regarding an insured item (e.g., a vehicle, etc.) from a computing device (e.g., a mobile device), the server processes the data and manages settlement of a claim associated with the insured item.
  • an insured item e.g., a vehicle, etc.
  • a computing device e.g., a mobile device
  • the automated process may utilize various hardware components (e.g., processors, communication servers, memory devices, sensors, etc.) and related computer algorithms to generate image data related to damage associated with an insured item, determine if the image data conforms to a predetermined set of criteria, analyze the image data to assess loss associated with the insured item, and determine if a payment is appropriate to the claimant as compensation for assessed loss.
  • various hardware components e.g., processors, communication servers, memory devices, sensors, etc.
  • FIG. 1 illustrates a block diagram of an enhanced claims processing server 101 (e.g., a computer server) in communication system 100 that may be used according to an illustrative embodiment of the disclosure.
  • the server 101 may have a processor 103 for controlling overall operation of the enhanced claims processing server 101 and its associated components, including RAM 105 , ROM 107 , input/output module 109 , and memory 115 .
  • I/O 109 may include a microphone, keypad, touch screen, and/or stylus through which a user of enhanced claims processing server 101 may provide input, and may also include one or more of a speaker for providing audio output and a video display device for providing textual, audiovisual, and/or graphical output.
  • Software may be stored within memory 115 to provide instructions to processor 103 for enabling device 101 to perform various functions.
  • memory 115 may store software used by the device 101 , such as an operating system 117 , application programs 119 , and an associated database 121 .
  • Processor 103 and its associated components may allow the device 101 to run a series of computer-readable instructions to analyze image data depicting damage to an insured item (e.g., vehicle, etc.).
  • Processor 103 may determine the general location of damage associated with the vehicle by analyzing images of the vehicle and comparing these images with reference images of a similar vehicle with no damage or with similar damage. In addition, processor 103 may assess the loss associated with the damaged vehicle and transmit terms for settling an insurance claim related to the loss to a user of a mobile device.
  • the server 101 may operate in a networked environment supporting connections to one or more remote computers, such as terminals 141 and 151 .
  • the terminals 141 and 151 may be personal computers or servers that include many or all of the elements described above relative to the server 101 .
  • terminal 141 and/or 151 may be data stores for storing image data of insured items that have been analyzed by the enhanced claims processing server 101 in the past.
  • terminals 141 and 151 may represent mobile devices with built-in cameras for capturing image data associated with a damaged item.
  • the network connections depicted in FIG. 1 include a local area network (LAN) 125 and a wide area network (WAN) 129 , but may also include other networks.
  • the server 101 When used in a LAN networking environment, the server 101 is connected to the LAN 125 through a network interface or adapter 123 .
  • the server 101 When used in a WAN networking environment, the server 101 may include a modem 127 or other means for establishing communications over the WAN 129 , such as the Internet 131 .
  • the network connections shown are illustrative and other means of establishing a communications link between the computers may be used. The existence of any of various well-known protocols such as TCP/IP, Ethernet, FTP, HTTP and the like is presumed.
  • an application program 119 used by the enhanced claims processing server 101 may include computer executable instructions for invoking functionality related to calculating an appropriate payment for assessed damage associated with an insured item.
  • Enhanced claims processing server 101 and/or terminals 141 or 151 may also be mobile terminals including various other components, such as a battery, speaker, camera, and antennas (not shown).
  • the disclosure is operational with numerous other general purpose or special purpose computing system environments or configurations.
  • Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the disclosure include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, and distributed computing environments that include any of the above systems or devices, and the like.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • the disclosure may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
  • program modules may be located in both local and remote computer storage media including non-transitory memory storage devices, such as a hard disk, random access memory (RAM), and read only memory (ROM).
  • system 200 may include one or more network devices 201 .
  • Devices 201 may be local or remote, and are connected by one or more communications links 202 to computer network 203 that is linked via communications links 205 to enhanced claims processing server 101 .
  • network devices 201 may run different algorithms used by server 101 for analyzing image data showing damage associated with an insured item, or, in other embodiments, network devices 201 may be data stores for storing reference image data of insured items.
  • network devices 201 may represent mobile user devices configured to capture image data (e.g., via a camera, etc.) associated with a damaged insured item and to transmit the image data to server 101 .
  • enhanced claims processing server 101 may be any suitable server, processor, computer, or data processing device, or combination of the same.
  • Computer network 203 may be any suitable computer network including the Internet, an intranet, a wide-area network (WAN), a local-area network (LAN), a wireless network, a digital subscriber line (DSL) network, a frame relay network, an asynchronous transfer mode (ATM) network, a virtual private network (VPN), or any combination of any of the same.
  • Communications links 202 and 205 may be any communications links suitable for communicating between network devices 201 and server 101 , such as network links, dial-up links, wireless links, hard-wired links, etc.
  • a user e.g., a claimant of a mobile device (e.g., mobile phone, personal digital assistant (PDA), etc.) may take a variety of photos associated with damage to an insured vehicle.
  • the photos may include wide shots of the damaged vehicle, pictures of an identification number associated with the damaged vehicle (e.g., a vehicle identification number (VIN), etc.), and/or multiple angles/close-up shots of the damage associated with the insured vehicle.
  • VIN vehicle identification number
  • the enhanced claims processing server 101 may be configured to receive and analyze the photos to determine if they meet a predefined set of criteria (e.g., not too blurry, correct angles, etc.) for completeness, accuracy, etc. If the photos do not meet the minimum criteria, server 101 may transmit a message (e.g., via a feedback loop), informing the mobile device that alternative and/or additional photos must be taken. This process of assuring that the photos are compliant for further analysis may be repeated until the user of device 201 has complied with all of the rules set forth by enhanced claims processing server 101 .
  • a predefined set of criteria e.g., not too blurry, correct angles, etc.
  • Server 101 may then analyze the photos to generate an output, including a cost estimate to repair the damage associated with the insured vehicle and/or to replace a damaged part of the insured vehicle.
  • server 101 may analyze the photos and determine the location of damage (e.g., exterior parts, etc.), extent of damage, and/or the cost of parts/labor to fix the damage.
  • the cost estimate may represent the cost of replacing the insured vehicle itself.
  • server 101 may also output various claims documents, including disclosures, brochures, guarantees, etc. If appropriate, server 101 may transmit a payment to the user and/or to an account associated with the user, for the cost of repairing the damage or replacing a part. In addition, server 101 may inform the user approximately how long it will take to repair/replace the insured vehicle.
  • damage inspection and appraisal in the automated claims processing scheme discussed herein may be completed in thirty minutes or less.
  • server 101 may aid in cutting down time between a first notice of loss and settlement of the claim (e.g., real-time settlement of a claim) associated with the loss (e.g., via a payment and/or information regarding repair/replacement of an insured item).
  • a first notice of loss and settlement of the claim e.g., real-time settlement of a claim
  • the methods discussed herein are automated and involve minimal and/or no involvement from claims adjusters, less time and money may be spent to transport these adjusters to inspection locations. The automated nature of this process may also create the opportunity for remote human inspections of damage associated with insured items.
  • server 101 may aid in attracting technology savvy consumers to an entity (e.g., an insurance company) managing server 101 .
  • entity e.g., an insurance company
  • FIG. 3 shows an automated damage assessment process 300 in accordance with at least one aspect of the present disclosure.
  • an application related to damage assessment and claims processing may be downloaded onto a mobile device (e.g., iPhoneTM, AndroidTM, etc.) associated with a user (e.g., a customer of an insurance company) to facilitate one or more steps of the process in FIG. 3 .
  • a mobile device e.g., iPhoneTM, AndroidTM, etc.
  • a user e.g., a customer of an insurance company
  • the process of FIG. 3 may start out at step 301 where a user (e.g., a customer) associated with an entity managing enhanced claims processing server 101 (e.g., insurance company) may enter a claim number (e.g., a number related to damage associated with an insured vehicle, etc.) into a damage assessment and claims processing application running on a mobile device (e.g., network device 201 ).
  • a claimant may contact an entity managing enhanced claims processing server 101 (e.g., an insurance company, etc.) with a first notice of loss (FNOL).
  • FNOL first notice of loss
  • the claimant may contact the insurance company in any number of ways, including via phone, by email, via a company web site, etc.
  • the claimant may provide basic identifying and/or validating information (e.g., name, age, claim number, etc.) and vehicle information, including the make, model, and year of manufacture.
  • the claimant may also provide the general areas of damage to the vehicle and any other relevant details (e.g., condition of glass, under carriage, engine, wheels, airbags, etc. associated with the vehicle).
  • this information may be provided from a remote location (e.g., location of an accident, claimant's home, etc.) using an application loaded onto a smart phone (e.g., iPhoneTM, AndroidTM, etc.).
  • the mobile device may then transmit the entered claim number and related information to enhanced claims processing server 101 .
  • the process may then move to step 303 where server 101 may determine if the claim number received in step 301 is valid. If server 101 determines that the claim number is not valid, then server 101 may transmit a message to the mobile device, stating that the claim number is invalid in step 305 . The user may then enter another claim number (step 301 ).
  • server 101 may send the user instructions of the types of image data (e.g., photos, video, etc.) that should be captured of damage associated with the insured vehicle.
  • server 101 may not receive a claim number and may proceed in providing user instructions on the types of image data that should be captured without receiving a claim number.
  • the user may receive instructions on various types of photos/video, including photos/video of the entire vehicle, VIN door tag, and/or the damaged areas.
  • the user may capture image data related to at least two different angles of the damage for each panel (e.g., hood, fender, door, bumper, etc.) based on an initial claim description.
  • the user may use a camera associated with the mobile device to take the photos and transmit these photos to the server 101 .
  • the user may be allowed to preview each photo before selecting the image.
  • the image may be shown on a display associated with the mobile device under a photo type (e.g., a photo of the entire vehicle, VIN door tag, and/or damaged area). If the user is not satisfied with any photo, the user may delete the photo by selecting it.
  • the user may annotate the photos (e.g., by drawing a line from one end of the dent to the other, etc.) prior to transmitting them to server 101 .
  • server 101 may itself annotate any received photos/video.
  • any approved photo may not be sent to server 101 until all of the images have been captured.
  • server 101 may support a website interface through which photos may be uploaded by a user of a mobile device.
  • the use of multiple photos e.g., via stereoscopic techniques), video (e.g., by walking around the vehicle to generate a complete view), and/or three-dimensional photos/video may assist in determining the depth of damage to a vehicle.
  • determining the depth of damage may help in classifying the damage (e.g., a turbulent dent versus a dish dent).
  • the degree of damage by area and depth may be automatically estimated through tools similar to ultrasound tools. Knowing the depth of damage may also assist in automatically determining the cost of repair or replacement.
  • a claims adjuster associated with an entity managing server 101 may interface with the user in real-time (e.g., via phone, email, etc.) as the photos are being sent to the adjuster and/or as the video is being streamed to the adjuster and describe to the user the photos/video that still need to be taken and/or where to place a camera as the photos/video are captured.
  • server 101 may determine if the photos are acceptable in step 311 . For instance, server 101 may determine that the photos are too blurry and/or that the photos do not capture the correct angles to clearly show damage associated with the insured vehicle.
  • server 101 may employ a bar code scanning mechanism and/or an optical character recognition (OCR) system for detecting the VIN from a submitted photo.
  • OCR optical character recognition
  • the mobile device itself may use a bar code scanning mechanism and/or an OCR system for determining the VIN number. In this example, if the VIN cannot be detected from the photo and/or using these techniques, then the submitted photo may be deemed to be unacceptable.
  • server 101 may move back to step 307 where the server 101 may send the user instructions on what types of photos to take and/or what changes need to be made to the previously submitted photos.
  • a dispatcher associated with an entity managing server 101 e.g., an insurance company
  • the mobile device may itself determine if any given photo is blurry and/or inaccurate and prompt the user to retake the photo.
  • the application for damage assessment and claims processing running on the mobile device may have computer-executable instructions stored within a memory of the mobile device for automatically detecting and/or rejecting a photo/video captured within a given category.
  • server 101 may attach the photos to the user's claim in a database associated with server 101 .
  • Server 101 may also determine a damage estimate (e.g., an estimate for repairing and/or replacing any damaged parts) after analyzing the photos in step 313 based on predefined rules.
  • the damage estimate may be generated by comparing the photos submitted by the mobile device with photos of similarly damaged vehicles or with photos of non-damaged vehicles of similar make/model. To perform this comparison, server 101 may access a database (e.g., network device 201 ) of photos of vehicles with various types of damage and/or vehicles with no damage. To initially populate the database with photos for later use, each user may be required to upload various photos of a vehicle upon purchase of the vehicle.
  • server 101 may analyze recently submitted photos, previously uploaded photos of a given vehicle may be used to determine any pre-existing damage on the vehicle.
  • server 101 may determine a damage estimate for a new case based on the prior cases.
  • Server 101 may not need to build a new damage estimate piece-by-piece for a given damaged vehicle.
  • server 101 (or an individual/group associated with the entity managing server 101 ) may generate a new damage estimate based on a holistic view of a damaged vehicle.
  • server 101 may build a database (e.g., network device 201 ) of specific damage templates (e.g., damages to more than one part of a vehicle that are commonly associated with one another) and estimated/actual costs for repairing damages associated with these templates.
  • damage estimates associated with subsequently analyzed vehicles may be generated from a holistic view of the vehicles by accessing information within the historical database.
  • server 101 may use this repair cost to generate a new estimate for subsequent vehicles that exhibit damage similar to this damage template.
  • the damage estimates retrieved from the historical database may be adjusted based on differences associated with a current case. For instance, the damage estimate may be adjusted based on the average inflation rate (e.g., for parts, labor, etc.) between the date at which the damage estimate within the historical database was generated and the current date. In other embodiments, the damage estimate may be adjusted for small differences such as the make, model, and year of manufacture when the vehicle in the historical database and the currently analyzed vehicle are compared. Similarly, the damage estimate may be adjusted based on differences in the precise damage associated with the vehicle in the historical database and the damage associated with the vehicle currently being analyzed. In yet other examples, the damage estimate may be adjusted based on the terms of an insurance policy that covers damage to the insured vehicle currently being analyzed. One of ordinary skill in the art would understand that any number of factors may be considered when adjusting the damage estimate retrieved for vehicles stored in the historical database to more accurately reflect a damage estimate for a currently analyzed vehicle.
  • the damage estimate may be adjusted based on the average inflation rate (e.g., for parts
  • server 101 may access the historical database multiple times (one for each type of damage) and then add one or more interaction terms to the sum of the cost estimates for each type of damage. For instance, extending the example above of damage to a front bumper and to the headlights of a vehicle, server 101 may generate a first damage estimate for repairing the front bumper and a second damage estimate for repairing the headlights. Server 101 may then add these two damage estimates to generate a total damage estimate.
  • server 101 may also calculate an interaction term (which may be a positive or a negative value) that represents either an increased (e.g., because the damages taken collectively introduce more complexity and are thus more expensive to repair than if handled individually) or decreased (e.g., because the damages taken collectively have overlapping repair procedures and are thus less expensive to repair than if handled individually) cost of repairing the vehicle when both of these types of damages occur together.
  • the effective total damage estimate may then be the sum of the total damage estimate and the interaction term.
  • server 101 may generate any number of interaction terms for a given analysis. For instance, if a damage estimate is based on damage to three parts of a vehicle, server 101 may generate interaction terms that relate to increased/decreased cost associated with repair to the following part groups: the first two parts, the first and third parts, the second and third parts, and all three parts at once. In other embodiments, server 101 may generate an interaction term for only some of the damaged parts.
  • server 101 may also query the claimant with regards to the type of third party service provider (e.g., repair shop, etc.) they would prefer after damage analysis and claims processing is complete.
  • third party service provider e.g., repair shop, etc.
  • exterior damage associated with the vehicle may be used to predict (e.g., via predictive modeling using the database of past assessed exterior/interior damage for other similar cases, etc.) the likelihood of interior (e.g., mechanical, cabin, etc.) damage to the vehicle and/or potential difficulties in repairing the vehicle.
  • server 101 may include computer-executable instructions to recognize the extent of damage to various parts of the vehicle (e.g., chassis, etc.), including various types of dents and edge damage, and to identify various parts of the vehicle.
  • vehicle e.g., chassis, etc.
  • server 101 may include computer-executable instructions to recognize the extent of damage to various parts of the vehicle (e.g., chassis, etc.), including various types of dents and edge damage, and to identify various parts of the vehicle.
  • the detection of damage to the vehicle may be based on object recognition algorithms that compare images (e.g., comparing x, y, and z coordinates of each point on the images) of the vehicle in question to reference images of similar vehicles (e.g., same model, make, year of manufacture, etc.) with no damage.
  • server 101 may access a database of images storing the reference images of vehicles of various models and makes.
  • object recognition/edge detection algorithms e.g., involving blur filters, gray-scaling, custom algorithms, etc.
  • Server 101 may also access internal/external databases storing images, damage depth map information (e.g., from previously assessed analyses, etc.), and/or processed claims reports from damaged vehicles that server 101 has assessed previously.
  • server 101 may access images/depth map information from previously assessed damaged vehicles for use as a guidepost in assessing the damage of a new vehicle. If no reference information (e.g., data, images) exists, axis symmetry information may also be used to identify possible irregularities and/or damage.
  • the algorithm employed by server 101 may use a comparison of an image of a damaged vehicle with an image of an undamaged version of the same vehicle to “subtract out” and isolate the damaged area of a vehicle. If an exact replica of an undamaged vehicle corresponding to a damaged vehicle under study is not available for this comparison, server 101 may further use various additional image processing algorithms, including blurring filters, etc. to detect a damaged portion of a vehicle.
  • server 101 may grayscale all image data to make processing faster. Further, edge filters may be applied to both the image data from the damaged vehicle and its corresponding reference image data so that the edges of a damaged area may be “subtracted out” and identified in the image data of the damaged vehicle. Once the damaged area has been identified in the image data, server 101 may further process the damaged area to sharpen the area, to make the edges more prominent, and to fill any missing links found in the edges. Afterwards, server 101 may color, texture, and/or otherwise “fill in” the damaged area surrounded by the edges and extract the damaged area from the surrounding image data. Once the damaged area has been isolated, server 101 may calculate the precise area of the damage.
  • server 101 may determine the depth of a damaged area (e.g., via stereoscopic methods, etc.) and may analyze raw depth data to further investigate points of interest (e.g., a point that has a much larger depth than surrounding points, etc.). Using this analysis, the damaged area may be further characterized (e.g., a dented area may be detected and if, for example, the general slope of the dent is high, the dent may be characterized as deep and rounded whereas if the slope is low, the dent may be characterized as shallow.)
  • points of interest e.g., a point that has a much larger depth than surrounding points, etc.
  • the damaged area may be further characterized (e.g., a dented area may be detected and if, for example, the general slope of the dent is high, the dent may be characterized as deep and rounded whereas if the slope is low, the dent may be characterized as shallow.)
  • server 101 may use a damage analysis or cost estimate of identifying/repairing the damage or replacing a damaged part of the previously analyzed vehicle to generate a damage analysis/cost estimate for the currently analyzed vehicle.
  • server 101 may perform one or more database queries to match characteristics of the current analysis with previous analyses.
  • the queries may seek to match the size, depth, and location of a dent on a current vehicle with a similar dent on a vehicle with a similar chassis configuration, make, model, and year of manufacture. For instance, consider a case where the vehicle in question is a new model that has not been analyzed before by server 101 . In this scenario, server 101 may attempt to match the vehicle currently being analyzed with its closest match, which in this case may be a similar model from the previous year with the same chassis configuration (e.g., a twin chassis configuration).
  • a twin chassis configuration e.g., a twin chassis configuration
  • server 101 may assign a confidence factor to the match.
  • Server 101 may assign the highest confidence factor (e.g., a confidence factor of 100%) to a comparison between the exact same types of vehicles (e.g., cars of the same make, model, year of manufacture, etc.) having the exact same type of damage (e.g., a predetermined type of dent, etc.). For instance, a comparison between vehicles with two completely different types of damage would have a confidence factor of 0%. As the similarities between the currently analyzed vehicle and previously analyzed vehicles are reduced, server 101 may assign a lower confidence factor to the comparison.
  • the highest confidence factor e.g., a confidence factor of 100%
  • server 101 may assign a threshold confidence factor (e.g., 70%, etc.) below which output generated by a comparison performed by server 101 may not be considered reliable.
  • a threshold confidence factor e.g. 70%, etc.
  • server 101 may then use physical details of the damage (e.g., size, location, area, etc.) to provide output such as a cost estimate for damage repair/replacement and/or the amount of time required for repair/replacement.
  • physical details of the damage e.g., size, location, area, etc.
  • Server 101 may also use stored data to determine appropriate vendors for repairing/replacing the vehicle and the amount of time for repair/replacement.
  • the wait time for repair/replacement may depend on various factors, including the size (e.g., area, depth, etc.), classification (e.g., turbulent dent, etc.), and location of the damage.
  • server 101 may determine if parts nearby to damaged parts may also need to be blended into the damaged area. In other words, if a part of the vehicle needs to be refinished (e.g., repainted) either because it is being replaced or repaired, parts within a predetermined distance of the repaired/replaced part may need to be blended (e.g., color-matched) to the repaired/replaced part.
  • a part of the vehicle needs to be refinished (e.g., repainted) either because it is being replaced or repaired, parts within a predetermined distance of the repaired/replaced part may need to be blended (e.g., color-matched) to the repaired/replaced part.
  • server 101 may acquire the knowledge of all previous claims processed by server 101 , as well as the knowledge of human adjusters, to accurately process future claims. In this way, server 101 may use machine learning to evolve its cost and/or repair estimation procedure based on past experience.
  • server 101 may also consider the extent/severity of the damage (area, depth, location, classification, etc.). For instance, damage to a character line (e.g., edge of a door associated with the vehicle) would be more difficult (e.g., more expensive and/or more time-consuming, etc.) to repair than damage to a more central location on the vehicle. Server 101 may also consider the actual cash value and the salvage value of the vehicle and any relevant local, state, and national laws in this analysis.
  • the extent/severity of the damage area, depth, location, classification, etc.
  • server 101 may also consider the actual cash value and the salvage value of the vehicle and any relevant local, state, and national laws in this analysis.
  • server 101 may generate a rough cost estimate of repairing the damage just based on the extent of the damage; then server 101 may refine this estimate by analyzing previous cost estimates provided by server 101 and/or actual repair data received from third party service providers (e.g., repair shops, etc.) that have repaired similar vehicles with similar damage.
  • server 101 may generate a basic cost estimate by taking into account factors such as the number of hours predicted for the repair, the labor rate, and the current market conditions.
  • server 101 may compare this basic cost estimate with the cost of merely replacing the vehicle (e.g., a total loss) or the damaged part within the vehicle and based on the comparison, server 101 may recommend the cheaper option.
  • These estimates may also be transmitted to existing platforms (e.g., Audatex®, Mitchell®, etc.) for comparison purposes.
  • server 101 may query the claimant as to the discrepancy. For instance, if the claimant initially provided information relating to damage on the left side of the vehicle but server 101 discovers that the primary damage occurred on the right side, server 101 may question the claimant as to when the damage occurred (e.g., was the damage due to a previous incident or preexisting condition?, is the claimant being truthful?, etc.). Server 101 may also ask the claimant to sign a statement as to the truth of the information provided. The claimant may have the option of answering the questions as they come up or the questions may be queued until the server 101 has finished processing the image analysis of the vehicle. If discrepancies between the claimant's answers and the analyzed damage to the vehicle continue to exist, server 101 may request the involvement of a human claims adjuster.
  • server 101 may request the involvement of a human claims adjuster.
  • a technician associated with an entity managing server 101 may analyze the photos to determine a damage estimate. Also, in certain aspects, the process discussed herein may allow a user to upload photos/video that fall into alternative and/or additional categories (e.g., photos for each vehicle part, etc.).
  • server 101 may ask the user to compare damage associated with the insured vehicle to damage depicted in a series of photos/video sent by server 101 .
  • server 101 may request that the user classify the type of damage associated with the insured vehicle. For instance, server 101 may ask the user questions such as, “Does the damage to your vehicle look more like the damage shown in photo A or photo B?” Server 101 may ask any number of questions until server 101 has reached a clear understanding of all the damage to the insured vehicle and a damage estimate can be calculated. In some ways, this process may allow the user to estimate the damage to the insured vehicle.
  • server 101 may transmit, to a user device, one or more images depicting various types of damage to the driver's side window and door of four-door sedans that have been previously analyzed and/or stored in memory.
  • the first image or images transmitted to the user device may be based on previously submitted information regarding an accident that caused the damage or any other type of input provided by a claimant and/or related parties.
  • the first image or images transmitted to the user device may not depict damage that precisely conforms to the damage of the sedan currently being analyzed. For instance, if two images are initially transmitted to the user device, one of the images may depict damage to the corner of the driver's side window and door and the other image may depict damage that is located closer to the center.
  • a user of the user device e.g., a mobile phone
  • upon analyzing the two images may select the image that depicts the centrally-located damage.
  • the mobile device may then transmit the selection to server 101 , and server 101 may use this information to generate a damage estimate.
  • both images initially transmitted from server 101 depict damage to the corner of the driver's side door and window in a four-door sedan.
  • the user may transmit a message to server 101 , stating how the reference images are equally unrepresentative.
  • server 101 may transmit another image or images responsive to the information provided by the user in the message.
  • the user may select one or more images that most closely depict damage to the sedan in question.
  • server 101 again transmits two images and that, in this instance, both images depict damage to four-door sedans with centrally-located damage to the driver's side door and window.
  • the user may choose the image that depicts damage with the severity level consistent with the damage to the sedan in question.
  • server 101 may, with each successive round, determine more precisely the damage associated with the sedan in question.
  • server 101 may use the various responses provided by the user device to calculate a damage estimate for the damage to the sedan and transmit a settlement based on the calculated estimate.
  • server 101 may transmit an insurance claim to a claims adjuster for manual processing of the claim if server 101 cannot calculate an accurate damage estimate after a predetermined number of question/answer rounds.
  • the user may transmit audio (e.g., by speaking into the mobile device, etc.) and/or an audio file that includes a description of what happened to cause the damage to the vehicle (e.g., the specifics of an accident, etc.).
  • This audio/audio file may be translated into text and incorporated into the photos/video of damage and/or analyzed to determine if the damage matches any narrative description provided by the user.
  • the user may transmit a text file describing damage and/or an accident that caused the damage.
  • the user may capture and transmit the sound of the vehicle being started and/or the sound of the vehicle running to server 101 (e.g., to determine if a muffler associated with the damaged vehicle is broken, etc.).
  • server 101 may transmit a proposed settlement (e.g., cash compensation, etc.) for the assessed loss to the user of the mobile device in step 315 .
  • a proposed settlement e.g., cash compensation, etc.
  • the user may notify server 101 whether or not the proposed settlement is acceptable in step 317 .
  • step 319 server 101 may transmit the settlement to a claims adjuster for manual processing. If the settlement terms are acceptable, the process may move to step 321 where server 101 may transfer any funds related to the assessed loss directly to a bank account associated with the user.
  • users may provide feedback designed to evaluate their experience through process 300 .
  • This feedback may be used to improve process 300 for future users and may involve the use of surveys, questionnaires, email, etc.
  • server 101 may determine and/or transmit supplemental adjustments to an initial damage/repair estimate. For instance, server 101 may determine that there is a 95% chance that repair option A must be performed, a 50% chance that additional repair option B must also be performed, and a 10% chance that additional repair option C must also be performed. When a repair shop examines the damage to a damaged vehicle and notices that there is less/additional damage, server 101 may use this information to revise an initial damage estimate with a supplemental adjustment to the initial estimate.
  • a claims adjuster may manually evaluate the damage and determine the likelihood of each of the supplemental adjustments.
  • server 101 may provide the user with a list of repair facilities for repairing the vehicle.
  • messages may be pushed to the mobile device of the user to identify where the vehicle is located is in the repair process (e.g., which step of the repair process is the current step, etc.). These messages may identify who is working on the vehicle and/or may include photos/video of the vehicle as it is being repaired. The messages may also identify when the repair process may be completed.
  • claims may be excluded from the automated process illustrated in FIG. 3 . These claims may include comprehensive claims, claims with injuries to any involved parties, claims involving non-drivable vehicles or air bag deployments, claims with loss descriptions that include undercarriage/mechanical damage, claims involving motorcycle and/or recreational vehicle (RV) losses, and claims involving users that already have an estimate for damage associated with an insured vehicle.
  • RV recreational vehicle
  • FIGS. 4-8B show various display screens displayed to a user of a mobile device in accordance with at least one aspect of the present disclosure.
  • FIG. 4 shows a series of initial display screens displayed when a user starts a damage assessment and claims processing application stored on a mobile device (e.g., network device 201 ) in accordance with at least one aspect of the present disclosure.
  • Screen 401 may be the initial screen that the user views upon starting the application.
  • Screen 401 may allow the user to enter a claim number to begin a damage assessment and claims processing method.
  • the claim number may be used to compare a damage estimate generated by analysis of photos submitted by the user to a damage estimate generated manually by a claims adjuster using more conventional claims adjustment techniques.
  • the mobile device may display screen 403 , where the user is presented with photo instructions that explain to the user the types of photos that should be taken.
  • Screen 403 may include instructions on taking photos of the entire insured vehicle, VIN door tag, and any damaged areas of the insured vehicle.
  • the mobile device may display screen 405 , which allows a user to select and start taking any of the types of photos listed in screen 403 (e.g., photos of the entire vehicle, VIN door tag, and/or damaged areas).
  • the “Submit Photos” button 405 a on screen 405 may be inactive until at least one photo of each type is taken by the user.
  • FIG. 5A and FIG. 5B show a first series of display screens displayed on a mobile device as a user takes photos of a damaged vehicle in accordance with at least one aspect of the present disclosure.
  • the user may select to take a photo of the entire vehicle, the VIN door tag, and/or the specific damaged area(s).
  • a user selects to take a photo of the entire vehicle.
  • screen 503 may allow the user to select the “Capture Photo” button 503 a to start the camera functionality within the mobile device, the “Adding Existing” button 503 b to choose a photo from the photo roll, and/or the “Cancel” button 503 c to cancel out of the previous command.
  • the mobile device may display screen 505 where instructions related to the current photo type (e.g., a wide view of the entire vehicle) may be overlaid on top of the camera.
  • the user may select the “OK” button 505 a on screen 505 to close the overlay and cause display of the camera screen 507 .
  • Camera screen 507 may include a camera shutter button 507 a (e.g., for taking a photo) and flash button 507 b (e.g., for turning the camera flash on/off).
  • the “Instructions” button 507 c on screen 507 may open the instructions overlay from screen 505
  • the “Done” button 507 d on screen 507 may save all photos that have been taken to a memory of the mobile device and may return the user to the main photos screen 501 .
  • the mobile device may display screen 509 to indicate that a photo is being taken. In some aspects, all buttons on screen 509 may be disabled after the user selects the shutter button 507 a.
  • FIG. 5C shows a second series of display screens displayed on a mobile device as a user takes photos of a damaged vehicle in accordance with at least one aspect of the present disclosure.
  • Screen 511 may allow a user to preview a photo that has been taken and take an appropriate action on this photo.
  • the user may select a “Discard” button 511 a to discard the photo or a “Use” button 511 b to use the photo for damage assessment and claims processing.
  • “Use” button 511 b the user may proceed to take other photos within the selected photo type.
  • the user may select the “Done” button 513 a on screen 513 .
  • the mobile device may display screen 515 , where thumbnail image(s) of the photo(s) that the user has already taken may be displayed in the corresponding categories.
  • FIG. 6A and FIG. 6B show a series of display screens displayed on a mobile device for enabling a user to delete photos that have already been taken in accordance with at least one aspect of the present disclosure.
  • Screen 601 displays thumbnails of all photos that have already been taken.
  • the mobile device may display screen 603 , where a series of buttons may be displayed, including an additional options button 603 a for displaying additional options associated with the current photo (e.g., email photo, use photo as wallpaper, etc.), a scroll to previous photo button 603 b for scrolling to the previously-viewed photo in the photo reel, a play photo reel button 603 c for sequentially displaying each photo in the photo reel, a scroll to next photo button 603 d for scrolling to the next photo in the reel, and a delete button 603 e for deleting the currently-viewed photo.
  • a series of buttons may be displayed, including an additional options button 603 a for displaying additional options associated with the current photo (e.g., email photo, use photo as wallpaper, etc.), a scroll to previous photo button 603 b for scrolling to the previously-viewed photo in the photo reel, a play photo reel button 603 c for sequentially displaying each photo in the photo reel, a scroll to next photo button 603 d for scrolling to the next
  • Screen 605 includes an action panel with a “Delete Photo” button 605 a for confirming that the currently-viewed photo is to be deleted and a “Cancel” button 605 b for cancelling deletion of the currently-viewed photo. If the user selects “Delete Photo” button 605 a , the currently-viewed photo is deleted and the next photo in the current category is displayed in screen 607 . If the user selects a back button 607 a on screen 607 , the user may back out to return to photos screen 609 . Screen 609 may display the remaining thumbnails stored in a memory of the mobile device, with the image that the user deleted in screen 605 removed from the list of thumbnails.
  • FIG. 7A and FIG. 7B show a series of display screens displayed on a mobile device for enabling a user to submit photos for review by an enhanced claims processing server 101 , in accordance with at least one aspect of the present disclosure.
  • Screen 701 may include a “Submit Photos” button 701 a for submitting photos to server 101 when all photos have been taken.
  • the mobile device may display screen 703 , which includes an action panel with the “Submit Photos” button 703 a for confirming that the captured photos are to be submitted to server 101 and a “Cancel” button 703 b for cancelling the submission.
  • the mobile device may display screen 705 where an upload progress bar may indicate the progress of the photo upload. Once the photos have been fully uploaded, the mobile device may display screen 707 , which indicates that the photos have been uploaded and explains any next steps that should be taken.
  • FIG. 8A and FIG. 8B show a series of display screens displayed on a mobile device for enabling a user to receive feedback from an enhanced claims processing server 101 regarding previously submitted photos, in accordance with at least one aspect of the present disclosure.
  • server 101 may transmit a notification to the mobile device that feedback is ready for review.
  • screen 801 which includes a notification that feedback is ready, may be displayed.
  • the mobile device may display screen 803 , which may include a detailed description of any feedback received from server 101 .
  • server 101 has transmitted a message that asks the user to take additional photos (e.g., of the damage to the left side of a bumper).
  • Screen 803 may also include a “Take Photos” button 803 a which may allow the user to take additional photos of the damaged vehicle.
  • the mobile device may display screen 805 which allows the user to take more photos of the damaged vehicle (e.g., in response to the feedback received in screen 803 ) using the same process depicted in FIG. 5A and FIG. 5C .
  • the user may press the “Submit Photos” button 807 a in screen 807 to submit the photos taken via screen 805 to enhanced claims processing server 101 .
  • the mobile device may display screen 809 , which includes a progress bar that shows the progress of the photo upload to server 101 .

Landscapes

  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Engineering & Computer Science (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Technology Law (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

Systems and methods provide for an automated system for analyzing damage and processing claims associated with an insured item, such as a vehicle. An enhanced claims processing server may analyze damage associated with the insured item using photos/video transmitted to the server from a user device (e.g., a mobile device). The mobile device may receive feedback from the server regarding the acceptability of submitted photos/video, and if the server determines that any of the submitted photos/video is unacceptable, the mobile device may capture additional photos/video until all of the data are deemed acceptable. To aid in damage analysis, the server may also interface with various internal and external databases storing reference images of undamaged items and cost estimate information for repairing previously analyzed damages to similar items. Further still, the server may generate a payment for compensating a claimant for repair of the insured item.

Description

CROSS-REFERENCE TO RELATED-APPLICATIONS
This application is a continuation of U.S. application Ser. No. 13/892,598, filed May 13, 2013, which is a divisional of U.S. application Ser. No. 13/587,635, filed Aug. 16, 2012, which is related to U.S. application Ser. No. 13/587,620, filed Aug. 16, 2012, and U.S. application Ser. No. 13/587,630, filed Aug. 16, 2012. All of the aforementioned applications are herein incorporated by reference in their entirety.
TECHNICAL FIELD
The present disclosure relates to systems and methods for analyzing damage to an insured item such as a vehicle and processing an insurance claim related to the analyzed damage.
BACKGROUND
Conventional insurance claims processing is a complex process that starts with a first notification of loss related to an insured item. Upon notification of loss, the claim may be routed to multiple claims adjusters that analyze different aspects of the damage associated with the insured item in order to determine whether compensation for the loss is appropriate.
In general, conventional claims adjustment can involve paperwork processing, telephone calls, and potentially face-to-face meetings between claimant and adjuster. In addition, a significant amount of time can elapse between a first notice of loss from the claimant and the final settlement of the claim.
SUMMARY
The following presents a simplified summary of the present disclosure in order to provide a basic understanding of some aspects of the disclosure. This summary is not an extensive overview of the disclosure. It is not intended to identify key or critical elements of the disclosure or to delineate the scope of the disclosure. The following summary merely presents some concepts of the disclosure in a simplified form as a prelude to the more detailed description provided below.
Aspects of the disclosure involve a streamlined and efficient process for claims management and disclose methods, computer-readable media, and apparatuses for automating the processing and settling of claims related to an insured item. A mobile device may transmit data (e.g., images, video, etc.) related to damage associated with an insured item to an enhanced claims processing server. The enhanced claims processing server may manage analysis of damage associated with the insured item and settlement of a claim related to the damage.
In another aspect of the disclosure, an enhanced claims processing server may analyze damage data received from a mobile device to generate a repair cost estimate for repairing the insured item.
Further aspects of the disclosure may be provided in a computer-readable medium having computer-executable instructions that, when executed, cause a computer, user terminal, or other apparatus to at least perform one or more of the processes described herein.
BRIEF DESCRIPTION OF THE DRAWINGS
All descriptions are exemplary and explanatory only and are not intended to restrict the disclosure, as claimed. The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and, together with the description, serve to explain the principles of the disclosure. In the drawings:
FIG. 1 shows an illustrative operating environment in which various aspects of the disclosure may be implemented.
FIG. 2 shows a system of network devices and servers that may be used to implement the processes and functions of certain aspects of the present disclosure.
FIG. 3 shows a flow chart for an automated damage assessment process in accordance with certain aspects of the present disclosure.
FIG. 4 shows a series of initial display screens displayed when a user starts a damage assessment and claims processing application stored on a mobile device in accordance with certain aspects of the present disclosure.
FIG. 5A and FIG. 5B show a first series of display screens displayed on a mobile device as a user takes photos of a damaged vehicle in accordance with certain aspects of the present disclosure.
FIG. 5C shows a second series of display screens displayed on a mobile device as a user takes photos of a damaged vehicle in accordance with certain aspects of the present disclosure.
FIG. 6A and FIG. 6B show a series of display screens displayed on a mobile device for enabling a user to delete photos that have already been taken in accordance with certain aspects of the present disclosure.
FIG. 7A and FIG. 7B show a series of display screens displayed on a mobile device for enabling a user to submit photos for review by an enhanced claims processing server, in accordance with certain aspects of the present disclosure.
FIG. 8A and FIG. 8B show a series of display screens displayed on a mobile device for enabling a user to receive feedback from an enhanced claims processing server regarding previously submitted photos, in accordance with certain aspects of the present disclosure.
DETAILED DESCRIPTION
In accordance with various aspects of the disclosure, methods, computer-readable media, and apparatuses are disclosed through which insurance claims may be settled through an enhanced automated process. In certain aspects, when an enhanced claims processing server receives data regarding an insured item (e.g., a vehicle, etc.) from a computing device (e.g., a mobile device), the server processes the data and manages settlement of a claim associated with the insured item.
The automated process may utilize various hardware components (e.g., processors, communication servers, memory devices, sensors, etc.) and related computer algorithms to generate image data related to damage associated with an insured item, determine if the image data conforms to a predetermined set of criteria, analyze the image data to assess loss associated with the insured item, and determine if a payment is appropriate to the claimant as compensation for assessed loss.
FIG. 1 illustrates a block diagram of an enhanced claims processing server 101 (e.g., a computer server) in communication system 100 that may be used according to an illustrative embodiment of the disclosure. The server 101 may have a processor 103 for controlling overall operation of the enhanced claims processing server 101 and its associated components, including RAM 105, ROM 107, input/output module 109, and memory 115.
I/O 109 may include a microphone, keypad, touch screen, and/or stylus through which a user of enhanced claims processing server 101 may provide input, and may also include one or more of a speaker for providing audio output and a video display device for providing textual, audiovisual, and/or graphical output. Software may be stored within memory 115 to provide instructions to processor 103 for enabling device 101 to perform various functions. For example, memory 115 may store software used by the device 101, such as an operating system 117, application programs 119, and an associated database 121. Processor 103 and its associated components may allow the device 101 to run a series of computer-readable instructions to analyze image data depicting damage to an insured item (e.g., vehicle, etc.). Processor 103 may determine the general location of damage associated with the vehicle by analyzing images of the vehicle and comparing these images with reference images of a similar vehicle with no damage or with similar damage. In addition, processor 103 may assess the loss associated with the damaged vehicle and transmit terms for settling an insurance claim related to the loss to a user of a mobile device.
The server 101 may operate in a networked environment supporting connections to one or more remote computers, such as terminals 141 and 151. The terminals 141 and 151 may be personal computers or servers that include many or all of the elements described above relative to the server 101. Also, terminal 141 and/or 151 may be data stores for storing image data of insured items that have been analyzed by the enhanced claims processing server 101 in the past. In yet other embodiments, terminals 141 and 151 may represent mobile devices with built-in cameras for capturing image data associated with a damaged item.
The network connections depicted in FIG. 1 include a local area network (LAN) 125 and a wide area network (WAN) 129, but may also include other networks. When used in a LAN networking environment, the server 101 is connected to the LAN 125 through a network interface or adapter 123. When used in a WAN networking environment, the server 101 may include a modem 127 or other means for establishing communications over the WAN 129, such as the Internet 131. It will be appreciated that the network connections shown are illustrative and other means of establishing a communications link between the computers may be used. The existence of any of various well-known protocols such as TCP/IP, Ethernet, FTP, HTTP and the like is presumed.
Additionally, an application program 119 used by the enhanced claims processing server 101 according to an illustrative embodiment of the disclosure may include computer executable instructions for invoking functionality related to calculating an appropriate payment for assessed damage associated with an insured item.
Enhanced claims processing server 101 and/or terminals 141 or 151 may also be mobile terminals including various other components, such as a battery, speaker, camera, and antennas (not shown).
The disclosure is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the disclosure include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, and distributed computing environments that include any of the above systems or devices, and the like.
The disclosure may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The disclosure may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including non-transitory memory storage devices, such as a hard disk, random access memory (RAM), and read only memory (ROM).
Referring to FIG. 2, a system 200 for implementing methods according to the present disclosure is shown. As illustrated, system 200 may include one or more network devices 201. Devices 201 may be local or remote, and are connected by one or more communications links 202 to computer network 203 that is linked via communications links 205 to enhanced claims processing server 101. In certain embodiments, network devices 201 may run different algorithms used by server 101 for analyzing image data showing damage associated with an insured item, or, in other embodiments, network devices 201 may be data stores for storing reference image data of insured items. In yet other embodiments, network devices 201 may represent mobile user devices configured to capture image data (e.g., via a camera, etc.) associated with a damaged insured item and to transmit the image data to server 101. In system 200, enhanced claims processing server 101 may be any suitable server, processor, computer, or data processing device, or combination of the same.
Computer network 203 may be any suitable computer network including the Internet, an intranet, a wide-area network (WAN), a local-area network (LAN), a wireless network, a digital subscriber line (DSL) network, a frame relay network, an asynchronous transfer mode (ATM) network, a virtual private network (VPN), or any combination of any of the same. Communications links 202 and 205 may be any communications links suitable for communicating between network devices 201 and server 101, such as network links, dial-up links, wireless links, hard-wired links, etc.
The steps that follow in the Figures may be implemented by one or more of the components in FIGS. 1 and 2 and/or other components, including other computing devices.
In accordance with aspects of the disclosure, a user (e.g., a claimant) of a mobile device (e.g., mobile phone, personal digital assistant (PDA), etc.) may take a variety of photos associated with damage to an insured vehicle. The photos may include wide shots of the damaged vehicle, pictures of an identification number associated with the damaged vehicle (e.g., a vehicle identification number (VIN), etc.), and/or multiple angles/close-up shots of the damage associated with the insured vehicle.
Once the user is satisfied that the appropriate photos have been taken, the user may transmit the photos to an enhanced claims processing server 101. The enhanced claims processing server 101 may be configured to receive and analyze the photos to determine if they meet a predefined set of criteria (e.g., not too blurry, correct angles, etc.) for completeness, accuracy, etc. If the photos do not meet the minimum criteria, server 101 may transmit a message (e.g., via a feedback loop), informing the mobile device that alternative and/or additional photos must be taken. This process of assuring that the photos are compliant for further analysis may be repeated until the user of device 201 has complied with all of the rules set forth by enhanced claims processing server 101. Server 101 may then analyze the photos to generate an output, including a cost estimate to repair the damage associated with the insured vehicle and/or to replace a damaged part of the insured vehicle. In some aspects, to generate this output, server 101 may analyze the photos and determine the location of damage (e.g., exterior parts, etc.), extent of damage, and/or the cost of parts/labor to fix the damage.
In some instances, depending on the amount of damage to the insured vehicle, the cost estimate may represent the cost of replacing the insured vehicle itself. Along with the cost estimate for repair/replacement of the insured vehicle, server 101 may also output various claims documents, including disclosures, brochures, guarantees, etc. If appropriate, server 101 may transmit a payment to the user and/or to an account associated with the user, for the cost of repairing the damage or replacing a part. In addition, server 101 may inform the user approximately how long it will take to repair/replace the insured vehicle.
In some aspects, damage inspection and appraisal in the automated claims processing scheme discussed herein may be completed in thirty minutes or less.
Although embodiments of the disclosure discussed herein relate to an insured vehicle analyzed by enhanced claims processing server 101, one of ordinary skill in the art would recognize that other types of insured items, including homes, may be employed with a similar scheme.
In certain aspects, the use of server 101 may aid in cutting down time between a first notice of loss and settlement of the claim (e.g., real-time settlement of a claim) associated with the loss (e.g., via a payment and/or information regarding repair/replacement of an insured item). In addition, because the methods discussed herein are automated and involve minimal and/or no involvement from claims adjusters, less time and money may be spent to transport these adjusters to inspection locations. The automated nature of this process may also create the opportunity for remote human inspections of damage associated with insured items.
Also, the technologies used in the claims adjustment processes implemented by server 101 may aid in attracting technology savvy consumers to an entity (e.g., an insurance company) managing server 101.
FIG. 3 shows an automated damage assessment process 300 in accordance with at least one aspect of the present disclosure. In certain aspects, an application related to damage assessment and claims processing may be downloaded onto a mobile device (e.g., iPhone™, Android™, etc.) associated with a user (e.g., a customer of an insurance company) to facilitate one or more steps of the process in FIG. 3.
The process of FIG. 3 may start out at step 301 where a user (e.g., a customer) associated with an entity managing enhanced claims processing server 101 (e.g., insurance company) may enter a claim number (e.g., a number related to damage associated with an insured vehicle, etc.) into a damage assessment and claims processing application running on a mobile device (e.g., network device 201). To generate a claim number, a claimant may contact an entity managing enhanced claims processing server 101 (e.g., an insurance company, etc.) with a first notice of loss (FNOL). The claimant may contact the insurance company in any number of ways, including via phone, by email, via a company web site, etc. As part of the FNOL, the claimant may provide basic identifying and/or validating information (e.g., name, age, claim number, etc.) and vehicle information, including the make, model, and year of manufacture. The claimant may also provide the general areas of damage to the vehicle and any other relevant details (e.g., condition of glass, under carriage, engine, wheels, airbags, etc. associated with the vehicle). In one embodiment, this information may be provided from a remote location (e.g., location of an accident, claimant's home, etc.) using an application loaded onto a smart phone (e.g., iPhone™, Android™, etc.).
The mobile device may then transmit the entered claim number and related information to enhanced claims processing server 101. The process may then move to step 303 where server 101 may determine if the claim number received in step 301 is valid. If server 101 determines that the claim number is not valid, then server 101 may transmit a message to the mobile device, stating that the claim number is invalid in step 305. The user may then enter another claim number (step 301).
If server 101 determines that the claim number is valid, the process may move to step 307 where server 101 may send the user instructions of the types of image data (e.g., photos, video, etc.) that should be captured of damage associated with the insured vehicle. It should also be noted that in some embodiments server 101 may not receive a claim number and may proceed in providing user instructions on the types of image data that should be captured without receiving a claim number. The user may receive instructions on various types of photos/video, including photos/video of the entire vehicle, VIN door tag, and/or the damaged areas. In some aspects, the user may capture image data related to at least two different angles of the damage for each panel (e.g., hood, fender, door, bumper, etc.) based on an initial claim description.
When the user of the mobile device receives these instructions, the user may use a camera associated with the mobile device to take the photos and transmit these photos to the server 101. The user may be allowed to preview each photo before selecting the image. Once a photo has been selected, the image may be shown on a display associated with the mobile device under a photo type (e.g., a photo of the entire vehicle, VIN door tag, and/or damaged area). If the user is not satisfied with any photo, the user may delete the photo by selecting it. In some aspects, the user may annotate the photos (e.g., by drawing a line from one end of the dent to the other, etc.) prior to transmitting them to server 101. In yet other embodiments, server 101 may itself annotate any received photos/video.
In some embodiments, any approved photo may not be sent to server 101 until all of the images have been captured. In some aspects, server 101 may support a website interface through which photos may be uploaded by a user of a mobile device. Also, the use of multiple photos (e.g., via stereoscopic techniques), video (e.g., by walking around the vehicle to generate a complete view), and/or three-dimensional photos/video may assist in determining the depth of damage to a vehicle. In some aspects, determining the depth of damage may help in classifying the damage (e.g., a turbulent dent versus a dish dent). In addition, the degree of damage by area and depth may be automatically estimated through tools similar to ultrasound tools. Knowing the depth of damage may also assist in automatically determining the cost of repair or replacement. In addition, as the user is taking video/photos of damage associated with the insured vehicle, a claims adjuster associated with an entity managing server 101 (e.g., an insurance company) may interface with the user in real-time (e.g., via phone, email, etc.) as the photos are being sent to the adjuster and/or as the video is being streamed to the adjuster and describe to the user the photos/video that still need to be taken and/or where to place a camera as the photos/video are captured.
After server 101 receives image data transmitted from a mobile device in step 309, server 101 (or an individual/group associated with the entity managing server 101) may determine if the photos are acceptable in step 311. For instance, server 101 may determine that the photos are too blurry and/or that the photos do not capture the correct angles to clearly show damage associated with the insured vehicle. As an example, server 101 may employ a bar code scanning mechanism and/or an optical character recognition (OCR) system for detecting the VIN from a submitted photo. In other aspects, the mobile device itself may use a bar code scanning mechanism and/or an OCR system for determining the VIN number. In this example, if the VIN cannot be detected from the photo and/or using these techniques, then the submitted photo may be deemed to be unacceptable. If server 101 determines that that the photos are not acceptable, the process may move back to step 307 where the server 101 may send the user instructions on what types of photos to take and/or what changes need to be made to the previously submitted photos. In yet other embodiments, a dispatcher associated with an entity managing server 101 (e.g., an insurance company) may determine if submitted photos are acceptable. In other embodiments, the mobile device may itself determine if any given photo is blurry and/or inaccurate and prompt the user to retake the photo. In this aspect, the application for damage assessment and claims processing running on the mobile device may have computer-executable instructions stored within a memory of the mobile device for automatically detecting and/or rejecting a photo/video captured within a given category.
If server 101 determines that the photos are acceptable, server 101 may attach the photos to the user's claim in a database associated with server 101. Server 101 may also determine a damage estimate (e.g., an estimate for repairing and/or replacing any damaged parts) after analyzing the photos in step 313 based on predefined rules. The damage estimate may be generated by comparing the photos submitted by the mobile device with photos of similarly damaged vehicles or with photos of non-damaged vehicles of similar make/model. To perform this comparison, server 101 may access a database (e.g., network device 201) of photos of vehicles with various types of damage and/or vehicles with no damage. To initially populate the database with photos for later use, each user may be required to upload various photos of a vehicle upon purchase of the vehicle. Also, as server 101 analyzes recently submitted photos, previously uploaded photos of a given vehicle may be used to determine any pre-existing damage on the vehicle. Once database 201 includes photos/video from many cases of vehicle damage, server 101 may determine a damage estimate for a new case based on the prior cases.
Server 101 may not need to build a new damage estimate piece-by-piece for a given damaged vehicle. In this regard, server 101 (or an individual/group associated with the entity managing server 101) may generate a new damage estimate based on a holistic view of a damaged vehicle. Over time, server 101 may build a database (e.g., network device 201) of specific damage templates (e.g., damages to more than one part of a vehicle that are commonly associated with one another) and estimated/actual costs for repairing damages associated with these templates. Once this database has been built, damage estimates associated with subsequently analyzed vehicles may be generated from a holistic view of the vehicles by accessing information within the historical database.
For instance, if a first type of damage to the front bumper of a vehicle is commonly associated with a second type of damage to the headlights of the same vehicle and this damage template is associated with a predetermined repair cost in the database, server 101 may use this repair cost to generate a new estimate for subsequent vehicles that exhibit damage similar to this damage template.
In one example, the damage estimates retrieved from the historical database may be adjusted based on differences associated with a current case. For instance, the damage estimate may be adjusted based on the average inflation rate (e.g., for parts, labor, etc.) between the date at which the damage estimate within the historical database was generated and the current date. In other embodiments, the damage estimate may be adjusted for small differences such as the make, model, and year of manufacture when the vehicle in the historical database and the currently analyzed vehicle are compared. Similarly, the damage estimate may be adjusted based on differences in the precise damage associated with the vehicle in the historical database and the damage associated with the vehicle currently being analyzed. In yet other examples, the damage estimate may be adjusted based on the terms of an insurance policy that covers damage to the insured vehicle currently being analyzed. One of ordinary skill in the art would understand that any number of factors may be considered when adjusting the damage estimate retrieved for vehicles stored in the historical database to more accurately reflect a damage estimate for a currently analyzed vehicle.
In other aspects, when a vehicle exhibits more than one type of damage, server 101 may access the historical database multiple times (one for each type of damage) and then add one or more interaction terms to the sum of the cost estimates for each type of damage. For instance, extending the example above of damage to a front bumper and to the headlights of a vehicle, server 101 may generate a first damage estimate for repairing the front bumper and a second damage estimate for repairing the headlights. Server 101 may then add these two damage estimates to generate a total damage estimate.
In this embodiment, server 101 may also calculate an interaction term (which may be a positive or a negative value) that represents either an increased (e.g., because the damages taken collectively introduce more complexity and are thus more expensive to repair than if handled individually) or decreased (e.g., because the damages taken collectively have overlapping repair procedures and are thus less expensive to repair than if handled individually) cost of repairing the vehicle when both of these types of damages occur together. The effective total damage estimate may then be the sum of the total damage estimate and the interaction term.
One of ordinary skill in the art would understand that a given damage template may be built based on any number of specific damage types/locations. In addition, server 101 may generate any number of interaction terms for a given analysis. For instance, if a damage estimate is based on damage to three parts of a vehicle, server 101 may generate interaction terms that relate to increased/decreased cost associated with repair to the following part groups: the first two parts, the first and third parts, the second and third parts, and all three parts at once. In other embodiments, server 101 may generate an interaction term for only some of the damaged parts.
In certain aspects, server 101 may also query the claimant with regards to the type of third party service provider (e.g., repair shop, etc.) they would prefer after damage analysis and claims processing is complete.
In other aspects, exterior damage associated with the vehicle may be used to predict (e.g., via predictive modeling using the database of past assessed exterior/interior damage for other similar cases, etc.) the likelihood of interior (e.g., mechanical, cabin, etc.) damage to the vehicle and/or potential difficulties in repairing the vehicle.
Once the image data has been analyzed, server 101 may include computer-executable instructions to recognize the extent of damage to various parts of the vehicle (e.g., chassis, etc.), including various types of dents and edge damage, and to identify various parts of the vehicle.
In some aspects, the detection of damage to the vehicle may be based on object recognition algorithms that compare images (e.g., comparing x, y, and z coordinates of each point on the images) of the vehicle in question to reference images of similar vehicles (e.g., same model, make, year of manufacture, etc.) with no damage. More specifically, server 101 may access a database of images storing the reference images of vehicles of various models and makes. By using object recognition/edge detection algorithms (e.g., involving blur filters, gray-scaling, custom algorithms, etc.), server 101 may determine where damage is located as well as the potential size/area of the damage. Server 101 may also access internal/external databases storing images, damage depth map information (e.g., from previously assessed analyses, etc.), and/or processed claims reports from damaged vehicles that server 101 has assessed previously. In particular, server 101 may access images/depth map information from previously assessed damaged vehicles for use as a guidepost in assessing the damage of a new vehicle. If no reference information (e.g., data, images) exists, axis symmetry information may also be used to identify possible irregularities and/or damage.
In some aspects, the algorithm employed by server 101 may use a comparison of an image of a damaged vehicle with an image of an undamaged version of the same vehicle to “subtract out” and isolate the damaged area of a vehicle. If an exact replica of an undamaged vehicle corresponding to a damaged vehicle under study is not available for this comparison, server 101 may further use various additional image processing algorithms, including blurring filters, etc. to detect a damaged portion of a vehicle.
In additional aspects, server 101 may grayscale all image data to make processing faster. Further, edge filters may be applied to both the image data from the damaged vehicle and its corresponding reference image data so that the edges of a damaged area may be “subtracted out” and identified in the image data of the damaged vehicle. Once the damaged area has been identified in the image data, server 101 may further process the damaged area to sharpen the area, to make the edges more prominent, and to fill any missing links found in the edges. Afterwards, server 101 may color, texture, and/or otherwise “fill in” the damaged area surrounded by the edges and extract the damaged area from the surrounding image data. Once the damaged area has been isolated, server 101 may calculate the precise area of the damage.
Similarly, server 101 may determine the depth of a damaged area (e.g., via stereoscopic methods, etc.) and may analyze raw depth data to further investigate points of interest (e.g., a point that has a much larger depth than surrounding points, etc.). Using this analysis, the damaged area may be further characterized (e.g., a dented area may be detected and if, for example, the general slope of the dent is high, the dent may be characterized as deep and rounded whereas if the slope is low, the dent may be characterized as shallow.)
In addition, if the server 101 retrieves image data or claims reports associated with a similar or the same previously analyzed vehicle that has similar or the same types of damage (e.g., as a result of a similar accident to a similar vehicle or part, etc.) as a vehicle currently being analyzed, server 101 may use a damage analysis or cost estimate of identifying/repairing the damage or replacing a damaged part of the previously analyzed vehicle to generate a damage analysis/cost estimate for the currently analyzed vehicle. In other words, server 101 may perform one or more database queries to match characteristics of the current analysis with previous analyses. For instance, the queries may seek to match the size, depth, and location of a dent on a current vehicle with a similar dent on a vehicle with a similar chassis configuration, make, model, and year of manufacture. For instance, consider a case where the vehicle in question is a new model that has not been analyzed before by server 101. In this scenario, server 101 may attempt to match the vehicle currently being analyzed with its closest match, which in this case may be a similar model from the previous year with the same chassis configuration (e.g., a twin chassis configuration).
In matching a vehicle currently being analyzed with one that has been previously analyzed, server 101 may assign a confidence factor to the match. Server 101 may assign the highest confidence factor (e.g., a confidence factor of 100%) to a comparison between the exact same types of vehicles (e.g., cars of the same make, model, year of manufacture, etc.) having the exact same type of damage (e.g., a predetermined type of dent, etc.). For instance, a comparison between vehicles with two completely different types of damage would have a confidence factor of 0%. As the similarities between the currently analyzed vehicle and previously analyzed vehicles are reduced, server 101 may assign a lower confidence factor to the comparison. For instance, output drawn from comparisons between vehicles of the same make and model but with different years of manufacture may be associated with a slightly lower confidence factor than 100%. In some aspects, confidence factors may decrease further when vehicles of different models and years of manufacture (e.g., vehicles with different chassis configurations, trim line configurations, etc.) but the same make are compared. In one embodiment, server 101 may assign a threshold confidence factor (e.g., 70%, etc.) below which output generated by a comparison performed by server 101 may not be considered reliable. If the confidence factor associated with a comparison between two vehicles falls below this threshold and there is no reliable comparison within the database, server 101 may then use physical details of the damage (e.g., size, location, area, etc.) to provide output such as a cost estimate for damage repair/replacement and/or the amount of time required for repair/replacement.
Server 101 may also use stored data to determine appropriate vendors for repairing/replacing the vehicle and the amount of time for repair/replacement. The wait time for repair/replacement may depend on various factors, including the size (e.g., area, depth, etc.), classification (e.g., turbulent dent, etc.), and location of the damage.
In addition, server 101 may determine if parts nearby to damaged parts may also need to be blended into the damaged area. In other words, if a part of the vehicle needs to be refinished (e.g., repainted) either because it is being replaced or repaired, parts within a predetermined distance of the repaired/replaced part may need to be blended (e.g., color-matched) to the repaired/replaced part.
In some aspects, server 101 may acquire the knowledge of all previous claims processed by server 101, as well as the knowledge of human adjusters, to accurately process future claims. In this way, server 101 may use machine learning to evolve its cost and/or repair estimation procedure based on past experience.
To estimate the cost and repair/replacement time associated with the damage to the vehicle and to determine whether to recommend that the vehicle be replaced or repaired, server 101 may also consider the extent/severity of the damage (area, depth, location, classification, etc.). For instance, damage to a character line (e.g., edge of a door associated with the vehicle) would be more difficult (e.g., more expensive and/or more time-consuming, etc.) to repair than damage to a more central location on the vehicle. Server 101 may also consider the actual cash value and the salvage value of the vehicle and any relevant local, state, and national laws in this analysis. In some aspects, server 101 may generate a rough cost estimate of repairing the damage just based on the extent of the damage; then server 101 may refine this estimate by analyzing previous cost estimates provided by server 101 and/or actual repair data received from third party service providers (e.g., repair shops, etc.) that have repaired similar vehicles with similar damage. In additional aspects, server 101 may generate a basic cost estimate by taking into account factors such as the number of hours predicted for the repair, the labor rate, and the current market conditions. In this aspect, server 101 may compare this basic cost estimate with the cost of merely replacing the vehicle (e.g., a total loss) or the damaged part within the vehicle and based on the comparison, server 101 may recommend the cheaper option. These estimates may also be transmitted to existing platforms (e.g., Audatex®, Mitchell®, etc.) for comparison purposes.
If the analyzed damage to the vehicle is different from the damage indicated by the claimant during the FNOL, server 101 may query the claimant as to the discrepancy. For instance, if the claimant initially provided information relating to damage on the left side of the vehicle but server 101 discovers that the primary damage occurred on the right side, server 101 may question the claimant as to when the damage occurred (e.g., was the damage due to a previous incident or preexisting condition?, is the claimant being truthful?, etc.). Server 101 may also ask the claimant to sign a statement as to the truth of the information provided. The claimant may have the option of answering the questions as they come up or the questions may be queued until the server 101 has finished processing the image analysis of the vehicle. If discrepancies between the claimant's answers and the analyzed damage to the vehicle continue to exist, server 101 may request the involvement of a human claims adjuster.
In other embodiments, a technician associated with an entity managing server 101 (e.g., an insurance company) may analyze the photos to determine a damage estimate. Also, in certain aspects, the process discussed herein may allow a user to upload photos/video that fall into alternative and/or additional categories (e.g., photos for each vehicle part, etc.).
As part of the image/video damage analysis, server 101 may ask the user to compare damage associated with the insured vehicle to damage depicted in a series of photos/video sent by server 101. In other embodiments, server 101 may request that the user classify the type of damage associated with the insured vehicle. For instance, server 101 may ask the user questions such as, “Does the damage to your vehicle look more like the damage shown in photo A or photo B?” Server 101 may ask any number of questions until server 101 has reached a clear understanding of all the damage to the insured vehicle and a damage estimate can be calculated. In some ways, this process may allow the user to estimate the damage to the insured vehicle.
As an example, consider a scenario where a driver's side door is dented and the driver's side window is cracked in a four-door sedan. Assume that the damage is centrally located on the driver's side window and door. Once server 101 receives a valid claim number related to this damaged sedan, server 101 may transmit, to a user device, one or more images depicting various types of damage to the driver's side window and door of four-door sedans that have been previously analyzed and/or stored in memory. The first image or images transmitted to the user device may be based on previously submitted information regarding an accident that caused the damage or any other type of input provided by a claimant and/or related parties. Thus, the first image or images transmitted to the user device may not depict damage that precisely conforms to the damage of the sedan currently being analyzed. For instance, if two images are initially transmitted to the user device, one of the images may depict damage to the corner of the driver's side window and door and the other image may depict damage that is located closer to the center. In this scenario, a user of the user device (e.g., a mobile phone), upon analyzing the two images, may select the image that depicts the centrally-located damage. The mobile device may then transmit the selection to server 101, and server 101 may use this information to generate a damage estimate.
Alternatively, suppose that both images initially transmitted from server 101 depict damage to the corner of the driver's side door and window in a four-door sedan. In this scenario, if both images are equally unrepresentative of the damage to the sedan in question, the user may transmit a message to server 101, stating how the reference images are equally unrepresentative. In response to this message, server 101 may transmit another image or images responsive to the information provided by the user in the message. Once again, the user may select one or more images that most closely depict damage to the sedan in question. Suppose that, on the second pass, server 101 again transmits two images and that, in this instance, both images depict damage to four-door sedans with centrally-located damage to the driver's side door and window. However, suppose that one of the images does not depict damage that is as severe as that exhibited by the sedan in question. In this scenario, the user may choose the image that depicts damage with the severity level consistent with the damage to the sedan in question.
By iterating through multiple rounds of image analysis and data exchange between server 101 and a user device, server 101 may, with each successive round, determine more precisely the damage associated with the sedan in question. When server 101 determines that the damage to the sedan has been fully characterized, server 101 may use the various responses provided by the user device to calculate a damage estimate for the damage to the sedan and transmit a settlement based on the calculated estimate.
In other embodiments, server 101 may transmit an insurance claim to a claims adjuster for manual processing of the claim if server 101 cannot calculate an accurate damage estimate after a predetermined number of question/answer rounds.
In additional embodiments, the user may transmit audio (e.g., by speaking into the mobile device, etc.) and/or an audio file that includes a description of what happened to cause the damage to the vehicle (e.g., the specifics of an accident, etc.). This audio/audio file may be translated into text and incorporated into the photos/video of damage and/or analyzed to determine if the damage matches any narrative description provided by the user. Also, the user may transmit a text file describing damage and/or an accident that caused the damage. In yet other embodiments, the user may capture and transmit the sound of the vehicle being started and/or the sound of the vehicle running to server 101 (e.g., to determine if a muffler associated with the damaged vehicle is broken, etc.).
Based on the analysis and the damage estimate, server 101 may transmit a proposed settlement (e.g., cash compensation, etc.) for the assessed loss to the user of the mobile device in step 315. After the user receives the proposed settlement, the user may notify server 101 whether or not the proposed settlement is acceptable in step 317.
If the settlement terms are not acceptable, then the process may move to step 319 where server 101 may transmit the settlement to a claims adjuster for manual processing. If the settlement terms are acceptable, the process may move to step 321 where server 101 may transfer any funds related to the assessed loss directly to a bank account associated with the user.
In some aspects, users may provide feedback designed to evaluate their experience through process 300. This feedback may be used to improve process 300 for future users and may involve the use of surveys, questionnaires, email, etc.
In other aspects, server 101 may determine and/or transmit supplemental adjustments to an initial damage/repair estimate. For instance, server 101 may determine that there is a 95% chance that repair option A must be performed, a 50% chance that additional repair option B must also be performed, and a 10% chance that additional repair option C must also be performed. When a repair shop examines the damage to a damaged vehicle and notices that there is less/additional damage, server 101 may use this information to revise an initial damage estimate with a supplemental adjustment to the initial estimate. Also, in cases where server 101 predicts that there may be many supplemental adjustments (e.g., above a predetermined threshold number of supplemental adjustments) to the initial estimate of damage, a claims adjuster may manually evaluate the damage and determine the likelihood of each of the supplemental adjustments.
In addition, server 101 may provide the user with a list of repair facilities for repairing the vehicle. Once the vehicle enters the repair process, messages may be pushed to the mobile device of the user to identify where the vehicle is located is in the repair process (e.g., which step of the repair process is the current step, etc.). These messages may identify who is working on the vehicle and/or may include photos/video of the vehicle as it is being repaired. The messages may also identify when the repair process may be completed.
In some aspects, some types of claims may be excluded from the automated process illustrated in FIG. 3. These claims may include comprehensive claims, claims with injuries to any involved parties, claims involving non-drivable vehicles or air bag deployments, claims with loss descriptions that include undercarriage/mechanical damage, claims involving motorcycle and/or recreational vehicle (RV) losses, and claims involving users that already have an estimate for damage associated with an insured vehicle.
FIGS. 4-8B show various display screens displayed to a user of a mobile device in accordance with at least one aspect of the present disclosure. FIG. 4 shows a series of initial display screens displayed when a user starts a damage assessment and claims processing application stored on a mobile device (e.g., network device 201) in accordance with at least one aspect of the present disclosure. Screen 401 may be the initial screen that the user views upon starting the application. Screen 401 may allow the user to enter a claim number to begin a damage assessment and claims processing method. In certain aspects, the claim number may be used to compare a damage estimate generated by analysis of photos submitted by the user to a damage estimate generated manually by a claims adjuster using more conventional claims adjustment techniques. Once a user enters a valid claim number, the mobile device may display screen 403, where the user is presented with photo instructions that explain to the user the types of photos that should be taken. Screen 403 may include instructions on taking photos of the entire insured vehicle, VIN door tag, and any damaged areas of the insured vehicle. When a user presses the “Get Started” button 403 a on screen 403, the mobile device may display screen 405, which allows a user to select and start taking any of the types of photos listed in screen 403 (e.g., photos of the entire vehicle, VIN door tag, and/or damaged areas). The “Submit Photos” button 405 a on screen 405 may be inactive until at least one photo of each type is taken by the user.
FIG. 5A and FIG. 5B show a first series of display screens displayed on a mobile device as a user takes photos of a damaged vehicle in accordance with at least one aspect of the present disclosure. In display screen 501, the user may select to take a photo of the entire vehicle, the VIN door tag, and/or the specific damaged area(s). In the example of FIG. 5A and FIG. 5B, a user selects to take a photo of the entire vehicle. When a user selects one of the photo categories in screen 501, screen 503 may allow the user to select the “Capture Photo” button 503 a to start the camera functionality within the mobile device, the “Adding Existing” button 503 b to choose a photo from the photo roll, and/or the “Cancel” button 503 c to cancel out of the previous command.
Assuming that the user selects the “Capture Photo” button 503 a in screen 503, the mobile device may display screen 505 where instructions related to the current photo type (e.g., a wide view of the entire vehicle) may be overlaid on top of the camera. The user may select the “OK” button 505 a on screen 505 to close the overlay and cause display of the camera screen 507. Camera screen 507 may include a camera shutter button 507 a (e.g., for taking a photo) and flash button 507 b (e.g., for turning the camera flash on/off). The “Instructions” button 507 c on screen 507 may open the instructions overlay from screen 505, and the “Done” button 507 d on screen 507 may save all photos that have been taken to a memory of the mobile device and may return the user to the main photos screen 501. When the user selects the shutter button 507 a in screen 507, the mobile device may display screen 509 to indicate that a photo is being taken. In some aspects, all buttons on screen 509 may be disabled after the user selects the shutter button 507 a.
FIG. 5C shows a second series of display screens displayed on a mobile device as a user takes photos of a damaged vehicle in accordance with at least one aspect of the present disclosure. Screen 511 may allow a user to preview a photo that has been taken and take an appropriate action on this photo. In particular, the user may select a “Discard” button 511 a to discard the photo or a “Use” button 511 b to use the photo for damage assessment and claims processing. Assuming that the user selects “Use” button 511 b, the user may proceed to take other photos within the selected photo type. When the user has taken all the photos of a given photo type, the user may select the “Done” button 513 a on screen 513. After selecting the “Done” button 513 a on screen 513, the mobile device may display screen 515, where thumbnail image(s) of the photo(s) that the user has already taken may be displayed in the corresponding categories.
FIG. 6A and FIG. 6B show a series of display screens displayed on a mobile device for enabling a user to delete photos that have already been taken in accordance with at least one aspect of the present disclosure. Screen 601 displays thumbnails of all photos that have already been taken. When a user selects one of the thumbnails in screen 601, the mobile device may display screen 603, where a series of buttons may be displayed, including an additional options button 603 a for displaying additional options associated with the current photo (e.g., email photo, use photo as wallpaper, etc.), a scroll to previous photo button 603 b for scrolling to the previously-viewed photo in the photo reel, a play photo reel button 603 c for sequentially displaying each photo in the photo reel, a scroll to next photo button 603 d for scrolling to the next photo in the reel, and a delete button 603 e for deleting the currently-viewed photo. If the user selects delete button 603 e, the photo currently displayed may be queued for deletion and mobile device may display screen 605. Screen 605 includes an action panel with a “Delete Photo” button 605 a for confirming that the currently-viewed photo is to be deleted and a “Cancel” button 605 b for cancelling deletion of the currently-viewed photo. If the user selects “Delete Photo” button 605 a, the currently-viewed photo is deleted and the next photo in the current category is displayed in screen 607. If the user selects a back button 607 a on screen 607, the user may back out to return to photos screen 609. Screen 609 may display the remaining thumbnails stored in a memory of the mobile device, with the image that the user deleted in screen 605 removed from the list of thumbnails.
FIG. 7A and FIG. 7B show a series of display screens displayed on a mobile device for enabling a user to submit photos for review by an enhanced claims processing server 101, in accordance with at least one aspect of the present disclosure. Screen 701 may include a “Submit Photos” button 701 a for submitting photos to server 101 when all photos have been taken. When a user presses “Submit Photos” button 701 a, the mobile device may display screen 703, which includes an action panel with the “Submit Photos” button 703 a for confirming that the captured photos are to be submitted to server 101 and a “Cancel” button 703 b for cancelling the submission. If the user selects “Submit Photos” button 703 a, the mobile device may display screen 705 where an upload progress bar may indicate the progress of the photo upload. Once the photos have been fully uploaded, the mobile device may display screen 707, which indicates that the photos have been uploaded and explains any next steps that should be taken.
FIG. 8A and FIG. 8B show a series of display screens displayed on a mobile device for enabling a user to receive feedback from an enhanced claims processing server 101 regarding previously submitted photos, in accordance with at least one aspect of the present disclosure. When enhanced claims processing server 101 completes review of the photos submitted in FIG. 7A and FIG. 7B, server 101 may transmit a notification to the mobile device that feedback is ready for review. When the mobile device receives the notification, screen 801, which includes a notification that feedback is ready, may be displayed. When a user selects the “View Notification” button 801 a, the mobile device may display screen 803, which may include a detailed description of any feedback received from server 101. In this case, server 101 has transmitted a message that asks the user to take additional photos (e.g., of the damage to the left side of a bumper). Screen 803 may also include a “Take Photos” button 803 a which may allow the user to take additional photos of the damaged vehicle. When the user presses “Take Photos” button 803 a, the mobile device may display screen 805 which allows the user to take more photos of the damaged vehicle (e.g., in response to the feedback received in screen 803) using the same process depicted in FIG. 5A and FIG. 5C.
Once all required photos have been taken, the user may press the “Submit Photos” button 807 a in screen 807 to submit the photos taken via screen 805 to enhanced claims processing server 101. When the user presses the “Submit Photos” button 807 a in screen 807, the mobile device may display screen 809, which includes a progress bar that shows the progress of the photo upload to server 101.
The foregoing descriptions of the disclosure have been presented for purposes of illustration and description. They are not exhaustive and do not limit the disclosure to the precise form disclosed. Modifications and variations are possible in light of the above teachings or may be acquired from practicing of the disclosure. For example, the described implementation includes software but the present disclosure may be implemented as a combination of hardware and software or in hardware alone. Additionally, although aspects of the present disclosure are described as being stored in memory, one skilled in the art will appreciate that these aspects can also be stored on other types of computer-readable media, such as secondary storage devices, like hard disks, floppy disks, or CD-ROM; a carrier wave from the Internet or other propagation medium; or other forms of RAM or ROM.

Claims (20)

What is claimed is:
1. A non-transitory computer-readable storage medium having computer-executable program instructions stored thereon that when executed by a processor, cause the processor at least to:
send, to a mobile device, user instructions to capture images depicting damage to an insured item;
receive, from a software application executing on the mobile device, a first plurality of images depicting a plurality of damaged areas of an insured vehicle;
send one or more feedback instructions to the mobile device based on an analysis of the first plurality of images;
receive, from the mobile device and responsive to the one or more feedback instructions, a second plurality of images;
identify the plurality of damaged areas of the insured vehicle within the second plurality of images by applying an object recognition algorithm to the second plurality of images;
access a damage template associated with a reference image and comprising a damage estimate;
compare three-dimensional coordinates of the second plurality of images to three-dimensional coordinates of the reference image;
generate, for a first damaged area of the plurality of damaged areas, and based on the damage estimate and a result of the comparing, a modified damage estimate;
generate a total damage estimate based on the modified damage estimate and a second damage estimate for a second damaged area of the plurality of damaged areas;
generate an effective total damage estimate based on the total damage estimate and an interaction term, wherein the interaction term represents a modified cost associated with collectively repairing the first damaged area and the second damaged area;
determine a settlement based on the effective total damage estimate;
generate a user interface screen comprising the settlement; and
send one or more feedback instructions to the mobile device based on an analysis of the first plurality of images;
receive, from the mobile device and responsive to the one or more feedback instructions, a second plurality of images; and
send, to the mobile device, the user interface screen.
2. The non-transitory computer-readable storage medium of claim 1, wherein the interaction term is generated based on information stored in a database.
3. The non-transitory computer-readable storage medium of claim 1, further comprising one or more additional computer-executable program instructions that, when executed by the processor, cause the processor to:
translate an audio file, comprising data associated with one or more of the plurality of damaged areas of the insured vehicle, into a text file.
4. The non-transitory computer-readable storage medium of claim 1, further comprising one or more additional computer-executable program instructions that, when executed by the processor, cause the processor to:
perform optical character recognition on an image depicting a vehicle identification number (VIN) of the insured vehicle.
5. The non-transitory computer-readable storage medium of claim 1, further comprising one or more additional computer-executable program instructions that, when executed by the processor, cause the processor to:
predict damage to an interior of the insured vehicle based on damage to an exterior of the insured vehicle.
6. The non-transitory computer-readable storage medium of claim 1, further comprising one or more additional computer-executable program instructions that, when executed by the processor, cause the processor to:
send one or more feedback instructions to the mobile device based on an analysis of the first plurality of images;
receive, from the mobile device and responsive to the one or more feedback instructions, a second plurality of images; and
send a request for additional images to the mobile device.
7. The non-transitory computer-readable storage medium of claim 6, further comprising one or more additional computer-executable program instructions that, when executed by the processor, cause the processor to:
receive one or more additional images depicting the plurality of damaged areas of the insured vehicle.
8. The non-transitory computer-readable storage medium of claim 7, further comprising one or more additional computer-executable program instructions that, when executed by the processor, cause the processor to:
determine whether or not the one or more additional images are acceptable.
9. The non-transitory computer-readable storage medium of claim 8, further comprising one or more additional computer-executable program instructions that, when executed by the processor, cause the processor to:
when the one or more additional images are not acceptable, send a second request for additional images to the mobile device.
10. The non-transitory computer-readable storage medium of claim 1, further comprising one or more additional computer-executable program instructions that, when executed by the processor, cause the processor to:
send, to the mobile device, a notification associated with a repairing of the insured vehicle.
11. The non-transitory computer-readable storage medium of claim 1, further comprising one or more additional computer-executable program instructions that, when executed by the processor, cause the processor to:
receive, from the mobile device, a notification indicating whether or not the settlement is acceptable.
12. The non-transitory computer-readable storage medium of claim 11, further comprising one or more additional computer-executable program instructions that, when executed by the processor, cause the processor to:
when the settlement is acceptable, send funds to an account of a user of the mobile device.
13. The non-transitory computer-readable storage medium of claim 11, further comprising one or more additional computer-executable program instructions that, when executed by the processor, cause the processor to:
when the settlement is not acceptable, send the settlement to a claims adjuster.
14. A method comprising:
receiving, by an enhanced claims processor, from a software application executing on a mobile device, a first plurality of images depicting a plurality of damaged areas of an insured vehicle;
sending one or more feedback instructions to the mobile device based on an analysis of the first plurality of images;
receiving, from the mobile device and responsive to the one or more feedback instructions, a second plurality of images;
identifying, by the enhanced claims processor, the plurality of damaged areas of the insured vehicle within the second plurality of images by applying an object recognition algorithm to the second plurality of images;
comparing three-dimensional coordinates of the second plurality of images to three-dimensional coordinates of a reference image of a damage template associated with a reference image;
generating, by the enhanced claims processor, a total damage estimate based on a first damage estimate, a second damage estimate, and an interaction term, wherein the first damage estimate is for a first damaged area and is based on the comparing, and wherein the interaction term represents a modified cost associated with collectively repairing the first damaged area and a second damaged area associated with the second damage estimate;
determining a settlement based on the total damage estimate;
generating a user interface screen comprising the settlement; and
sending, by the enhanced claims processor, to the mobile device, the user interface screen.
15. The method of claim 14, wherein the interaction term is generated based on information stored in a database.
16. The method of claim 14, further comprising:
translating, by the enhanced claims processor, an audio file comprising data associated with one or more of the plurality of damaged areas into a text file.
17. The method of claim 14, further comprising:
performing, by the enhanced claims processor, optical character recognition on an image depicting a vehicle identification number (VIN) of the insured vehicle.
18. The method of claim 14, further comprising:
sending a request for additional images to the mobile device.
19. An apparatus comprising:
a processor; and
a memory storing computer-readable instructions that, when executed by the processor, cause the apparatus at least to:
send, to a mobile device, user instructions to capture images depicting damage to an insured item;
receive, from a software application executing on the mobile device, a first plurality of images depicting a plurality of damaged areas of an insured vehicle;
send one or more feedback instructions to the mobile device based on an analysis of the first plurality of images;
receive, from the mobile device and responsive to the one or more feedback instructions, a second plurality of images;
identify the plurality of damaged areas of the insured vehicle within the second plurality of images by applying an object recognition algorithm to the second plurality of images;
compare three-dimensional coordinates of the second plurality of images to three-dimensional coordinates of a reference image of a damage template associated with a reference image;
generate a total damage estimate based a first damage estimate and an interaction term, wherein the first damage estimate is for a first damaged area and is based on the comparing, and wherein the interaction term represents a modified cost associated with collectively repairing the first damaged area and a second damaged area;
determine a settlement based on the total damage estimate;
generate a user interface screen comprising the settlement; and
send, to the mobile device, the user interface screen.
20. The apparatus of claim 19, wherein the interaction term is generated based on information stored in a database.
US16/570,421 2012-08-16 2019-09-13 Processing insured items holistically with mobile damage assessment and claims processing Active US10803532B1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/570,421 US10803532B1 (en) 2012-08-16 2019-09-13 Processing insured items holistically with mobile damage assessment and claims processing
US17/008,079 US11386503B2 (en) 2012-08-16 2020-08-31 Processing insured items holistically with mobile damage assessment and claims processing
US17/862,159 US12079877B2 (en) 2012-08-16 2022-07-11 Processing insured items holistically with mobile damage assessment and claims processing

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13/587,635 US10430885B1 (en) 2012-08-16 2012-08-16 Processing insured items holistically with mobile damage assessment and claims processing
US13/892,598 US10430886B1 (en) 2012-08-16 2013-05-13 Processing insured items holistically with mobile damage assessment and claims processing
US16/570,421 US10803532B1 (en) 2012-08-16 2019-09-13 Processing insured items holistically with mobile damage assessment and claims processing

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/892,598 Continuation US10430886B1 (en) 2012-08-16 2013-05-13 Processing insured items holistically with mobile damage assessment and claims processing

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/008,079 Continuation US11386503B2 (en) 2012-08-16 2020-08-31 Processing insured items holistically with mobile damage assessment and claims processing

Publications (1)

Publication Number Publication Date
US10803532B1 true US10803532B1 (en) 2020-10-13

Family

ID=68063988

Family Applications (6)

Application Number Title Priority Date Filing Date
US13/587,635 Active US10430885B1 (en) 2012-08-16 2012-08-16 Processing insured items holistically with mobile damage assessment and claims processing
US13/892,598 Active US10430886B1 (en) 2012-08-16 2013-05-13 Processing insured items holistically with mobile damage assessment and claims processing
US14/671,602 Active US10878507B1 (en) 2012-08-16 2015-03-27 Feedback loop in mobile damage assessment and claims processing
US16/570,421 Active US10803532B1 (en) 2012-08-16 2019-09-13 Processing insured items holistically with mobile damage assessment and claims processing
US17/008,079 Active US11386503B2 (en) 2012-08-16 2020-08-31 Processing insured items holistically with mobile damage assessment and claims processing
US17/862,159 Active US12079877B2 (en) 2012-08-16 2022-07-11 Processing insured items holistically with mobile damage assessment and claims processing

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US13/587,635 Active US10430885B1 (en) 2012-08-16 2012-08-16 Processing insured items holistically with mobile damage assessment and claims processing
US13/892,598 Active US10430886B1 (en) 2012-08-16 2013-05-13 Processing insured items holistically with mobile damage assessment and claims processing
US14/671,602 Active US10878507B1 (en) 2012-08-16 2015-03-27 Feedback loop in mobile damage assessment and claims processing

Family Applications After (2)

Application Number Title Priority Date Filing Date
US17/008,079 Active US11386503B2 (en) 2012-08-16 2020-08-31 Processing insured items holistically with mobile damage assessment and claims processing
US17/862,159 Active US12079877B2 (en) 2012-08-16 2022-07-11 Processing insured items holistically with mobile damage assessment and claims processing

Country Status (1)

Country Link
US (6) US10430885B1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220343435A1 (en) * 2012-08-16 2022-10-27 Allstate Insurance Company Processing insured items holistically with mobile damage assessment and claims processing
US11861137B2 (en) 2020-09-09 2024-01-02 State Farm Mutual Automobile Insurance Company Vehicular incident reenactment using three-dimensional (3D) representations
US12079878B2 (en) 2012-08-16 2024-09-03 Allstate Insurance Company Feedback loop in mobile damage assessment and claims processing
US12142039B1 (en) 2022-06-23 2024-11-12 State Farm Mutual Automobile Insurance Company Interactive insurance inventory and claim generation

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150324924A1 (en) * 2011-04-28 2015-11-12 Allstate Insurance Company Streamlined Claims Processing
US8712893B1 (en) 2012-08-16 2014-04-29 Allstate Insurance Company Enhanced claims damage estimation using aggregate display
US11532048B2 (en) 2012-08-16 2022-12-20 Allstate Insurance Company User interactions in mobile damage assessment and claims processing
US10783585B1 (en) 2012-08-16 2020-09-22 Allstate Insurance Company Agent-facilitated claims damage estimation
US10572944B1 (en) * 2012-08-16 2020-02-25 Allstate Insurance Company Claims damage estimation using enhanced display
US11455691B2 (en) 2012-08-16 2022-09-27 Allstate Insurance Company Processing insured items holistically with mobile damage assessment and claims processing
US10580075B1 (en) 2012-08-16 2020-03-03 Allstate Insurance Company Application facilitated claims damage estimation
US10552911B1 (en) * 2014-01-10 2020-02-04 United Services Automobile Association (Usaa) Determining status of building modifications using informatics sensor data
US11017477B1 (en) 2016-09-07 2021-05-25 United Services Automobile Association (Usaa) Digital imagery, audio, and meta-data
US11361380B2 (en) * 2016-09-21 2022-06-14 Allstate Insurance Company Enhanced image capture and analysis of damaged tangible objects
AU2018233168A1 (en) * 2017-03-16 2019-10-10 Randall Innovations Pty Ltd Improved insurance system
US10791265B1 (en) * 2017-10-13 2020-09-29 State Farm Mutual Automobile Insurance Company Systems and methods for model-based analysis of damage to a vehicle
US11144997B2 (en) * 2017-11-14 2021-10-12 James Mark Chappell System and method for expediting purchase of vehicular insurance
US11562436B2 (en) 2018-02-08 2023-01-24 The Travelers Indemnity Company Systems and methods for automated accident analysis
MX2021003882A (en) * 2018-10-03 2021-08-05 Solera Holdings Inc Apparatus and method for combined visual intelligence.
US11182860B2 (en) * 2018-10-05 2021-11-23 The Toronto-Dominion Bank System and method for providing photo-based estimation
US10949553B1 (en) 2019-09-03 2021-03-16 Airmika Inc. System for and methods of securing vehicle electronic data
US11468198B2 (en) * 2020-04-01 2022-10-11 ImageKeeper LLC Secure digital media authentication and analysis
US20210365887A1 (en) * 2020-05-22 2021-11-25 Mitchell International, Inc. Near-real-time collaborative vehicle repair estimating tool
US11769120B2 (en) * 2020-10-14 2023-09-26 Mitchell International, Inc. Systems and methods for improving user experience during damage appraisal
CN112597931B (en) * 2020-12-28 2024-06-18 京东科技控股股份有限公司 Screen state detection method, device, electronic equipment, server and storage medium
US11087278B1 (en) * 2021-01-29 2021-08-10 Coupang Corp. Computerized systems and methods for managing inventory by grading returned products
US12002192B2 (en) 2021-11-16 2024-06-04 Solera Holdings, Llc Transfer of damage markers from images to 3D vehicle models for damage assessment
US20240012856A1 (en) * 2022-07-08 2024-01-11 Motorola Solutions, Inc. Method and system that accounts for appearance changes in an object that are attributable to object deformation or restoration

Citations (101)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4899292A (en) 1988-03-02 1990-02-06 Image Storage/Retrieval Systems, Inc. System for storing and retrieving text and associated graphics
US5128859A (en) * 1990-09-12 1992-07-07 Carbone Albert R Electronic accident estimating system
US5317503A (en) 1992-03-27 1994-05-31 Isao Inoue Apparatus for calculating a repair cost of a damaged car
US5432904A (en) 1991-02-19 1995-07-11 Ccc Information Services Inc. Auto repair estimate, text and graphic system
US5504674A (en) 1991-02-19 1996-04-02 Ccc Information Services, Inc. Insurance claims estimate, text, and graphics network and method
US5950169A (en) 1993-05-19 1999-09-07 Ccc Information Services, Inc. System and method for managing insurance claim processing
JP2000050156A (en) * 1998-08-03 2000-02-18 Nippon Telegr & Teleph Corp <Ntt> News supporting system
US6141611A (en) * 1998-12-01 2000-10-31 John J. Mackey Mobile vehicle accident data system
US6185540B1 (en) 1994-12-28 2001-02-06 Automatic Data Processing Insurance estimating system
US20020002475A1 (en) * 2000-04-13 2002-01-03 Joel Freedman Automated insurance system and method
KR20020008466A (en) 2000-07-20 2002-01-31 이수창 Insurance system used by on-line image transmitting and processing method thereof
US20020055861A1 (en) * 2000-11-08 2002-05-09 King Daniel A. Claiming system and method
EP1215612A1 (en) 2000-02-15 2002-06-19 E.A.C. Co., Ltd. System for estimating car repair expense and estimating method
US20020161533A1 (en) 2000-02-15 2002-10-31 Tateo Uegaki System for recognizing damaged part of accident-involved car and computer-readable medium on which program is recorded
US20020188479A1 (en) * 2001-06-05 2002-12-12 Renwick Glenn M. Method of processing vehicle damage claims
US20030219169A1 (en) 2002-02-22 2003-11-27 Piergiorgio Sartor Method and apparatus for improving picture sharpness
US20040064345A1 (en) 2002-09-27 2004-04-01 Ajamian Setrak A. Internet claims handling services
US20040153346A1 (en) 2003-02-04 2004-08-05 Allstate Insurance Company Remote contents estimating system and method
US20050125127A1 (en) 1998-02-04 2005-06-09 Bomar John B.Jr. System and method for determining post-collision vehicular velocity changes
US20050228683A1 (en) 2004-04-13 2005-10-13 Stephen Saylor Integrated use of a portable image capture device into a workflow process
US20050246108A1 (en) 2004-03-25 2005-11-03 Airbus France Method and system for characterizing structural damage from observing surface distortions
US20050251427A1 (en) 2004-05-07 2005-11-10 International Business Machines Corporation Rapid business support of insured property using image analysis
KR20060031208A (en) 2004-10-07 2006-04-12 김준호 A system for insurance claim of broken cars(automoble, taxi, bus, truck and so forth) of a motoring accident
US20060080154A1 (en) 2004-07-27 2006-04-13 Larsen Donovan R Method of verifying insurance claims
US7092369B2 (en) 1995-11-17 2006-08-15 Symbol Technologies, Inc. Communications network with wireless gateways for mobile terminal access
CN1828336A (en) 2006-04-04 2006-09-06 张路平 Rapid estimation method for earthquake disaster damage based on GIS technology
US20070027726A1 (en) 2004-09-08 2007-02-01 Warren Gregory S Calculation of driver score based on vehicle operation for forward looking insurance premiums
US7203654B2 (en) 2003-01-04 2007-04-10 Dale Menendez Method of expediting insurance claims
US7263493B1 (en) 2002-01-11 2007-08-28 P5, Inc. Delivering electronic versions of supporting documents associated with an insurance claim
US20080052134A1 (en) * 2006-05-18 2008-02-28 Vikki Nowak Rich claim reporting system
US20080059238A1 (en) 2006-09-01 2008-03-06 Athenahealth, Inc. Medical image annotation
US7346523B1 (en) 2002-01-11 2008-03-18 P5, Inc. Processing an insurance claim using electronic versions of supporting documents
US7432938B1 (en) 1996-08-19 2008-10-07 Qwest Communications International, Inc. System and method for annotating electronic documents
US20080255887A1 (en) * 2007-04-10 2008-10-16 Autoonline Gmbh Informationssysteme Method and system for processing an insurance claim for a damaged vehicle
US20080255722A1 (en) 2006-05-22 2008-10-16 Mcclellan Scott System and Method for Evaluating Driver Behavior
US20080267487A1 (en) 2004-05-11 2008-10-30 Fausto Siri Process and System for Analysing Deformations in Motor Vehicles
US20090018859A1 (en) 2006-09-01 2009-01-15 Purifoy Jonathan P Method for vehicle repair estimate and scheduling
US20090018874A1 (en) 1999-12-16 2009-01-15 Hartford Fire Insurance Company Method and system for issuing insurance underwriting instruments
US20090100106A1 (en) 2007-10-12 2009-04-16 Anthony Marcus System and Method for Securely Storing Wirelessly Transmitted Text, Images and Video
US20090138290A1 (en) 2006-09-26 2009-05-28 Holden Johnny L Insurance adjustment through digital imaging system and method
US20090147988A1 (en) 2007-12-05 2009-06-11 Jones Paul W Image transfer with secure quality assessment
US7586654B2 (en) 2002-10-11 2009-09-08 Hewlett-Packard Development Company, L.P. System and method of adding messages to a scanned image
US20090234678A1 (en) * 2008-03-11 2009-09-17 Arenas Claims Consulting, Inc. Computer systems and methods for assisting accident victims with insurance claims
US20090265193A1 (en) 2008-04-17 2009-10-22 Collins Dean Methods and systems for automated property insurance inspection
US20090309893A1 (en) 2006-06-29 2009-12-17 Aftercad Software Inc. Method and system for displaying and communicating complex graphics file information
US20100036683A1 (en) 2008-08-05 2010-02-11 Logan Andrew J Diagramming tool for vehicle insurance claims
WO2010026170A1 (en) 2008-09-02 2010-03-11 Ecole Polytechnique Federale De Lausanne (Epfl) Image annotation on portable devices
US20100066012A1 (en) 2008-09-17 2010-03-18 Canon Kabushiki Kaisha Image forming system, image forming apparatus, and sheet feeding apparatus
US20100088123A1 (en) 2008-10-07 2010-04-08 Mccall Thomas A Method for using electronic metadata to verify insurance claims
US7702529B2 (en) 2002-11-27 2010-04-20 Computer Sciences Corporation Computerized method and system for estimating an effect on liability using claim data accessed from claim reporting software
US20100138298A1 (en) 2008-04-02 2010-06-03 William Fitzgerald System for advertising integration with auxiliary interface
US7734485B1 (en) 2008-06-25 2010-06-08 United Services Automobile Association (Usaa) Systems and methods for insurance coverage
US20100174564A1 (en) 2009-01-06 2010-07-08 Mark Stender Method and system for connecting an insured to an insurer using a mobile device
JP2010157267A (en) 2002-11-08 2010-07-15 Tokio Marine & Nichido Fire Insurance Co Ltd Program, method and device for collecting data for damage insurance processing
US7873710B2 (en) 2007-02-06 2011-01-18 5O9, Inc. Contextual data communication platform
US7889931B2 (en) 2004-10-22 2011-02-15 Gb Investments, Inc. Systems and methods for automated vehicle image acquisition, analysis, and reporting
US20110040692A1 (en) 2009-08-14 2011-02-17 Erik Ahroon System and method for acquiring, comparing and evaluating property condition
US20110054806A1 (en) 2009-06-05 2011-03-03 Jentek Sensors, Inc. Component Adaptive Life Management
US7953615B2 (en) 2000-04-03 2011-05-31 Mitchell International, Inc. System and method of administering, tracking and managing of claims processing
US7962485B1 (en) 2008-04-25 2011-06-14 Trandal David S Methods and systems for inventory management
US20110161117A1 (en) 2009-12-31 2011-06-30 Busque Keven J Insurance processing system and method using mobile devices for proof of ownership
US20110196707A1 (en) 2002-08-07 2011-08-11 Metropolitan Property And Casualty Insurance Company System and method for identifying and assessing comparative negligence in insurance claims
US20110213628A1 (en) 2009-12-31 2011-09-01 Peak David F Systems and methods for providing a safety score associated with a user location
US8015036B1 (en) * 2007-04-02 2011-09-06 ClaimAssistant.com, LLC Automated claim mediation system and associated methods
US20110218825A1 (en) 2010-03-03 2011-09-08 International Business Machines Corporation Three-dimensional interactive vehicle damage claim interface
US8019629B1 (en) 2008-04-07 2011-09-13 United Services Automobile Association (Usaa) Systems and methods for automobile accident claims initiation
US8035639B2 (en) 2006-10-13 2011-10-11 Gerhard Witte Method and apparatus for determining the alteration of the shape of a three dimensional object
US20110270641A1 (en) 2006-02-15 2011-11-03 Allstate Insurance Company Retail Location Services
US8081795B2 (en) 2008-05-09 2011-12-20 Hartford Fire Insurance Company System and method for assessing a condition of property
US20110313951A1 (en) 2010-06-19 2011-12-22 SHzoom LLC Vehicle Repair Cost Estimate Acquisition System and Method
WO2011157064A1 (en) 2010-06-18 2011-12-22 中兴通讯股份有限公司 System and method for vehicle insurance claims based on mobile communication network
US20110313936A1 (en) 2010-06-18 2011-12-22 Joseph Michael Sieger Method and apparatus for estimating value of a damaged vehicle
US8239220B2 (en) * 2006-06-08 2012-08-07 Injury Sciences Llc Method and apparatus for obtaining photogrammetric data to estimate impact severity
WO2012113084A1 (en) 2011-02-25 2012-08-30 Audatex Gmbh System and method for estimating collision damage to a car
US20120290333A1 (en) 2006-10-18 2012-11-15 Hartford Fire Insurance Company System and method for repair calculation, replacement calculation, and insurance adjustment
US20120297337A1 (en) 2006-05-31 2012-11-22 Manheim Investments, Inc. Computer-based technology for aiding the repair of motor vehicles
WO2013003957A1 (en) 2011-07-05 2013-01-10 Shunock Michael Stewart System and method for annotating images
US8510196B1 (en) * 2012-08-16 2013-08-13 Allstate Insurance Company Feedback loop in mobile damage assessment and claims processing
US20130297353A1 (en) * 2008-01-18 2013-11-07 Mitek Systems Systems and methods for filing insurance claims using mobile imaging
US20130317861A1 (en) * 2012-05-24 2013-11-28 Nathan Lee Tofte System And Method For Real-Time Accident Documentation And Claim Submission
US20140032430A1 (en) 2012-05-25 2014-01-30 Insurance Auto Auctions, Inc. Title transfer application and method
US8650106B1 (en) 2007-06-13 2014-02-11 United Sevices Automobile Association Systems and methods for processing overhead imagery
US20140108058A1 (en) 2012-10-11 2014-04-17 Agero, Inc. Method and System to Determine Auto Insurance Risk
US8712893B1 (en) * 2012-08-16 2014-04-29 Allstate Insurance Company Enhanced claims damage estimation using aggregate display
US20140122133A1 (en) 2012-10-31 2014-05-01 Bodyshopbids, Inc. Method of virtually settling insurance claims
US20140229207A1 (en) 2011-09-29 2014-08-14 Tata Consultancy Services Limited Damage assessment of an object
US20140266789A1 (en) 2013-03-12 2014-09-18 The Inner Circle Group LLC System and method for determining a driver in a telematic application
US20150073864A1 (en) * 2011-01-11 2015-03-12 Accurence, Inc. Method and System for Property Damage Analysis
US20150348204A1 (en) 2014-05-28 2015-12-03 James C. Daues Method for assessing hail damage
US20150363717A1 (en) 2014-06-11 2015-12-17 Hartford Fire Insurance Company System and method for processing of uav based data for risk mitigation and loss control
US20170116494A1 (en) * 2015-10-22 2017-04-27 Abbyy Development Llc Video capture in data capture scenario
US20170270650A1 (en) * 2016-03-17 2017-09-21 Conduent Business Services, Llc Image analysis system for property damage assessment and verification
US20170270612A1 (en) 2016-03-17 2017-09-21 Conduent Business Services, Llc Image capture system for property damage assessment
US20170293894A1 (en) 2016-04-06 2017-10-12 American International Group, Inc. Automatic assessment of damage and repair costs in vehicles
US9824453B1 (en) * 2015-10-14 2017-11-21 Allstate Insurance Company Three dimensional image scan for vehicle
US9970881B1 (en) * 2011-09-08 2018-05-15 United Services Automobile Association (Usaa) Property inspection devices, methods, and systems
US20180260793A1 (en) 2016-04-06 2018-09-13 American International Group, Inc. Automatic assessment of damage and repair costs in vehicles
US10373262B1 (en) * 2014-03-18 2019-08-06 Ccc Information Services Inc. Image processing system for vehicle damage
US10373387B1 (en) * 2017-04-07 2019-08-06 State Farm Mutual Automobile Insurance Company Systems and methods for enhancing and developing accident scene visualizations
US10380696B1 (en) * 2014-03-18 2019-08-13 Ccc Information Services Inc. Image processing system for vehicle damage
US10430886B1 (en) * 2012-08-16 2019-10-01 Allstate Insurance Company Processing insured items holistically with mobile damage assessment and claims processing

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BE374798A (en) * 1929-11-19
US6386560B2 (en) 2000-04-25 2002-05-14 Joseph P. Calender Dolly for large appliances
US7895063B2 (en) 2002-11-27 2011-02-22 Computer Sciences Corporation Computerized method and system for creating pre-configured claim reports including liability in an accident estimated using a computer system
US7707039B2 (en) 2004-02-15 2010-04-27 Exbiblio B.V. Automatic modification of web pages
US7812860B2 (en) 2004-04-01 2010-10-12 Exbiblio B.V. Handheld device for capturing text from both a document printed on paper and a document displayed on a dynamic display device
US9137417B2 (en) 2005-03-24 2015-09-15 Kofax, Inc. Systems and methods for processing video data
US10102583B2 (en) 2008-01-18 2018-10-16 Mitek Systems, Inc. System and methods for obtaining insurance offers using mobile image capture
US9691189B1 (en) 2008-09-29 2017-06-27 United Services Automobile Association Accident assessment and reconstruction systems and applications
US9584774B2 (en) 2011-10-24 2017-02-28 Motorola Solutions, Inc. Method and apparatus for remotely controlling an image capture position of a camera
US20170111532A1 (en) 2012-01-12 2017-04-20 Kofax, Inc. Real-time processing of video streams captured using mobile devices
TWI492166B (en) 2012-01-12 2015-07-11 Kofax Inc Systems and methods for mobile image capture and processing
US9275281B2 (en) 2012-01-12 2016-03-01 Kofax, Inc. Mobile image capture, processing, and electronic form generation
US8768038B1 (en) 2012-06-19 2014-07-01 Wells Fargo Bank, N.A. System and method for mobile check deposit
US10783585B1 (en) 2012-08-16 2020-09-22 Allstate Insurance Company Agent-facilitated claims damage estimation
US11532048B2 (en) 2012-08-16 2022-12-20 Allstate Insurance Company User interactions in mobile damage assessment and claims processing
US10572944B1 (en) 2012-08-16 2020-02-25 Allstate Insurance Company Claims damage estimation using enhanced display
US10580075B1 (en) 2012-08-16 2020-03-03 Allstate Insurance Company Application facilitated claims damage estimation
US10304137B1 (en) 2012-12-27 2019-05-28 Allstate Insurance Company Automated damage assessment and claims processing
CN105518704A (en) 2013-05-03 2016-04-20 柯法克斯公司 Systems and methods for detecting and classifying objects in video captured using mobile devices
US20150029346A1 (en) 2013-07-23 2015-01-29 Insurance Auto Auctions, Inc. Photo inspection guide for vehicle auction
US9210327B2 (en) 2013-12-02 2015-12-08 Yahoo! Inc. Blur aware photo feedback
US10182187B2 (en) 2014-06-16 2019-01-15 Playvuu, Inc. Composing real-time processed video content with a mobile device
US20160171622A1 (en) 2014-12-15 2016-06-16 Loss of Use, Inc. Insurance Asset Verification and Claims Processing System
EP3800532B1 (en) 2014-12-24 2024-06-19 Nokia Technologies Oy Automated monitoring of a scene
US20210312557A1 (en) 2015-01-21 2021-10-07 State Farm Mutual Automobile Insurance Company Automatically Generating Personal Articles Insurance Data Based Upon Digital Images
US10083551B1 (en) 2015-04-13 2018-09-25 Allstate Insurance Company Automatic crash detection
EP3131311B1 (en) 2015-08-14 2019-06-19 Nokia Technologies Oy Monitoring
US11769212B2 (en) 2016-06-08 2023-09-26 Allstate Insurance Company Predictive claims platform for managing repairs
US10902525B2 (en) 2016-09-21 2021-01-26 Allstate Insurance Company Enhanced image capture and analysis of damaged tangible objects
US11361380B2 (en) 2016-09-21 2022-06-14 Allstate Insurance Company Enhanced image capture and analysis of damaged tangible objects
CN107610091A (en) 2017-07-31 2018-01-19 阿里巴巴集团控股有限公司 Vehicle insurance image processing method, device, server and system
CN109145903A (en) 2018-08-22 2019-01-04 阿里巴巴集团控股有限公司 A kind of image processing method and device
US10636211B1 (en) 2018-10-05 2020-04-28 The Toronto-Dominion Bank System and method for remotely indicating vehicular damage
US11468198B2 (en) 2020-04-01 2022-10-11 ImageKeeper LLC Secure digital media authentication and analysis

Patent Citations (116)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4899292A (en) 1988-03-02 1990-02-06 Image Storage/Retrieval Systems, Inc. System for storing and retrieving text and associated graphics
US5128859A (en) * 1990-09-12 1992-07-07 Carbone Albert R Electronic accident estimating system
US5432904A (en) 1991-02-19 1995-07-11 Ccc Information Services Inc. Auto repair estimate, text and graphic system
US5504674A (en) 1991-02-19 1996-04-02 Ccc Information Services, Inc. Insurance claims estimate, text, and graphics network and method
US5317503A (en) 1992-03-27 1994-05-31 Isao Inoue Apparatus for calculating a repair cost of a damaged car
US5950169A (en) 1993-05-19 1999-09-07 Ccc Information Services, Inc. System and method for managing insurance claim processing
US6185540B1 (en) 1994-12-28 2001-02-06 Automatic Data Processing Insurance estimating system
US7092369B2 (en) 1995-11-17 2006-08-15 Symbol Technologies, Inc. Communications network with wireless gateways for mobile terminal access
US7432938B1 (en) 1996-08-19 2008-10-07 Qwest Communications International, Inc. System and method for annotating electronic documents
US7197444B2 (en) 1998-02-04 2007-03-27 Injury Sciences Llc System and method for determining post-collision vehicular velocity changes
US20050125127A1 (en) 1998-02-04 2005-06-09 Bomar John B.Jr. System and method for determining post-collision vehicular velocity changes
JP2000050156A (en) * 1998-08-03 2000-02-18 Nippon Telegr & Teleph Corp <Ntt> News supporting system
US6141611A (en) * 1998-12-01 2000-10-31 John J. Mackey Mobile vehicle accident data system
US20090018874A1 (en) 1999-12-16 2009-01-15 Hartford Fire Insurance Company Method and system for issuing insurance underwriting instruments
EP1215612A1 (en) 2000-02-15 2002-06-19 E.A.C. Co., Ltd. System for estimating car repair expense and estimating method
US20020161533A1 (en) 2000-02-15 2002-10-31 Tateo Uegaki System for recognizing damaged part of accident-involved car and computer-readable medium on which program is recorded
US7953615B2 (en) 2000-04-03 2011-05-31 Mitchell International, Inc. System and method of administering, tracking and managing of claims processing
US20020002475A1 (en) * 2000-04-13 2002-01-03 Joel Freedman Automated insurance system and method
KR20020008466A (en) 2000-07-20 2002-01-31 이수창 Insurance system used by on-line image transmitting and processing method thereof
US20020055861A1 (en) * 2000-11-08 2002-05-09 King Daniel A. Claiming system and method
US20020188479A1 (en) * 2001-06-05 2002-12-12 Renwick Glenn M. Method of processing vehicle damage claims
US7324951B2 (en) 2001-06-05 2008-01-29 Renwick Glenn M Method of processing vehicle damage claims
US7346523B1 (en) 2002-01-11 2008-03-18 P5, Inc. Processing an insurance claim using electronic versions of supporting documents
US7263493B1 (en) 2002-01-11 2007-08-28 P5, Inc. Delivering electronic versions of supporting documents associated with an insurance claim
US20030219169A1 (en) 2002-02-22 2003-11-27 Piergiorgio Sartor Method and apparatus for improving picture sharpness
US20110196707A1 (en) 2002-08-07 2011-08-11 Metropolitan Property And Casualty Insurance Company System and method for identifying and assessing comparative negligence in insurance claims
US20040064345A1 (en) 2002-09-27 2004-04-01 Ajamian Setrak A. Internet claims handling services
US7586654B2 (en) 2002-10-11 2009-09-08 Hewlett-Packard Development Company, L.P. System and method of adding messages to a scanned image
JP2010157267A (en) 2002-11-08 2010-07-15 Tokio Marine & Nichido Fire Insurance Co Ltd Program, method and device for collecting data for damage insurance processing
US7702529B2 (en) 2002-11-27 2010-04-20 Computer Sciences Corporation Computerized method and system for estimating an effect on liability using claim data accessed from claim reporting software
US7203654B2 (en) 2003-01-04 2007-04-10 Dale Menendez Method of expediting insurance claims
US20040153346A1 (en) 2003-02-04 2004-08-05 Allstate Insurance Company Remote contents estimating system and method
US20050246108A1 (en) 2004-03-25 2005-11-03 Airbus France Method and system for characterizing structural damage from observing surface distortions
US20050228683A1 (en) 2004-04-13 2005-10-13 Stephen Saylor Integrated use of a portable image capture device into a workflow process
US7809587B2 (en) * 2004-05-07 2010-10-05 International Business Machines Corporation Rapid business support of insured property using image analysis
US20050251427A1 (en) 2004-05-07 2005-11-10 International Business Machines Corporation Rapid business support of insured property using image analysis
US20080267487A1 (en) 2004-05-11 2008-10-30 Fausto Siri Process and System for Analysing Deformations in Motor Vehicles
US20060080154A1 (en) 2004-07-27 2006-04-13 Larsen Donovan R Method of verifying insurance claims
US20070027726A1 (en) 2004-09-08 2007-02-01 Warren Gregory S Calculation of driver score based on vehicle operation for forward looking insurance premiums
KR20060031208A (en) 2004-10-07 2006-04-12 김준호 A system for insurance claim of broken cars(automoble, taxi, bus, truck and so forth) of a motoring accident
US7889931B2 (en) 2004-10-22 2011-02-15 Gb Investments, Inc. Systems and methods for automated vehicle image acquisition, analysis, and reporting
US20110270641A1 (en) 2006-02-15 2011-11-03 Allstate Insurance Company Retail Location Services
CN1828336A (en) 2006-04-04 2006-09-06 张路平 Rapid estimation method for earthquake disaster damage based on GIS technology
US20080052134A1 (en) * 2006-05-18 2008-02-28 Vikki Nowak Rich claim reporting system
US8095394B2 (en) * 2006-05-18 2012-01-10 Progressive Casualty Insurance Company Rich claim reporting system
US8554587B1 (en) 2006-05-18 2013-10-08 Progressive Casualty Insurance Company Rich claim reporting system
US20080255722A1 (en) 2006-05-22 2008-10-16 Mcclellan Scott System and Method for Evaluating Driver Behavior
US20120297337A1 (en) 2006-05-31 2012-11-22 Manheim Investments, Inc. Computer-based technology for aiding the repair of motor vehicles
US8239220B2 (en) * 2006-06-08 2012-08-07 Injury Sciences Llc Method and apparatus for obtaining photogrammetric data to estimate impact severity
US20090309893A1 (en) 2006-06-29 2009-12-17 Aftercad Software Inc. Method and system for displaying and communicating complex graphics file information
US20080059238A1 (en) 2006-09-01 2008-03-06 Athenahealth, Inc. Medical image annotation
US20090018859A1 (en) 2006-09-01 2009-01-15 Purifoy Jonathan P Method for vehicle repair estimate and scheduling
US20090138290A1 (en) 2006-09-26 2009-05-28 Holden Johnny L Insurance adjustment through digital imaging system and method
US8035639B2 (en) 2006-10-13 2011-10-11 Gerhard Witte Method and apparatus for determining the alteration of the shape of a three dimensional object
US20120290333A1 (en) 2006-10-18 2012-11-15 Hartford Fire Insurance Company System and method for repair calculation, replacement calculation, and insurance adjustment
US7873710B2 (en) 2007-02-06 2011-01-18 5O9, Inc. Contextual data communication platform
US8015036B1 (en) * 2007-04-02 2011-09-06 ClaimAssistant.com, LLC Automated claim mediation system and associated methods
US20080255887A1 (en) * 2007-04-10 2008-10-16 Autoonline Gmbh Informationssysteme Method and system for processing an insurance claim for a damaged vehicle
US8650106B1 (en) 2007-06-13 2014-02-11 United Sevices Automobile Association Systems and methods for processing overhead imagery
US20090100106A1 (en) 2007-10-12 2009-04-16 Anthony Marcus System and Method for Securely Storing Wirelessly Transmitted Text, Images and Video
US20090147988A1 (en) 2007-12-05 2009-06-11 Jones Paul W Image transfer with secure quality assessment
US20130297353A1 (en) * 2008-01-18 2013-11-07 Mitek Systems Systems and methods for filing insurance claims using mobile imaging
US20090234678A1 (en) * 2008-03-11 2009-09-17 Arenas Claims Consulting, Inc. Computer systems and methods for assisting accident victims with insurance claims
US20100138298A1 (en) 2008-04-02 2010-06-03 William Fitzgerald System for advertising integration with auxiliary interface
US8019629B1 (en) 2008-04-07 2011-09-13 United Services Automobile Association (Usaa) Systems and methods for automobile accident claims initiation
US20090265193A1 (en) 2008-04-17 2009-10-22 Collins Dean Methods and systems for automated property insurance inspection
US7962485B1 (en) 2008-04-25 2011-06-14 Trandal David S Methods and systems for inventory management
US8306258B2 (en) 2008-05-09 2012-11-06 Hartford Fire Insurance Company System and method for assessing a condition of an insured property
US8081795B2 (en) 2008-05-09 2011-12-20 Hartford Fire Insurance Company System and method for assessing a condition of property
US20120066012A1 (en) 2008-05-09 2012-03-15 Hartford Fire Insurance Company System and method for assessing a condition of an insured property
US7734485B1 (en) 2008-06-25 2010-06-08 United Services Automobile Association (Usaa) Systems and methods for insurance coverage
US20100036683A1 (en) 2008-08-05 2010-02-11 Logan Andrew J Diagramming tool for vehicle insurance claims
WO2010026170A1 (en) 2008-09-02 2010-03-11 Ecole Polytechnique Federale De Lausanne (Epfl) Image annotation on portable devices
US20100066012A1 (en) 2008-09-17 2010-03-18 Canon Kabushiki Kaisha Image forming system, image forming apparatus, and sheet feeding apparatus
US20100088123A1 (en) 2008-10-07 2010-04-08 Mccall Thomas A Method for using electronic metadata to verify insurance claims
US20100174564A1 (en) 2009-01-06 2010-07-08 Mark Stender Method and system for connecting an insured to an insurer using a mobile device
US20110054806A1 (en) 2009-06-05 2011-03-03 Jentek Sensors, Inc. Component Adaptive Life Management
US20110040692A1 (en) 2009-08-14 2011-02-17 Erik Ahroon System and method for acquiring, comparing and evaluating property condition
US20110213628A1 (en) 2009-12-31 2011-09-01 Peak David F Systems and methods for providing a safety score associated with a user location
US20110161116A1 (en) 2009-12-31 2011-06-30 Peak David F System and method for geocoded insurance processing using mobile devices
US20110161100A1 (en) 2009-12-31 2011-06-30 Peak David F Insurance processing systems and methods using mobile devices for medical monitoring
US20110161118A1 (en) 2009-12-31 2011-06-30 Borden Richard M Insurance processing systems and methods using mobi
US20110161117A1 (en) 2009-12-31 2011-06-30 Busque Keven J Insurance processing system and method using mobile devices for proof of ownership
US20110218825A1 (en) 2010-03-03 2011-09-08 International Business Machines Corporation Three-dimensional interactive vehicle damage claim interface
US20110313936A1 (en) 2010-06-18 2011-12-22 Joseph Michael Sieger Method and apparatus for estimating value of a damaged vehicle
WO2011157064A1 (en) 2010-06-18 2011-12-22 中兴通讯股份有限公司 System and method for vehicle insurance claims based on mobile communication network
US20110313951A1 (en) 2010-06-19 2011-12-22 SHzoom LLC Vehicle Repair Cost Estimate Acquisition System and Method
US20170330207A1 (en) * 2011-01-11 2017-11-16 Accurence, Inc. Method and system for property damage analysis
US20150073864A1 (en) * 2011-01-11 2015-03-12 Accurence, Inc. Method and System for Property Damage Analysis
WO2012113084A1 (en) 2011-02-25 2012-08-30 Audatex Gmbh System and method for estimating collision damage to a car
US20150294419A1 (en) * 2011-02-25 2015-10-15 Jorge Fernando Gonzalez Miranda System and method for estimating collision damage to a car
WO2013003957A1 (en) 2011-07-05 2013-01-10 Shunock Michael Stewart System and method for annotating images
US9970881B1 (en) * 2011-09-08 2018-05-15 United Services Automobile Association (Usaa) Property inspection devices, methods, and systems
US20140229207A1 (en) 2011-09-29 2014-08-14 Tata Consultancy Services Limited Damage assessment of an object
US20130317861A1 (en) * 2012-05-24 2013-11-28 Nathan Lee Tofte System And Method For Real-Time Accident Documentation And Claim Submission
US20140032430A1 (en) 2012-05-25 2014-01-30 Insurance Auto Auctions, Inc. Title transfer application and method
US10332209B1 (en) * 2012-08-16 2019-06-25 Allstate Insurance Company Enhanced claims damage estimation using aggregate display
US10430885B1 (en) * 2012-08-16 2019-10-01 Allstate Insurance Company Processing insured items holistically with mobile damage assessment and claims processing
US10430886B1 (en) * 2012-08-16 2019-10-01 Allstate Insurance Company Processing insured items holistically with mobile damage assessment and claims processing
US8510196B1 (en) * 2012-08-16 2013-08-13 Allstate Insurance Company Feedback loop in mobile damage assessment and claims processing
US8712893B1 (en) * 2012-08-16 2014-04-29 Allstate Insurance Company Enhanced claims damage estimation using aggregate display
US20140108058A1 (en) 2012-10-11 2014-04-17 Agero, Inc. Method and System to Determine Auto Insurance Risk
US20140122133A1 (en) 2012-10-31 2014-05-01 Bodyshopbids, Inc. Method of virtually settling insurance claims
US20140266789A1 (en) 2013-03-12 2014-09-18 The Inner Circle Group LLC System and method for determining a driver in a telematic application
US10380696B1 (en) * 2014-03-18 2019-08-13 Ccc Information Services Inc. Image processing system for vehicle damage
US10373262B1 (en) * 2014-03-18 2019-08-06 Ccc Information Services Inc. Image processing system for vehicle damage
US20150348204A1 (en) 2014-05-28 2015-12-03 James C. Daues Method for assessing hail damage
US20150363717A1 (en) 2014-06-11 2015-12-17 Hartford Fire Insurance Company System and method for processing of uav based data for risk mitigation and loss control
US9824453B1 (en) * 2015-10-14 2017-11-21 Allstate Insurance Company Three dimensional image scan for vehicle
US20170116494A1 (en) * 2015-10-22 2017-04-27 Abbyy Development Llc Video capture in data capture scenario
US9846915B2 (en) * 2016-03-17 2017-12-19 Conduent Business Services, Llc Image capture system for property damage assessment
US20170270612A1 (en) 2016-03-17 2017-09-21 Conduent Business Services, Llc Image capture system for property damage assessment
US20170270650A1 (en) * 2016-03-17 2017-09-21 Conduent Business Services, Llc Image analysis system for property damage assessment and verification
US20180260793A1 (en) 2016-04-06 2018-09-13 American International Group, Inc. Automatic assessment of damage and repair costs in vehicles
US20170293894A1 (en) 2016-04-06 2017-10-12 American International Group, Inc. Automatic assessment of damage and repair costs in vehicles
US10373387B1 (en) * 2017-04-07 2019-08-06 State Farm Mutual Automobile Insurance Company Systems and methods for enhancing and developing accident scene visualizations

Non-Patent Citations (335)

* Cited by examiner, † Cited by third party
Title
"How Can We Stop Insurers Writing Estimates from Photos", Angelo DiTullio, Babcox Media, Inc., BodyShop Business 35.4: 48(2), Apr. 2016.
"Stereoscopic Imaging", https://www.Techopedia.com/definition/91/stereoscopic-imaging, retrieved on Apr. 7, 2019, Year 2019.
Applying Image Analysis to Auto Insurance Triage: A Novel Application, Institute of Electrical and Electronics Engineers, Ying Li and Chitra Dorai, MMSP 2007 Proceedings, Chania, Crete, Greece, Oct. 1-3, 2007, pp. 280-283, Year: 2007.
Apr. 12, 2019-U.S. Final Office Action-U.S. Appl. No. 13/933,576.
Apr. 12, 2019—U.S. Final Office Action—U.S. Appl. No. 13/933,576.
Apr. 13, 2020-U.S. Final Office Action-U.S. Appl. No. 14/269,387.
Apr. 13, 2020—U.S. Final Office Action—U.S. Appl. No. 14/269,387.
Apr. 23, 2014-U.S. Office Action-U.S. Appl. No. 13/587,635.
Apr. 23, 2014—U.S. Office Action—U.S. Appl. No. 13/587,635.
Apr. 25, 2016-U.S. Non-Final Office Action-U.S. Appl. No. 13/587,630.
Apr. 25, 2016—U.S. Non-Final Office Action—U.S. Appl. No. 13/587,630.
Apr. 29, 2015 U.S. Final Office Action-U.S. Appl. No. 13/834,193.
Apr. 29, 2015 U.S. Final Office Action—U.S. Appl. No. 13/834,193.
Apr. 3, 2018-U.S. Final Office Action-U.S. Appl. No. 14/190,976.
Apr. 3, 2018—U.S. Final Office Action—U.S. Appl. No. 14/190,976.
Apr. 30, 2015 U.S. Final Office Action-U.S. Appl. No. 14/063,570.
Apr. 30, 2015 U.S. Final Office Action—U.S. Appl. No. 14/063,570.
Apr. 5, 2016-U.S. Non-Final Office Action-U.S. Appl. No. 13/892,598.
Apr. 5, 2016—U.S. Non-Final Office Action—U.S. Appl. No. 13/892,598.
Apr. 5, 2017-U.S. Non-Final Office Action-U.S. Appl. No. 13/892,598.
Apr. 5, 2017—U.S. Non-Final Office Action—U.S. Appl. No. 13/892,598.
Apr. 6, 2017-U.S. Final Office Action-U.S. Appl. No. 13/933,576.
Apr. 6, 2017—U.S. Final Office Action—U.S. Appl. No. 13/933,576.
Apr. 9, 2014 U.S. Office Action U.S. Appl. No. 14/063,533.
Apr. 9, 2014 U.S. Office Action U.S. Appl. No. 14/063,570.
Aug. 1, 2016-U.S. Office Action-U.S. Appl. No. 13/834,161.
Aug. 1, 2016—U.S. Office Action—U.S. Appl. No. 13/834,161.
Aug. 11, 2015-U.S. Non-Final Office Action-U.S. Appl. No. 14/076,473.
Aug. 11, 2015—U.S. Non-Final Office Action—U.S. Appl. No. 14/076,473.
Aug. 12, 2015-U.S. Final Office Action-U.S. Appl. No. 14/269,387.
Aug. 12, 2015—U.S. Final Office Action—U.S. Appl. No. 14/269,387.
Aug. 13, 2015-U.S. Non-Final Office Action-U.S. Appl. No. 14/190,976.
Aug. 13, 2015—U.S. Non-Final Office Action—U.S. Appl. No. 14/190,976.
Aug. 15, 2018-U.S. Non-Final Office Action-U.S. Appl. No. 13/834,170.
Aug. 15, 2018—U.S. Non-Final Office Action—U.S. Appl. No. 13/834,170.
Aug. 21, 2014 U.S. Final Office Action-U.S. Appl. No. 13/587,635.
Aug. 21, 2014 U.S. Final Office Action—U.S. Appl. No. 13/587,635.
Aug. 21, 2014 U.S. Non-Final Office Action-U.S. Appl. No. 13/587,630.
Aug. 21, 2014 U.S. Non-Final Office Action—U.S. Appl. No. 13/587,630.
Aug. 21, 2014-U.S. Non-Final Office Action-U.S. Appl. No. 14/063,517.
Aug. 21, 2014—U.S. Non-Final Office Action—U.S. Appl. No. 14/063,517.
Aug. 22, 2014-U.S. Non-Final Office Action-U.S. Appl. No. 13/834,170.
Aug. 22, 2014—U.S. Non-Final Office Action—U.S. Appl. No. 13/834,170.
Aug. 22, 2017-U.S. Final Office Action-U.S. Appl. No. 13/587,635.
Aug. 22, 2017—U.S. Final Office Action—U.S. Appl. No. 13/587,635.
Aug. 25, 2014-U.S. Non-Final Office Action-U.S. Appl. No. 13/834,210.
Aug. 25, 2014—U.S. Non-Final Office Action—U.S. Appl. No. 13/834,210.
Aug. 29, 2016-U.S. Non-Final Office Action-U.S. Appl. No. 13/834,210.
Aug. 29, 2016—U.S. Non-Final Office Action—U.S. Appl. No. 13/834,210.
Aug. 29, 2019-U.S. Non-Final Office Action-U.S. Appl. No. 16/450,270.
Aug. 29, 2019—U.S. Non-Final Office Action—U.S. Appl. No. 16/450,270.
Aug. 3, 2016-U.S. Non-Final Office Action-U.S. Appl. No. 14/671,602.
Aug. 3, 2016—U.S. Non-Final Office Action—U.S. Appl. No. 14/671,602.
Aug. 31, 2015-U.S. Response to Non-Final Office Action-U.S. Appl. No. 14/671,617.
Aug. 31, 2015—U.S. Response to Non-Final Office Action—U.S. Appl. No. 14/671,617.
Aug. 8, 2013-U.S. Office Action-U.S. Appl. No. 13/933,576.
Aug. 8, 2013—U.S. Office Action—U.S. Appl. No. 13/933,576.
Dec. 1, 2015 U.S. Final Office Action-U.S. Appl. No. 13/892,598.
Dec. 1, 2015 U.S. Final Office Action—U.S. Appl. No. 13/892,598.
Dec. 1, 2017-U.S. Non-Final Office Action-U.S. Appl. No. 13/933,576.
Dec. 1, 2017—U.S. Non-Final Office Action—U.S. Appl. No. 13/933,576.
Dec. 2, 2016-U.S. Final Office Action-U.S. Appl. No. 13/834,210.
Dec. 2, 2016—U.S. Final Office Action—U.S. Appl. No. 13/834,210.
Dec. 2, 2016-U.S. Final Office Action-U.S. Appl. No. 14/190,976.
Dec. 2, 2016—U.S. Final Office Action—U.S. Appl. No. 14/190,976.
Dec. 2, 2016-U.S. Non-Final Office Action-U.S. Appl. No. 14/269,387.
Dec. 2, 2016—U.S. Non-Final Office Action—U.S. Appl. No. 14/269,387.
Dec. 26, 2013-U.S. Office Action-U.S. Appl. No. 13/933,576.
Dec. 26, 2013—U.S. Office Action—U.S. Appl. No. 13/933,576.
Dec. 3, 2013-U.S. Office Action-U.S. Appl. No. 13/834,161.
Dec. 3, 2013—U.S. Office Action—U.S. Appl. No. 13/834,161.
Dec. 3, 2014 U.S. Non-Final Office Action-U.S. Appl. No. 14/076,473.
Dec. 3, 2014 U.S. Non-Final Office Action—U.S. Appl. No. 14/076,473.
Dec. 3, 2015-U.S. Office Action-U.S. Appl. No. 14/671,617.
Dec. 3, 2015—U.S. Office Action—U.S. Appl. No. 14/671,617.
Dec. 3, 2019 U.S. Non-Final Office Action-U.S. Appl. No. 14/269,387.
Dec. 3, 2019 U.S. Non-Final Office Action—U.S. Appl. No. 14/269,387.
Dec. 4, 2014-U.S. Office Action-U.S. Appl. No. 13/834,161.
Dec. 4, 2014—U.S. Office Action—U.S. Appl. No. 13/834,161.
Dec. 5, 2019 U.S. Final Office Action-U.S. Appl. No. 15/855,057.
Dec. 5, 2019 U.S. Final Office Action—U.S. Appl. No. 15/855,057.
Dec. 7, 2017-U.S. Non-Final Office Action-U.S. Appl. No. 14/671,617.
Dec. 7, 2017—U.S. Non-Final Office Action—U.S. Appl. No. 14/671,617.
Dec. 8, 2017-U.S. Non-Final Office Action-U.S. Appl. No. 14/063,570.
Dec. 8, 2017—U.S. Non-Final Office Action—U.S. Appl. No. 14/063,570.
DiTullio, Angelo."How Can We Stop Insurers Writing Estimates from Photos." Apr. 19, 2016. through Babcox Media, Inc. BodyShop Business 35.4/ Magazine/Journal 2 pages. (Year: 2016). *
Download our pocket agent app for iPhone today; retrieved from https://www.statefarm.com/about-us/innovation-research/mobile-apps/pocket-agent-for-iphone/; Copyright, State Farm Mutual Automobile Insurance Company, 2013.
Farmers iClaim, downloaded from http://www.farmers.com/iclaim.html, 2014 Farmers Insurance Company.
Feb. 12, 2016-U.S. Non-Final Office Action-U.S. Appl. No. 14/341,043.
Feb. 12, 2016—U.S. Non-Final Office Action—U.S. Appl. No. 14/341,043.
Feb. 16, 2016-U.S. Non-Final Office Action-U.S. Appl. No. 14/341,139.
Feb. 16, 2016—U.S. Non-Final Office Action—U.S. Appl. No. 14/341,139.
Feb. 19, 2020-U.S. Non-Final Office Action-U.S. Appl. No. 14/671,602.
Feb. 19, 2020—U.S. Non-Final Office Action—U.S. Appl. No. 14/671,602.
Feb. 21, 2019-U.S. Non-Final Office Action-U.S. Appl. No. 14/671,602.
Feb. 21, 2019—U.S. Non-Final Office Action—U.S. Appl. No. 14/671,602.
Feb. 23, 2016-U.S. Final Office Action-U.S. Appl. No. 13/587,635.
Feb. 23, 2016—U.S. Final Office Action—U.S. Appl. No. 13/587,635.
Feb. 26, 2014 U.S. Office Action-U.S. Appl. No. 13/834,170.
Feb. 26, 2014 U.S. Office Action—U.S. Appl. No. 13/834,170.
Feb. 26, 2014 U.S. Office Action-U.S. Appl. No. 14/063,517.
Feb. 26, 2014 U.S. Office Action—U.S. Appl. No. 14/063,517.
Feb. 27, 2013 U.S. Office Action-U.S. Appl. No. 13/587,620.
Feb. 27, 2013 U.S. Office Action—U.S. Appl. No. 13/587,620.
Feb. 27, 2020-U.S. Final Office Action-U.S. Appl. No. 16/450,270.
Feb. 27, 2020—U.S. Final Office Action—U.S. Appl. No. 16/450,270.
Feb. 5, 2015 U.S. Final Office Action-U.S. Appl. No. 13/587,630.
Feb. 5, 2015 U.S. Final Office Action—U.S. Appl. No. 13/587,630.
Feb. 6, 2019-U.S. Notice of Allowance-U.S. Appl. No. 14/190,976.
Feb. 6, 2019—U.S. Notice of Allowance—U.S. Appl. No. 14/190,976.
Feb. 7, 2017-U.S. Non-Final Office Action-U.S. Appl. No. 13/587,635.
Feb. 7, 2017—U.S. Non-Final Office Action—U.S. Appl. No. 13/587,635.
Get the Nationwide Mobile App for Android and iPhone, downloaded from http://www.nationwide.com/mobile-support.isp, 2014 Nationwide Mutual Insurance Company amd Affililated Companies.
Imaging Technology Saves the Day for Insurance Claim Processing; downloaded from www.snowbound.com/resources/articles/business-benetits-industry-information/imaging-technology-saves-day-insurance; 2013 Snowbound Software.
Jan. 12, 2018-U.S. Final Office Action-U.S. Appl. No. 13/834,161.
Jan. 12, 2018—U.S. Final Office Action—U.S. Appl. No. 13/834,161.
Jan. 13, 2015 U.S. Non-Final Office Action-U.S. Appl. No. 13/587,635.
Jan. 13, 2015 U.S. Non-Final Office Action—U.S. Appl. No. 13/587,635.
Jan. 14, 2019-U.S. Final Office Action-U.S. Appl. No. 13/834,170.
Jan. 14, 2019—U.S. Final Office Action—U.S. Appl. No. 13/834,170.
Jan. 20, 2015 U.S. Non-Final Office Action-U.S. Appl. No. 13/933,576.
Jan. 20, 2015 U.S. Non-Final Office Action—U.S. Appl. No. 13/933,576.
Jan. 22, 2016-U.S. Final Office Action-U.S. Appl. No. 14/190,976.
Jan. 22, 2016—U.S. Final Office Action—U.S. Appl. No. 14/190,976.
Jan. 23, 2014 U.S. Office Action-U.S. Appl. No. 13/834,210.
Jan. 23, 2014 U.S. Office Action—U.S. Appl. No. 13/834,210.
Jan. 29, 2020-U.S. Notice of Allowance-U.S. Appl. No. 14/458,826.
Jan. 29, 2020—U.S. Notice of Allowance—U.S. Appl. No. 14/458,826.
Jan. 9, 2018-U.S. Non-Final Office Action-U.S. Appl. No. 13/892,598.
Jan. 9, 2018—U.S. Non-Final Office Action—U.S. Appl. No. 13/892,598.
Jul. 10, 2017-U.S. Non-Final Office Action-U.S. Appl. No. 14/190,976.
Jul. 10, 2017—U.S. Non-Final Office Action—U.S. Appl. No. 14/190,976.
Jul. 12, 2016-U.S. Final Office Action-U.S. Appl. 13/587,630.
Jul. 12, 2016—U.S. Final Office Action—U.S. Appl. 13/587,630.
Jul. 27, 2015-U.S. Final Office Action-U.S. Appl. No. 13/587,635.
Jul. 27, 2015—U.S. Final Office Action—U.S. Appl. No. 13/587,635.
Jul. 29, 2016-U.S. Non-Final Office Action-U.S. Appl. No. 13/834,170.
Jul. 29, 2016—U.S. Non-Final Office Action—U.S. Appl. No. 13/834,170.
Jul. 30, 2019-U.S. Non-Final Office Action-U.S. Appl. No. 15/855,057.
Jul. 30, 2019—U.S. Non-Final Office Action—U.S. Appl. No. 15/855,057.
Jul. 6, 2018-U.S. Final Office Action-U.S. Appl. No. 14/671,602.
Jul. 6, 2018—U.S. Final Office Action—U.S. Appl. No. 14/671,602.
Jun. 1, 2016-U.S. Non-Final Office Action-U.S. Appl. No. 13/587,635.
Jun. 1, 2016—U.S. Non-Final Office Action—U.S. Appl. No. 13/587,635.
Jun. 13, 2016-U.S. Non-Final Office Action-U.S. Appl. No. 14/076,473.
Jun. 13, 2016—U.S. Non-Final Office Action—U.S. Appl. No. 14/076,473.
Jun. 17, 2016-U.S. Non-Final Office Action-U.S. Appl. No. 14/063,517.
Jun. 17, 2016—U.S. Non-Final Office Action—U.S. Appl. No. 14/063,517.
Jun. 20, 2018-U.S. Non-Final Office Action-U.S. Appl. No. 13/834,161.
Jun. 20, 2018—U.S. Non-Final Office Action—U.S. Appl. No. 13/834,161.
Jun. 25, 2013 U.S. Office Action-U.S. Appl. No. 13/587,635.
Jun. 25, 2013 U.S. Office Action—U.S. Appl. No. 13/587,635.
Jun. 26, 2018-U.S. Final Office Action-U.S. Appl. No. 13/892,598.
Jun. 26, 2018—U.S. Final Office Action—U.S. Appl. No. 13/892,598.
Jun. 3, 2015-U.S. Office Action-U.S. Appl. No. 14/671,617.
Jun. 3, 2015—U.S. Office Action—U.S. Appl. No. 14/671,617.
Mar. 1, 2016-U.S. Final Office Action-U.S. Appl. No. 14/076,473.
Mar. 1, 2016—U.S. Final Office Action—U.S. Appl. No. 14/076,473.
Mar. 12, 2018-U.S. Non-Final Office Action-U.S. Appl. No. 14/063,517.
Mar. 12, 2018—U.S. Non-Final Office Action—U.S. Appl. No. 14/063,517.
Mar. 13, 2015 U.S. Final Office Action-U.S. Appl. No. 13/834,170.
Mar. 13, 2015 U.S. Final Office Action—U.S. Appl. No. 13/834,170.
Mar. 13, 2015 U.S. Office Action-U.S. Appl. No. 13/834,161.
Mar. 13, 2015 U.S. Office Action—U.S. Appl. No. 13/834,161.
Mar. 13, 2017-U.S. Final Office Action-U.S. Appl. No. 14/269,387.
Mar. 13, 2017—U.S. Final Office Action—U.S. Appl. No. 14/269,387.
Mar. 14, 2018-U.S. Final Office Action-U.S. Appl. No. 13/834,170.
Mar. 14, 2018—U.S. Final Office Action—U.S. Appl. No. 13/834,170.
Mar. 14, 2018-U.S. Final Office Action-U.S. Appl. No. 13/834,210.
Mar. 14, 2018—U.S. Final Office Action—U.S. Appl. No. 13/834,210.
Mar. 16, 2015 U.S. Final Office Action-U.S. Appl. No. 13/834,210.
Mar. 16, 2015 U.S. Final Office Action—U.S. Appl. No. 13/834,210.
Mar. 16, 2018-U.S. Final Office Action-U.S. Appl. No. 13/933,576.
Mar. 16, 2018—U.S. Final Office Action—U.S. Appl. No. 13/933,576.
Mar. 16, 2018-U.S. Final Office Action-U.S. Appl. No. 14/063,570.
Mar. 16, 2018—U.S. Final Office Action—U.S. Appl. No. 14/063,570.
Mar. 20, 2019-U.S. Final Office Action-U.S. Appl. No. 13/834,210.
Mar. 20, 2019—U.S. Final Office Action—U.S. Appl. No. 13/834,210.
Mar. 20, 2019-U.S. Final Office Action-U.S. Appl. No. 14/063,570.
Mar. 20, 2019—U.S. Final Office Action—U.S. Appl. No. 14/063,570.
Mar. 21, 2018-U.S. Non-Final Office Action-U.S. Appl. No. 14/671,602.
Mar. 21, 2018—U.S. Non-Final Office Action—U.S. Appl. No. 14/671,602.
Mar. 22, 2019-U.S. Non-Final Office Action-U.S. Appl. No. 13/587,630.
Mar. 22, 2019—U.S. Non-Final Office Action—U.S. Appl. No. 13/587,630.
Mar. 22, 2019-U.S. Non-Final Office Action-U.S. Appl. No. 14/063,517.
Mar. 22, 2019—U.S. Non-Final Office Action—U.S. Appl. No. 14/063,517.
Mar. 22, 2019-U.S. Non-Final Office Action-U.S. Appl. No. 14/671,617.
Mar. 22, 2019—U.S. Non-Final Office Action—U.S. Appl. No. 14/671,617.
Mar. 26, 2014 U.S. Office Action U.S. Appl. No. 13/834,193.
Mar. 26, 2020-U.S. Non-Final Office Action-U.S. Appl. No. 16/586,008.
Mar. 26, 2020—U.S. Non-Final Office Action—U.S. Appl. No. 16/586,008.
Mar. 27, 2018-U.S. Non-Final Office Action-U.S. Appl. No. 15/855,057.
Mar. 27, 2018—U.S. Non-Final Office Action—U.S. Appl. No. 15/855,057.
Mar. 28, 2018-U.S. Non-Final Office Action-U.S. Appl. No. 13/587,635.
Mar. 28, 2018—U.S. Non-Final Office Action—U.S. Appl. No. 13/587,635.
Mar. 29, 2017-U.S. Final Office Action-U.S. Appl. No. 13/834,193.
Mar. 29, 2017—U.S. Final Office Action—U.S. Appl. No. 13/834,193.
Mar. 30, 2017-U.S. Final Office Action-U.S. Appl. No. 14/063,533.
Mar. 30, 2017—U.S. Final Office Action—U.S. Appl. No. 14/063,533.
Mar. 30, 2018-U.S. Non-Final Office Action-U.S. Appl. No. 13/587,630.
Mar. 30, 2018—U.S. Non-Final Office Action—U.S. Appl. No. 13/587,630.
Mar. 31, 2017-U.S. Final Office Action-U.S. Appl. No. 14/063,570.
Mar. 31, 2017—U.S. Final Office Action—U.S. Appl. No. 14/063,570.
Mar. 4, 2015 U.S. Final Office Action-U.S. Appl. No. 14/063,517.
Mar. 4, 2015 U.S. Final Office Action—U.S. Appl. No. 14/063,517.
Mar. 9, 2018-U.S. Final Office Action-U.S. Appl. No. 14/269,387.
Mar. 9, 2018—U.S. Final Office Action—U.S. Appl. No. 14/269,387.
May 12, 2015 U.S. Final Office Action-U.S. Appl. No. 13/933,576.
May 12, 2015 U.S. Final Office Action—U.S. Appl. No. 13/933,576.
May 14, 2015-U.S. Non-Final Office Action-U.S. Appl. No. 14/671,602.
May 14, 2015—U.S. Non-Final Office Action—U.S. Appl. No. 14/671,602.
May 15, 2019 U.S. Notice of Allowance and Fees Due-U.S. Appl. No. 16/892,598.
May 15, 2019 U.S. Notice of Allowance and Fees Due—U.S. Appl. No. 16/892,598.
May 16, 2017-U.S. Non-Final Office Action-U.S. Appl. No. 14/671,602.
May 16, 2017—U.S. Non-Final Office Action—U.S. Appl. No. 14/671,602.
May 17, 2019-U.S. Non-Final Office Action-U.S. Appl. No. 15/374,278.
May 17, 2019—U.S. Non-Final Office Action—U.S. Appl. No. 15/374,278.
May 18, 2015-U.S. Non-Final Office Action-U.S. Appl. No. 14/269,387.
May 18, 2015—U.S. Non-Final Office Action—U.S. Appl. No. 14/269,387.
May 18, 2017-U.S. Final Office Action-U.S. Appl. No. 14/671,617.
May 18, 2017—U.S. Final Office Action—U.S. Appl. No. 14/671,617.
May 19, 2014-U.S. Office Action-U.S. Appl. No. 13/834,161.
May 19, 2014—U.S. Office Action—U.S. Appl. No. 13/834,161.
May 20, 2015 U.S. Non-Final Office Action-U.S. Appl. No. 13/892,598.
May 20, 2015 U.S. Non-Final Office Action—U.S. Appl. No. 13/892,598.
May 20, 2019 U.S. Notice of Allowance and Fees Due-U.S. Appl. No. 13/587,635.
May 20, 2019 U.S. Notice of Allowance and Fees Due—U.S. Appl. No. 13/587,635.
May 23, 2016-U.S. Non-Final Office Action-U.S. Appl. No. 14/190,976.
May 23, 2016—U.S. Non-Final Office Action—U.S. Appl. No. 14/190,976.
May 24, 2017-U.S. Non-Final Office Action-U.S. Appl. No. 14/063,517.
May 24, 2017—U.S. Non-Final Office Action—U.S. Appl. No. 14/063,517.
May 3, 2019-U.S. Final Office Action-U.S. Appl. No. 15/184,580.
May 3, 2019—U.S. Final Office Action—U.S. Appl. No. 15/184,580.
May 6, 2014 U.S. Office Action U.S. Appl. No. 13/892,598.
May 6, 2015 U.S. Final Office Action-U.S. Appl. No. 14/076,473.
May 6, 2015 U.S. Final Office Action—U.S. Appl. No. 14/076,473.
May 8, 2015 U.S. Final Office Action-U.S. Appl. No. 14/063,533.
May 8, 2015 U.S. Final Office Action—U.S. Appl. No. 14/063,533.
May 9, 2017-U.S. Non-Final Office Action-U.S. Appl. No. 13/587,630.
May 9, 2017—U.S. Non-Final Office Action—U.S. Appl. No. 13/587,630.
Mort et al. "Marketing m-services: Establishing a usage benefit typology related to mobile user characteristics" Jul. 2005, Journal of Database Marketing & Customer Strategy Management, vol. 12, Issue 4, pp. 327-341 (Year: 2005). *
Mort, et al, "Marketing m-services: Establishing a Usage Benefit Typology Related to Mobile User Characteristics," Jul. 2005, Journal of Database Martketing & Customer Strategy Management, vol. 12, Issue 4, pp. 327-341.
Nov. 1, 2017-U.S. Final Office Action-U.S. Appl. No. 14/671,602.
Nov. 1, 2017—U.S. Final Office Action—U.S. Appl. No. 14/671,602.
Nov. 10, 2016-U.S. Non-Final Office Action-U.S. Appl. No. 14/063,533.
Nov. 10, 2016—U.S. Non-Final Office Action—U.S. Appl. No. 14/063,533.
Nov. 11, 2015-U.S. Response to Non-Final Office Action-U.S. Appl. No. 14/076,473.
Nov. 11, 2015—U.S. Response to Non-Final Office Action—U.S. Appl. No. 14/076,473.
Nov. 12, 2015-U.S. Response to Non-Final Office Action-U.S. Appl. No. 14/190,976.
Nov. 12, 2015—U.S. Response to Non-Final Office Action—U.S. Appl. No. 14/190,976.
Nov. 13, 2018-U.S. Final Office Action-U.S. Appl. No. 13/587,635.
Nov. 13, 2018—U.S. Final Office Action—U.S. Appl. No. 13/587,635.
Nov. 14, 2018-U.S. Non-Final Office Action-U.S. Appl. No. 13/892,598.
Nov. 14, 2018—U.S. Non-Final Office Action—U.S. Appl. No. 13/892,598.
Nov. 18, 2014-U.S. Final Office Action-U.S. Appl. No. 13/892,598.
Nov. 18, 2014—U.S. Final Office Action—U.S. Appl. No. 13/892,598.
Nov. 18, 2019-U.S. Notice of Allowance-U.S. Appl. No. 14/063,517.
Nov. 18, 2019—U.S. Notice of Allowance—U.S. Appl. No. 14/063,517.
Nov. 18, 2019-U.S. Notice of Allowance-U.S. Appl. No. 14/063,570.
Nov. 18, 2019—U.S. Notice of Allowance—U.S. Appl. No. 14/063,570.
Nov. 2, 2018-U.S. Non-Final Office Action-U.S. Appl. No. 14/269,387.
Nov. 2, 2018—U.S. Non-Final Office Action—U.S. Appl. No. 14/269,387.
Nov. 20, 2014-U.S. Non-Final Office Action-U.S. Appl. No. 14/063,533.
Nov. 20, 2014—U.S. Non-Final Office Action—U.S. Appl. No. 14/063,533.
Nov. 20, 2014-U.S. Non-Final Office Action-U.S. Appl. No. 14/063,570.
Nov. 20, 2014—U.S. Non-Final Office Action—U.S. Appl. No. 14/063,570.
Nov. 25, 2016-U.S. Non-Final Office Action-U.S. Appl. No. 14/671,617.
Nov. 25, 2016—U.S. Non-Final Office Action—U.S. Appl. No. 14/671,617.
Nov. 4, 2016-U.S. Final Office Action-U.S. Appl. No. 13/834,170.
Nov. 4, 2016—U.S. Final Office Action—U.S. Appl. No. 13/834,170.
Nov. 9, 2018-U.S. Non-Final Office Action-U.S. Appl. No. 14/063,570.
Nov. 9, 2018—U.S. Non-Final Office Action—U.S. Appl. No. 14/063,570.
Oct. 13, 2016-U.S. Final Office Action-U.S. Appl. No. 13/834,161.
Oct. 13, 2016—U.S. Final Office Action—U.S. Appl. No. 13/834,161.
Oct. 13, 2016-U.S. Office Action-U.S. Appl. No. 13/933,576.
Oct. 13, 2016—U.S. Office Action—U.S. Appl. No. 13/933,576.
Oct. 17, 2016-U.S. Final Office Action-U.S. Appl. No. 13/892,598.
Oct. 17, 2016—U.S. Final Office Action—U.S. Appl. No. 13/892,598.
Oct. 18, 2016-U.S. Final Office Action-U.S. Appl. No. 14/671,602.
Oct. 18, 2016—U.S. Final Office Action—U.S. Appl. No. 14/671,602.
Oct. 18, 2016-U.S. Office Action-U.S. Appl. No. 14/671,602.
Oct. 18, 2016—U.S. Office Action—U.S. Appl. No. 14/671,602.
Oct. 18, 2019-U.S. Notice of Allowance-U.S. Appl. No. 14/671,617.
Oct. 18, 2019—U.S. Notice of Allowance—U.S. Appl. No. 14/671,617.
Oct. 20, 2017-U.S. Non-Final Office Action-U.S. Appl. No. 13/834,210.
Oct. 20, 2017—U.S. Non-Final Office Action—U.S. Appl. No. 13/834,210.
Oct. 21, 2016-U.S. Non-Final Office Action-U.S. Appl. No. 14/063,570.
Oct. 21, 2016—U.S. Non-Final Office Action—U.S. Appl. No. 14/063,570.
Oct. 24, 2016-U.S. Non-Final Office Action-U.S. Appl. No. 13/834,193.
Oct. 24, 2016—U.S. Non-Final Office Action—U.S. Appl. No. 13/834,193.
Oct. 30, 2014-U.S. Non-Final Office Action-U.S. Appl. No. 13/834,193.
Oct. 30, 2014—U.S. Non-Final Office Action—U.S. Appl. No. 13/834,193.
Oct. 30, 2015 U.S. Non-Final Office Action-U.S. Appl. No. 13/587,635.
Oct. 30, 2015 U.S. Non-Final Office Action—U.S. Appl. No. 13/587,635.
Oct. 4, 2018-U.S. Final Office Action-U.S. Appl. No. 15/855,057.
Oct. 4, 2018—U.S. Final Office Action—U.S. Appl. No. 15/855,057.
Oct. 4, 2019-U.S. Final Office Action-U.S. Appl. No. 15/374,278.
Oct. 4, 2019—U.S. Final Office Action—U.S. Appl. No. 15/374,278.
Oct. 5, 2018-U.S. Final Office Action-U.S. Appl. No. 13/587,630.
Oct. 5, 2018—U.S. Final Office Action—U.S. Appl. No. 13/587,630.
Oct. 6, 2017-U.S. Final Office Action-U.S. Appl. No. 13/587,630.
Oct. 6, 2017—U.S. Final Office Action—U.S. Appl. No. 13/587,630.
Oct. 9, 2018-U.S. Final Office Action-U.S. Appl. No. 13/834,161.
Oct. 9, 2018—U.S. Final Office Action—U.S. Appl. No. 13/834,161.
Security First Mobile, downloaded from http://www.securityfirstflorida.com/security-first-mobile/, Copyright 2014 Security First Insurance Company.
Sep. 13, 2017-U.S. Final Office Action-U.S. Appl. No. 14/063,517.
Sep. 13, 2017—U.S. Final Office Action—U.S. Appl. No. 14/063,517.
Sep. 15, 2016-U.S. Final Office Action-U.S. Appl. No. 14/063,517.
Sep. 15, 2016—U.S. Final Office Action—U.S. Appl. No. 14/063,517.
Sep. 17, 2018-U.S. Non-Final Office Action-U.S. Appl. No. 13/933,576.
Sep. 17, 2018—U.S. Non-Final Office Action—U.S. Appl. No. 13/933,576.
Sep. 18, 2017-U.S. Final Office Action-U.S. Appl. No. 13/892,598.
Sep. 18, 2017—U.S. Final Office Action—U.S. Appl. No. 13/892,598.
Sep. 21, 2018-U.S. Final Office Action-U.S. Appl. No. 14/671,617.
Sep. 21, 2018—U.S. Final Office Action—U.S. Appl. No. 14/671,617.
Sep. 23, 2013 U.S. Final Office Action-U.S. Appl. No. 13/587,635.
Sep. 23, 2013 U.S. Final Office Action—U.S. Appl. No. 13/587,635.
Sep. 23, 2015-U.S. Final Office Action-U.S. Appl. No. 14/671,602.
Sep. 23, 2015—U.S. Final Office Action—U.S. Appl. No. 14/671,602.
Sep. 25, 2017-U.S. Non-Final Office Action-U.S. Appl. No. 13/834,161.
Sep. 25, 2017—U.S. Non-Final Office Action—U.S. Appl. No. 13/834,161.
Sep. 25, 2017-U.S. Non-Final Office Action-U.S. Appl. No. 13/834,170.
Sep. 25, 2017—U.S. Non-Final Office Action—U.S. Appl. No. 13/834,170.
Sep. 29, 2016 U.S. Final Office Action-U.S. Appl. No. 13/587,635.
Sep. 29, 2016 U.S. Final Office Action—U.S. Appl. No. 13/587,635.
Sep. 30, 2013 U.S. Office Action-U.S. Appl. No. 13/587,630.
Sep. 30, 2013 U.S. Office Action—U.S. Appl. No. 13/587,630.
Sep. 30, 2019-U.S. Final Office Action-U.S. Appl. No. 14/671,602.
Sep. 30, 2019—U.S. Final Office Action—U.S. Appl. No. 14/671,602.
Sep. 4, 2018-U.S. Non-Final Office Action-U.S. Appl. No. 13/834,210.
Sep. 4, 2018—U.S. Non-Final Office Action—U.S. Appl. No. 13/834,210.
Sep. 5, 2018-U.S. Final Office Action-U.S. Appl. No. 14/063,517.
Sep. 5, 2018—U.S. Final Office Action—U.S. Appl. No. 14/063,517.
Welcome to USAA! to get started, please enable cookies, downloaded from https://www.usaa.com/inet/pages/mobile_access_methods_mobileapps?akredirect=true, 2013 USAA.
Ying Li et al. "Applying Image Analysis to Auto Insurance Triage: A Novel Application" Oct. 1-3, 2007. Institute of Electrical and Electronics Engineers, MMSP 2007 Proceedings, Chania, Crete, Greece. pp. 280-283 (Year: 2007). *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220343435A1 (en) * 2012-08-16 2022-10-27 Allstate Insurance Company Processing insured items holistically with mobile damage assessment and claims processing
US12079878B2 (en) 2012-08-16 2024-09-03 Allstate Insurance Company Feedback loop in mobile damage assessment and claims processing
US12079877B2 (en) * 2012-08-16 2024-09-03 Allstate Insurance Company Processing insured items holistically with mobile damage assessment and claims processing
US11861137B2 (en) 2020-09-09 2024-01-02 State Farm Mutual Automobile Insurance Company Vehicular incident reenactment using three-dimensional (3D) representations
US12142039B1 (en) 2022-06-23 2024-11-12 State Farm Mutual Automobile Insurance Company Interactive insurance inventory and claim generation

Also Published As

Publication number Publication date
US10430885B1 (en) 2019-10-01
US10878507B1 (en) 2020-12-29
US11386503B2 (en) 2022-07-12
US10430886B1 (en) 2019-10-01
US20210056640A1 (en) 2021-02-25
US12079877B2 (en) 2024-09-03
US20220343435A1 (en) 2022-10-27

Similar Documents

Publication Publication Date Title
US12079878B2 (en) Feedback loop in mobile damage assessment and claims processing
US11386503B2 (en) Processing insured items holistically with mobile damage assessment and claims processing
US11915321B2 (en) Configuration and transfer of image data using a mobile device
US11625791B1 (en) Feedback loop in mobile damage assessment and claims processing
US11756131B1 (en) Automated damage assessment and claims processing
US11361385B2 (en) Application facilitated claims damage estimation
US11783428B2 (en) Agent-facilitated claims damage estimation
US9824453B1 (en) Three dimensional image scan for vehicle
US11455691B2 (en) Processing insured items holistically with mobile damage assessment and claims processing
US10572944B1 (en) Claims damage estimation using enhanced display
US20230334589A1 (en) User devices in claims damage estimation
US20230020642A1 (en) Processing insured items holistically with mobile damage assessment and claims processing

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4