US20140267830A1 - Identifying and Tracking Digital Images With Customized Metadata - Google Patents
Identifying and Tracking Digital Images With Customized Metadata Download PDFInfo
- Publication number
- US20140267830A1 US20140267830A1 US14/102,567 US201314102567A US2014267830A1 US 20140267830 A1 US20140267830 A1 US 20140267830A1 US 201314102567 A US201314102567 A US 201314102567A US 2014267830 A1 US2014267830 A1 US 2014267830A1
- Authority
- US
- United States
- Prior art keywords
- data
- image
- subject
- camera
- code
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 claims abstract description 95
- 238000012545 processing Methods 0.000 claims description 47
- 238000003860 storage Methods 0.000 claims description 8
- 238000009877 rendering Methods 0.000 claims description 2
- 230000008569 process Effects 0.000 description 65
- 238000006243 chemical reaction Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 7
- 238000011112 process operation Methods 0.000 description 7
- 238000004519 manufacturing process Methods 0.000 description 4
- 230000001960 triggered effect Effects 0.000 description 3
- 238000004883 computer application Methods 0.000 description 2
- 238000004806 packaging method and process Methods 0.000 description 2
- 238000010200 validation analysis Methods 0.000 description 2
- 102100030488 HEAT repeat-containing protein 6 Human genes 0.000 description 1
- 101000990566 Homo sapiens HEAT repeat-containing protein 6 Proteins 0.000 description 1
- 101000801684 Homo sapiens Phospholipid-transporting ATPase ABCA1 Proteins 0.000 description 1
- 241000593989 Scardinius erythrophthalmus Species 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 210000004209 hair Anatomy 0.000 description 1
- 230000002401 inhibitory effect Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000000638 solvent extraction Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- G06F17/3028—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/51—Indexing; Data structures therefor; Storage structures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H04N5/232—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
- H04N5/772—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
Definitions
- This invention relates generally to the field of portrait photography and more particularly to a method and apparatus for facilitating the identification and tracking of photographic portraits.
- Photographic studios, professional photographers, and others performing commercial portrait work (collectively referred to herein as “photographers”) often need to identify and track large numbers of images through the workflow process from image capture to portrait delivery.
- captured images are tracked by associating each image with organizational information indicating how the image should be processed, to whom the processed image should be delivered, and other descriptive information useful for later identification, selection, or sorting of captured images. Without such organizational information, each captured image would need to be individually reviewed, which could be time consuming when processing a large number of images. Moreover, such review of individual images by others not present when the images were captured can be an arduous and inaccurate process.
- the organizational information includes information identifying the customer or the subject of the image.
- identifying information can include the name, school, and grade of the subject of each photograph.
- the organizational information also includes order information. For example, each individual who has a photographic portrait taken may want multiple copies of the portrait in one or more sizes using one or more finishing techniques. Photographic portraits may also include multiple individuals, each of whom may want a different number of copies of the photograph in different sizes.
- photographers create order packages to facilitate tracking order information.
- Each package has a set number of copies for a set number of sizes.
- a first package may include two pictures having a dimension of five inches by seven inches and ten “wallet” pictures having a dimension of two inches by three inches.
- Another package may include one picture having a dimension of eight inches by ten inches and eight “wallet” pictures.
- Tracking organizational information in some prior systems includes printing the organizational information directly onto a negative of each portrait.
- the association between the organizational information and the image must be otherwise tracked between taking the picture and printing the negative.
- the photographic medium utilized is physical (e.g., a printed negative) or digital (e.g., a digital picture file).
- computer applications can be used to add metadata to an image downloaded from a digital camera.
- the association between the digital picture and the identity information must be otherwise maintained before obtaining the picture with the computer application.
- One example prior method of associating an image with identifying information at the time the image is taken includes sequential ordering. For example, a photographer can record a list of photographic sessions, the number of pictures taken during each session, and the customer associated with each session in the same order in which the sessions are conducted. In such a system, however, the pictures must be kept in the same order in which they are taken or the association between each picture and the identity information may be lost.
- Digital cameras typically record some types of information about a picture, known as metadata, along with the picture. In general, however, such information includes information about the picture itself and does not identify a customer or the subject of the image. For example, a typical camera encodes information regarding the make and model of the camera, the camera settings, and whether a flash is used. Some digital cameras also include information regarding the date and time the picture is taken. Some other digital cameras include a “Global Positioning System” (GPS) unit to track and record the physical location, Coordinating Universal Time (UTC) date, and UTC time at which each picture is taken.
- GPS Global Positioning System
- data can be associated with a digital image file on a digital camera through emulation of the memory card of the digital camera.
- US Publication No. 2005/0036034 discloses a system by which a processing unit emulates a memory card of a digital camera such that the digital camera's electronics operate with the memory of the processing unit through a camera interface card as if such memory was on a memory card located in the digital camera. Images captured by the digital camera are transferred, via the camera interface card, to the memory of the processing unit and the digital camera can access (e.g., read) images stored in the memory of the processing unit.
- Data can be associated with the image files by partitioning the memory of the processing unit to accommodate both originally captured images and copies of the images including file names and/or headers modified to indicate the data desired to be associated with the images.
- the invention relates to a method and system for associating customized information with a digital image and, in particular, identifying and tracking digital images using customizable user metadata.
- Customized information includes any information a photographer desires to associate with an image. Examples of customized information include information identifying a customer, information identifying an order package, and any internal tracking information of interest to the photographer. The invention thereby enables more accurate identification and tracking of such information with regard to each image.
- the invention is implemented using a camera.
- the camera obtains and stores an image.
- the camera also stores customized information along with the image, thereby associating the customized information with the image.
- the association between the customized information and image is automatically maintained when the image is retrieved from the camera at a later time.
- the camera obtains the customized data from an input device.
- a photographer can enter the customized information into the input device and can transmit the entered information to the camera.
- a processor device acts as an intermediary between the camera and the input device.
- the processor device resequences, reformats, or otherwise modifies the input information to be readable by the camera.
- a method for obtaining and tracking digital images includes inputting data identifying a subject of an image into a camera, acquiring an image with the camera, and storing the image and the inputted data, as metadata, in an image file when the image is acquired.
- a system for obtaining and tracking digital images includes a digital camera, an input device, and a data processor.
- the digital camera obtains digital images, and embeds metadata into digital image files encoding the digital images.
- the input device obtains information relating to a subject of a digital image.
- the obtained information has a format different from the data formats processable by the digital camera.
- the data processor receives the obtained information from the input device, converts the format of the obtained information to a format processable by the digital camera, and inputs the converted information into the digital camera as metadata.
- the camera is configured to accept and process a GPS data string
- the input device includes a barcode scanner
- the data processor includes a microcontroller
- the camera, the input device, and data processor are provided within a housing to form an integrated device.
- FIG. 1 illustrates an operation flow for an example process for providing photographic portraits to customers according to one embodiment of the present invention
- FIG. 2 illustrates an operation flow for an example process for aiding in processing images according to one embodiment of the present invention
- FIG. 3 illustrates an operation flow for a process for identifying and tracking digital images according to one embodiment of the present invention
- FIG. 4 illustrates a block diagram of a system for implementing the process shown in FIG. 3 according to one embodiment of the present invention
- FIG. 5 illustrates one example of a data processing device including processing software according to one embodiment of the present invention
- FIG. 6 illustrates one example embodiment of an integrated device configured according to the teaching of the present invention
- FIG. 7 illustrates an operation flow for a process for associating the processed user data with a digital picture according to one embodiment of the present invention
- FIG. 8 illustrates three events occurring during the use of one embodiment of the present invention
- FIG. 9 illustrates an example identification card according to one embodiment of the present invention.
- FIG. 10 illustrates a block diagram depicting loading user data into GPS fields of a GPS data stream
- FIGS. 11A-11G depict example GPS fields and a valid input format for each example field according to one embodiment of the present invention
- FIG. 12 illustrates an example operation flow for a process for converting user-customized information to information processable by a digital camera
- FIGS. 13A-13C illustrate example GPS fields resulting from some applications of the conversion process
- FIG. 14 illustrates a block diagram depicting how GPS metadata is obtained and restored to a useable form according to one embodiment of the present invention
- FIG. 15 illustrates an operation flow for a process for obtaining a saved image and the corresponding metadata from a camera and converting the metadata to useable data according to one embodiment of the present invention
- FIG. 16 illustrates an operation flow for a process for processing and delivering a captured image according to organizational data associated with the image.
- the invention relates to methods and systems for identifying and tracking digital images using metadata.
- the invention relates to methods and systems for associating customizable metadata, such as organizational data aiding in image production, with a digital picture at the time the digital picture is taken and tracking the digital picture with the associated metadata.
- FIG. 1 illustrates an operation flow for an example process 100 for providing photographic portraits to customers.
- the process 100 begins at start module 105 and proceeds to a capture operation 110 .
- the capture operation 110 obtains a photographic image of a subject.
- the photographic image is obtained with a digital camera.
- a process operation 115 produces one or more copies of the image.
- the process operation 115 renders the captured image according to order information provided by the customer.
- the process operation 115 edits the captured image before rendering the image.
- a deliver operation 120 sends the processed image to the customer.
- the deliver option 120 includes packaging and labeling the rendered image. The process 100 ends at stop module 125 .
- FIG. 2 illustrates an operation flow for an example process 200 for aiding in processing images.
- the process 200 begins at start module 205 and proceeds to a first acquire operation 210 .
- the first acquire operation 210 obtains data pertaining to an image to be captured.
- the obtained data includes organizational data aiding in the process operation 115 discussed in FIG. 1 . Examples of organizational data include a job number, a reference number, and other tracking information. Details on acquiring the data are disclosed herein.
- a second acquire operation 215 captures an image.
- the second acquire operation 215 captures the image by taking a picture of a subject using a digital camera.
- An associate operation 220 creates an association between the captured image and the acquired data.
- the associate operation 220 stores the data as metadata in the image file encoding the captured image.
- the associate operation 220 stores organizational data, such as a job number, with a captured image in an Exchange Image File Format (EXIF).
- EXIF Exchange Image File Format
- FIG. 3 illustrates an operation flow for a process 300 for identifying and tracking digital images.
- the process 300 begins at start module 305 and proceeds to input operation 310 .
- Input operation 310 obtains from a user data to be associated with an image.
- the obtained data is formatted to be readable by an input device and not by a digital camera.
- the obtained data is formatted as barcode data.
- the obtained data is a sequence of numbers and/or letters.
- the obtained data can be in any format not readable by a camera.
- a process operation 320 reformats (i.e., or otherwise converts) the obtained data into a format readable by a digital camera. Generally, the process operation 320 reformats the obtained data to resemble data the camera is configured to receive. In some embodiments, the obtained data is broken up (i.e., or fragmented) and rearranged into a new sequence. For example, in one embodiment, the process operation 320 converts the obtained data into a GPS data string.
- a store operation 330 encodes the reformatted data as metadata in a digital picture file at the time the corresponding picture is obtained. The data to be associated with the digital picture is thereby stored in the digital image file.
- the process 300 ends at module 335 .
- FIG. 4 illustrates a block diagram of a system for implementing the process 300 shown in FIG. 3 .
- the system 400 includes a data input device 410 , a data processing device 420 , and a digital camera 430 .
- the system 400 also includes one or more external lights 440 for illuminating a subject when a picture of the subject is taken.
- a power unit 425 is also typically electrically coupled to at least one of the above devices 410 , 420 , 430 , 440 .
- the data input device 410 enables a user to enter data that the user desires to associate with a digital picture into the system 400 .
- Examples of such user-entered data include information identifying the subject of the image, information regarding desired future applications of the image, and other data of interest to the user.
- the user-entered data includes the subject's name and an order package choice.
- Examples of the data input device 410 include a scanner 412 , a keypad 414 , and a character sequencer 416 . In other embodiments, however, any suitable input device can be used.
- the scanner 414 is a barcode scanner and the data of interest to the user is formatted and printed as a barcode.
- the data of interest is a sequence of numbers and/or letters that the user can manually enter using a keypad 414 .
- a character sequencer 416 generates a sequence of numbers and/or letters to be associated with the user.
- the data processing device 420 coverts the data obtained by the input device 410 into a form of data readable by the camera 430 .
- the data processing device 420 includes a processing unit 421 such as a microcontroller, a computer, or a logic simulator and memory storage 422 .
- the data processing unit 420 is communicatively coupled to the data input device 410 and to the camera 430 .
- the data processing device 420 includes a communications interface 423 for communicating with at least one of the input device 410 or the camera 430 .
- the data processing device 420 includes a camera control unit 424 for operating the camera 430 .
- the data processing device 420 will be described in greater detail herein.
- the digital camera 430 obtains images (e.g., shoots pictures) and creates image storage files encoding the obtained images.
- the digital camera 430 is also configured to obtain and process certain types of metadata associated with the obtained image.
- the digital camera 430 recognizes one or more metadata formats and stores metadata having a recognizable format in the image storage files along with the obtained images.
- the digital camera 430 includes a Global Positioning System (GPS) unit and is configured to receive and process a GPS data string as metadata.
- GPS data string includes one or more fields configured to contain GPS information.
- one or more external lights 440 are operably coupled to the data processing unit 420 or to the digital camera 430 .
- the lights 440 are synchronized with the camera 430 to illuminate the subject of the digital picture when the picture is taken.
- FIG. 5 illustrates a block diagram of one example of a data processing device including processing software.
- the data processing device 520 includes a microcontroller 550 and program memory 560 .
- the data processing device 520 also includes an interface 570 for communicating between the processing device 520 and the camera, such as the camera 430 of FIG. 4 .
- the data processing device 520 also includes input ports and output ports for communicatively coupling to a data input device, a camera, and external lights.
- the input ports include a shutter synchronization input port 552 , a data input port 554 , and a flash input port 558 .
- the shutter input port 552 couples to a shutter output of the digital camera. The shutter output indicates whether the camera's shutter was triggered (i.e., whether a picture was taken).
- the data input port 554 couples to a data input device, such as the scanner 412 of FIG. 4 .
- the flash input port 558 couples to a flash synchronization output of the digital camera that indicates when the camera flash, and hence the external lighting, should illuminate.
- the data processing device 520 couples to an external power device, such as power device 425 of FIG. 4 , via power input port 556 .
- the data processing device 520 includes an internal power source (not shown) such as a battery.
- the output ports include a shutter control port 582 , a metadata output port 584 , and a focus/wakeup output port 586 coupled to the digital camera.
- the shutter control port 582 enables the data processing device 520 to inhibit and uninhibit the operation of the shutter of the digital camera.
- the metadata output port 584 couples the data processing device 520 to a metadata input port of the digital camera.
- a digital camera metadata input port includes a GPS input port configured to receive information indicating the geographic position of the camera from a GPS unit.
- the focus/wakeup output port 586 couples the data processing device 520 to the settings input of a digital camera.
- the focus/wakeup output port 586 couples the data processing device to the auto focus feature of the digital camera.
- the output ports also include a light output port 588 coupled to one or more external lights, such as light 440 of FIG. 4 , for operating the lights.
- the external lights may be coupled directly to the flash synchronization output of the digital camera.
- the program memory 560 of the data processing device 520 stores at least one program executed by the controller 550 . In other embodiments, some of the functions of the programs are hardwired into the controller 550 .
- the program memory 560 stores a shutter control module 562 , a conversion module 563 , a trip timer 565 , and a wakeup module 566 .
- the shutter control module 562 prevents and enables triggering of the camera shutter. A user cannot take a picture using the camera if the shutter control module is inhibiting operation of the shutter.
- the conversion module 563 reformats or converts the data entered by a user to a format readable by the digital camera.
- the trip timer 565 determines whether a particular sequence of events has occurred within a set time period.
- the wakeup module 566 has control over at least one setting or parameter of the camera.
- the wakeup module 566 prepares the camera to take a picture (i.e., “wakes up” the camera) by adjusting the setting. In one example embodiment, the wakeup module 566 adjusts the focus of the camera.
- the program memory 560 of the data processing device 520 also stores an interface driver 564 for operating the interface 570 .
- the interface 570 enables the data processing device 520 to communicate with the digital camera.
- the program memory 560 stores a flash synchronization module 568 configured to control when the camera flash and/or the external lights illuminate the subject of a picture.
- FIG. 6 illustrates one example embodiment of an integrated camera and data processing device configured according to the teaching of the present invention.
- the integrated device 600 includes an enclosure 622 housing a circuit board 623 and a camera 630 .
- the circuit board 623 includes a microcontroller 650 having a clock 651 , program memory 660 , non-volatile memory 653 , and at least one input/output port.
- the microcontroller 650 is operatively coupled to an input device 610 .
- the input device 610 is housed within the enclosure 622 .
- the enclosure 622 houses a barcode scanner 610 .
- the input device 610 is external of the enclosure 622 and couples to the controller through an input port on the enclosure 622 .
- the microcontroller 650 couples to the input device 610 through a serial port 654 .
- the input device 610 couples to the microcontroller 650 through a USB port (not shown) and a USB hub, such as USB hub 627 .
- the USB hub 627 couples to a USB interface 657 of the microcontroller 650 through another USB port.
- any suitable data transfer means can be used.
- the microcontroller 650 converts the data received through the input device port 654 and transmits the converted data through an output port 682 to a metadata input port 634 of the camera 630 .
- the input device 610 includes a trigger 611 that operates the input device 610 .
- the trigger 611 is a button on the enclosure 622 that couples to a trigger input port on the microcontroller 650 .
- the input device 610 is external of the integrated device 600 , and the trigger 611 operationally couples to the input device 610 and not to the microcontroller 650 .
- the microcontroller 650 is further coupled to the camera via shutter output port 652 on the microcontroller 650 and shutter input port 632 on the camera, enabling the microcontroller 650 to control the camera shutter.
- a focus output port 686 on the microcontroller 650 and a focus input port 636 on the camera 630 enable the microcontroller 650 to adjust the camera focus.
- the microcontroller 650 may control other camera settings instead of the focus. In such a case, the appropriate ports would be used.
- the circuit board 623 also includes a flash synchronization interface 668 .
- a signal indicating when the camera flash should illuminate is transmitted from an output port 638 on the camera 630 to an input port 658 on the flash interface 668 .
- the signal is then transmitted via an output port 688 to external lights, such as external lights 440 of FIG. 4 .
- the flash synchronization signal is transmitted from the camera output port 638 directly to the external lights.
- the integrated device 600 is configured to run on one or more power sources.
- the circuit board 623 includes a power supply/switch 656 that is configured to receive input from an external 9 Volt DC source, an external 13.5 Volt power source, and/or a 6 Volt power source located in the camera 630 .
- the power switch 656 is also configured to transmit power to a 13.5 Volt DC input port on the camera.
- the camera 630 is also configured to use a removable battery 635 .
- the camera 630 includes a lens 631 and memory 639 in which to store digital pictures.
- the memory 639 includes a removable flash memory card.
- any suitable memory can be used.
- the camera 630 includes an operation interface 633 .
- an external computer (not shown) controls the camera 630 .
- the camera 630 is operatively coupled to the external computer with a cable connection.
- the camera 630 is operatively coupled to the external computer with a wireless connection.
- FIG. 7 illustrates an operation flow for an example process 700 for associating the converted user data with a digital picture at the time the picture is taken.
- the process 700 begins at module 705 and proceeds to an inhibit operation 710 .
- the inhibit operation 710 prevents the user from triggering the camera shutter, thereby preventing the user from taking a picture.
- the inhibit operation 710 prevents a user from taking a picture until after the user has entered data to be associated with the picture.
- a receive data operation 715 receives user-entered data from a data input device, such as the data input device 410 of FIG. 4 .
- the receive data operation 715 receives a data string including a sequence of numbers and/or letters.
- the receive data operation 715 receives two or more data strings.
- a start timer operation 720 activates a timer that counts down over a preset length of time.
- a convert operation 725 reformats (i.e., or otherwise converts) the data received in operation 715 into a format recognizable to the digital camera.
- a transmit operation 730 sends the reformatted data to the digital camera. In some embodiments, the transmit operation 730 sends the reformatted data to the digital camera only once. In other embodiments, the transmit operation 730 repeatedly sends the reformatted data to the digital camera in a continuous stream.
- a wakeup camera operation 735 adjusts at least one of the parameters of the digital camera in order to ensure that the camera is ready to receive the data and/or to take a picture.
- the wakeup camera operation 735 changes the focus of the camera. In other embodiments, however, the wakeup camera operation 735 turns the camera flash on or off, changes the zoom setting, or adjusts another setting on the camera.
- an uninhibit operation 745 then releases the shutter trigger and enables the user to take a picture using the camera.
- a delay operation 740 occurs after the wakeup operation 735 , but before the uninhibit operation 745 .
- the delay operation waits for a predetermined amount of time before proceeding to the uninhibit operation 745 .
- the delay operation 740 has a sufficiently long duration to ensure that the camera has time to both wakeup and to receive the transmitted data.
- the process 700 monitors at module 750 whether or not the shutter has actually been triggered. If and when the shutter is triggered, indicating that a picture has been taken, the process 700 returns to the inhibit operation 710 to begin another cycle. The process 700 also monitors at module 755 the status of the timer that was activated by operation 720 . When a predetermined length of time has passed without detecting operation of the camera shutter, the process 700 returns to the inhibit operation 710 . Returning to the inhibit operation 710 prevents the user from taking a picture until a new set of data has once again been received and all steps of the process 700 have been repeated. This timeout feature helps to ensure that the entered data is associated with the correct picture.
- FIG. 8 illustrates three events occurring during such an application.
- the three events occur at times T1, T2, and T3, respectively.
- an individual 890 receives an identification card 895 (i.e., or another indicia-bearing instrument) indicating information by which the individual's school picture can be tracked.
- the identification card 895 indicates a job number and the individual's name. In other embodiments, however, the identification card 895 can also indicate the individual's school, class, town, whether the individual is a student or a teacher, or any other information useful for identifying the individual 890 .
- the identification card 895 also includes order information. For example, in the embodiment shown, the identification card 895 indicates at 804 a that the individual 890 has ordered Package B. The order information is also encoded as a barcode at 804 b. In other embodiments, however, an individual 890 indicates her order preference separately. In some example embodiments, the individual 890 indicates her order preference on an envelope enclosing money equal to the cost of the order. In other example embodiments, the individual 890 indicates her order preference orally to the photographer 897 .
- information displayed on the identification card 895 is printed in a format readable to a photographer 897 .
- the information is written in a format readable by the input device 810 , but not readable to the photographer 897 .
- the information is written in multiple formats.
- the identification card 895 displays the individual's name and order package choice in both a typeface format 802 a, 804 a and a barcode format 802 b, 804 b, respectively.
- subject portraits are not the only types of pictures taken by the photographer 897 during a session.
- the term subject portrait refers to a picture of a subject such as an individual.
- Examples of other types of pictures are pictures taken to calibrate or monitor the performance of the camera.
- Still other examples include pictures taken to aid in tracking and identifying the pictures taken during a session.
- pictures are taken of start boards, calibration boards, room boards, end boards, and other such instruments. Start and end boards indicate the beginning and end, respectively, of a sequence of shots.
- a picture of a calibration board can be used to tailor the processes used to develop a picture.
- a dust shot can also be taken to determine the effectiveness of the camera over time.
- a photographer 897 can enter additional information, such as a control code, into the input device to indicate the type of picture being taken.
- additional information such as a control code
- the photographer 897 enters a control code separately.
- the control code is included in the information displayed on the identification card 895 and entered when the displayed information is entered.
- the information on the identification card 895 is entered using an input device 810 .
- the input device 810 is coupled to a data processing device 820 .
- the input device 810 is coupled to an integrated camera and data processing device as described above with reference to FIG. 7 .
- the input device 810 is a barcode scanner coupled to the data processing device 820 .
- a subject barcode 802 and an order barcode 804 displayed on the identification card 895 are scanned using a barcode scanner 810 coupled to an integrated data processor 820 and camera 830 .
- the photographer 897 can scan a barcode corresponding to an individual's order preference from a listing of barcodes encoding possible order preferences, such as the different package choices.
- One or more pictures are taken of the individual 890 at time T3 using the digital camera 830 .
- the digital camera 830 saves the picture information in a storage file.
- the storage file also encodes as metadata the information entered at time T2.
- the camera 830 stores the subject's name and order information in the image file.
- the digital camera 830 is controlled directly by an operator 897 .
- the digital camera 830 is controlled by an external computer (not shown) operatively coupled to the digital camera 830 .
- FIG. 9 depicts one example embodiment of an identification card 900 indicating a job number at 910 , information identifying the individual 890 at 920 , and order information at 930 .
- the information identifying the individual 890 includes the name, school, homeroom, and grade of the individual 890 . In other embodiments, however, the identification card 900 could provide other identifying information.
- the identification card 900 includes a barcode 940 encoding at least some of the information found on the identification card 900 .
- the step of converting user-customized data includes loading the data into one or more GPS fields and assembling the GPS fields into a GPS data string.
- FIG. 10 illustrates a block diagram depicting loading user-customized data into GPS fields 1011 - 1018 of a GPS data stream 1010 .
- the conversion process reformats two strings of data—a first barcode 1002 and a second barcode 1004 .
- the conversion process can convert any desired amount of data limited only by the capacity of the digital camera for accepting the data.
- the conversion process fragments and/or resequences the user-entered data. For example, while the conversion process loads all of the first barcode 1002 into a second GPS field 1011 , the conversion process breaks up the second barcode 1004 and loads only a portion of the second barcode 1004 into each of the fourth, sixth, and seventh GPS fields 714 , 716 , 717 , respectively.
- the user-entered data is loaded into all fields 711 - 718 of the GPS data string 1010 . In other embodiments, the user-entered data is loaded into only some of the GPS fields. The remaining GPS fields are set to default values.
- the GPS data string 1010 includes a prefix field 1011 and a checksum field 1018 .
- the checksum field 1018 enables verification that the GPS data string 1010 was transferred correctly from the processing device to the digital camera.
- the checksum field 1018 encodes a number calculated by the processing device based on at least some of the other fields in the GPS data string 1010 . The data string 1010 is then passed to the digital camera.
- the digital camera repeats the calculation and compares the result to the value encoded in the checksum field 1018 to verify that the data string 1010 has not been corrupted. If the value calculated by the digital camera matches the checksum value encoded in field 1018 , then the camera captures an image and associates the data encoded in the data string 1010 with the captured image. In some embodiments, if the calculated checksum does not match the encoded checksum, however, a warning is issued to the photographer. In one example embodiment, a light on the camera flashes in warning. In other embodiments, if the value calculated by the digital camera does not match the checksum value, then the digital camera inhibits the image capture ability of the camera.
- each GPS field is configured to accept a valid input format. Data having a different format will not be accepted into the field.
- FIG. 11A illustrates a GPS Altitude field 1100 A configured to hold a rational number.
- FIG. 11B illustrates a North/South Latitude Indicator (NSLI) field 1100 B configured to hold an ASCII decimal representation of the letter “N” or the letter “S” (i.e., 78 or 83, respectively).
- FIG. 11C illustrates an East/West Longitude
- EWLI Indicator
- FIG. 11D illustrates an example Latitude field 1100 D configured to contain three rational numbers.
- the three rational numbers indicate the latitude in degrees, minutes, and seconds, respectively.
- the first rational number is two integers long indicating the tenth's place and the one's place of the degree value. Minutes and seconds latitude are also typically expressed as two integers.
- the Latitude field 1100 D expects the second rational number to represent minutes, the first integer of the second rational number must range from 0 to 5 only to be accepted.
- FIG. 11E illustrates an example Longitude field 1100 E also configured to contain three rational numbers.
- the three rational numbers indicate the longitude in degrees, minutes, and seconds, respectively.
- the first rational number is three integers long indicating the hundredth's place, the tenth's place, and the one's place of the degree value. Minutes and seconds longitude are typically expressed as two integers.
- the Longitude field 1100 E expects the second rational number to represent minutes, the first integer of the second rational number must range from 0 to 5 only to be considered valid.
- FIG. 11F illustrates an example Universal Time field 1100 F configured to contain three rational numbers.
- the three rational numbers indicate the universal time in hours, minutes, and seconds, respectively.
- hours, minutes, and seconds latitude are each expressed as two integers.
- the Universal Time field 1100 F because the Universal Time field 1100 F expects the first rational number to represent hours, the first integer of the first rational number must equal either 1 or 2 to be accepted as valid.
- the first integer of the second and third rational numbers, which indicate minutes and seconds, respectively must range from 0 to 5 only to be accepted.
- FIG. 11G illustrates an example Universal Date field 1100 G, which can be configured to hold a number of different formats.
- the field is configured to hold eleven ASCII characters in the format YYYY:MM:DD_ with “_” indicating a null character.
- the field is configured to hold six ASCII characters in the format DDMMYY.
- ASCII characters are expressed as a two integer decimal value.
- the date need not be read as an ASCII value.
- the Universal Date field 1100 G is configured to hold six integers indicating a numerical date in the format DDMMYY.
- FIG. 12 illustrates an operation flow of an example process for converting user-entered information to information processable by a digital camera.
- FIG. 12 illustrates one example conversion process 1200 for loading up to three inputted data strings into the fields of a GPS data string.
- the example process 1200 is designed to load data associated with taking school portraits, such as a control code, a subject identifier, and an order identifier.
- the process 1200 can be used to convert any user-entered data to a format readable by a digital camera.
- the process 1200 begins at module 1205 and proceeds to a first receive operation 1210 that receives a first data string.
- the first data string is a control code indicating the type of picture being taken.
- a first validate operation 1215 determines whether the first data string is readable. In some embodiments, the first validate operation 1215 may also determine whether the first data string is correctly formatted. For example, the first validate operation 1215 may require that the first data string begin with a particular character, for example a “S.” If the first validate operation 1215 determines that the received information is unreadable and/or incorrectly formatted, then the process 1200 ends at module 1217 .
- a first store operation 1220 loads the first data string into a predetermined position in a predetermined GPS field.
- the first store operation 1220 may load the first data string into the LAT_DEG — 10 position of the Latitude field 1100 D of a GPS string.
- all of the received data is loaded into the GPS field.
- only a portion of the received data is loaded into the GPS field.
- the validate operation 1215 requires that all control barcodes begin a “$” and only the characters following the “$” are loaded into the GPS field by the first store operation 1220 .
- the first data string is loaded into multiple GPS fields.
- the process 1200 next proceeds to a picture type determination module 1225 , which determines the type of picture being taken based on the first received data string.
- a first data string of “1” indicates that the picture is a portrait and a first data string of “3” indicates that the picture shows a calibration board. If the picture being taken is not a portrait, then the process 900 ends at module 1227 . Identification information and order information are generally not entered in such a scenario. If the picture being taken is a student portrait, however, or if a user desires to enter further information, then the process 900 proceeds to a second receive operation 1230 .
- the second receive operation 1230 receives a second data string, for example, a subject code identifying the subject of the portrait.
- a second validate operation 1235 determines whether the second data string is readable. In some embodiments, the second validate operation 1235 may also determine whether the second data string is correctly formatted. If the second validate operation 1235 determines that the second data string is unreadable or incorrectly formatted, then process 1200 ends at module 1237 with an error message. If the second validate operation 1235 determines that the second data string is readable and correctly formatted, however, then a character type determination module 1240 determines whether the second data string includes alphanumeric characters or numeric characters only.
- the process 1200 proceeds directly to a store indicia operation 1250 .
- the store indicia operation 1250 loads a predetermined value in a predetermined position in a predetermined GPS field.
- the predetermined value is configured to indicate that the second data string is numeric only.
- a store identity operation 1255 next loads the subsequent characters of the second data string into predetermined positions in at least one predetermined field.
- the store indicia operation 1250 loads the predetermined value into the first position of the Altitude field 1100 A and the store identity operation 1255 loads the subsequent characters of the second data string into the subsequent positions of the Altitude field 1100 A.
- other GPS fields may be used to store the information.
- the translate operation 1245 converts each of the characters found in the second data string to a numerical equivalent of the character. For example, each letter and number has a decimal ASCII value that can be expressed with two integers.
- the store indicia operation 1250 loads a different predetermined value in the predetermined position of the predetermined field. The value would indicate that the second data string contains alphanumeric characters.
- the store identification operation 1255 would then load the translated numerical equivalents of the characters from the second data string into the appropriate positions in the appropriate field.
- a third receive operation 1260 receives a third data string, for example, an order code representing order information.
- a third validate operation 1265 determines whether the third data string is readable. In some embodiments, the third validate operation 1265 also determines whether the third data string is correctly formatted. For example, the third validate operation 1265 may require that the third data string begin with a particular character, for example a “-” sign. If the third validate operation 1265 determines that the third data string is unreadable and/or incorrectly formatted, then the process 1200 ends at module 1267 with an error message. If the third validate operation 1265 determines that the third data string is readable and formatted correctly, however, then the process 1200 proceeds to a determine order length operation 1270 .
- the order length operation 1270 determines the number of characters included in the third data string.
- a store order length operation 1275 loads the length of the third data string into one or more predetermined positions in one or more predetermined fields.
- the order length is indicated by a first integer representing the ten's column value of the length and a second integer representing the one's column value of the length.
- the first and second integers are stored in the HR — 10 and HR — 1 positions, respectively, of the UTC Time field 1100 F.
- a store order operation 1280 loads each character of the third data string into a predetermined position of a predetermined GPS field.
- the characters need not be loaded in consecutive order within a field, however.
- the store order operation 1280 loads the first character of the third data string into the LAT_DEG — 1 position of the Latitude field 1100 D of the GPS string and the second character into the LAT_MIN — 1 position of the Latitude field 1100 D.
- the characters of the third data string need not even be loaded into consecutive fields.
- the store order operation 1280 loads the third character of the third data string into the MN — 1 position of the UTC Time field 1100 F.
- any of the three data string can be stored in non-consecutive positions and/or fields.
- An assemble operation 1285 sets any unloaded position in each field to a default value and forms a data string using the fields.
- a calculate operation 1290 determines a checksum value based on at least some of the other values stored in the GPS fields. The checksum value is encoded into a checksum field in the GPS data string, such as checksum field 1018 in GPS string 1010 .
- a transmit operation 1295 passes the assembled GPS data string 1010 including the checksum value to a digital camera. The process 1200 ends at module 1297 .
- FIGS. 13A-13C some of the steps of the process 1200 can best be understood by some example applications. For the purposes of these examples, it is assumed that the picture to be taken is a student portrait.
- FIG. 13A illustrates an example GPS Altitude field 1300 A and a subject barcode 1002 A.
- the subject barcode 1302 A encodes the character string “123456.”
- the second receive operation 1230 receives the character string and the second validate operation 1235 determines that the string is readable.
- the determination module 1240 determines that the characters in the second data string are numbers only and, as shown in FIG. 13A , the store indicia operation 1250 stores the character “1” in the first position 1310 A of the Altitude field 1300 A.
- the character “1” indicates that the remaining characters stored in the Altitude field are numbers only.
- the store identification operation 955 loads the characters into the remaining positions 1320 A of the Altitude field 1300 A. As shown in the illustrated example, the remaining characters stored in the Altitude field are “123456.”
- the Altitude field 1300 A therefore, contains the number “1123456”.
- FIG. 13B illustrates an alternative example of a GPS Altitude field 1300 B being loaded with a subject barcode 1302 B.
- the subject barcode 1302 B encodes the character string “ABC1.”
- the second receive operation 1230 of the process 1200 receives the character string and the second validate operation 1235 determines that the string is readable, similar to the example of FIG. 13A .
- the determination module 1240 determines that the characters in the second data string include both letters and numbers.
- the translate operation 1245 converts each of the letters and numbers into a decimal ASCII equivalent. For example, the number “65” is the decimal ASCII equivalent of the letter “A” and the number “49” is the decimal ASCII equivalent of the number “1.”
- the store indicia operation 1250 loads the character “2” in the first position 1310 B of the Altitude field 1300 B.
- the character “2” in this predetermined position indicates that the remaining characters stored in the Altitude field are decimal ASCII equivalents of letters and numbers.
- the store identification operation 1255 loads the translated characters into the remaining positions 1320 B of the Altitude field 1300 B. As shown, the remaining characters stored in the Altitude field are “65666749.” The Altitude field 1300 B, therefore, contains the number “265666749”.
- FIG. 13C illustrates example UTC Time, Longitude, Latitude, and UTC Date fields 1312 , 1314 , 1316 , and 1318 , respectively, and an order barcode 1304 .
- the order barcode 1304 encodes the character string “-9876543.”
- the third receive operation 1260 receives the character string and the third validation operation 1265 determines that the string is readable. In some embodiments, the third validation operation 1265 also determines that the first character of the third data string is a “-” character.
- the determine length operation 1270 determines the number of characters in the third data string. Generally, the determine length operation 1270 does not count any symbol characters indicating that the string is a particular type of barcode, for example, the “-” character indicating an order barcode. In the illustrated example, the third data string has seven characters not including the “-” character. Consequently, the store order length operation 1275 loads a “0” in the first character position of the UTC Time field 1312 and a “7” in the second character position.
- the store order operation 1280 loads the characters of the order barcode 1304 following the “-” symbol into predetermined positions in different GPS fields.
- the first character in the order barcode 1304 “9”, is stored in the LAT_DEG — 1 position of the Latitude field 1316 .
- the second character, “8”, is loaded into the LAT_MIN — 1 position of the Latitude field 1316 .
- the third character, “7”, is loaded into the MN — 1 of the GPS Time field 1312 .
- the fourth, fifth, sixth, and seventh characters are loaded into the SE — 1 position of the GPS Time field 1312 , the LON_DEG — 10 position and LON_DEG — 1 positions of the Longitude field 1314 , and the YR — 10 position of the GPS Date field 1318 , respectively.
- information identifying the subject of a picture and information regarding desired processing of the picture can be directly associated with the digital picture at the time the picture is taken. Furthermore, the identifying information and order information can be inputted in a user-customized format. The user is not restricted to encoding only the type or format of information for which the camera is configured to accept and process.
- FIG. 14 illustrates a block diagram showing how the GPS string (i.e., or other metadata) is obtained from the image storage file 1435 on the camera 1430 by a data processor 1450 .
- the data processor 1450 can be the same data processor used to reformat the user data.
- the data processor 1450 is a separate data processor.
- the data processor 1450 converts the data from a format processable by the digital camera 1430 to a useable format.
- a useable format is a format readable by a user.
- a useable format is a format readable by an image application.
- an image application 1460 enables a user to display the picture and to display the metadata in a format readable to the user.
- FIG. 15 illustrates an operation flow for an example process 1500 for obtaining the saved image and metadata from the camera and restoring the metadata to a useable format.
- the process 1500 begins as module 1505 and proceeds to read operation 1510 .
- the read operation 1510 obtains the metadata encoded in a digital image file.
- the read operation 1510 obtains a string of multiple GPS fields from a digital image file.
- the desired user data is scattered throughout multiple fields of the metadata and not in sequential order.
- An extract operation 1515 pulls user-entered data from the metadata and an order operation 1520 reassembles any disparate sections of the user data into the proper sequence.
- a store operation 1525 creates an electronic file to store the user-entered data on a computer or other storage medium other than the digital camera.
- the digital image and the user data are stored in a database and organized according to the user data.
- the image file can be indexed and searched based on the stored user data.
- the process 1500 ends at module 1530 .
- FIG. 16 illustrates an operation flow for a process 1600 by which an image is processed and delivered using the restored user-customized data.
- the process 1600 begins at start module 1605 and proceeds to arrange operation 1610 .
- the arrange operation 1610 arranges the image files obtained from the digital camera in a particular order. In some embodiments, the arrange operation 1610 organizes the image files within a database. In one example embodiment, the arrange operation 1610 organizes the image files into alphanumerical order based on a job number or reference number associated with the image file. In another example embodiment, the arrange operation 1610 organizes the image files into order first by school, then by homeroom, then by last name, and then by first name. Of course, in other embodiments, the arrange operation 1610 can organize the image files into any order that aids in the production of the image.
- a sequence operation 1615 obtains and analyzes the data associated with each image file in sequential order.
- a determine operation 1620 analyzes the obtained data from sequence operation 1615 to determine the order preference and other production details associated with the image encoded by the image file.
- the data obtained from each image file can include, but is not limited to, the name of the subject, subject contact information, school name, school contact information, grade of subject, customer order information, requested finishing techniques, and a delivery date.
- a render operation 1625 then prints using standard printing techniques one or more copies of the encoded image based on the order preference. For example, in one embodiment, two “five by seven”-inch pictures and eight wallet-size pictures of the image are printed.
- the render operation 1625 modifies the encoded image before printing. For example, different finishing options can be performed on the image. Non-limiting examples of possible finishing options include removing stray hairs, removing “red eye,” covering skin blemishes, color tinting, adding backgrounds and borders, and adding text to an image.
- a package operation 1630 packages and labels the rendered images in preparation for delivery.
- the package operation 1630 places the images in one or more frames, folios, or albums.
- the package operation 1630 encodes the images onto a CD.
- any suitable packaging method can be used to prepare the rendered images for delivery.
- a ship operation 1635 sends the packaged photographs to the customer.
- the ship operation 1635 delivers packaged school portraits to a school for distribution to the students.
- each package includes a label identifying the student to whom the package belongs.
- the process 1600 ends at stop module 1640 .
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
A method of tracking digital images includes inputting data identifying a subject of an image into a camera, acquiring an image with the camera, and storing the image and the inputted data, as metadata, in an image file when the image is acquired. The method can be implemented using a scanner, a digital camera, and a data processor. The scanner obtains the identifying data and transmits the data to the camera. The camera obtains digital images and embeds the data into digital image files encoding the digital images. The identifying data has a format different from any of the formats processable by the digital camera. The data processor converts the format of the identifying data to one of the plurality of formats processable by the digital camera loads the converted information into the digital camera as metadata.
Description
- This application is a continuation of U.S. patent application Ser. No. 12/729,898 filed on Mar. 23, 2010, entitled IDENTIFYING AND TRACKING DIGITAL IMAGES WITH CUSTOMIZED METADATA, which is a continuation of U.S. Pat. No. 7,714,908 filed on May 26, 2006, entitled IDENTIFYING AND TRACKING DIGITAL IMAGES WITH CUSTOMIZED METADATA, the disclosures of which are hereby incorporated by reference in their entireties. To the extent appropriate, a claim of priority is made to each of the above disclosed applications.
- This invention relates generally to the field of portrait photography and more particularly to a method and apparatus for facilitating the identification and tracking of photographic portraits.
- Photographic studios, professional photographers, and others performing commercial portrait work (collectively referred to herein as “photographers”) often need to identify and track large numbers of images through the workflow process from image capture to portrait delivery. In general, captured images are tracked by associating each image with organizational information indicating how the image should be processed, to whom the processed image should be delivered, and other descriptive information useful for later identification, selection, or sorting of captured images. Without such organizational information, each captured image would need to be individually reviewed, which could be time consuming when processing a large number of images. Moreover, such review of individual images by others not present when the images were captured can be an arduous and inaccurate process.
- In some cases, the organizational information includes information identifying the customer or the subject of the image. For example, in the case of school portraits, identifying information can include the name, school, and grade of the subject of each photograph. In other embodiments, the organizational information also includes order information. For example, each individual who has a photographic portrait taken may want multiple copies of the portrait in one or more sizes using one or more finishing techniques. Photographic portraits may also include multiple individuals, each of whom may want a different number of copies of the photograph in different sizes.
- In some prior systems, photographers create order packages to facilitate tracking order information. Each package has a set number of copies for a set number of sizes. For example, a first package may include two pictures having a dimension of five inches by seven inches and ten “wallet” pictures having a dimension of two inches by three inches. Another package may include one picture having a dimension of eight inches by ten inches and eight “wallet” pictures.
- Tracking organizational information in some prior systems includes printing the organizational information directly onto a negative of each portrait. However, the association between the organizational information and the image must be otherwise tracked between taking the picture and printing the negative. These issues are the same regardless of whether the photographic medium utilized is physical (e.g., a printed negative) or digital (e.g., a digital picture file). For example, in some prior digital systems, computer applications can be used to add metadata to an image downloaded from a digital camera. However, the association between the digital picture and the identity information must be otherwise maintained before obtaining the picture with the computer application.
- One example prior method of associating an image with identifying information at the time the image is taken includes sequential ordering. For example, a photographer can record a list of photographic sessions, the number of pictures taken during each session, and the customer associated with each session in the same order in which the sessions are conducted. In such a system, however, the pictures must be kept in the same order in which they are taken or the association between each picture and the identity information may be lost.
- In other systems, limited types of information can be associated with a digital image at the time the image is taken. Digital cameras typically record some types of information about a picture, known as metadata, along with the picture. In general, however, such information includes information about the picture itself and does not identify a customer or the subject of the image. For example, a typical camera encodes information regarding the make and model of the camera, the camera settings, and whether a flash is used. Some digital cameras also include information regarding the date and time the picture is taken. Some other digital cameras include a “Global Positioning System” (GPS) unit to track and record the physical location, Coordinating Universal Time (UTC) date, and UTC time at which each picture is taken.
- In still other systems, data can be associated with a digital image file on a digital camera through emulation of the memory card of the digital camera. US Publication No. 2005/0036034 discloses a system by which a processing unit emulates a memory card of a digital camera such that the digital camera's electronics operate with the memory of the processing unit through a camera interface card as if such memory was on a memory card located in the digital camera. Images captured by the digital camera are transferred, via the camera interface card, to the memory of the processing unit and the digital camera can access (e.g., read) images stored in the memory of the processing unit. Data can be associated with the image files by partitioning the memory of the processing unit to accommodate both originally captured images and copies of the images including file names and/or headers modified to indicate the data desired to be associated with the images.
- Therefore, there is a need in the art for a tracking system that facilitates tracking of the association between an image and organizational information facilitating production and delivery of the image. Aspects of the present invention overcome these and other shortcomings of the prior art and address these needs in the art.
- The invention relates to a method and system for associating customized information with a digital image and, in particular, identifying and tracking digital images using customizable user metadata.
- The invention enables a photographer to immediately associate customized information with an image at the time the image is taken. Customized information includes any information a photographer desires to associate with an image. Examples of customized information include information identifying a customer, information identifying an order package, and any internal tracking information of interest to the photographer. The invention thereby enables more accurate identification and tracking of such information with regard to each image.
- Generally, the invention is implemented using a camera. The camera obtains and stores an image. When the image is being stored, the camera also stores customized information along with the image, thereby associating the customized information with the image. The association between the customized information and image is automatically maintained when the image is retrieved from the camera at a later time.
- In some systems, the camera obtains the customized data from an input device. A photographer can enter the customized information into the input device and can transmit the entered information to the camera.
- In other systems, the camera cannot read or process the customized data entered into the input device. In such systems, a processor device acts as an intermediary between the camera and the input device. The processor device resequences, reformats, or otherwise modifies the input information to be readable by the camera.
- According to one embodiment, a method for obtaining and tracking digital images includes inputting data identifying a subject of an image into a camera, acquiring an image with the camera, and storing the image and the inputted data, as metadata, in an image file when the image is acquired.
- According to another embodiment, a system for obtaining and tracking digital images includes a digital camera, an input device, and a data processor. The digital camera obtains digital images, and embeds metadata into digital image files encoding the digital images. The input device obtains information relating to a subject of a digital image. The obtained information has a format different from the data formats processable by the digital camera. The data processor receives the obtained information from the input device, converts the format of the obtained information to a format processable by the digital camera, and inputs the converted information into the digital camera as metadata.
- In one example embodiment, the camera is configured to accept and process a GPS data string, the input device includes a barcode scanner, and the data processor includes a microcontroller.
- In another example embodiment, the camera, the input device, and data processor are provided within a housing to form an integrated device.
- While the invention will be described with respect to preferred embodiment configurations and with respect to particular devices used therein, it will be understood that the invention is not to be construed as limited in any manner by either such configuration or components described herein. Variations of the invention will become apparent to those skilled in the art upon a more detailed description of the invention. The advantages and features that characterize the invention are pointed out with particularity in the claims annexed hereto and forming a part hereof. For a better understanding of the invention, however, reference should be had to the drawings which form a part hereof and to the accompanying descriptive matter, in which there is illustrated and described a preferred embodiment of the invention.
- In the drawings in which like elements are identified with the same designation numeral:
-
FIG. 1 illustrates an operation flow for an example process for providing photographic portraits to customers according to one embodiment of the present invention; -
FIG. 2 illustrates an operation flow for an example process for aiding in processing images according to one embodiment of the present invention; -
FIG. 3 illustrates an operation flow for a process for identifying and tracking digital images according to one embodiment of the present invention; -
FIG. 4 illustrates a block diagram of a system for implementing the process shown inFIG. 3 according to one embodiment of the present invention; -
FIG. 5 illustrates one example of a data processing device including processing software according to one embodiment of the present invention; -
FIG. 6 illustrates one example embodiment of an integrated device configured according to the teaching of the present invention; -
FIG. 7 illustrates an operation flow for a process for associating the processed user data with a digital picture according to one embodiment of the present invention; -
FIG. 8 illustrates three events occurring during the use of one embodiment of the present invention; -
FIG. 9 illustrates an example identification card according to one embodiment of the present invention; -
FIG. 10 illustrates a block diagram depicting loading user data into GPS fields of a GPS data stream; -
FIGS. 11A-11G depict example GPS fields and a valid input format for each example field according to one embodiment of the present invention; -
FIG. 12 illustrates an example operation flow for a process for converting user-customized information to information processable by a digital camera; -
FIGS. 13A-13C illustrate example GPS fields resulting from some applications of the conversion process; -
FIG. 14 illustrates a block diagram depicting how GPS metadata is obtained and restored to a useable form according to one embodiment of the present invention; -
FIG. 15 illustrates an operation flow for a process for obtaining a saved image and the corresponding metadata from a camera and converting the metadata to useable data according to one embodiment of the present invention; and -
FIG. 16 illustrates an operation flow for a process for processing and delivering a captured image according to organizational data associated with the image. - The invention relates to methods and systems for identifying and tracking digital images using metadata. In particular, the invention relates to methods and systems for associating customizable metadata, such as organizational data aiding in image production, with a digital picture at the time the digital picture is taken and tracking the digital picture with the associated metadata.
-
FIG. 1 illustrates an operation flow for an example process 100 for providing photographic portraits to customers. The process 100 begins atstart module 105 and proceeds to acapture operation 110. Thecapture operation 110 obtains a photographic image of a subject. In some embodiments, the photographic image is obtained with a digital camera. - A
process operation 115 produces one or more copies of the image. In some example embodiments, theprocess operation 115 renders the captured image according to order information provided by the customer. In other example embodiments, theprocess operation 115 edits the captured image before rendering the image. A deliveroperation 120 sends the processed image to the customer. In some example embodiments, the deliveroption 120 includes packaging and labeling the rendered image. The process 100 ends atstop module 125. -
FIG. 2 illustrates an operation flow for anexample process 200 for aiding in processing images. Theprocess 200 begins atstart module 205 and proceeds to afirst acquire operation 210. Thefirst acquire operation 210 obtains data pertaining to an image to be captured. In some embodiments, the obtained data includes organizational data aiding in theprocess operation 115 discussed inFIG. 1 . Examples of organizational data include a job number, a reference number, and other tracking information. Details on acquiring the data are disclosed herein. - A
second acquire operation 215 captures an image. In some embodiments, thesecond acquire operation 215 captures the image by taking a picture of a subject using a digital camera. Anassociate operation 220 creates an association between the captured image and the acquired data. In some embodiments, theassociate operation 220 stores the data as metadata in the image file encoding the captured image. In one example embodiment, theassociate operation 220 stores organizational data, such as a job number, with a captured image in an Exchange Image File Format (EXIF). Theprocess 200 ends atstop module 225.FIG. 3 illustrates an operation flow for aprocess 300 for identifying and tracking digital images. Theprocess 300 begins atstart module 305 and proceeds to inputoperation 310.Input operation 310 obtains from a user data to be associated with an image. Generally, the obtained data is formatted to be readable by an input device and not by a digital camera. In some example embodiments, the obtained data is formatted as barcode data. In other example embodiments, the obtained data is a sequence of numbers and/or letters. Of course, in still other embodiments, the obtained data can be in any format not readable by a camera. - A
process operation 320 reformats (i.e., or otherwise converts) the obtained data into a format readable by a digital camera. Generally, theprocess operation 320 reformats the obtained data to resemble data the camera is configured to receive. In some embodiments, the obtained data is broken up (i.e., or fragmented) and rearranged into a new sequence. For example, in one embodiment, theprocess operation 320 converts the obtained data into a GPS data string. - A
store operation 330 encodes the reformatted data as metadata in a digital picture file at the time the corresponding picture is obtained. The data to be associated with the digital picture is thereby stored in the digital image file. Theprocess 300 ends atmodule 335. -
FIG. 4 illustrates a block diagram of a system for implementing theprocess 300 shown inFIG. 3 . Generally, thesystem 400 includes adata input device 410, adata processing device 420, and adigital camera 430. In some embodiments, thesystem 400 also includes one or moreexternal lights 440 for illuminating a subject when a picture of the subject is taken. Apower unit 425 is also typically electrically coupled to at least one of theabove devices - The
data input device 410 enables a user to enter data that the user desires to associate with a digital picture into thesystem 400. Examples of such user-entered data include information identifying the subject of the image, information regarding desired future applications of the image, and other data of interest to the user. For example, in one embodiment, the user-entered data includes the subject's name and an order package choice. - Examples of the
data input device 410 include ascanner 412, akeypad 414, and acharacter sequencer 416. In other embodiments, however, any suitable input device can be used. In one embodiment, thescanner 414 is a barcode scanner and the data of interest to the user is formatted and printed as a barcode. In another embodiment, the data of interest is a sequence of numbers and/or letters that the user can manually enter using akeypad 414. In yet another embodiment, acharacter sequencer 416 generates a sequence of numbers and/or letters to be associated with the user. - The
data processing device 420 coverts the data obtained by theinput device 410 into a form of data readable by thecamera 430. Generally, thedata processing device 420 includes aprocessing unit 421 such as a microcontroller, a computer, or a logic simulator andmemory storage 422. Thedata processing unit 420 is communicatively coupled to thedata input device 410 and to thecamera 430. In some embodiments, thedata processing device 420 includes acommunications interface 423 for communicating with at least one of theinput device 410 or thecamera 430. In other embodiments, thedata processing device 420 includes acamera control unit 424 for operating thecamera 430. Thedata processing device 420 will be described in greater detail herein. - The
digital camera 430 obtains images (e.g., shoots pictures) and creates image storage files encoding the obtained images. Thedigital camera 430 is also configured to obtain and process certain types of metadata associated with the obtained image. In general, thedigital camera 430 recognizes one or more metadata formats and stores metadata having a recognizable format in the image storage files along with the obtained images. For example, in some embodiments, thedigital camera 430 includes a Global Positioning System (GPS) unit and is configured to receive and process a GPS data string as metadata. In general, a GPS data string includes one or more fields configured to contain GPS information. - In some example embodiments, one or more
external lights 440 are operably coupled to thedata processing unit 420 or to thedigital camera 430. Thelights 440 are synchronized with thecamera 430 to illuminate the subject of the digital picture when the picture is taken. -
FIG. 5 illustrates a block diagram of one example of a data processing device including processing software. Thedata processing device 520 includes amicrocontroller 550 andprogram memory 560. In some embodiments, thedata processing device 520 also includes aninterface 570 for communicating between theprocessing device 520 and the camera, such as thecamera 430 ofFIG. 4 . - The
data processing device 520 also includes input ports and output ports for communicatively coupling to a data input device, a camera, and external lights. In some example embodiments, the input ports include a shuttersynchronization input port 552, adata input port 554, and aflash input port 558. Theshutter input port 552 couples to a shutter output of the digital camera. The shutter output indicates whether the camera's shutter was triggered (i.e., whether a picture was taken). Thedata input port 554 couples to a data input device, such as thescanner 412 ofFIG. 4 . Theflash input port 558 couples to a flash synchronization output of the digital camera that indicates when the camera flash, and hence the external lighting, should illuminate. In one embodiment, thedata processing device 520 couples to an external power device, such aspower device 425 ofFIG. 4 , viapower input port 556. In another embodiment, thedata processing device 520 includes an internal power source (not shown) such as a battery. - In some example embodiments, the output ports include a
shutter control port 582, ametadata output port 584, and a focus/wakeup output port 586 coupled to the digital camera. Theshutter control port 582 enables thedata processing device 520 to inhibit and uninhibit the operation of the shutter of the digital camera. Themetadata output port 584 couples thedata processing device 520 to a metadata input port of the digital camera. One example embodiment of a digital camera metadata input port includes a GPS input port configured to receive information indicating the geographic position of the camera from a GPS unit. - The focus/
wakeup output port 586 couples thedata processing device 520 to the settings input of a digital camera. In one example embodiment, the focus/wakeup output port 586 couples the data processing device to the auto focus feature of the digital camera. In other embodiments, the output ports also include alight output port 588 coupled to one or more external lights, such aslight 440 ofFIG. 4 , for operating the lights. Of course, in still other embodiments, the external lights may be coupled directly to the flash synchronization output of the digital camera. - In some embodiments, the
program memory 560 of thedata processing device 520 stores at least one program executed by thecontroller 550. In other embodiments, some of the functions of the programs are hardwired into thecontroller 550. - Generally, the
program memory 560 stores ashutter control module 562, aconversion module 563, atrip timer 565, and awakeup module 566. Theshutter control module 562 prevents and enables triggering of the camera shutter. A user cannot take a picture using the camera if the shutter control module is inhibiting operation of the shutter. Theconversion module 563 reformats or converts the data entered by a user to a format readable by the digital camera. Thetrip timer 565 determines whether a particular sequence of events has occurred within a set time period. Thewakeup module 566 has control over at least one setting or parameter of the camera. Thewakeup module 566 prepares the camera to take a picture (i.e., “wakes up” the camera) by adjusting the setting. In one example embodiment, thewakeup module 566 adjusts the focus of the camera. - In some embodiments, the
program memory 560 of thedata processing device 520 also stores aninterface driver 564 for operating theinterface 570. Theinterface 570 enables thedata processing device 520 to communicate with the digital camera. In other embodiments, theprogram memory 560 stores aflash synchronization module 568 configured to control when the camera flash and/or the external lights illuminate the subject of a picture. -
FIG. 6 illustrates one example embodiment of an integrated camera and data processing device configured according to the teaching of the present invention. Theintegrated device 600 includes anenclosure 622 housing acircuit board 623 and acamera 630. Thecircuit board 623 includes a microcontroller 650 having aclock 651,program memory 660,non-volatile memory 653, and at least one input/output port. - The microcontroller 650 is operatively coupled to an
input device 610. In some embodiments, theinput device 610 is housed within theenclosure 622. For example, in one embodiment, theenclosure 622 houses abarcode scanner 610. In other embodiments, theinput device 610 is external of theenclosure 622 and couples to the controller through an input port on theenclosure 622. - In one embodiment, the microcontroller 650 couples to the
input device 610 through aserial port 654. In another embodiment, theinput device 610 couples to the microcontroller 650 through a USB port (not shown) and a USB hub, such asUSB hub 627. TheUSB hub 627 couples to aUSB interface 657 of the microcontroller 650 through another USB port. In other embodiments, however, any suitable data transfer means can be used. The microcontroller 650 converts the data received through theinput device port 654 and transmits the converted data through anoutput port 682 to ametadata input port 634 of thecamera 630. - In some embodiments, the
input device 610 includes atrigger 611 that operates theinput device 610. In one example embodiment, thetrigger 611 is a button on theenclosure 622 that couples to a trigger input port on the microcontroller 650. In other embodiments, theinput device 610 is external of theintegrated device 600, and thetrigger 611 operationally couples to theinput device 610 and not to the microcontroller 650. - The microcontroller 650 is further coupled to the camera via
shutter output port 652 on the microcontroller 650 and shutterinput port 632 on the camera, enabling the microcontroller 650 to control the camera shutter. Afocus output port 686 on the microcontroller 650 and afocus input port 636 on thecamera 630 enable the microcontroller 650 to adjust the camera focus. Of course, in other embodiments, the microcontroller 650 may control other camera settings instead of the focus. In such a case, the appropriate ports would be used. - In some embodiments, the
circuit board 623 also includes aflash synchronization interface 668. In one example embodiment, a signal indicating when the camera flash should illuminate is transmitted from anoutput port 638 on thecamera 630 to aninput port 658 on theflash interface 668. The signal is then transmitted via anoutput port 688 to external lights, such asexternal lights 440 ofFIG. 4 . In another example embodiment, however, the flash synchronization signal is transmitted from thecamera output port 638 directly to the external lights. - In general, the
integrated device 600 is configured to run on one or more power sources. For example, thecircuit board 623 includes a power supply/switch 656 that is configured to receive input from an external 9 Volt DC source, an external 13.5 Volt power source, and/or a 6 Volt power source located in thecamera 630. Thepower switch 656 is also configured to transmit power to a 13.5 Volt DC input port on the camera. In one embodiment, thecamera 630 is also configured to use aremovable battery 635. - The
camera 630 includes alens 631 andmemory 639 in which to store digital pictures. In one example embodiment, thememory 639 includes a removable flash memory card. Of course, any suitable memory can be used. In some embodiments, thecamera 630 includes anoperation interface 633. In other embodiments, an external computer (not shown) controls thecamera 630. In one example embodiment, thecamera 630 is operatively coupled to the external computer with a cable connection. In another example embodiment, thecamera 630 is operatively coupled to the external computer with a wireless connection. -
FIG. 7 illustrates an operation flow for anexample process 700 for associating the converted user data with a digital picture at the time the picture is taken. Theprocess 700 begins atmodule 705 and proceeds to an inhibitoperation 710. The inhibitoperation 710 prevents the user from triggering the camera shutter, thereby preventing the user from taking a picture. Generally, the inhibitoperation 710 prevents a user from taking a picture until after the user has entered data to be associated with the picture. - A receive
data operation 715 receives user-entered data from a data input device, such as thedata input device 410 ofFIG. 4 . In some embodiments, the receivedata operation 715 receives a data string including a sequence of numbers and/or letters. In other embodiments, the receivedata operation 715 receives two or more data strings. After the data is received, astart timer operation 720 activates a timer that counts down over a preset length of time. - A
convert operation 725 reformats (i.e., or otherwise converts) the data received inoperation 715 into a format recognizable to the digital camera. A transmitoperation 730 sends the reformatted data to the digital camera. In some embodiments, the transmitoperation 730 sends the reformatted data to the digital camera only once. In other embodiments, the transmitoperation 730 repeatedly sends the reformatted data to the digital camera in a continuous stream. - A
wakeup camera operation 735 adjusts at least one of the parameters of the digital camera in order to ensure that the camera is ready to receive the data and/or to take a picture. In some embodiments, thewakeup camera operation 735 changes the focus of the camera. In other embodiments, however, thewakeup camera operation 735 turns the camera flash on or off, changes the zoom setting, or adjusts another setting on the camera. - In some embodiments, an
uninhibit operation 745 then releases the shutter trigger and enables the user to take a picture using the camera. In other embodiments, however, adelay operation 740 occurs after thewakeup operation 735, but before theuninhibit operation 745. The delay operation waits for a predetermined amount of time before proceeding to theuninhibit operation 745. Generally, thedelay operation 740 has a sufficiently long duration to ensure that the camera has time to both wakeup and to receive the transmitted data. - After the
uninhibit operation 745 has released the camera shutter, theprocess 700 monitors atmodule 750 whether or not the shutter has actually been triggered. If and when the shutter is triggered, indicating that a picture has been taken, theprocess 700 returns to the inhibitoperation 710 to begin another cycle. Theprocess 700 also monitors atmodule 755 the status of the timer that was activated byoperation 720. When a predetermined length of time has passed without detecting operation of the camera shutter, theprocess 700 returns to the inhibitoperation 710. Returning to the inhibitoperation 710 prevents the user from taking a picture until a new set of data has once again been received and all steps of theprocess 700 have been repeated. This timeout feature helps to ensure that the entered data is associated with the correct picture. - Referring now to
FIG. 8 , one example embodiment of the present invention can be used to identify and track professional school pictures.FIG. 8 illustrates three events occurring during such an application. The three events occur at times T1, T2, and T3, respectively. At time T1, an individual 890 receives an identification card 895 (i.e., or another indicia-bearing instrument) indicating information by which the individual's school picture can be tracked. For example, in one example embodiment, theidentification card 895 indicates a job number and the individual's name. In other embodiments, however, theidentification card 895 can also indicate the individual's school, class, town, whether the individual is a student or a teacher, or any other information useful for identifying the individual 890. - In some embodiments, the
identification card 895 also includes order information. For example, in the embodiment shown, theidentification card 895 indicates at 804 a that the individual 890 has ordered Package B. The order information is also encoded as a barcode at 804 b. In other embodiments, however, an individual 890 indicates her order preference separately. In some example embodiments, the individual 890 indicates her order preference on an envelope enclosing money equal to the cost of the order. In other example embodiments, the individual 890 indicates her order preference orally to thephotographer 897. - In some embodiments, information displayed on the
identification card 895 is printed in a format readable to aphotographer 897. In other example embodiments, the information is written in a format readable by theinput device 810, but not readable to thephotographer 897. In still other embodiments, the information is written in multiple formats. For example, in one embodiment, theidentification card 895 displays the individual's name and order package choice in both atypeface format barcode format - In general, subject portraits are not the only types of pictures taken by the
photographer 897 during a session. As used herein, the term subject portrait refers to a picture of a subject such as an individual. Examples of other types of pictures are pictures taken to calibrate or monitor the performance of the camera. Still other examples include pictures taken to aid in tracking and identifying the pictures taken during a session. For example, in some embodiments, pictures are taken of start boards, calibration boards, room boards, end boards, and other such instruments. Start and end boards indicate the beginning and end, respectively, of a sequence of shots. A picture of a calibration board can be used to tailor the processes used to develop a picture. A dust shot can also be taken to determine the effectiveness of the camera over time. - In some embodiments, a
photographer 897 can enter additional information, such as a control code, into the input device to indicate the type of picture being taken. In one example embodiment, thephotographer 897 enters a control code separately. In other embodiments, however, the control code is included in the information displayed on theidentification card 895 and entered when the displayed information is entered. - Still referring to
FIG. 8 , at time T2, the information on theidentification card 895 is entered using aninput device 810. In some embodiments, theinput device 810 is coupled to adata processing device 820. In other embodiments, theinput device 810 is coupled to an integrated camera and data processing device as described above with reference toFIG. 7 . In one example embodiment, theinput device 810 is a barcode scanner coupled to thedata processing device 820. - In the example shown in
FIG. 8 , a subject barcode 802 and an order barcode 804 displayed on theidentification card 895 are scanned using abarcode scanner 810 coupled to anintegrated data processor 820 andcamera 830. In other embodiments, however, thephotographer 897 can scan a barcode corresponding to an individual's order preference from a listing of barcodes encoding possible order preferences, such as the different package choices. - One or more pictures are taken of the individual 890 at time T3 using the
digital camera 830. Thedigital camera 830 saves the picture information in a storage file. The storage file also encodes as metadata the information entered at time T2. For example, in the embodiment shown inFIG. 8 , thecamera 830 stores the subject's name and order information in the image file. In some embodiments, thedigital camera 830 is controlled directly by anoperator 897. In other embodiments, thedigital camera 830 is controlled by an external computer (not shown) operatively coupled to thedigital camera 830. -
FIG. 9 depicts one example embodiment of anidentification card 900 indicating a job number at 910, information identifying the individual 890 at 920, and order information at 930. In the example embodiment shown, the information identifying the individual 890 includes the name, school, homeroom, and grade of the individual 890. In other embodiments, however, theidentification card 900 could provide other identifying information. In one example embodiment, theidentification card 900 includes abarcode 940 encoding at least some of the information found on theidentification card 900. - Referring now to
FIG. 10 , in some embodiments, the step of converting user-customized data, such as the subject andorder barcodes FIG. 8 , includes loading the data into one or more GPS fields and assembling the GPS fields into a GPS data string.FIG. 10 illustrates a block diagram depicting loading user-customized data into GPS fields 1011-1018 of aGPS data stream 1010. In the example shown, the conversion process reformats two strings of data—afirst barcode 1002 and asecond barcode 1004. Of course, in other embodiments, the conversion process can convert any desired amount of data limited only by the capacity of the digital camera for accepting the data. - As shown in
FIG. 10 , in some embodiments, the conversion process fragments and/or resequences the user-entered data. For example, while the conversion process loads all of thefirst barcode 1002 into asecond GPS field 1011, the conversion process breaks up thesecond barcode 1004 and loads only a portion of thesecond barcode 1004 into each of the fourth, sixth, and seventh GPS fields 714, 716, 717, respectively. - In some embodiments, the user-entered data is loaded into all fields 711-718 of the
GPS data string 1010. In other embodiments, the user-entered data is loaded into only some of the GPS fields. The remaining GPS fields are set to default values. In some embodiment, theGPS data string 1010 includes aprefix field 1011 and achecksum field 1018. In some example embodiments, thechecksum field 1018 enables verification that theGPS data string 1010 was transferred correctly from the processing device to the digital camera. In one example embodiment, thechecksum field 1018 encodes a number calculated by the processing device based on at least some of the other fields in theGPS data string 1010. Thedata string 1010 is then passed to the digital camera. - The digital camera repeats the calculation and compares the result to the value encoded in the
checksum field 1018 to verify that thedata string 1010 has not been corrupted. If the value calculated by the digital camera matches the checksum value encoded infield 1018, then the camera captures an image and associates the data encoded in thedata string 1010 with the captured image. In some embodiments, if the calculated checksum does not match the encoded checksum, however, a warning is issued to the photographer. In one example embodiment, a light on the camera flashes in warning. In other embodiments, if the value calculated by the digital camera does not match the checksum value, then the digital camera inhibits the image capture ability of the camera. - Referring now to
FIGS. 11A-11G , each GPS field is configured to accept a valid input format. Data having a different format will not be accepted into the field. For example,FIG. 11A illustrates aGPS Altitude field 1100A configured to hold a rational number.FIG. 11B illustrates a North/South Latitude Indicator (NSLI)field 1100B configured to hold an ASCII decimal representation of the letter “N” or the letter “S” (i.e., 78 or 83, respectively).FIG. 11C illustrates an East/West Longitude - Indicator (EWLI)
field 1100C configured to contain an ASCII decimal representation of the letter “E” or the letter “W”0 (i.e., 69 or 87, respectively). In other embodiments, the NSLI and EWLI fields 1100B, 1100C are configured to accept the appropriate letters. -
FIG. 11D illustrates anexample Latitude field 1100D configured to contain three rational numbers. Generally, the three rational numbers indicate the latitude in degrees, minutes, and seconds, respectively. In some embodiments, because degrees latitude are typically indicated using thenotation 0 to 90° North and 0 to 90° South, the first rational number is two integers long indicating the tenth's place and the one's place of the degree value. Minutes and seconds latitude are also typically expressed as two integers. In one example embodiment, because theLatitude field 1100D expects the second rational number to represent minutes, the first integer of the second rational number must range from 0 to 5 only to be accepted. -
FIG. 11E illustrates anexample Longitude field 1100E also configured to contain three rational numbers. Generally, the three rational numbers indicate the longitude in degrees, minutes, and seconds, respectively. In some embodiments, because degrees longitude are typically indicated using thenotation 0 to 180° East and 0 to 180° West, the first rational number is three integers long indicating the hundredth's place, the tenth's place, and the one's place of the degree value. Minutes and seconds longitude are typically expressed as two integers. In one example embodiment, because theLongitude field 1100E expects the second rational number to represent minutes, the first integer of the second rational number must range from 0 to 5 only to be considered valid. -
FIG. 11F illustrates an exampleUniversal Time field 1100F configured to contain three rational numbers. Generally, the three rational numbers indicate the universal time in hours, minutes, and seconds, respectively. In some embodiments, hours, minutes, and seconds latitude are each expressed as two integers. In one example embodiment, because theUniversal Time field 1100F expects the first rational number to represent hours, the first integer of the first rational number must equal either 1 or 2 to be accepted as valid. Similarly, in some embodiments, the first integer of the second and third rational numbers, which indicate minutes and seconds, respectively, must range from 0 to 5 only to be accepted. -
FIG. 11G illustrates an exampleUniversal Date field 1100G, which can be configured to hold a number of different formats. For example, in some embodiments, the field is configured to hold eleven ASCII characters in the format YYYY:MM:DD_ with “_” indicating a null character. In other example embodiments, the field is configured to hold six ASCII characters in the format DDMMYY. In some embodiments, ASCII characters are expressed as a two integer decimal value. In still other embodiments, the date need not be read as an ASCII value. In the example shown, theUniversal Date field 1100G is configured to hold six integers indicating a numerical date in the format DDMMYY. -
FIG. 12 illustrates an operation flow of an example process for converting user-entered information to information processable by a digital camera. In particular,FIG. 12 illustrates oneexample conversion process 1200 for loading up to three inputted data strings into the fields of a GPS data string. Theexample process 1200 is designed to load data associated with taking school portraits, such as a control code, a subject identifier, and an order identifier. Of course, in other embodiments, theprocess 1200 can be used to convert any user-entered data to a format readable by a digital camera. - The
process 1200 begins atmodule 1205 and proceeds to a first receiveoperation 1210 that receives a first data string. In one example embodiment, the first data string is a control code indicating the type of picture being taken. A first validateoperation 1215 determines whether the first data string is readable. In some embodiments, the first validateoperation 1215 may also determine whether the first data string is correctly formatted. For example, the first validateoperation 1215 may require that the first data string begin with a particular character, for example a “S.” If the first validateoperation 1215 determines that the received information is unreadable and/or incorrectly formatted, then theprocess 1200 ends atmodule 1217. - If the first validate
operation 1215 determines that the received information is readable and/or correctly formatted, however, then afirst store operation 1220 loads the first data string into a predetermined position in a predetermined GPS field. For example, thefirst store operation 1220 may load the first data string into theLAT_DEG —10 position of theLatitude field 1100D of a GPS string. In some embodiments, all of the received data is loaded into the GPS field. In other embodiments, only a portion of the received data is loaded into the GPS field. For example, in one embodiment, the validateoperation 1215 requires that all control barcodes begin a “$” and only the characters following the “$” are loaded into the GPS field by thefirst store operation 1220. In still other embodiments, the first data string is loaded into multiple GPS fields. - The
process 1200 next proceeds to a picturetype determination module 1225, which determines the type of picture being taken based on the first received data string. - For example, in one embodiment, a first data string of “1” indicates that the picture is a portrait and a first data string of “3” indicates that the picture shows a calibration board. If the picture being taken is not a portrait, then the
process 900 ends atmodule 1227. Identification information and order information are generally not entered in such a scenario. If the picture being taken is a student portrait, however, or if a user desires to enter further information, then theprocess 900 proceeds to a second receiveoperation 1230. - The second receive
operation 1230 receives a second data string, for example, a subject code identifying the subject of the portrait. A second validateoperation 1235 determines whether the second data string is readable. In some embodiments, the second validateoperation 1235 may also determine whether the second data string is correctly formatted. If the second validateoperation 1235 determines that the second data string is unreadable or incorrectly formatted, then process 1200 ends atmodule 1237 with an error message. If the second validateoperation 1235 determines that the second data string is readable and correctly formatted, however, then a charactertype determination module 1240 determines whether the second data string includes alphanumeric characters or numeric characters only. - If the
module 1240 determines that the second data string includes only numeric characters, then theprocess 1200 proceeds directly to astore indicia operation 1250. The store indiciaoperation 1250 loads a predetermined value in a predetermined position in a predetermined GPS field. The predetermined value is configured to indicate that the second data string is numeric only. Astore identity operation 1255 next loads the subsequent characters of the second data string into predetermined positions in at least one predetermined field. For example, in one embodiment, thestore indicia operation 1250 loads the predetermined value into the first position of theAltitude field 1100A and thestore identity operation 1255 loads the subsequent characters of the second data string into the subsequent positions of theAltitude field 1100A. In other embodiments, however, other GPS fields may be used to store the information. - Referring back to the
determination module 1240, if thedetermination module 1240 determines that the second data string includes both numbers and letters, however, then theprocess 1200 proceeds to a translateoperation 1245. The translateoperation 1245 converts each of the characters found in the second data string to a numerical equivalent of the character. For example, each letter and number has a decimal ASCII value that can be expressed with two integers. In such a case, thestore indicia operation 1250 loads a different predetermined value in the predetermined position of the predetermined field. The value would indicate that the second data string contains alphanumeric characters. Thestore identification operation 1255 would then load the translated numerical equivalents of the characters from the second data string into the appropriate positions in the appropriate field. - Next, a third receive
operation 1260 receives a third data string, for example, an order code representing order information. A third validateoperation 1265 determines whether the third data string is readable. In some embodiments, the third validateoperation 1265 also determines whether the third data string is correctly formatted. For example, the third validateoperation 1265 may require that the third data string begin with a particular character, for example a “-” sign. If the third validateoperation 1265 determines that the third data string is unreadable and/or incorrectly formatted, then theprocess 1200 ends atmodule 1267 with an error message. If the third validateoperation 1265 determines that the third data string is readable and formatted correctly, however, then theprocess 1200 proceeds to a determineorder length operation 1270. - The
order length operation 1270 determines the number of characters included in the third data string. A storeorder length operation 1275 loads the length of the third data string into one or more predetermined positions in one or more predetermined fields. For example, in some embodiments, the order length is indicated by a first integer representing the ten's column value of the length and a second integer representing the one's column value of the length. In one embodiment, the first and second integers are stored in theHR —10 andHR —1 positions, respectively, of theUTC Time field 1100F. - A
store order operation 1280 loads each character of the third data string into a predetermined position of a predetermined GPS field. The characters need not be loaded in consecutive order within a field, however. For example, in one embodiment, thestore order operation 1280 loads the first character of the third data string into theLAT_DEG —1 position of theLatitude field 1100D of the GPS string and the second character into theLAT_MIN —1 position of theLatitude field 1100D. The characters of the third data string need not even be loaded into consecutive fields. For example, continuing with the example discussed above, thestore order operation 1280 loads the third character of the third data string into theMN —1 position of theUTC Time field 1100F. Of course, in other embodiments, any of the three data string can be stored in non-consecutive positions and/or fields. - An assemble
operation 1285 sets any unloaded position in each field to a default value and forms a data string using the fields. A calculateoperation 1290 determines a checksum value based on at least some of the other values stored in the GPS fields. The checksum value is encoded into a checksum field in the GPS data string, such aschecksum field 1018 inGPS string 1010. A transmitoperation 1295 passes the assembledGPS data string 1010 including the checksum value to a digital camera. Theprocess 1200 ends atmodule 1297. - Referring now to
FIGS. 13A-13C , some of the steps of theprocess 1200 can best be understood by some example applications. For the purposes of these examples, it is assumed that the picture to be taken is a student portrait. -
FIG. 13A illustrates an exampleGPS Altitude field 1300A and a subject barcode 1002A. In the illustrated example, thesubject barcode 1302A encodes the character string “123456.” The second receiveoperation 1230 receives the character string and the second validateoperation 1235 determines that the string is readable. Thedetermination module 1240 then determines that the characters in the second data string are numbers only and, as shown inFIG. 13A , thestore indicia operation 1250 stores the character “1” in thefirst position 1310A of theAltitude field 1300A. In the illustrated example, the character “1” indicates that the remaining characters stored in the Altitude field are numbers only. The store identification operation 955 loads the characters into the remainingpositions 1320A of theAltitude field 1300A. As shown in the illustrated example, the remaining characters stored in the Altitude field are “123456.” TheAltitude field 1300A, therefore, contains the number “1123456”. -
FIG. 13B illustrates an alternative example of aGPS Altitude field 1300B being loaded with asubject barcode 1302B. In the illustrated example, thesubject barcode 1302B encodes the character string “ABC1.” In this example, the second receiveoperation 1230 of theprocess 1200 receives the character string and the second validateoperation 1235 determines that the string is readable, similar to the example ofFIG. 13A . Thedetermination module 1240, however, determines that the characters in the second data string include both letters and numbers. The translateoperation 1245 converts each of the letters and numbers into a decimal ASCII equivalent. For example, the number “65” is the decimal ASCII equivalent of the letter “A” and the number “49” is the decimal ASCII equivalent of the number “1.” - As shown in
FIG. 13B , thestore indicia operation 1250 loads the character “2” in thefirst position 1310B of theAltitude field 1300B. In the illustrated example, the character “2” in this predetermined position indicates that the remaining characters stored in the Altitude field are decimal ASCII equivalents of letters and numbers. Thestore identification operation 1255 loads the translated characters into the remainingpositions 1320B of theAltitude field 1300B. As shown, the remaining characters stored in the Altitude field are “65666749.” TheAltitude field 1300B, therefore, contains the number “265666749”. -
FIG. 13C illustrates example UTC Time, Longitude, Latitude, andUTC Date fields order barcode 1304. In the illustrated example, theorder barcode 1304 encodes the character string “-9876543.” During theconversion process 1200, the third receiveoperation 1260 receives the character string and thethird validation operation 1265 determines that the string is readable. In some embodiments, thethird validation operation 1265 also determines that the first character of the third data string is a “-” character. - The determine
length operation 1270 then determines the number of characters in the third data string. Generally, the determinelength operation 1270 does not count any symbol characters indicating that the string is a particular type of barcode, for example, the “-” character indicating an order barcode. In the illustrated example, the third data string has seven characters not including the “-” character. Consequently, the storeorder length operation 1275 loads a “0” in the first character position of theUTC Time field 1312 and a “7” in the second character position. - In the example shown, the
store order operation 1280 loads the characters of theorder barcode 1304 following the “-” symbol into predetermined positions in different GPS fields. In particular, the first character in theorder barcode 1304, “9”, is stored in theLAT_DEG —1 position of theLatitude field 1316. The second character, “8”, is loaded into theLAT_MIN —1 position of theLatitude field 1316. The third character, “7”, is loaded into theMN —1 of theGPS Time field 1312. Similarly, the fourth, fifth, sixth, and seventh characters are loaded into theSE —1 position of theGPS Time field 1312, theLON_DEG —10 position andLON_DEG —1 positions of theLongitude field 1314, and theYR —10 position of theGPS Date field 1318, respectively. - Using the above-described process, information identifying the subject of a picture and information regarding desired processing of the picture, such as order information, can be directly associated with the digital picture at the time the picture is taken. Furthermore, the identifying information and order information can be inputted in a user-customized format. The user is not restricted to encoding only the type or format of information for which the camera is configured to accept and process.
- Referring now to
FIG. 14 , the data stored with the digital picture files as metadata can be recovered and converted to a useable form for identifying and tracking the pictures.FIG. 14 illustrates a block diagram showing how the GPS string (i.e., or other metadata) is obtained from theimage storage file 1435 on thecamera 1430 by adata processor 1450. In one embodiment, thedata processor 1450 can be the same data processor used to reformat the user data. In another embodiment, thedata processor 1450 is a separate data processor. Thedata processor 1450 converts the data from a format processable by thedigital camera 1430 to a useable format. In some embodiments, a useable format is a format readable by a user. In other embodiments, a useable format is a format readable by an image application. Generally, animage application 1460 enables a user to display the picture and to display the metadata in a format readable to the user. -
FIG. 15 illustrates an operation flow for anexample process 1500 for obtaining the saved image and metadata from the camera and restoring the metadata to a useable format. Theprocess 1500 begins asmodule 1505 and proceeds to readoperation 1510. Theread operation 1510 obtains the metadata encoded in a digital image file. In one example embodiment, theread operation 1510 obtains a string of multiple GPS fields from a digital image file. - In some example embodiments, the desired user data is scattered throughout multiple fields of the metadata and not in sequential order. An
extract operation 1515 pulls user-entered data from the metadata and anorder operation 1520 reassembles any disparate sections of the user data into the proper sequence. Astore operation 1525 creates an electronic file to store the user-entered data on a computer or other storage medium other than the digital camera. In some embodiments, the digital image and the user data are stored in a database and organized according to the user data. In one embodiment, the image file can be indexed and searched based on the stored user data. Theprocess 1500 ends atmodule 1530. - Referring now to
FIG. 16 , after restoring the user-customized data, the data is used to process and deliver each captured image to the appropriate customer.FIG. 16 illustrates an operation flow for aprocess 1600 by which an image is processed and delivered using the restored user-customized data. Theprocess 1600 begins atstart module 1605 and proceeds to arrangeoperation 1610. - The arrange
operation 1610 arranges the image files obtained from the digital camera in a particular order. In some embodiments, the arrangeoperation 1610 organizes the image files within a database. In one example embodiment, the arrangeoperation 1610 organizes the image files into alphanumerical order based on a job number or reference number associated with the image file. In another example embodiment, the arrangeoperation 1610 organizes the image files into order first by school, then by homeroom, then by last name, and then by first name. Of course, in other embodiments, the arrangeoperation 1610 can organize the image files into any order that aids in the production of the image. - A
sequence operation 1615 obtains and analyzes the data associated with each image file in sequential order. A determineoperation 1620 analyzes the obtained data fromsequence operation 1615 to determine the order preference and other production details associated with the image encoded by the image file. For example, in the case of a school portrait, the data obtained from each image file can include, but is not limited to, the name of the subject, subject contact information, school name, school contact information, grade of subject, customer order information, requested finishing techniques, and a delivery date. - A render
operation 1625 then prints using standard printing techniques one or more copies of the encoded image based on the order preference. For example, in one embodiment, two “five by seven”-inch pictures and eight wallet-size pictures of the image are printed. In some embodiments, the renderoperation 1625 modifies the encoded image before printing. For example, different finishing options can be performed on the image. Non-limiting examples of possible finishing options include removing stray hairs, removing “red eye,” covering skin blemishes, color tinting, adding backgrounds and borders, and adding text to an image. - A
package operation 1630 packages and labels the rendered images in preparation for delivery. In some embodiments, thepackage operation 1630 places the images in one or more frames, folios, or albums. In other embodiments, thepackage operation 1630 encodes the images onto a CD. Of course, any suitable packaging method can be used to prepare the rendered images for delivery. Aship operation 1635 sends the packaged photographs to the customer. In one example embodiment, theship operation 1635 delivers packaged school portraits to a school for distribution to the students. In such an embodiment, each package includes a label identifying the student to whom the package belongs. Theprocess 1600 ends atstop module 1640. - While particular embodiments of the invention have been described with respect to its application, it will be understood by those skilled in the art that the invention is not limited by such application or embodiment or the particular components disclosed and described herein. It will be appreciated by those skilled in the art that other components that embody the principles of this invention and other applications therefore other than as described herein can be configured within the spirit and intent of this invention. The arrangement described herein is provided as only one example of an embodiment that incorporates and practices the principles of this invention. Other modifications and alterations are well within the knowledge of those skilled in the art and are to be included within the broad scope of the appended claims.
Claims (20)
1-20. (canceled)
21. A method of tracking digital images, the method comprising:
obtaining a first code using one or more input devices;
generating subject data identifying a subject from the first code;
obtaining a second code with the one or more input devices;
generating other data from the second code;
capturing an image of the subject with a digital camera;
storing the image in an image file on the digital camera, the image file including metadata including a metadata field; and
storing the subject data and the other data into the metadata field of the image file to associate the subject data and the other data with the image.
22. The method of claim 21 , wherein the first code is a barcode.
23. The method of claim 22 , wherein the second code is a barcode.
24. The method of claim 21 , wherein the other data generated from the second code are selected from: classroom, grade, town, and school.
25. The method of claim 21 , wherein storing the subject data includes inserting the other data into a GPS data string.
26. The method of claim 21 , wherein the digital camera includes a GPS input port and wherein storing the subject data includes transmitting the GPS data string to the GPS input port of the digital camera.
27. The method of claim 21 , wherein the subject is a first subject, and further comprising:
capturing an image of a second subject with a digital camera;
storing the image of the second subject in a second image file on the digital camera, the second image file including metadata including a metadata field; and
storing subject data identifying the second subject and the other data into the metadata field to associate the subject data identifying the second subject and the other data with the image of the second subject, wherein the subject data for the second subject is different than the subject data for the first subject and wherein the other data for the first and second subjects is the same when the first and second subjects share a same characteristic selected from: classroom, grade, group, team, town, event, and school.
28. The method of claim 27 , wherein the second code is obtained prior to capturing an image of any subject having the same characteristic, and the second code is obtained only once for all of the subjects having the same characteristic.
29. The method of claim 21 , further comprising analyzing the other data pertaining to the image by extracting the metadata stored in the image file.
30. The method of claim 21 , further comprising:
processing the image according to the other data associated with the image to form a processed image.
31. The method of claim 30 , wherein processing the image includes selecting the image based on a group number extracted from the metadata stored with the image file.
32. The method of claim 30 , further comprising:
delivering the processed image to a customer based on the other data associated with the image.
33. The method of claim 30 , wherein processing the image includes rendering the image.
34. The method of claim 30 , wherein the other data includes at least one requested finishing technique and processing the image includes modifying the image based on the requested finishing technique.
35. The method of claim 30 , wherein the input device is a barcode scanner.
36. A digital camera system for tracking digital images, the system comprising:
an image capture device configured to capture a digital image;
one or more processing devices; and
one or more computer readable storage devices storing data instructions, which when executed by the one or more processing devices cause the one or more processing devices to:
obtain a first code from one or more input devices;
generate subject data identifying a subject from the first code;
obtain a second code from the one or more input devices;
generate other data from the second code;
capture an image of the subject with the image capture device;
store the image in an image file in the one or more computer readable storage devices, the image file including metadata including a metadata field; and
store the subject data and the other data into the metadata field of the image file to associate the subject data and the other data with the image.
37. The system of claim 36 , wherein at least one of the one or more input devices is a barcode scanner.
38. The system of claim 36 , wherein the other data are organizational information.
39. The system of claim 36 , wherein storing the subject data and the other data into the metadata field comprises converting a format of the subject data and the other data into a GPS data string format.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/102,567 US20140267830A1 (en) | 2006-05-26 | 2013-12-11 | Identifying and Tracking Digital Images With Customized Metadata |
US14/611,545 US9924128B2 (en) | 2006-05-26 | 2015-02-02 | Identifying and tracking digital images with customized metadata |
US15/888,151 US10341603B2 (en) | 2006-05-26 | 2018-02-05 | Identifying and tracking digital images with customized metadata |
US16/457,458 US11095846B2 (en) | 2006-05-26 | 2019-06-28 | Identifying and tracking digital images with customized metadata |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/441,904 US7714908B2 (en) | 2006-05-26 | 2006-05-26 | Identifying and tracking digital images with customized metadata |
US12/729,898 US8619157B2 (en) | 2006-05-26 | 2010-03-23 | Identifying and tracking digital images with customized metadata |
US14/102,567 US20140267830A1 (en) | 2006-05-26 | 2013-12-11 | Identifying and Tracking Digital Images With Customized Metadata |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/729,898 Continuation US8619157B2 (en) | 2006-05-26 | 2010-03-23 | Identifying and tracking digital images with customized metadata |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/611,545 Continuation US9924128B2 (en) | 2006-05-26 | 2015-02-02 | Identifying and tracking digital images with customized metadata |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140267830A1 true US20140267830A1 (en) | 2014-09-18 |
Family
ID=38749137
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/441,904 Active 2027-09-09 US7714908B2 (en) | 2006-05-26 | 2006-05-26 | Identifying and tracking digital images with customized metadata |
US12/729,898 Active US8619157B2 (en) | 2006-05-26 | 2010-03-23 | Identifying and tracking digital images with customized metadata |
US14/102,567 Abandoned US20140267830A1 (en) | 2006-05-26 | 2013-12-11 | Identifying and Tracking Digital Images With Customized Metadata |
US14/611,545 Active US9924128B2 (en) | 2006-05-26 | 2015-02-02 | Identifying and tracking digital images with customized metadata |
US15/888,151 Active 2026-05-28 US10341603B2 (en) | 2006-05-26 | 2018-02-05 | Identifying and tracking digital images with customized metadata |
US16/457,458 Active 2026-08-12 US11095846B2 (en) | 2006-05-26 | 2019-06-28 | Identifying and tracking digital images with customized metadata |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/441,904 Active 2027-09-09 US7714908B2 (en) | 2006-05-26 | 2006-05-26 | Identifying and tracking digital images with customized metadata |
US12/729,898 Active US8619157B2 (en) | 2006-05-26 | 2010-03-23 | Identifying and tracking digital images with customized metadata |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/611,545 Active US9924128B2 (en) | 2006-05-26 | 2015-02-02 | Identifying and tracking digital images with customized metadata |
US15/888,151 Active 2026-05-28 US10341603B2 (en) | 2006-05-26 | 2018-02-05 | Identifying and tracking digital images with customized metadata |
US16/457,458 Active 2026-08-12 US11095846B2 (en) | 2006-05-26 | 2019-06-28 | Identifying and tracking digital images with customized metadata |
Country Status (2)
Country | Link |
---|---|
US (6) | US7714908B2 (en) |
CA (1) | CA2589837C (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10341603B2 (en) | 2006-05-26 | 2019-07-02 | Lifetouch Inc. | Identifying and tracking digital images with customized metadata |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7446800B2 (en) | 2002-10-08 | 2008-11-04 | Lifetouch, Inc. | Methods for linking photographs to data related to the subjects of the photographs |
WO2007012041A2 (en) * | 2005-07-20 | 2007-01-25 | Lab Partners Associates, Inc. | Wireless photographic communication system and method |
US7437063B2 (en) | 2006-04-07 | 2008-10-14 | Lab Partners Associates, Inc. | Wireless camera flash synchronizer system and method |
US7774023B2 (en) * | 2007-03-26 | 2010-08-10 | Kyocera Corporation | System and method for associating device information with digital images |
US20080261567A1 (en) * | 2007-04-17 | 2008-10-23 | Xerox Corporation | Mobile telephony device having a print request dedicated key for transmitting digital images to a printing system |
WO2008150902A1 (en) | 2007-05-29 | 2008-12-11 | Lab Partners Associates, Inc. | System and method for maintaining hot shoe communications between a camera and a wireless device |
US8270303B2 (en) * | 2007-12-21 | 2012-09-18 | Hand Held Products, Inc. | Using metadata tags in video recordings produced by portable encoded information reading terminals |
US20100198876A1 (en) | 2009-02-02 | 2010-08-05 | Honeywell International, Inc. | Apparatus and method of embedding meta-data in a captured image |
US8326136B1 (en) | 2009-02-12 | 2012-12-04 | Lab Partners Associates, Inc. | Systems and methods for communicating with a device using one or more camera body controls |
US8326141B1 (en) | 2009-02-12 | 2012-12-04 | Lab Partners Associates, Inc. | Systems and methods for changing power states of a remote device using one or more camera body controls and a preset delay |
US8614766B1 (en) | 2009-02-12 | 2013-12-24 | Lab Partners Associates, Inc. | Systems and methods for controlling a power state of a remote device using camera body backlighting control signaling |
US8718461B2 (en) | 2009-02-12 | 2014-05-06 | Lab Partners Associates, Inc. | Photographic synchronization optimization system and method |
CN102388342A (en) | 2009-02-12 | 2012-03-21 | 拉布合伙人联合公司 | Early photographic synchronization system and method |
US8794506B2 (en) * | 2009-02-23 | 2014-08-05 | Digitaqq | System for automatic image association |
US9519814B2 (en) | 2009-06-12 | 2016-12-13 | Hand Held Products, Inc. | Portable data terminal |
US8312137B1 (en) * | 2010-01-04 | 2012-11-13 | Google Inc. | Live experiment framework |
WO2012009537A1 (en) | 2010-07-14 | 2012-01-19 | Lab Partners Associates, Inc. | Photographic wireless communication protocol system and method |
US8565540B2 (en) * | 2011-03-08 | 2013-10-22 | Neal Solomon | Digital image and video compression and decompression methods |
US9195869B2 (en) * | 2011-11-10 | 2015-11-24 | Colorvision International, Inc. | Theme park photograph tracking and retrieval system for park visitors and associated methods |
US10573045B2 (en) | 2012-12-19 | 2020-02-25 | Shutterfly, Llc | Generating an assembled group image from subject images |
US9025906B2 (en) | 2012-12-19 | 2015-05-05 | Lifetouch Inc. | Generating an assembled group image from subject images |
US9690169B2 (en) | 2013-11-04 | 2017-06-27 | Lab Partners Associates, Inc. | Photographic lighting system and method |
EP2879371B1 (en) * | 2013-11-29 | 2016-12-21 | Axis AB | System for following an object marked by a tag device with a camera |
US20150286651A1 (en) * | 2014-04-04 | 2015-10-08 | Mach 1 Development, Inc. | Marked image file security system and process |
WO2016053331A1 (en) * | 2014-10-01 | 2016-04-07 | Landmark Graphics Corporation | Image based transfer of wellsite data between devices in a petroleum field |
US10284537B2 (en) | 2015-02-11 | 2019-05-07 | Google Llc | Methods, systems, and media for presenting information related to an event based on metadata |
US11048855B2 (en) | 2015-02-11 | 2021-06-29 | Google Llc | Methods, systems, and media for modifying the presentation of contextually relevant documents in browser windows of a browsing application |
US9769564B2 (en) | 2015-02-11 | 2017-09-19 | Google Inc. | Methods, systems, and media for ambient background noise modification based on mood and/or behavior information |
US10223459B2 (en) | 2015-02-11 | 2019-03-05 | Google Llc | Methods, systems, and media for personalizing computerized services based on mood and/or behavior information from multiple data sources |
US10339489B2 (en) * | 2015-02-13 | 2019-07-02 | One Stop Mailing LLC | Parcel processing system and method |
CN107624241B (en) * | 2015-04-09 | 2021-01-12 | 欧姆龙株式会社 | Method, system, and computer-readable storage medium for embedded web server |
WO2019246132A1 (en) | 2018-06-18 | 2019-12-26 | Digimarc Corporation | Methods and arrangements for reconciling data from disparate data carriers |
US11030730B2 (en) | 2019-04-02 | 2021-06-08 | Shutterfly, Llc | Composite group image |
CN111343335B (en) * | 2020-02-14 | 2021-02-26 | Tcl移动通信科技(宁波)有限公司 | Image display processing method, system, storage medium and mobile terminal |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6282362B1 (en) * | 1995-11-07 | 2001-08-28 | Trimble Navigation Limited | Geographical position/image digital recording and display system |
US20040066455A1 (en) * | 2002-10-08 | 2004-04-08 | Lifetouch, Inc. | Photography system |
US20080117309A1 (en) * | 2006-11-16 | 2008-05-22 | Samsung Techwin Co., Ltd. | System and method for inserting position information into image |
Family Cites Families (120)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4062026A (en) | 1975-10-28 | 1977-12-06 | Fuji Photo Film Co., Ltd. | Data recording means for use in a single lens reflex camera |
CH632347A5 (en) | 1978-08-17 | 1982-09-30 | Gretag Ag | METHOD AND DEVICE FOR PRODUCING PHOTOGRAPHIC COPIES. |
US4269495A (en) | 1979-10-31 | 1981-05-26 | L. M. Dearing Associates | Camera with back-up shutter and provisions for recording data in conjunction with a primary image |
JPS56109328A (en) | 1980-02-04 | 1981-08-29 | Olympus Optical Co Ltd | Data recording |
US4422745A (en) | 1981-07-31 | 1983-12-27 | National School Studios, Inc. | Camera system |
IL66817A0 (en) | 1982-09-16 | 1982-12-31 | Gaz Moshe | Special visual and sound effects in cinematography using beam lasers on positive and negative copies |
US4714332A (en) | 1984-10-08 | 1987-12-22 | Canon Kabushiki Kaisha | Film data reading device |
JPS6214630A (en) | 1985-07-12 | 1987-01-23 | Canon Inc | Data imprinting device |
US4780735A (en) | 1985-08-29 | 1988-10-25 | Minolta Camera Kabushiki Kaisha | Photographic camera |
US4860039A (en) | 1985-10-31 | 1989-08-22 | Minolta Camera Kabushiki Kaisha | Photographic camera system and using thereof |
EP0222364B1 (en) | 1985-11-11 | 1992-08-05 | Minolta Camera Kabushiki Kaisha | Photographic camera |
JPS62133436A (en) | 1985-12-04 | 1987-06-16 | Fuji Photo Film Co Ltd | Deciding method for frame position |
EP0225597A3 (en) | 1985-12-04 | 1989-05-03 | Fuji Photo Film Co., Ltd. | Image date recording medium and apparatus for determining tape position of the same |
US4668984A (en) | 1985-12-24 | 1987-05-26 | Rca Corporation | Optical pattern generation technique |
JPH0617966B2 (en) | 1985-12-25 | 1994-03-09 | キヤノン株式会社 | camera |
US4814802A (en) | 1986-04-08 | 1989-03-21 | Nikon Corporation | Camera system |
US4864229A (en) | 1986-05-03 | 1989-09-05 | Integrated Ionics, Inc. | Method and apparatus for testing chemical and ionic sensors |
US4671648A (en) | 1986-07-11 | 1987-06-09 | Fuji Photo Film Co., Ltd. | Image data processing apparatus |
FR2610736B3 (en) | 1987-02-06 | 1989-02-10 | App Telecommandes Ste F | METHOD AND DEVICE FOR RECORDING AT HIGH SPEED IMAGES AND INFORMATION ASSOCIATED WITH IMAGES |
DE3888483T2 (en) | 1987-12-14 | 1994-06-23 | Fuji Photo Film Co Ltd | Process for determining the frame number of a photographic film. |
DE68926803T2 (en) | 1988-05-02 | 1997-02-06 | Nippon Kogaku Kk | Information recording system |
DE68929345T2 (en) | 1988-08-01 | 2002-07-11 | Asahi Kogaku Kogyo K.K., Tokio/Tokyo | electronically controlled camera |
US5021820A (en) | 1988-10-07 | 1991-06-04 | Eastman Kodak Company | Order entry process for magnetically encodable film with dedicated magnetic tracks |
JP2851284B2 (en) | 1988-10-18 | 1999-01-27 | 旭光学工業株式会社 | Photographing data recording device |
US5208706A (en) | 1988-10-24 | 1993-05-04 | Lemelson Jerome H | Magnetic reproduction apparatus and method |
US5179266A (en) | 1988-12-05 | 1993-01-12 | Fuji Photo Film Co., Ltd. | Photographic film and method of identifying frame numbers of photographic film |
US5128711A (en) | 1989-04-28 | 1992-07-07 | Fuji Photo Film Co., Ltd. | Apparatus for recording position information of principal image and method of detecting principal image |
JP2842614B2 (en) | 1989-05-02 | 1999-01-06 | 旭光学工業株式会社 | Camera data imprinting device |
US5227837A (en) | 1989-05-12 | 1993-07-13 | Fuji Photo Film Co., Ltd. | Photograph printing method |
US5107290A (en) | 1989-06-06 | 1992-04-21 | Canon Kabushiki Kaisha | Camera |
US5255031A (en) | 1989-11-13 | 1993-10-19 | Fuji Photo Film Co., Ltd. | Data-retainable photographic film cartridge |
US5124735A (en) | 1989-11-21 | 1992-06-23 | Nikon Corporation | Electromotive camera equipped with data photographing device |
US5479226A (en) | 1990-02-09 | 1995-12-26 | Nikon Corporation | Camera reading information applied to a film unit |
JPH03236030A (en) | 1990-02-14 | 1991-10-22 | Asahi Optical Co Ltd | Electronic control camera |
JP2817986B2 (en) | 1990-02-14 | 1998-10-30 | 旭光学工業株式会社 | Camera automatic shooting device |
US5289216A (en) | 1990-06-22 | 1994-02-22 | Canon Kabushiki Kaisha | Camera with film information setting device |
JPH04250436A (en) | 1991-01-11 | 1992-09-07 | Pioneer Electron Corp | Image pickup device |
JP3061473B2 (en) | 1991-04-15 | 2000-07-10 | オリンパス光学工業株式会社 | Data imprinting device |
US5260740A (en) | 1991-05-28 | 1993-11-09 | Fuji Photo Film Co., Ltd. | Method of detecting image frame and apparatus thereof, method of positioning image frame, photographic film carrier, and method of printing photographic film |
US6179494B1 (en) | 1991-07-19 | 2001-01-30 | Canon Kabushiki Kaisha | Camera |
EP0821265B1 (en) | 1991-09-12 | 2002-02-06 | Fuji Photo Film Co., Ltd. | Method of making photographic prints |
JPH05100305A (en) | 1991-10-11 | 1993-04-23 | Fuji Photo Optical Co Ltd | Camera with printing information recording function |
JP2613335B2 (en) | 1991-12-12 | 1997-05-28 | 富士写真フイルム株式会社 | Data recording device for camera |
US5274408A (en) | 1991-12-12 | 1993-12-28 | Konica Corporation | Electronic camera |
US5532774A (en) | 1992-03-05 | 1996-07-02 | Olympus Optical Co., Ltd. | Film data recording/reproducing apparatus for a camera by writing/reading pits recorded on a film |
US5570147A (en) | 1992-03-17 | 1996-10-29 | Sony Corporation | Photographic camera system |
US6151456A (en) | 1993-03-04 | 2000-11-21 | Sony Corporation | Photographic camera system |
US5600386A (en) | 1992-03-17 | 1997-02-04 | Sony Corporation | Photographic camera system |
KR100225112B1 (en) | 1992-09-28 | 1999-10-15 | 기시모토 마사도시 | Dot/code and information recording/reproducing system for recording/reproducing the system |
JP2948036B2 (en) | 1992-10-29 | 1999-09-13 | キヤノン株式会社 | Camera and camera film feeder |
US5311228A (en) | 1993-02-09 | 1994-05-10 | Eastman Kodak Company | Camera with optical film metering and image frame data encodement |
JPH06308595A (en) | 1993-03-15 | 1994-11-04 | Olympus Optical Co Ltd | Information recoreder for camera |
JP3289988B2 (en) | 1993-04-13 | 2002-06-10 | オリンパス光学工業株式会社 | Camera data imprinting device |
US5666578A (en) | 1993-04-28 | 1997-09-09 | Nikon Corporation | Camera and print information control apparatus |
US5555047A (en) | 1993-05-18 | 1996-09-10 | Minolta Camera Kabushiki Kaisha | Data reading and writing apparatus |
CA2127765C (en) | 1993-08-24 | 2000-12-12 | James Gifford Evans | Personalized image recording system |
US5822436A (en) | 1996-04-25 | 1998-10-13 | Digimarc Corporation | Photographic products and methods employing embedded information |
US5748783A (en) | 1995-05-08 | 1998-05-05 | Digimarc Corporation | Method and apparatus for robust information coding |
JP3101139B2 (en) | 1993-12-24 | 2000-10-23 | 富士写真フイルム株式会社 | Index photo, film package, package creation method and creation machine |
JP2938745B2 (en) | 1993-12-27 | 1999-08-25 | キヤノン株式会社 | camera |
DE69521928T2 (en) | 1994-03-31 | 2002-04-11 | Canon K.K., Tokio/Tokyo | Camera capable of recording information relating to the production of photographic prints |
US5530501A (en) | 1994-06-29 | 1996-06-25 | Eastman Kodak Company | Photographic camera with data recording on film |
US5634156A (en) | 1994-12-15 | 1997-05-27 | Eastman Kodak Company | Printing exposure reference |
JPH08234865A (en) | 1995-02-24 | 1996-09-13 | Canon Inc | Equipment provided with microcomputer |
US5525459A (en) | 1995-03-15 | 1996-06-11 | Minnesota Mining And Manufacturing Company | Photographic film having a mask bar code |
US5740484A (en) | 1995-06-01 | 1998-04-14 | Olympus Optical Co., Ltd. | Camera capable of recording data onto film |
US5646713A (en) | 1995-06-13 | 1997-07-08 | Eastman Kodak Company | Apparatus and method for exposing data characters onto a strip region of moving photosensitive media |
US5664248A (en) | 1995-08-08 | 1997-09-02 | Fuji Photo Optical Co., Ltd. | Photographic camera |
US5819126A (en) | 1996-01-19 | 1998-10-06 | Fuji Photo Film Co., Ltd. | Lens-fitted photo film unit and data recording method therefor |
JPH09211660A (en) | 1996-01-31 | 1997-08-15 | Nikon Corp | Information recording device of camera |
US5745811A (en) | 1996-03-27 | 1998-04-28 | Fuji Photo Film Co., Ltd. | Image recording device, developing device and image recording system |
TW351466U (en) | 1996-04-25 | 1999-01-21 | Seiko Epson Corp | Data-projection device for disposable cameras |
JP3643176B2 (en) | 1996-05-17 | 2005-04-27 | 富士写真フイルム株式会社 | Development processing control method, development apparatus, and image exposure apparatus |
JPH09325403A (en) | 1996-06-05 | 1997-12-16 | Fuji Photo Film Co Ltd | Magnetic recording/reproducing device for photographic film |
JPH1010619A (en) | 1996-06-19 | 1998-01-16 | Fuji Photo Optical Co Ltd | Camera |
US5991550A (en) | 1996-07-25 | 1999-11-23 | Nikon Corporation | Camera |
JPH1078610A (en) | 1996-09-03 | 1998-03-24 | Olympus Optical Co Ltd | Data imprinting device |
US5901245A (en) | 1997-01-23 | 1999-05-04 | Eastman Kodak Company | Method and system for detection and characterization of open space in digital images |
JPH1184511A (en) | 1997-09-10 | 1999-03-26 | Fuji Photo Film Co Ltd | Optical data exposure circuit |
US6396537B1 (en) * | 1997-11-24 | 2002-05-28 | Eastman Kodak Company | Photographic system for enabling interactive communication between a camera and an attraction site |
US6324345B1 (en) | 1997-12-10 | 2001-11-27 | Fuji Photo Film Co., Ltd. | Photographic film with recorded information, method of acquiring the information recorded on photographic film, image processing method using the acquired information, and print system using the same |
US6205296B1 (en) | 1997-12-12 | 2001-03-20 | Minolta Co., Ltd. | Film information writing device, film information reading device and film information handling device |
US6122520A (en) * | 1998-02-13 | 2000-09-19 | Xerox Corporation | System and method for obtaining and using location specific information |
JPH11231374A (en) | 1998-02-16 | 1999-08-27 | Fuji Photo Optical Co Ltd | Camera |
JPH11252337A (en) | 1998-02-27 | 1999-09-17 | Fuji Photo Film Co Ltd | Image data recording method and frame image reproducing method |
US6321040B1 (en) | 1998-06-04 | 2001-11-20 | Eastman Kodak Company | System and method of scanning, viewing and writing information on a magnetic layer of photosensitive film |
GB2344433B (en) | 1998-10-07 | 2002-12-18 | Asahi Optical Co Ltd | Camera having a data imprinting device |
GB2344432B (en) | 1998-10-07 | 2002-10-09 | Asahi Optical Co Ltd | Camera having a data imprinting device |
US6038408A (en) | 1999-02-18 | 2000-03-14 | Eastman Kodak Company | Optical data recording circuit for a film camera |
US6950800B1 (en) | 1999-12-22 | 2005-09-27 | Eastman Kodak Company | Method of permitting group access to electronically stored images and transaction card used in the method |
US6608563B2 (en) | 2000-01-26 | 2003-08-19 | Creative Kingdoms, Llc | System for automated photo capture and retrieval |
US6587839B1 (en) | 2000-02-04 | 2003-07-01 | Eastman Kodak Company | Method and system for notifying a consumer that the photofinishing order is ready and for controlling inventory of photofinishing orders in a business |
US6180312B1 (en) | 2000-03-22 | 2001-01-30 | Eastman Kodak Company | Photographic imaging system incorporating metadata recording capability |
US6674923B1 (en) | 2000-03-28 | 2004-01-06 | Eastman Kodak Company | Method and system for locating and accessing digitally stored images |
US6407767B1 (en) | 2000-08-09 | 2002-06-18 | Eastman Kodak Company | Apparatus for exposing sensitometric and bar code data onto photosensitive media |
US6280914B1 (en) | 2000-08-09 | 2001-08-28 | Eastman Kodak Company | Photographic element with reference calibration data |
US6745186B1 (en) | 2000-08-17 | 2004-06-01 | Eastman Kodak Company | Product and method for organizing and searching digital images |
US6629104B1 (en) | 2000-11-22 | 2003-09-30 | Eastman Kodak Company | Method for adding personalized metadata to a collection of digital images |
US6429924B1 (en) | 2000-11-30 | 2002-08-06 | Eastman Kodak Company | Photofinishing method |
US6553187B2 (en) | 2000-12-15 | 2003-04-22 | Michael J Jones | Analog/digital camera and method |
US20020101519A1 (en) * | 2001-01-29 | 2002-08-01 | Myers Jeffrey S. | Automatic generation of information identifying an object in a photographic image |
US6628895B2 (en) * | 2001-02-16 | 2003-09-30 | Eastman Kodak Company | Apparatus and method for obtaining special images from a one time use camera |
GB2374225A (en) | 2001-03-28 | 2002-10-09 | Hewlett Packard Co | Camera for recording linked information associated with a recorded image |
GB0113026D0 (en) | 2001-05-30 | 2001-07-18 | Eastman Kodak Co | A photographic processing system |
KR100582628B1 (en) * | 2001-05-31 | 2006-05-23 | 캐논 가부시끼가이샤 | Information storing apparatus and method therefor |
BR0306813A (en) | 2002-01-11 | 2004-12-28 | Gentpr Dev Group Inc | Systems and methods for portrait production |
US20030161009A1 (en) | 2002-02-22 | 2003-08-28 | Kenji Yokoo | System and method for processing and ordering photographic prints |
US7430003B2 (en) * | 2002-08-23 | 2008-09-30 | Candid Color Systems, Inc. | Digital camera/computer synchronization method |
US20040135902A1 (en) * | 2003-01-09 | 2004-07-15 | Eventshots.Com Incorporated | Image association process |
US7372482B2 (en) * | 2003-03-28 | 2008-05-13 | Hewlett-Packard Development Company, L.P. | System and method of capturing and providing supplemental data associated with a digital image |
JP2007502035A (en) * | 2003-07-29 | 2007-02-01 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Digital photo enriched photo viewing experience |
US7685428B2 (en) * | 2003-08-14 | 2010-03-23 | Ricoh Company, Ltd. | Transmission of event markers to data stream recorder |
US20050036034A1 (en) | 2003-08-15 | 2005-02-17 | Rea David D. | Apparatus for communicating over a network images captured by a digital camera |
US20050104976A1 (en) * | 2003-11-17 | 2005-05-19 | Kevin Currans | System and method for applying inference information to digital camera metadata to identify digital picture content |
US7528868B2 (en) * | 2003-12-18 | 2009-05-05 | Eastman Kodak Company | Image metadata attachment |
US20050270381A1 (en) * | 2004-06-04 | 2005-12-08 | James Owens | System and method for improving image capture ability |
US7347373B2 (en) * | 2004-07-08 | 2008-03-25 | Scenera Technologies, Llc | Method and system for utilizing a digital camera for retrieving and utilizing barcode information |
US20070073770A1 (en) * | 2005-09-29 | 2007-03-29 | Morris Robert P | Methods, systems, and computer program products for resource-to-resource metadata association |
US7797337B2 (en) * | 2005-09-29 | 2010-09-14 | Scenera Technologies, Llc | Methods, systems, and computer program products for automatically associating data with a resource as metadata based on a characteristic of the resource |
US7714908B2 (en) | 2006-05-26 | 2010-05-11 | Lifetouch Inc. | Identifying and tracking digital images with customized metadata |
-
2006
- 2006-05-26 US US11/441,904 patent/US7714908B2/en active Active
-
2007
- 2007-05-23 CA CA2589837A patent/CA2589837C/en active Active
-
2010
- 2010-03-23 US US12/729,898 patent/US8619157B2/en active Active
-
2013
- 2013-12-11 US US14/102,567 patent/US20140267830A1/en not_active Abandoned
-
2015
- 2015-02-02 US US14/611,545 patent/US9924128B2/en active Active
-
2018
- 2018-02-05 US US15/888,151 patent/US10341603B2/en active Active
-
2019
- 2019-06-28 US US16/457,458 patent/US11095846B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6282362B1 (en) * | 1995-11-07 | 2001-08-28 | Trimble Navigation Limited | Geographical position/image digital recording and display system |
US20040066455A1 (en) * | 2002-10-08 | 2004-04-08 | Lifetouch, Inc. | Photography system |
US20080117309A1 (en) * | 2006-11-16 | 2008-05-22 | Samsung Techwin Co., Ltd. | System and method for inserting position information into image |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10341603B2 (en) | 2006-05-26 | 2019-07-02 | Lifetouch Inc. | Identifying and tracking digital images with customized metadata |
US11095846B2 (en) | 2006-05-26 | 2021-08-17 | Shutterfly, Llc | Identifying and tracking digital images with customized metadata |
Also Published As
Publication number | Publication date |
---|---|
US7714908B2 (en) | 2010-05-11 |
US8619157B2 (en) | 2013-12-31 |
US20100177212A1 (en) | 2010-07-15 |
US9924128B2 (en) | 2018-03-20 |
CA2589837A1 (en) | 2007-11-26 |
US20160014368A1 (en) | 2016-01-14 |
US20180234659A1 (en) | 2018-08-16 |
US10341603B2 (en) | 2019-07-02 |
US20070273774A1 (en) | 2007-11-29 |
US11095846B2 (en) | 2021-08-17 |
CA2589837C (en) | 2016-08-16 |
US20200014882A1 (en) | 2020-01-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11095846B2 (en) | Identifying and tracking digital images with customized metadata | |
US20190068844A1 (en) | Photography system to organize digital photographs and information regarding the subjects therein | |
US20060176516A1 (en) | System and method for embedding and retrieving information in digital images and using the information to copyright the digital images | |
US5383027A (en) | Portrait printer system with digital image processing editing | |
US20120013766A1 (en) | Device and method for embedding and retrieving information in digital images | |
US20050134707A1 (en) | Image metadata attachment | |
US5965859A (en) | Automated system and method for associating identification data with images | |
WO2005076898A2 (en) | Method for organizing photographic images in a computer for locating, viewing and purchase | |
JP4117675B2 (en) | Method and program for creating card photo data | |
JP4009926B2 (en) | Image output method and apparatus | |
JP2021048450A (en) | Imaging apparatus and imaging system | |
JP7130977B2 (en) | Inspection processing equipment and print production system | |
JP2010045435A (en) | Camera and photographing system | |
US20160085770A1 (en) | System and method of digital image matching with subject descriptors | |
KR20110037546A (en) | Method for manufacturing album using internet | |
JP2005017923A (en) | Index print creation system, sample photographic print creation system and index print | |
JP2005347802A (en) | Print processing method and print processing system | |
JP2005012638A (en) | Digital camera, print preparing apparatus and print ordering method | |
JP2005010531A (en) | Index print and sample print creation system using same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LIFETOUCH INC., MINNESOTA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HOLMES, JON A.;REEL/FRAME:033017/0381 Effective date: 20060525 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |