WO1992020184A1 - Computer/human interface for performing data transformations by manipulating graphical objects on a video display - Google Patents
Computer/human interface for performing data transformations by manipulating graphical objects on a video display Download PDFInfo
- Publication number
- WO1992020184A1 WO1992020184A1 PCT/US1992/003135 US9203135W WO9220184A1 WO 1992020184 A1 WO1992020184 A1 WO 1992020184A1 US 9203135 W US9203135 W US 9203135W WO 9220184 A1 WO9220184 A1 WO 9220184A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- transform
- user
- array
- data
- data values
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/00413—Display of information to the user, e.g. menus using menus, i.e. presenting the user with a plurality of selectable options
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/00411—Display of information to the user, e.g. menus the display also being used for user input, e.g. touch screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/6011—Colour correction or control with simulation on a subsidiary picture reproducer
Definitions
- the invention relates generally to the interface between a human operator and a computer hardware and software system and in particular to a system for performing data transformations by manipulating graphical objects on a video display monitor.
- a preferred embodiment of the invention relates to color image processing systems employing look-up tables for transforming from a first coordinate space to a second coordinate space.
- Color image processing systems typically include an input device for generating an electronic representation of a color image. The input device provides the electronic image representation to a computer workstation, which processes the image in ac- cordance with a user's instructions and forwards the processed image to a color monitor for display.
- the user interacts with the workstation, typically through input devices such as a mouse and keyboard, and an output device such as a control monitor, repeatedly instructing the computer to adjust the electronic image until the color monitor displays a desired image.
- the user can also generate a hard copy of the image by instructing the workstation to provide the processed electronic image to a selected printing device.
- the electronic image processed by the workstation consists of a two dimensional array of picture elements (pixels) .
- the color of each pixel may be represented in any of a variety of color notations or "color spaces.”
- the RGB color space represents pixel colors according to the relative contributions of three primary colors, red, green and blue.
- each pixel of the monitor's display contains three primary color phosphors.
- the monitor stimulates each primary phosphor with an intensity determined by the corresponding R, G, B value.
- the CMYK color space represents color using four variables, C, M, Y, K, each corresponding to the relative (subtractive) contribution of the colorants, cyan, magenta, yellow and black.
- C, M, Y, K determines the amount of a colorant (e.g. ink, dye) used by the printer in producing a desired color.
- Black is used to increase printed ink density and minimize the use of costly colored colorants, in situations where the overlay of multiple colorants would appear substantially black.
- Color spaces such as linear RGB and CMYK are useful for image scanning devices and image printing devices, respectively, since each parameter of the color space closely corresponds to a physical mechanism by which these devices measure and generate color.
- the three parameters R, G, B define a three dimensional, linear color space, each point within the space corresponding to a unique color.
- a selected change in the values of the parameters may not result in a commensurate change in color perceived by a human viewer.
- increasing the parameter R by n units yields little perceived change in color.
- increasing R by the same n units yields a dramatic change in the perceived color. Accordingly, it may be difficult for a user to manipulate the primaries R, G, B, to achieve a desired change in color.
- the "u'v'L*" space for example, is a three dimensional color space defined by the parameters u', v', L*.
- the chromaticity of each color in this space is uniformly characterized by the parameters u', v 1 .
- L* perceptually uniform variations in the lightness of the color
- the workstation To process a color image change within the "uvL" color space, the workstation simply maps each point u n , v Q , L Q in the color space to a new point u,, v-, L- . For example, if the user desires to display the image on a monitor, he may wish to adjust the colors of the image to compensate for lighting conditions of the room. Accordingly, the user selects a transform which maps each point u Q , v Q , L Q to a new point having the same values u Q , v Q but having greater luminance value
- the image processing system typically contains a predetermined transform definition for each such color image transformation. Based on a selected definition, the system maps certain points of the color space to new points. Accordingly, the color at each pixel of an electronic image is sequentially mapped in accordance with the transform definition to yield the desired visual effect. To perform another image transformation, the system remaps the color values to yet another point in accordance with a second transform definition. Any number of transformations can thus be performed by sequentially mapping color values accord ⁇ ing to the available predetermined transform definitions. However, such sequential processing of images can be extremely time consuming, particularly if a large number of predetermined transforms are selected and a large number of datapoints must be transformed.
- the method features the steps of receiving a user's selection of an image transformation to be performed on the array of input pixel values.
- a plurality of transform definitions are automatically selected from stored transform definitions.
- Each transform definition includes sample values in three dimensions representing an input/output relation of a predetermined image transformation.
- a composite transform definition is generated containing sample values of an input/output relation of a composite image transformation which is equivalent to the several image transformations effectively selected by the user.
- the composite transform is preferably compiled and implemented sufficiently quickly (e.g., in real time) to allow the user to interactively change his selec- tions until a desired composite transformation is created.
- At least one sample value is selected from the composite transform definition, based on the value of an input color to be modified.
- a processed color value is then determined based on the at least one selected sample value. For example, in one embodiment, a nearest neighbor of the input color value is selected as the sample value. In another embodiment, a plurality of sample values are selected and the processed color value is determined by interpolating between these values.
- the stored transform definitions may include custom transform definitions made in accordance with a user's instruction as well as the predetermined transform definitions.
- a user's instructions specifying desired color changes are received.
- a custom transform definition is then prepared for implementing the selected color changes.
- Such custom transform definitions may be created by modifying predetermined transform definitions or by generating entirely new transform definitions based on the user's input.
- the three-dimensional transform definitions are functional look-up tables and are referred to herein as "FUTs.”
- CHI CHI
- the apparatus of the invention in a first preference embodiment, generates an array of modified data values related to an array of input data values in response to commands from a user.
- the apparatus includes storage means for storing at least one data transform definition and data processing means for identifying at least one of the transform definitions in response to a user's selection of at least one data transformation and for applying the identified transform definitions to the array of input data values.
- Interface means displays are provided between the user and the data processing means for receiving instructions from the user and for indicating the outputs of the data processing means.
- the interface means comprises an input device for the user to transmit instructions, signal generating means for generating signals representative of the one transform definition and the user instructions, and display means.
- the display means displays a graphical representation of the signals representing the user instructions and the transform definitions.
- Signal processing means are also included for receiving the user instructions to select and move on the display means the graphical representation of at least one of said transform definitions.
- the signal processing means generates commands to the data processing means to identify the selected at least one transform definition and to apply a selected transform definitions to the array of input data values to arrive at modified values.
- a second preferred embodiment of the invention further comprises means for generating a single composite transform definition, such that application of the single composite transform definition to the input data values generates the same output data values as would the individual serial application of a selected at least two transform definitions.
- the invention interface includes means for selecting a primary range of the input data array to be transformed according to a selected transform definition and designating at least one secondary range of the input data array to be transformed according to a transform definition that provides a selected rate of change of the transformation of data values within the secondary range of said input data array with respect to the proximity of said data values to the primary range of said input data array.
- the interface includes means for sending a message to and means for receiving a message from tools and objects, at least one data transform definition tool having means for sending a message to and receiving a message from other tools and objects and an object representing the input data values having means for sending a message to and means for receiving a message from tools, where the message sent by the transform tool to the data object includes a message to apply a selected transform to the input data values.
- the user interface further includes signal processing means for receiving from the user commands to identify on the display means the graphical representation of at least one master transform definition and to generate commands to the transform controller to generate a changeable working transform definition that is the same as the identified master transform definition, other than the ability to be changed.
- Figure 1 is a schematic diagram of the RGB color space.
- Figure 2 is a block diagram of an image processing system used in connection with the claimed invention.
- Figure 3 is a block diagram of a variety of transform operations in accordance with instructions generated by the user through a CHI of the invention.
- Figure 4 is a flow chart schematically showing the basic operation of a hand of the CHI of the invention.
- Figure 5 is a flow chart schematically showing the basic operation of tools of the CHI of the invention which are used to create and modify data FUTs when the tools are activated or applied.
- Figure 6 is a flow chart schematically showing the basic operation of tools of the CHI of the invention which are used to create and modify data FUTs when the tools are released on an object.
- Figure 7a-7f is a flow chart schematically showing the basic steps a user would follow using the method of the invention to display a picture on a monitor, try various modifications to that picture and finally make permanent the alterations to the picture.
- Figure 8 is a flow chart schematically showing the general steps the user would take according to the CHI of the invention to change a FUT relating to the grey balance of a picture.
- Figure 9 is a schematic, showing basic elements of the CHI, including a room, door and picture objects and hand, input, output, adjustment, grey balance, tonal, monitor, work order, memory storage, and inking tools.
- Figure 10 is a schematic representation, of the hardware elements of the CHI of the invention.
- Figure 11 is a schematic, showing the basic actions that will take place as a result of a user's activation of various combinations of two mouse buttons.
- Figure 12 is a schematic representation of the tool room of the CHI of the invention.
- Figure 13 is a schematic representation of an activated tonal tool of the CHI of the invention.
- Figure 14 is a schematic representation of an activated grey balance tool of the CHI of the invention.
- Figure 15 is a schematic representation of an activated monitor tool of the CHI of the invention applied to a picture object.
- Figure 16 is a schematic representation of an activated work order tool, including a grey balance tool adjusted to affect shadows, another grey balance tool adjusted to affect highlights and a tonal tool adjusted to affect the entire range.
- an image processing system 8 includes a plurality of input devices 10 for scanning a source image (such as a photograph, film, or a scene within the view of a camera) to create an electronic digital representation of the image.
- the electronic representation is provided to an image processor 14 which adjusts the colors of the electronic image and either stores the adjusted image on storage device 17 (e.g., for later retrieval and processing) or forwards it to various output devices 16 for printing, display or transmission over a network 15 or any other communication channel.
- Image processor 14 is connected to a user interface 22 through which a user indicates the desired transformations to be performed on an electronic image.
- image processor 14 and user interface 22 are implemented with a properly programmed general purpose computer or computer workstation.
- the principal aspect of the present invention is a user interface, described below.
- a transform controller 20 selects a set of FUTs from a collection 19 of stored predetermined FUTs.
- transform controller 20 may be implemented as separate hardware elements.
- Each predetermined FUT describes a unique transform for mapping the values representing each color of an image in a first color space to a different set of values (e.g., a different color in the same or a different color space) thus yielding a desired image transformation.
- the user can also create his own custom FUT in accordance with the invention.
- the user interface allows the user to select a set of colors to be changed (e.g., from a palette of possible colors) .
- the user can then specify the desired changes to these colors, (e.g., a specific increase in the brightness) .
- the controller can then prepare a custom FUT corresponding to the user's selections.
- controller 20 can compose the selected FUTs into a single composite FUT 28 as illustrated in Figure 3 for processing and displaying (or FUT 32 for printing) the image from the input device without intermediate storage of the image.
- This selection and composition is performed with sufficient speed to allow the user to interact with the system, altering his selections until the system displays a desired image. Processing at a speed whereby the user experiences only a minimal or no delay between making a command and the system's response to that command is referred to as "real-time" processing.
- controller 20 provides the composite FUT 28 to transform processor 18 which implements the transform in accordance with the composite FUT 28.
- Transform processor 18 is also preferably implemented in software on a general purpose computer, performing any transform specified in a given FUT, although, it may also be composed of individual hardware elements.
- the user may instruct image processor 14 to accept an electronic image from scanner 10(a) , perform selected transformations of the image, and display the transformed image on color monitor 16(a).
- controller 20 first (and preferably automatically) selects input FUT 12(a) for converting the electronic image from the particular scanner 10 (a) into a "reference" color space used in performing subsequent transformations.
- the defined input transform maps each point in the scanner's RGB space to a corresponding point in the perceptually based color space, u, v, L (i.e., the "reference” space). In performing this translation, the input transform compensates for idiosyncrasies of the associated scanner 10(a).
- each scanner 10(a) , 10(b) may generate coordinates in the RGB space different from each other. Accordingly, the input FUTs 12(a), 12(b) (see Fig. 4) are calibrated to compensate for idiosyncrasies of each scanner such that each scanner generates the same point in the reference space uvL when scanning the same color.
- FUT 12(a) is set once for a particular scanner and is not modified by the user thereafter. However, the user may select from a limited number of such FUTs, depending on input conditions.
- input transform definition 12(a) can be applied to the input data by transform processor 18 alternatively before image store 17, or after as part of a composite transform definition.
- the user may request that the image from scanner 10(a), after calibration 12(a), receive further "input” processing through modifiable input FUT 13, and then be stored on storage device 17 before being further processed and displayed.
- the user modifiable input transform definition 13 may be used by a user, for instance, to change the effects of calibration transform definition 12(a), if it does not suit his particular needs. Accordingly, after controller 20 instructs transform processor 18 to process the electronic image according to input FUT 12(a) it then instructs processing according to input FUT 13.
- the resultant calibrated, transformed image is then stored in a storage device 17.
- the electronic image is stored as an array of data values in reference space for each segment of the picture.
- controller 20 automatically selects three FUTs 26, 30, and 15(a) required to implement the task.
- Display calibration FUT 15(a) is designed to convert the image from reference color space into the RGB color space required by the specific monitor 16(a), calibrating the image to compensate for the characteristics of the monitor so that it appears to the user on display 16(a) as it would be printed on printer 16(b) .
- Gamut compression FUT 30 is designed to modify the image so that it will appear the same on monitor 16(a) as if printed on printer 16(b). For example, in most cases, electronic printers cannot print as wide a range of colors as a monitor can display. Thus, it is not useful to display these colors on a monitor. If the electronic image contains colors which the printer cannot display (but the monitor can) , the gamut compression transform maps these colors to acceptably similar colors within the printer's gamut.
- User modifiable display FUT 26 is used to adjust the calibration of display 16(a) if the standard calibration FUT 15(a) does not suit its needs, for instance with respect to room light. FUT 26 adjusts all colors to compensate for the ambient lighting conditions surrounding display 16(a) .
- the three FUTs 26, 30 and 15(a) are typically selected once for a given site to accommodate the monitor, room and printer.
- controller 20 After selecting FUTs 26, 30, and 15(a), controller 20 next selects pleasing FUT 24 corresponding to the color transformations requested by the user for a specific image.
- Pleasing FUT 24, for example, may increase the luminance parameter L of certain blue colors in the image.
- a user might, alternatively, for instance, be processing a picture taken by a photographer who habitually overexposes the photos.
- a different pleasing transform 24 might decrease the brightness over all ranges to counteract this incorrect overexposure.
- Controller 20 composes the four selected FUTs into a single composite FUT 28. It then fetches the previously stored data array representing the original image from storage 17 and instructs processor 18 to process the image according to composite FUT 28 for display on monitor 16(a) .
- controller 20 dynamically composes the FUTs 24, 26, 30 and 15(a) into the composite FUT 28.
- the electronic image from scanner 10(a) is then processed in a single transformation step in accordance with composite FUT 28 and thereafter is displayed on color monitor 16(a).
- transform processor 18 applied input calibration FUT 12 (a) , and user modifiable input FUT 13 sequentially to the input data and then the data was stored. Afterward, composite FUT 28 was applied to the data which had already been transformed through FUTs 12(a) and 13.
- transform controller 20 can compose FUTs 12(a) and 13 together before image store 17 and instruct transform processor 18 to apply the composed FUT to the data and then store the resultant.
- the invention may also be practiced by storing the data in image store 17 just as it is scanned, without passing through any FUTs, and then including those FUTs 12(a) and 13 in the group of FUTs composed by transform controller 20 after image store 17.
- the user if satisfied with the displayed image, next instructs the controller to generate a hard copy of the image on printer 16(b). Accordingly, the controller selects the FUTs 24, 30 and 15(b).
- the first two of the three definitions, pleasing FUT 24 and gamut compression FUT 30, are identical to the pleasing and gamut FUTs used in generating the display image above.
- Printer calibration FUT 15(b) is designed to convert the image into the CMYK color space, calibrating the image to compensate for characteristics of a specific printer 16 (b) , which have been previously determined. If input FUTs 12(b) and 13 had not been applied to the data before image store 17, then they too would be composed with FUTs 24, 30 and 15(b) to create output composite FUT 32.
- the foregoing description is of the hardware and software components used to implement a user's commands.
- the subject invention relates principally to the interface (22 Fig. 2) between a user and the hardware elements.
- the invention has broad applicability to many types of data array processing, a particularly preferred embodiment is with respect to color image processing. Such an embodiment of a user interface is described below.
- a typical user interface 22 includes an input pointing device, such as a mouse 120 or trackball (not shown) having at least one and beneficially two user activated buttons L, R.
- a video display 122 is used to display icons or graphical objects representing data arrays that the user will instruct the transform processor 18 to use for transformation operations and the results of his instructions on those data arrays.
- the mouse 120 or trackball can be moved by the user.
- a sensor in the mouse senses its relative motions and transmits signals representing those motions to a processor that causes the display 122 to display a correspondingly moving cursor 100.
- a keyboard 124 may also be used for this purpose.
- the terminal 122 is not the same as the monitor 16(a), which is also part of the user interface, upon which the picture image is displayed. However, it is possible to dedicate a portion of a single display device to both tasks, according to well known techniques.
- the cursor 100 (Fig. 9) is an icon that resembles a hand. As explained below, the user uses the hand to accomplish certain tasks.
- the user interface also includes tools, which are displayed on the terminal as graphical icons.
- Additional tools include a scanner input tool 102, a monitor output tool 104, a hard copy printer tool 106, a memory storage tool 108, a grey balance tool 110, a tonality tool 112, a work order tool 117, an adjustment tool 114 and an inking tool 116.
- a picture object 118 is also shown.
- a scanner input tool allows the user to control a scanner to scan pictures into the system.
- a picture object allows the user to select data for a given picture to be treated.
- a monitor tool allows the user to control a color monitor to display and view pictures as they might be modified by use of the other tools.
- An output tool allows the user to control a printer to make hard copies such as prints or films of the image displayed on the monitor.
- a memory storage tool allows the user to direct that image data be written into or read from computer disk memory.
- a grey balance tool allows the user to add or remove color cast from a picture, either as displayed on the monitor or as stored or printed.
- a tonal tool allows the user to adjust the brightness and contrast of a picture over specified tonal ranges.
- a work order tool allows the user to combine the effects of several tools into one tool.
- An inking tool specifies the press/proof conditions under which to display or output a picture and permits control of under color removal.
- the processor of the user interface generates icons displayed on monitor 122 representing each of the tools shown in Fig. 9.
- each tool is controlled by a program entity according to methods well known in the art. As discussed below, there can be multiple instances of the same tool, either identically parameterized, or variably parameterized. Each instance of a tool is represented by a tool data element, and each type of tool is represented by a software routine.
- the tools may be thought of as independent processors that receive instructions from the user and act according to those instructions.
- the user activates and implements the tools by manipulating the icons of the tools with the hand icon, which is a cursor.
- the user can invoke in the hand two states of openness, i.e. open or closed, and two states of activation, i.e. activated and not activated.
- the states are invoked by use of two buttons L (left) or R (right) on mouse 120.
- the hand 100 can be closed by pressing the left mouse button.
- the hand icon has a "hot spot" (i.e. the "+" in Fig. 11, bottom) . If the hot spot of the hand icon is overlapping any part of any other icon on the terminal screen 122 at the instant it is closed, it will grasp that object, as shown at 130.
- the icon representing the hand changes shape to show a hand grasping the tool or object being grasped, to signal to the user the grasped state of the tool.
- the drawing of a hand 98 is not part of the user interface, but is a representation of an actual hand of a human user, pressing the buttons of the mouse 120.
- This representational aspect of the user interface i.e. the change of appearance of the hand icon when it changes state, closely represents the actions of three dimensional objects in the real world.
- This aspect is implemented in other fashions, as will be discussed, with respect to other actions implemented by the user.
- the tools are program entities that have attributes, such as display location, applied state, active state, control settings, etc.
- Each tool has an icon associated with it.
- the user actually causes the tool object, the program entity, to move in a space that is represented by the display 122. This is shown to the user by a change in appearance of the icon e.g. it moves on the screen, or opens up to reveal additional controls.
- the hand opens 132 and its icon changes shape. If the hand was grasping an object before it was opened, it will release that object. If the hand is open and the right button is pushed the hand activates any object or tool (or portion of a tool) to which it's pointing finger is adjacent 134. The effect of activating a tool differs from tool to tool as is discussed.
- the grasped tool is applied 136. If the grasped tool is adjacent to another object or another tool to which the grasped tool can be applied, then some activity will occur with respect to that second tool or object. If the grasped tool is not adjacent to any object or tool upon which it can be applied, the grasped tool and hand icon pair will change shape to indicate to the user that the grasped tool is being applied, but there is no object for its application. For instance, the adjustment tool 114, rotates as would a screwdriver in use to illustrate its use and aid in learning.
- the interaction between the user, the hand 100 and other tools or objects in general is shown schematically in Fig. 4.
- the hand is initialized 200 and is made open 202.
- the system evaluates 204 the user command. If the user commands the hand to grasp (by pressing the L button), at 208, the system evaluates if there is an object at that display location (which location would be part of the data parameters defining the program or processor entity that constitutes that object) . If there is, at 206 the system sends a message identifying the hand's location on the display 122 along with a message that whatever is at that location is being grasped. Then at 210 the object receives the instruction to be grasped, sends back a message that something has been grasped and what it is.
- the hand icon changes shape or replaces the cursor with a view of the object held by a hand, and the grasped object moves with the hand cursor 100 in response to the user's motion of the mouse.
- the user interface operating system would indicate that there was no object, and the hand would close empty handed at 212.
- the system or, a tool, as the case may be
- sending a message including the hand's (or a tool's) location which message is intercepted by the object at that location
- the system or tool to examine the location beneath the hand, and determine if there is another object there. If so, the system or tool sends the message. If not, it sends no message.
- the general case will be discussed as if the hand (or a tool) simply sends the message to the second tool, without regard to the manner by which it is determined if a second tool is at the location or the fact that the operating system, rather than the hand, sends and receives messages.
- the user had next commanded the hand to open, it would return to 202 and be made open. If at 214 the user had transmitted any other command, such as no command, or a movement of the mouse, the hand 100 would remain closed.
- the hand which is grasping something, would evaluate the user command at 216. If the user had pressed the L button, directing the hand 100 to apply the held tool or object, the hand 100 would send 217 an "Apply" message to the held tool or object to apply itself. The held object would at 218 act accordingly to receiving a command to apply itself, depending on its environment. A specific implementation of a tool responding to an "apply" message is shown schematically in Fig. 5 and is discussed below. After the object has been applied, at 216, the hand 100 again evaluates 216 the user command and proceeds as described.
- the user command is to "release” also referred to as "drop" the object
- the hand 100 sends "Drop” a message to the grasped object that it is released and the hand drops 228 the object.
- the specific routine for the object being released is invoked at 230.
- a routine for a specific object having been dropped is shown schematically at Fig. 6. After the completion of such a routine, the hand 100 is made open at 202 and again evaluates a user command. If at 216, the user command is other than
- the hand again evaluates the user command at 216. If, for instance, the user command is “move”, the hand returns to 216 and then, if an "apply” command is in place, it again goes to the specific routine for the application of that object.
- the hand 100 sends 220 a message that whatever is at that location is being activated and the hand icon changes shape, e.g. to a hand with a pointing finger.
- the object receives 224 the instruction to be activated.
- the routine for the particular object, when activated, is invoked 226.
- Such a procedure for a particular activated object is shown schematically at Fig. 5.
- an object can be applied without being activated, depending on whether that condition would be an appropriate analogy to the real world.
- the hand 100 if the hand 100 is not over an object, it will change its shape to point to nothing at 240 to illustrate its use and aid in learning.
- the hand will evaluate the user command. If the user command is "deactivate”, the hand returns to 202 and is made open, proceeding as before. If the command is other than “deactivate”, such as "no command” or a
- a tool if a tool is released or activated, that tool invokes its own specific routine for being released or activated.
- the routine for released and activated tools of the type that "create" a FUT is shown schematically at Fig. 5.
- Such tools include the grey balance tool 110, the tonal tool 112 and the work order tool 117. It will be understood to those of ordinary skill in the art that tools that "create" FUTs do so by instructing transform controller 20 to create a FUT.
- the tool creates a FUT, albeit at a distance, and through the agency of another program entity, the transform controller 20; similarly, with respect to modifying and composing FUTs.
- the tool enters the routine from the stage discussed above at A or C of Fig.
- the tool receives 304 a message, if any from the target.
- the tool evaluates 306 the message. If 308 there is no message from a target, then the tool returns to Fig. 4. Such a situation could arise if there is no object under the tool, or if the object under the tool is not the type that can receive a FUT. If the message is "process the picture at memory location x" the tool gets 310 the data at memory location X and transforms 312 the data according to the FUT then associated with the tool. The tool writes 314 the data to the memory location X and returns 316 to Fig. 4.
- the tool computes a new FUT if necessary 318 and sends 320 the FUT to the target object. The tool then returns 322 to Fig. 4. The target object will proceed along its own command path. A typical such command path is discussed with respect to a monitor tool 104 and Fig. 7.
- the routine for a dropped object is shown schematically at Fig. 6, which is invoked at B of Fig. 4.
- the tool receives 328 a message from the object.
- the tool evaluates 330 the received message. If the received message is "send your FUT,” the tool sends 332 its FUT and returns 334. If at 330 the message is anything other than "send your FUT, " the tool returns at 334 without sending its FUT.
- the tool must compute a new FUT.
- This routine is shown schematically in Fig. 8 with respect to the grey balance tool 110, shown in more detail in Fig. 14.
- the grey balance tool 110 is used to add or remove a color cast to an image.
- the grey balance tool permits a color cast change with respect to all or only specified parts of the tonal range of an image. For instance, if the highlights of an image are too blue, it is possible to reduce the blueness of only the highlights of the picture, leaving untouched the blueness of all other tonal portions of the picture e.g. the shadows and midtones.
- the grey balance tool 110 is divided into zones, which zones act as controls to be separately activated by the user.
- the grey balance tool 110 (Fig. 14) includes a color control 520, a maximum effect control 522 and decreasing and increasing effect controls 526, 524 respectively.
- the user indicates with the hand 100 a location on the color circle 520, for instance as marked by the small x.
- the color circle represents all hues from neutral to their most saturated forms; the center represents neutral and the outside edge represents the most saturated form of each hue. Hues vary continuously around the circle.
- the letters C,G,Y,R,M,B stand for cyan, green, yellow, red, magenta and blue, respectively. Pointing with the activated hand 100 at the center of the color indicates no color change.
- Pointing the hand 100 at a location away from the center indicates the amount of a color change (not the amount of color cast in absolute.)
- the user wants to remove a purple color cast, he adds a yellowish green color cast to all hues in the picture, by pointing at the spot indicated by the x. If the user finds that the added cast was correct in color, but too weak in saturation, the user points further out along the radius on which x lies, until the desired color cast is achieved.
- the user must also select the range of tones over which the color cast is desired. This is done with adjustment of the maximum effect control 522, increasing effect control 524 and decreasing effect control 526.
- the scale 528 indicates schematically what range of tonality will be affected by application of the three other controls. The shadows are indicated on the left of the scale 528 and the highlights on the right. Because, as shown in this example, the maximum effect control extends equally about the mid region of tonality, the full effect of the yellowish green color shift will take effect only in regions of midrange brightness.
- the slope of the edges of increasing effect control 524 and decreasing effect control 526 can be moved by activating the hand in that region and moving the hand while activated.
- the range of maximum effect would span the tonal range and the indicated color cast change will be applied fully to all tonality ranges, from the darkest shadows to the lightest highlights. If, however, the increasing effect control 524 is inclined as indicated, then tonal regions that are only slightly darker than the mid tonal region indicated by the maximum effect control 528, will have their color cast changed, but at less than the full change. The further from the midtones that a tonal range is, the less will there be an effect in parts of the image having that tone. Inclining the increasing effect control 524 more steeply, so that it does not intersect the side margin of the panel, as shown with respect to the grey balance tool 110b of Fig. 16, will result in no color change whatsoever at the darkest shadows.
- the decreasing effect control 526 works similarly with respect to the other side of the area of maximum effect. It is also possible, in a preferred embodiment, to move the area of maximum effect from left to right or vice versa, i.e. from darker to lighter tonalities, by activating the maximum effect control 528 with the hand and moving it, side to side, in the desired direction.
- the width of the region of maximum effect can also be widened or narrowed by activating 528 and moving up or down. Both side to side and up and down motion may be done in combination.
- the effect of moving these controls on the FUT generated by the grey balance tool is shown schematically in Fig. 8. The grey balance tool starts the routine at 350.
- the tool computes 354 the color parameters for its FUT based on the location of the hand, recomputes 356 its FUT and returns 358 to the main routine shown in Fig. 6(a), then moving on to send 320 its FUT to the target object. If the hand is not in the color circle control
- the grey balance tool evaluates if the hand is in the maximum effect control 522. If so, the grey balance tool identifies 362 the tonal range of the maximum effect based on the location of the had tool and the parameters that will be used to generate the new FUT, and then proceeds to recompute 356 the new FUT as described above.
- the hand If the hand is not in the maximum effect control 522, it evaluates 364 if it is in the increasing effect control 524. If it is, the grey balance tool computes 366 the degree of increasing effect based on the location of the hand, and identifies the parameters that will be used to generate the new FUT, then proceeding to generate the new FUT as before. If the hand is not in the increasing effect control 524, the tool evaluates 368 whether it is in the decreasing effect control 526, with similar results. If the hand is not in the decreasing effect control 526, then the tool returns 372 to the main routine shown in Fig. 6(a), then moving on to send 320 its FUT to the target object.
- the tool will generate a new FUT for each new control location and send that FUT to the target object, for instance a monitor tool.
- the system reviews the location of the hand at least two times per second. This is fast enough to generate ⁇ real time processing.
- the monitor tool 104 allows the user to control the high resolution monitor 16(a). and more. In its inactive state, the monitor tool icon 104 appears as shown in Fig. 9. When dropped on a picture, the monitor tool icon enlarges to surround the icon of the picture object 11 as shown in Fig. 15.
- the monitor tool includes several controls, which work in the same general fashion as the controls of the grey balance tool.
- a zoom slider 111 is used to control the degree of magnification that the high resolution monitor will show the picture.
- the view area control 113 can be moved around the monitor, by activating with the hand, to select the portion of the picture to be displayed on the high resolution monitor.
- the zoom setting indicator 115 displays the magnification of the chosen degree of zoom.
- the hand 100 grasps the monitor tool 104 when the user presses the L button, according to the message interchange discussed above with respect to the general operation of the hand 100.
- the user moves 408 the monitor tool 104 to a picture object 118 and releases 410 the monitor tool.
- the monitor tool sends 412 a message "process.”
- the picture object being at the location of the monitor tool, receives 414 the message and sends a message "a data array for the picture to be processed is at memory location [specified]".
- the monitor tool 104 receives 416 the message from the picture tool, gets the data array from the specified memory location and processes the picture data using the FUTs then in place. It will be assumed for the purpose of this discussion that the only FUT in place is a display calibration FUT 15(a) (Fig. 3).
- the monitor tool has associated with it a composite FUT 28, which may combine a plurality of individual FUTs into one, thereby minimizing the computation time.
- composite FUT 28 is the same as display calibration FUT 15(a) .
- the monitor tool 104 stores the data for the transformed array in memory at a location different from that where the original picture data resides. Thus, the original data remains intact for further manipulation.
- the monitor tool causes the data for the transformed array to be sent to the color monitor 16(a), where it is displayed for the user's inspection. It will be understood that the data array that results from the application of composite FUT 28 will be in RGB space, which are the type of signals that the display device 16(a) uses to display an image. If the user wants to add a gamut compression FUT 30, to display on the monitor the image as it would actually be printed by a specific printer, at 420 he causes the hand 100 to grasp an adjustment tool 114, which in a preferred embodiment appears as a screwdriver. The hand moves 422 the adjustment tool 114 over the monitor tool and applies 424 the adjustment tool.
- the adjustment tool sends a message "adjust.”
- the monitor tool receives the message and the icon changes 426 appearance to reveal an adjustment panel 105.
- the adjustment panel 105 includes a zone 107, referred to as an "inking sticky pad.”
- the hand moves away 428 from the monitor and releases the adjustment tool 114, which remains where released and causes no further action.
- the user moves 430 the hand 100 to and grasps an ink tool/object 116.
- the user moves 432 the ink object to over the sticky pad and causes the hand to send a "release” message to the inking tool 116.
- the inking tool 116 sends 434 a "process" message.
- the monitor tool receives the message and sends 436 a message that is received by the inking tool "send your FUT.”
- the inking tool 116 sends 438 its FUT for the gamut compression of a particular inking process and that FUT is received by the monitor tool 104.
- the monitor tool 104 composes 440 the gamut compression FUT with whatever FUTs it has been previously set up with. This results in a composite FUT 28 (fig. 3), which provides the same transformation as would first application of gamut compression FUT 30 and next display calibration FUT 15(a).
- the monitor tool gets the originally stored picture data and processes the data through the newly composed FUT, and displays the picture as modified on the color monitor 16(a) .
- the user interface provides an elegant method by which a user can cause a composite FUT to be created from two individual FUTs, simply by selecting tools on the display and moving those tools relative to one another and picture objects.
- the user may want to effectuate additional transformations to the image being worked upon, which are also of a general nature.
- the room in which the monitor 16(a) is used may be unusually bright and thus a change in the brightness is desired so that the image appears as would a printed image in normal lighting conditions. Further, it may be desired to offset an undesired color cast caused by the lighting in the viewing room.
- This second type of adjustment is a grey balance adjustment.
- a work order tool 117 is a tool that combines other tools which have FUTs, associated with them such as inking tools, grey balance tools and tonality tools.
- a work order tool 117 is shown in an activated configuration in fig. 16.
- the work order tool shown has three other tools associated with it: a grey balance tool 110a for affecting the shadows, a grey balance tool 110b for affecting the highlights and a tonal tool 112a for an overall effect.
- the work order tool can accommodate a large number of tools. It should be noted that there is no limit to the number of identical types of tools that a user can use in one image processing environment.
- new tonal tools 112 can be freely made by copying from existing tonal tools, and then modifying them. If a tool is copied, according to the software embodiment of the invention, a new data structure is created that is initially virtually identical to the data structure of the tool copied.
- the user simply causes the hand 100 to grasp the desired tool and release it on the work order tool 117.
- the released object sends a "process" message to the work order tool.
- the work order tool sends a message that is received by the released tool to "send FUT".
- the work order tool receives the FUT, and combines it with the previously combined FUTs received from tools already a part of the work order.
- the work order tool then graphically merges the added tool into the work order.
- Each "pocket" of a work order tool is actually a sticky pad.
- the work order tool is another tool by which the user can combine transformations, simply by causing the hand to move graphical objects relative to one another on a terminal display.
- the user grasps 444 a previously prepared work order tool 117 and moves it to the sticky pad 109 on the monitor tool adjustment panel 105, where it is released.
- the work order tool 117 sends its composed FUT to the monitor tool, which receives it and composes 450 a new FUT based on the FUTs already installed, for instance the display calibration FUT 15(a) and gamut compression FUT 30 generated by inking tool 116, stuck onto sticky pad 107.
- the order in which the monitor tool composes the FUTs is important. Further, it is not always possible for the monitor tool to simply compose a new FUT, for instance from the work order tool 117, with a previously existing one, e.g. from the display calibration combined with the gamut compression. Sometimes, depending on the types of FUTs involved, it is necessary to begin anew with its uncomposed source FUTs, and recompose them all.
- the order of composition is to first compose customer modifiable monitor-to-proof ("CMMP") FUTs, such as grey balance, tonal tool or work orders, with inking FUTs, and next with display calibrations FUTs, such as 15(a) for a monitor or 15(b) for a printer.
- CMMP monitor-to-proof
- the monitor tool 104 gets 452 the original picture data and processes the data through the newly composed FUT and redisplays the altered picture on color monitor 16(a) .
- the foregoing changes are the type of changes that the user must make so that the picture is displayed as it will be printed on the chosen printer.
- the user may also want to add an artificial or artistic alteration to the picture, such as to increase the contrast of the midtones of the picture.
- the hand 100 grasps a tonal tool 112 and moves 456 it to the monitor, where it is released and then opens up.
- the user points 458 the hand at the tonal tool 112 to activate it, whereby the tonal tool sends its FUT to the monitor tool 104, which composes a new FUT as described above in connection with step 450.
- the monitor tool applies the new FUT to the data for the picture and displays the further modified picture on the color monitor 16(a) .
- the tonal tool is shown in detail in fig. 13.
- the user may, for instance, desire to increase the brightness and contrast of certain portions of the picture, for instance those presently of medium brightness.
- the tonal tool includes a brightness control 502 and a contrast control 504, a maximum effect control 506, an increasing effect control 508, a decreasing effect control 510, and region 599 of both brightness and contrast control together.
- the user points 462 the hand 100 at the contrast control 504 and presses the R button sending 464 a message to activate the slider control 504.
- the slider control 504 sends 466 a message to the tonal tool 112, indicating its current position.
- the tonal tool 112 computes 468 its FUT based on the location of the contrast slider control 504.
- the means by which the parameters of the FUT vary with respect to the location of the controls is unimportant with respect to the present invention.
- changes in location of the contrast control invoke changes in the FUT such that when the FUT is applied to data, the picture generated changes with respect to its contrast. Similarly with respect to brightness.
- the tonal tool sends 470 the newly computed FUT to the monitor tool as already discussed, and the monitor tool recomputes 472 its FUT including the new FUT from the tonal tool 112 and recomputes 474 and redisplays the picture, indicating the effect of the tonal adjustment. If the user again moves the contrast control 504, at 476 the tonal tool loops back to 466 and generates a new FUT, which is sent to the monitor tool and used as discussed.
- the pixel elements that will be affected are highlighted in some fashion, for instance all are made orange or all are made white.
- the R button is released, then the actual contrast transformation takes place.
- the user grasps 478 the monitor tool and removes it from the picture object 118.
- the monitor 16(a) will no longer display 480 the picture in question.
- the user grasps the tonal tool 112 with the hand and applies 482 it to the picture object 118.
- the tonal tool gets the picture data from memory and transforms it according to the FUT generated by the tonal tool 112 according to its settings 484. This data is stored in the location for the picture data and the picture data is thus permanently changed.
- the user interface of the invention keeps inviolate the data for the picture during the interactive user modifications made while the monitor tool is displaying the picture.
- the original data is kept in one memory location and modified picture data is kept in another memory location.
- FIG. 12 A tool room is shown in Fig. 12.
- multiple "rooms" are provided.
- the user can move objects or tools from one room to another, by grasping the objects and carrying it to a door 121.
- the door opens and the tool or object can be carried to the adjoining room.
- the user can create new rooms and connect them to each other as desired.
- the CHI of the invention includes a room with special properties, referred to as a "tool room, " which acts as a room having an endless supply of standard tools.
- the tool room is shown in Fig. 12 and includes one master copy of each tool that will be used, set up according to default parameters.
- the tool room creates a working copy of the tool which the user takes to the new room.
- this creation amounts to the generation of a copy of the data structure for the tool.
- the creation is effected by the activation of an additional processor designed to carry out the activities of the tool.
- the user goes to the tool room and simply takes the tool desired. It is not necessary to consciously keep track of the last copy of a tool, so that it is not inadvertently destroyed or altered. Further the master copies of the tools are all kept in a central place.
- the invention also includes a method and apparatus for visual or graphical programming.
- the user programs the transform controller 20 by selecting and moving graphical objects on a screen relative to each other. For instance, as described in connection with Fig. 7, the user programmed the transform controller 20 to select, first FUTS 15(a), 30, 26 and 24 and to apply those to the data array for purposes of viewing on monitor 16(a). Next, the user programmed the controller 20 to select FUTS 24, 27, 30 and 15(b) and apply them to the data for purposes of printing. All of this programming was done by manipulating graphics rather than by typing commands in a word-based programming language.
- CHI of the invention is also useful in connection with hardware that operates at a high enough instruction rate per second that it is not necessary to compose multiple FUTs together in order to achieve real time image processing.
- the claimed invention is intended to cover those embodiments that do not include composition of FUTs.
- the invention may be implemented in software or hardware.
- each tool is controlled by a processor or a portion of a processor, which generates signals causing the terminal to display the icon of the tool.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Processing Or Creating Images (AREA)
Abstract
The apparatus of the invention generates an array of modified data values related to an array of input data values in response to commands from a user. The invention is particularly suited for color image processing. The apparatus includes storage means for storing at least one data transform definition and data processing means for identifying at least one of the transform definitions in response to a user's selection of at least one data transformation and for applying the identified transform definitions to the array of input data values. A user interface is provided. The interface comprises an input device and a display. The display displays a graphical representation of the user instructions and the transform definitions. The user selects and moves on the display the graphical representation of at least one of the transform definitions. A signal processor identifies the selected at least one of the transform definitions and applies the selected transform definitions to the array of input data values to arrive at modified values. Thus, the tonal properties of a picture can be changed. The invention further comprises means for generating a single composite transform definition, such that application of the single composite transform definition to the input data values generates the same output data values as would the individual serial application of a selected at least two transform definitions.
Description
COMPUTER/HOMAN INTERFACE FOR PERFORMING DATA TRANSFORMATIONS BY MANIPULATING GRAPHICAL OBJECTS ON A VIDEO DISPLAY
Background of The Invention
The invention relates generally to the interface between a human operator and a computer hardware and software system and in particular to a system for performing data transformations by manipulating graphical objects on a video display monitor. A preferred embodiment of the invention relates to color image processing systems employing look-up tables for transforming from a first coordinate space to a second coordinate space. Color image processing systems typically include an input device for generating an electronic representation of a color image. The input device provides the electronic image representation to a computer workstation, which processes the image in ac- cordance with a user's instructions and forwards the processed image to a color monitor for display. The user interacts with the workstation, typically through input devices such as a mouse and keyboard, and an output device such as a control monitor, repeatedly instructing the computer to adjust the electronic image until the color monitor displays a desired image. The user can also generate a hard copy of the image by instructing the workstation to provide the processed electronic image to a selected printing device. The electronic image processed by the workstation consists of a two dimensional array of picture elements (pixels) . The color of each pixel may be represented in any of a variety of color notations or "color spaces." For example, the RGB color space represents pixel colors according to the relative contributions of three primary colors, red, green and blue. This color notation is commonly used in connection with color
monitors, since the three parameters (R, G, B) correspond to the mechanism and signals by which the monitor generates color. More specifically, each pixel of the monitor's display contains three primary color phosphors. To generate a color defined by a set of RGB values, the monitor stimulates each primary phosphor with an intensity determined by the corresponding R, G, B value.
Similarly, the CMYK color space represents color using four variables, C, M, Y, K, each corresponding to the relative (subtractive) contribution of the colorants, cyan, magenta, yellow and black. This notation is commonly used in connection with printing devices since each parameter C,M,Y and K determines the amount of a colorant (e.g. ink, dye) used by the printer in producing a desired color. Black is used to increase printed ink density and minimize the use of costly colored colorants, in situations where the overlay of multiple colorants would appear substantially black.
Color spaces such as linear RGB and CMYK are useful for image scanning devices and image printing devices, respectively, since each parameter of the color space closely corresponds to a physical mechanism by which these devices measure and generate color.
However, for a variety of reasons, these color spaces may not be well suited for processing color images. For example, as shown in Figure 1, the three parameters R, G, B define a three dimensional, linear color space, each point within the space corresponding to a unique color. At various points within the space, a selected change in the values of the parameters may not result in a commensurate change in color perceived by a human viewer. For example, at one location in the space, increasing the parameter R by n units yields little perceived change in color. Yet, at another point in the space, increasing R by the same n units yields a
dramatic change in the perceived color. Accordingly, it may be difficult for a user to manipulate the primaries R, G, B, to achieve a desired change in color. In response to this problem, a variety of perceptually based color spaces have been proposed for defining color in terms of parameters which more closely correspond to the manner in which humans perceive color. The most prominent perceptually based standards for color representation are collectively referred to as the CIE system promulgated by the International Commission on Illumination.
The "u'v'L*" space, for example, is a three dimensional color space defined by the parameters u', v', L*. The chromaticity of each color in this space is uniformly characterized by the parameters u', v1. The third parameter, L*, denotes perceptually uniform variations in the lightness of the color, (e.g., L*=0 is black, L*=100 is white) . (For simplicity in the following discussion, the reference space will be referred to, simply as u, v, L.)
To process a color image change within the "uvL" color space, the workstation simply maps each point un, vQ, LQ in the color space to a new point u,, v-, L- . For example, if the user desires to display the image on a monitor, he may wish to adjust the colors of the image to compensate for lighting conditions of the room. Accordingly, the user selects a transform which maps each point uQ, vQ, LQ to a new point having the same values uQ, vQ but having greater luminance value
Ll-
The image processing system typically contains a predetermined transform definition for each such color image transformation. Based on a selected definition, the system maps certain points of the color space to new points. Accordingly, the color at each pixel of an electronic image is sequentially mapped in accordance
with the transform definition to yield the desired visual effect. To perform another image transformation, the system remaps the color values to yet another point in accordance with a second transform definition. Any number of transformations can thus be performed by sequentially mapping color values accord¬ ing to the available predetermined transform definitions. However, such sequential processing of images can be extremely time consuming, particularly if a large number of predetermined transforms are selected and a large number of datapoints must be transformed.
Copending, co-assigned application entitled "Color Image Processing System For Preparing a Composite Image Transformation Module For Performing a Plurality of Selected Image Transformations," U.S.S.N. , incorporated by reference herein, discloses a method and apparatus for generating an array of modified pixel values in response to an array of input pixel values, vastly minimizing the required computations. The method features the steps of receiving a user's selection of an image transformation to be performed on the array of input pixel values. In response to the user's selections, a plurality of transform definitions are automatically selected from stored transform definitions. Each transform definition includes sample values in three dimensions representing an input/output relation of a predetermined image transformation. From these selected transform definitions, a composite transform definition is generated containing sample values of an input/output relation of a composite image transformation which is equivalent to the several image transformations effectively selected by the user. The composite transform is preferably compiled and implemented sufficiently quickly (e.g., in real time) to allow the user to interactively change his selec-
tions until a desired composite transformation is created.
To implement a composite transform definition, at least one sample value is selected from the composite transform definition, based on the value of an input color to be modified. A processed color value is then determined based on the at least one selected sample value. For example, in one embodiment, a nearest neighbor of the input color value is selected as the sample value. In another embodiment, a plurality of sample values are selected and the processed color value is determined by interpolating between these values.
In another aspect of the method and apparatus described in the copending application, the stored transform definitions may include custom transform definitions made in accordance with a user's instruction as well as the predetermined transform definitions. A user's instructions specifying desired color changes are received. A custom transform definition is then prepared for implementing the selected color changes. Such custom transform definitions may be created by modifying predetermined transform definitions or by generating entirely new transform definitions based on the user's input. In the preferred embodiment, the three-dimensional transform definitions are functional look-up tables and are referred to herein as "FUTs."
In order to enable a human operator to use predetermined FUTs and to create custom FUTs, it is necessary to provide an interface between the computer and the human that is easy and intuitive to use. Known interfaces for similar systems employ video display graphical objects, but the manipulation of the graphical objects does not correspond intuitively to the real world tools or data structures being controlled, or to the data transformations being
implemented. Further, they require a multiplicity of different icons and interaction commands. As used herein, a "CHI" shall refer to a computer/human interface between a computer system and a human operator.
Thus, the several objects of the invention are to provide a CHI for a system that: transforms data, for example, a color image processing system; which presents a graphical representation of the tools and objects that make up the system; corresponds intuitively to the real world objects and the data transformations being controlled. It is a further object of the invention to provide a CHI that minimizes the number of icon representations of objects and tools and standardizes interaction and commands among them. It is a further object of the invention to provide a convenient means for creating various custom implementations of standardized master tools. Yet a further object of the invention is to provide a CHI that permits the user to combine a plurality of FUTs into a composite FUT, simply by manipulating graphical objects on a video display. It is yet a further object of the invention to provide a convenient, graphical means for altering the parameters of a data FUT, and in particular, to graphically alter the degree to which such a transformation will be applied to the data in question.
Other objects, features and advantages of the invention are apparent from the following description of particular preferred embodiments taken together with the drawings.
Summary of the Invention
The apparatus of the invention, in a first preference embodiment, generates an array of modified data values related to an array of input data values in response to commands from a user. The apparatus
includes storage means for storing at least one data transform definition and data processing means for identifying at least one of the transform definitions in response to a user's selection of at least one data transformation and for applying the identified transform definitions to the array of input data values. Interface means displays are provided between the user and the data processing means for receiving instructions from the user and for indicating the outputs of the data processing means. The interface means comprises an input device for the user to transmit instructions, signal generating means for generating signals representative of the one transform definition and the user instructions, and display means. The display means displays a graphical representation of the signals representing the user instructions and the transform definitions. Signal processing means are also included for receiving the user instructions to select and move on the display means the graphical representation of at least one of said transform definitions. The signal processing means generates commands to the data processing means to identify the selected at least one transform definition and to apply a selected transform definitions to the array of input data values to arrive at modified values.
A second preferred embodiment of the invention further comprises means for generating a single composite transform definition, such that application of the single composite transform definition to the input data values generates the same output data values as would the individual serial application of a selected at least two transform definitions.
In a third preferred embodiment, the invention interface includes means for selecting a primary range of the input data array to be transformed according to a selected transform definition and designating at
least one secondary range of the input data array to be transformed according to a transform definition that provides a selected rate of change of the transformation of data values within the secondary range of said input data array with respect to the proximity of said data values to the primary range of said input data array.
In a fourth preferred embodiment of the invention, the interface includes means for sending a message to and means for receiving a message from tools and objects, at least one data transform definition tool having means for sending a message to and receiving a message from other tools and objects and an object representing the input data values having means for sending a message to and means for receiving a message from tools, where the message sent by the transform tool to the data object includes a message to apply a selected transform to the input data values.
In a fifth preferred embodiment of the apparatus of the invention, the user interface further includes signal processing means for receiving from the user commands to identify on the display means the graphical representation of at least one master transform definition and to generate commands to the transform controller to generate a changeable working transform definition that is the same as the identified master transform definition, other than the ability to be changed.
Brief Description of the Drawings
Figure 1 is a schematic diagram of the RGB color space.
Figure 2 is a block diagram of an image processing system used in connection with the claimed invention. Figure 3 is a block diagram of a variety of transform operations in accordance with instructions generated by the user through a CHI of the invention.
Figure 4 is a flow chart schematically showing the basic operation of a hand of the CHI of the invention.
Figure 5 is a flow chart schematically showing the basic operation of tools of the CHI of the invention which are used to create and modify data FUTs when the tools are activated or applied.
Figure 6 is a flow chart schematically showing the basic operation of tools of the CHI of the invention which are used to create and modify data FUTs when the tools are released on an object.
Figure 7a-7f is a flow chart schematically showing the basic steps a user would follow using the method of the invention to display a picture on a monitor, try various modifications to that picture and finally make permanent the alterations to the picture.
Figure 8 is a flow chart schematically showing the general steps the user would take according to the CHI of the invention to change a FUT relating to the grey balance of a picture. Figure 9 is a schematic, showing basic elements of the CHI, including a room, door and picture objects and hand, input, output, adjustment, grey balance, tonal, monitor, work order, memory storage, and inking tools.
Figure 10 is a schematic representation, of the hardware elements of the CHI of the invention.
Figure 11 is a schematic, showing the basic actions that will take place as a result of a user's activation of various combinations of two mouse buttons. Figure 12 is a schematic representation of the tool room of the CHI of the invention.
Figure 13 is a schematic representation of an activated tonal tool of the CHI of the invention.
Figure 14 is a schematic representation of an activated grey balance tool of the CHI of the invention.
Figure 15 is a schematic representation of an activated monitor tool of the CHI of the invention applied to a picture object.
Figure 16 is a schematic representation of an activated work order tool, including a grey balance tool adjusted to affect shadows, another grey balance tool adjusted to affect highlights and a tonal tool adjusted to affect the entire range.
Description of The Preferred Embodiment
The invention has applicability to a wide range of data processing applications. A preferred application is in the field of color image processing, which will be used as an illustrative example. Referring to Figure 2, an image processing system 8 includes a plurality of input devices 10 for scanning a source image (such as a photograph, film, or a scene within the view of a camera) to create an electronic digital representation of the image. The electronic representation is provided to an image processor 14 which adjusts the colors of the electronic image and either stores the adjusted image on storage device 17 (e.g., for later retrieval and processing) or forwards it to various output devices 16 for printing, display or transmission over a network 15 or any other communication channel.
Image processor 14 is connected to a user interface 22 through which a user indicates the desired transformations to be performed on an electronic image. In general, image processor 14 and user interface 22 are implemented with a properly programmed general purpose computer or computer workstation. The principal aspect of the present invention is a user interface, described below. In response to the user's request, as discussed below, in one mode of operation, a transform controller 20 selects a set of FUTs from a collection 19 of stored
predetermined FUTs. (Rather than being implemented in software, transform controller 20 may be implemented as separate hardware elements.) Each predetermined FUT describes a unique transform for mapping the values representing each color of an image in a first color space to a different set of values (e.g., a different color in the same or a different color space) thus yielding a desired image transformation. The user can also create his own custom FUT in accordance with the invention. For example, the user interface allows the user to select a set of colors to be changed (e.g., from a palette of possible colors) . The user can then specify the desired changes to these colors, (e.g., a specific increase in the brightness) . In response to these selections, the controller can then prepare a custom FUT corresponding to the user's selections. Such custom transforms may be created by modifying predetermined transforms or by generating entirely new transforms based on the user's input. In accordance with the user's instructions, controller 20 can compose the selected FUTs into a single composite FUT 28 as illustrated in Figure 3 for processing and displaying (or FUT 32 for printing) the image from the input device without intermediate storage of the image. This selection and composition is performed with sufficient speed to allow the user to interact with the system, altering his selections until the system displays a desired image. Processing at a speed whereby the user experiences only a minimal or no delay between making a command and the system's response to that command is referred to as "real-time" processing.
Referring again to Figure 2, controller 20 provides the composite FUT 28 to transform processor 18 which implements the transform in accordance with the composite FUT 28. Transform processor 18 is also preferably implemented in software on a general purpose
computer, performing any transform specified in a given FUT, although, it may also be composed of individual hardware elements.
Referring to Figure 3 the user, for example, may instruct image processor 14 to accept an electronic image from scanner 10(a) , perform selected transformations of the image, and display the transformed image on color monitor 16(a). In response, controller 20 first (and preferably automatically) selects input FUT 12(a) for converting the electronic image from the particular scanner 10 (a) into a "reference" color space used in performing subsequent transformations. More specifically, the defined input transform maps each point in the scanner's RGB space to a corresponding point in the perceptually based color space, u, v, L (i.e., the "reference" space). In performing this translation, the input transform compensates for idiosyncrasies of the associated scanner 10(a). For example, in response to a given color, each scanner 10(a) , 10(b) may generate coordinates in the RGB space different from each other. Accordingly, the input FUTs 12(a), 12(b) (see Fig. 4) are calibrated to compensate for idiosyncrasies of each scanner such that each scanner generates the same point in the reference space uvL when scanning the same color. Typically, FUT 12(a) is set once for a particular scanner and is not modified by the user thereafter. However, the user may select from a limited number of such FUTs, depending on input conditions. As discussed below, input transform definition 12(a) can be applied to the input data by transform processor 18 alternatively before image store 17, or after as part of a composite transform definition. The user may request that the image from scanner 10(a), after calibration 12(a), receive further "input" processing through modifiable input FUT 13, and then be
stored on storage device 17 before being further processed and displayed. The user modifiable input transform definition 13 may be used by a user, for instance, to change the effects of calibration transform definition 12(a), if it does not suit his particular needs. Accordingly, after controller 20 instructs transform processor 18 to process the electronic image according to input FUT 12(a) it then instructs processing according to input FUT 13. The resultant calibrated, transformed image is then stored in a storage device 17. The electronic image is stored as an array of data values in reference space for each segment of the picture.
At a later time, the user instructs controller 20 to modify the image with a "pleasing" transform and display the modified image on monitor 16(a) as it would appear if printed on printer 16(b). Accordingly, controller 20 automatically selects three FUTs 26, 30, and 15(a) required to implement the task. Display calibration FUT 15(a) is designed to convert the image from reference color space into the RGB color space required by the specific monitor 16(a), calibrating the image to compensate for the characteristics of the monitor so that it appears to the user on display 16(a) as it would be printed on printer 16(b) . As with input calibration FUT 12(a), it is not usual for the user to change this FUT 15(a). However, the user may select from a group of suitable FUTs.
Gamut compression FUT 30 is designed to modify the image so that it will appear the same on monitor 16(a) as if printed on printer 16(b). For example, in most cases, electronic printers cannot print as wide a range of colors as a monitor can display. Thus, it is not useful to display these colors on a monitor. If the electronic image contains colors which the printer cannot display (but the monitor can) , the gamut compression transform maps these colors to acceptably
similar colors within the printer's gamut. User modifiable display FUT 26 is used to adjust the calibration of display 16(a) if the standard calibration FUT 15(a) does not suit its needs, for instance with respect to room light. FUT 26 adjusts all colors to compensate for the ambient lighting conditions surrounding display 16(a) .
The three FUTs 26, 30 and 15(a) are typically selected once for a given site to accommodate the monitor, room and printer. After selecting FUTs 26, 30, and 15(a), controller 20 next selects pleasing FUT 24 corresponding to the color transformations requested by the user for a specific image. Pleasing FUT 24, for example, may increase the luminance parameter L of certain blue colors in the image. A user might, alternatively, for instance, be processing a picture taken by a photographer who habitually overexposes the photos. A different pleasing transform 24 might decrease the brightness over all ranges to counteract this incorrect overexposure.
Controller 20 composes the four selected FUTs into a single composite FUT 28. It then fetches the previously stored data array representing the original image from storage 17 and instructs processor 18 to process the image according to composite FUT 28 for display on monitor 16(a) .
The selected transformations could be performed sequentially by transform processor 18 by processing the image with each FUT in the sequence. However, this could result in a very large number of required calculations. To expedite processing, according to co- assigned invention identified above, described in
U.S.S.N. , controller 20 dynamically composes the FUTs 24, 26, 30 and 15(a) into the composite FUT 28. The electronic image from scanner 10(a) is then processed in a single transformation step in accordance
with composite FUT 28 and thereafter is displayed on color monitor 16(a).
In the above discussion, transform processor 18 applied input calibration FUT 12 (a) , and user modifiable input FUT 13 sequentially to the input data and then the data was stored. Afterward, composite FUT 28 was applied to the data which had already been transformed through FUTs 12(a) and 13. Alternatively, transform controller 20 can compose FUTs 12(a) and 13 together before image store 17 and instruct transform processor 18 to apply the composed FUT to the data and then store the resultant. The invention may also be practiced by storing the data in image store 17 just as it is scanned, without passing through any FUTs, and then including those FUTs 12(a) and 13 in the group of FUTs composed by transform controller 20 after image store 17.
The user, if satisfied with the displayed image, next instructs the controller to generate a hard copy of the image on printer 16(b). Accordingly, the controller selects the FUTs 24, 30 and 15(b). The first two of the three definitions, pleasing FUT 24 and gamut compression FUT 30, are identical to the pleasing and gamut FUTs used in generating the display image above. Printer calibration FUT 15(b) however, is designed to convert the image into the CMYK color space, calibrating the image to compensate for characteristics of a specific printer 16 (b) , which have been previously determined. If input FUTs 12(b) and 13 had not been applied to the data before image store 17, then they too would be composed with FUTs 24, 30 and 15(b) to create output composite FUT 32.
It is also possible at this stage for the user to modify the calibration of the output device by adding user modifiable output FUT 27. This FUT would be used, for instance, if the user had a supply of abnormal paper. It might be possible to obtain normal,
acceptable prints, by adjusting the FUT 27 to compensate for the abnormality.
The foregoing description is of the hardware and software components used to implement a user's commands. The subject invention relates principally to the interface (22 Fig. 2) between a user and the hardware elements. As mentioned above, although the invention has broad applicability to many types of data array processing, a particularly preferred embodiment is with respect to color image processing. Such an embodiment of a user interface is described below.
A typical user interface 22 (Fig. 10) includes an input pointing device, such as a mouse 120 or trackball (not shown) having at least one and beneficially two user activated buttons L, R. A video display 122 is used to display icons or graphical objects representing data arrays that the user will instruct the transform processor 18 to use for transformation operations and the results of his instructions on those data arrays. The mouse 120 or trackball can be moved by the user. A sensor in the mouse senses its relative motions and transmits signals representing those motions to a processor that causes the display 122 to display a correspondingly moving cursor 100. A keyboard 124 may also be used for this purpose. A separate processor
126 may be used, or the data processor that handles the FUT implementation discussed above may also be used to generate the display on terminal 122. It should be noted that in the normal case, the terminal 122 is not the same as the monitor 16(a), which is also part of the user interface, upon which the picture image is displayed. However, it is possible to dedicate a portion of a single display device to both tasks, according to well known techniques. In a preferred embodiment of the invention, the cursor 100 (Fig. 9) is an icon that resembles a hand. As explained below, the user uses the hand to
accomplish certain tasks. The user interface also includes tools, which are displayed on the terminal as graphical icons. Additional tools include a scanner input tool 102, a monitor output tool 104, a hard copy printer tool 106, a memory storage tool 108, a grey balance tool 110, a tonality tool 112, a work order tool 117, an adjustment tool 114 and an inking tool 116. A picture object 118 is also shown.
The tools will be described in more detail below. In brief, a scanner input tool allows the user to control a scanner to scan pictures into the system. A picture object allows the user to select data for a given picture to be treated. A monitor tool allows the user to control a color monitor to display and view pictures as they might be modified by use of the other tools. An output tool allows the user to control a printer to make hard copies such as prints or films of the image displayed on the monitor. A memory storage tool allows the user to direct that image data be written into or read from computer disk memory. A grey balance tool allows the user to add or remove color cast from a picture, either as displayed on the monitor or as stored or printed. A tonal tool allows the user to adjust the brightness and contrast of a picture over specified tonal ranges. A work order tool allows the user to combine the effects of several tools into one tool. An inking tool specifies the press/proof conditions under which to display or output a picture and permits control of under color removal. The processor of the user interface generates icons displayed on monitor 122 representing each of the tools shown in Fig. 9. In a computer software implementation, each tool is controlled by a program entity according to methods well known in the art. As discussed below, there can be multiple instances of the same tool, either identically parameterized, or variably parameterized. Each instance of a tool is
represented by a tool data element, and each type of tool is represented by a software routine. The tools may be thought of as independent processors that receive instructions from the user and act according to those instructions.
In general, the user activates and implements the tools by manipulating the icons of the tools with the hand icon, which is a cursor. As shown in Fig. 11, the user can invoke in the hand two states of openness, i.e. open or closed, and two states of activation, i.e. activated and not activated. In a preferred implementation, the states are invoked by use of two buttons L (left) or R (right) on mouse 120. The hand 100 can be closed by pressing the left mouse button. The hand icon has a "hot spot" (i.e. the "+" in Fig. 11, bottom) . If the hot spot of the hand icon is overlapping any part of any other icon on the terminal screen 122 at the instant it is closed, it will grasp that object, as shown at 130. The icon representing the hand changes shape to show a hand grasping the tool or object being grasped, to signal to the user the grasped state of the tool. (It should be noted that as shown in Fig. 11, the drawing of a hand 98 is not part of the user interface, but is a representation of an actual hand of a human user, pressing the buttons of the mouse 120.)
This representational aspect of the user interface, i.e. the change of appearance of the hand icon when it changes state, closely represents the actions of three dimensional objects in the real world. This aspect is implemented in other fashions, as will be discussed, with respect to other actions implemented by the user. Thus, the CHI of the invention provides the user with an environment that behaves very closely to the real, physical world with which the user is intuitively familiar.
The tools are program entities that have attributes, such as display location, applied state, active state, control settings, etc. Each tool has an icon associated with it. The user actually causes the tool object, the program entity, to move in a space that is represented by the display 122. This is shown to the user by a change in appearance of the icon e.g. it moves on the screen, or opens up to reveal additional controls. In the following discussion, reference will be made to the tool itself or the hand (i.e. the program entity) unless the tool icon is specifically identified.
If the left button is released, the hand opens 132 and its icon changes shape. If the hand was grasping an object before it was opened, it will release that object. If the hand is open and the right button is pushed the hand activates any object or tool (or portion of a tool) to which it's pointing finger is adjacent 134. The effect of activating a tool differs from tool to tool as is discussed.
If the right button is pressed while the hand is grasping a tool, the grasped tool is applied 136. If the grasped tool is adjacent to another object or another tool to which the grasped tool can be applied, then some activity will occur with respect to that second tool or object. If the grasped tool is not adjacent to any object or tool upon which it can be applied, the grasped tool and hand icon pair will change shape to indicate to the user that the grasped tool is being applied, but there is no object for its application. For instance, the adjustment tool 114, rotates as would a screwdriver in use to illustrate its use and aid in learning.
It will be understood by those of ordinary skill in the art that the activities of and relationships among the tools and objects can be described in terms
of messages being sent and received to and from each other.
The interaction between the user, the hand 100 and other tools or objects in general is shown schematically in Fig. 4. The hand is initialized 200 and is made open 202. The system evaluates 204 the user command. If the user commands the hand to grasp (by pressing the L button), at 208, the system evaluates if there is an object at that display location (which location would be part of the data parameters defining the program or processor entity that constitutes that object) . If there is, at 206 the system sends a message identifying the hand's location on the display 122 along with a message that whatever is at that location is being grasped. Then at 210 the object receives the instruction to be grasped, sends back a message that something has been grasped and what it is. The hand icon changes shape or replaces the cursor with a view of the object held by a hand, and the grasped object moves with the hand cursor 100 in response to the user's motion of the mouse.
If at 208 there were no object, the user interface operating system would indicate that there was no object, and the hand would close empty handed at 212. Rather than the system (or, a tool, as the case may be) sending a message including the hand's (or a tool's) location, which message is intercepted by the object at that location, it is also possible for the system (or tool) to examine the location beneath the hand, and determine if there is another object there. If so, the system or tool sends the message. If not, it sends no message. In order to simplify the following discussion, the general case will be discussed as if the hand (or a tool) simply sends the message to the second tool, without regard to the manner by which it is determined if a second tool is at
the location or the fact that the operating system, rather than the hand, sends and receives messages.
At 214, if the user had next commanded the hand to open, it would return to 202 and be made open. If at 214 the user had transmitted any other command, such as no command, or a movement of the mouse, the hand 100 would remain closed.
After 210, the hand, which is grasping something, would evaluate the user command at 216. If the user had pressed the L button, directing the hand 100 to apply the held tool or object, the hand 100 would send 217 an "Apply" message to the held tool or object to apply itself. The held object would at 218 act accordingly to receiving a command to apply itself, depending on its environment. A specific implementation of a tool responding to an "apply" message is shown schematically in Fig. 5 and is discussed below. After the object has been applied, at 216, the hand 100 again evaluates 216 the user command and proceeds as described.
If at 216, the user command is to "release" also referred to as "drop") the object, at 227 the hand 100 sends "Drop" a message to the grasped object that it is released and the hand drops 228 the object. The specific routine for the object being released, is invoked at 230. A routine for a specific object having been dropped is shown schematically at Fig. 6. After the completion of such a routine, the hand 100 is made open at 202 and again evaluates a user command. If at 216, the user command is other than
"release" or "apply", such as "no command", or a "move" command, the hand again evaluates the user command at 216. If, for instance, the user command is "move", the hand returns to 216 and then, if an "apply" command is in place, it again goes to the specific routine for the application of that object.
At 204, if, rather than "grasp", the user command had been "activate", the hand 100 sends 220 a message that whatever is at that location is being activated and the hand icon changes shape, e.g. to a hand with a pointing finger. At 222, if there is an object at that location then the object receives 224 the instruction to be activated. The routine for the particular object, when activated, is invoked 226. Such a procedure for a particular activated object is shown schematically at Fig. 5. In some cases, an object can be applied without being activated, depending on whether that condition would be an appropriate analogy to the real world.
At 222, if the hand 100 is not over an object, it will change its shape to point to nothing at 240 to illustrate its use and aid in learning. At 242 the hand will evaluate the user command. If the user command is "deactivate", the hand returns to 202 and is made open, proceeding as before. If the command is other than "deactivate", such as "no command" or a
"move" command, then the hand will remain in the active configuration, and will move with the mouse motion.
As discussed above with respect to Fig. 4, if a tool is released or activated, that tool invokes its own specific routine for being released or activated. The routine for released and activated tools of the type that "create" a FUT is shown schematically at Fig. 5. Such tools include the grey balance tool 110, the tonal tool 112 and the work order tool 117. It will be understood to those of ordinary skill in the art that tools that "create" FUTs do so by instructing transform controller 20 to create a FUT. Thus, the tool creates a FUT, albeit at a distance, and through the agency of another program entity, the transform controller 20; similarly, with respect to modifying and composing FUTs.
At 300, the tool enters the routine from the stage discussed above at A or C of Fig. 4. The tool receives 304 a message, if any from the target. The tool evaluates 306 the message. If 308 there is no message from a target, then the tool returns to Fig. 4. Such a situation could arise if there is no object under the tool, or if the object under the tool is not the type that can receive a FUT. If the message is "process the picture at memory location x" the tool gets 310 the data at memory location X and transforms 312 the data according to the FUT then associated with the tool. The tool writes 314 the data to the memory location X and returns 316 to Fig. 4.
If the message at 306 is "send your FUT" then the tool computes a new FUT if necessary 318 and sends 320 the FUT to the target object. The tool then returns 322 to Fig. 4. The target object will proceed along its own command path. A typical such command path is discussed with respect to a monitor tool 104 and Fig. 7.
The routine for a dropped object is shown schematically at Fig. 6, which is invoked at B of Fig. 4. At 324 the object enters. The tool receives 328 a message from the object. The tool evaluates 330 the received message. If the received message is "send your FUT," the tool sends 332 its FUT and returns 334. If at 330 the message is anything other than "send your FUT, " the tool returns at 334 without sending its FUT.
As indicated above at 318 of Fig. 5, in some cases, the tool must compute a new FUT. This routine is shown schematically in Fig. 8 with respect to the grey balance tool 110, shown in more detail in Fig. 14.
The grey balance tool 110 is used to add or remove a color cast to an image. The grey balance tool permits a color cast change with respect to all or only specified parts of the tonal range of an image. For instance, if the highlights of an image are too blue,
it is possible to reduce the blueness of only the highlights of the picture, leaving untouched the blueness of all other tonal portions of the picture e.g. the shadows and midtones. The grey balance tool 110, is divided into zones, which zones act as controls to be separately activated by the user. The grey balance tool 110 (Fig. 14) includes a color control 520, a maximum effect control 522 and decreasing and increasing effect controls 526, 524 respectively.
To use the grey balance tool 110, the user indicates with the hand 100 a location on the color circle 520, for instance as marked by the small x. The color circle represents all hues from neutral to their most saturated forms; the center represents neutral and the outside edge represents the most saturated form of each hue. Hues vary continuously around the circle. The letters C,G,Y,R,M,B stand for cyan, green, yellow, red, magenta and blue, respectively. Pointing with the activated hand 100 at the center of the color indicates no color change. Pointing the hand 100 at a location away from the center indicates the amount of a color change (not the amount of color cast in absolute.) Thus, if the user wants to remove a purple color cast, he adds a yellowish green color cast to all hues in the picture, by pointing at the spot indicated by the x. If the user finds that the added cast was correct in color, but too weak in saturation, the user points further out along the radius on which x lies, until the desired color cast is achieved.
The user must also select the range of tones over which the color cast is desired. This is done with adjustment of the maximum effect control 522, increasing effect control 524 and decreasing effect control 526. The scale 528 indicates schematically what range of tonality will be affected by application of the three other controls. The shadows are indicated on
the left of the scale 528 and the highlights on the right. Because, as shown in this example, the maximum effect control extends equally about the mid region of tonality, the full effect of the yellowish green color shift will take effect only in regions of midrange brightness. The slope of the edges of increasing effect control 524 and decreasing effect control 526 can be moved by activating the hand in that region and moving the hand while activated. If they were both moved to be horizontal, then the range of maximum effect would span the tonal range and the indicated color cast change will be applied fully to all tonality ranges, from the darkest shadows to the lightest highlights. If, however, the increasing effect control 524 is inclined as indicated, then tonal regions that are only slightly darker than the mid tonal region indicated by the maximum effect control 528, will have their color cast changed, but at less than the full change. The further from the midtones that a tonal range is, the less will there be an effect in parts of the image having that tone. Inclining the increasing effect control 524 more steeply, so that it does not intersect the side margin of the panel, as shown with respect to the grey balance tool 110b of Fig. 16, will result in no color change whatsoever at the darkest shadows.
The decreasing effect control 526 works similarly with respect to the other side of the area of maximum effect. It is also possible, in a preferred embodiment, to move the area of maximum effect from left to right or vice versa, i.e. from darker to lighter tonalities, by activating the maximum effect control 528 with the hand and moving it, side to side, in the desired direction. The width of the region of maximum effect can also be widened or narrowed by activating 528 and moving up or down. Both side to side and up and down motion may be done in combination.
The effect of moving these controls on the FUT generated by the grey balance tool is shown schematically in Fig. 8. The grey balance tool starts the routine at 350. If the hand is pointing 352 in the color circle control 520 the tool computes 354 the color parameters for its FUT based on the location of the hand, recomputes 356 its FUT and returns 358 to the main routine shown in Fig. 6(a), then moving on to send 320 its FUT to the target object. If the hand is not in the color circle control
520, the grey balance tool evaluates if the hand is in the maximum effect control 522. If so, the grey balance tool identifies 362 the tonal range of the maximum effect based on the location of the had tool and the parameters that will be used to generate the new FUT, and then proceeds to recompute 356 the new FUT as described above.
If the hand is not in the maximum effect control 522, it evaluates 364 if it is in the increasing effect control 524. If it is, the grey balance tool computes 366 the degree of increasing effect based on the location of the hand, and identifies the parameters that will be used to generate the new FUT, then proceeding to generate the new FUT as before. If the hand is not in the increasing effect control 524, the tool evaluates 368 whether it is in the decreasing effect control 526, with similar results. If the hand is not in the decreasing effect control 526, then the tool returns 372 to the main routine shown in Fig. 6(a), then moving on to send 320 its FUT to the target object.
Thus, it will be understood that as the user activates the grey balance tool and moves the controls, the tool will generate a new FUT for each new control location and send that FUT to the target object, for instance a monitor tool. In a preferred embodiment, the system reviews the location of the hand at least
two times per second. This is fast enough to generate ■real time processing.
It will be understood by one of skill in the art that, while the foregoing description of the applying the grey balance transformation with a zone of maximum effect and zones of increasing and decreasing effect has been with respect to three dimensional functional transform definitions, this aspect of the invention is also useful with respect to input/output relationships less complex than a FUT, such as a one or two dimensional relationship, or a look-up table.
The steps of the method of the invention that a user would practice to apply a pleasing transform to a picture are shown schematically in Figure 7. Starting at 400, the user would at 402 move the hand 100 to be over the monitor tool 104.
The monitor tool 104 allows the user to control the high resolution monitor 16(a). and more. In its inactive state, the monitor tool icon 104 appears as shown in Fig. 9. When dropped on a picture, the monitor tool icon enlarges to surround the icon of the picture object 11 as shown in Fig. 15. The monitor tool includes several controls, which work in the same general fashion as the controls of the grey balance tool. A zoom slider 111 is used to control the degree of magnification that the high resolution monitor will show the picture. The view area control 113 can be moved around the monitor, by activating with the hand, to select the portion of the picture to be displayed on the high resolution monitor. The zoom setting indicator 115 displays the magnification of the chosen degree of zoom.
The hand 100 grasps the monitor tool 104 when the user presses the L button, according to the message interchange discussed above with respect to the general operation of the hand 100. The user moves 408 the monitor tool 104 to a picture object 118 and releases
410 the monitor tool. The monitor tool sends 412 a message "process." The picture object, being at the location of the monitor tool, receives 414 the message and sends a message "a data array for the picture to be processed is at memory location [specified]". The monitor tool 104 receives 416 the message from the picture tool, gets the data array from the specified memory location and processes the picture data using the FUTs then in place. It will be assumed for the purpose of this discussion that the only FUT in place is a display calibration FUT 15(a) (Fig. 3).
The monitor tool has associated with it a composite FUT 28, which may combine a plurality of individual FUTs into one, thereby minimizing the computation time. At this point, since there is only one FUT 15(a) associated with monitor tool 104, composite FUT 28 is the same as display calibration FUT 15(a) . The monitor tool 104 stores the data for the transformed array in memory at a location different from that where the original picture data resides. Thus, the original data remains intact for further manipulation.
The monitor tool causes the data for the transformed array to be sent to the color monitor 16(a), where it is displayed for the user's inspection. It will be understood that the data array that results from the application of composite FUT 28 will be in RGB space, which are the type of signals that the display device 16(a) uses to display an image. If the user wants to add a gamut compression FUT 30, to display on the monitor the image as it would actually be printed by a specific printer, at 420 he causes the hand 100 to grasp an adjustment tool 114, which in a preferred embodiment appears as a screwdriver. The hand moves 422 the adjustment tool 114 over the monitor tool and applies 424 the adjustment tool. The adjustment tool sends a message
"adjust." The monitor tool receives the message and the icon changes 426 appearance to reveal an adjustment panel 105. The adjustment panel 105 includes a zone 107, referred to as an "inking sticky pad." The hand moves away 428 from the monitor and releases the adjustment tool 114, which remains where released and causes no further action. The user moves 430 the hand 100 to and grasps an ink tool/object 116. The user moves 432 the ink object to over the sticky pad and causes the hand to send a "release" message to the inking tool 116. The inking tool 116 sends 434 a "process" message. The monitor tool receives the message and sends 436 a message that is received by the inking tool "send your FUT." The inking tool 116 sends 438 its FUT for the gamut compression of a particular inking process and that FUT is received by the monitor tool 104. The monitor tool 104 composes 440 the gamut compression FUT with whatever FUTs it has been previously set up with. This results in a composite FUT 28 (fig. 3), which provides the same transformation as would first application of gamut compression FUT 30 and next display calibration FUT 15(a). Next 442 the monitor tool gets the originally stored picture data and processes the data through the newly composed FUT, and displays the picture as modified on the color monitor 16(a) .
It can be seen that the user interface provides an elegant method by which a user can cause a composite FUT to be created from two individual FUTs, simply by selecting tools on the display and moving those tools relative to one another and picture objects.
Next, the user may want to effectuate additional transformations to the image being worked upon, which are also of a general nature. For instance, the room in which the monitor 16(a) is used may be unusually bright and thus a change in the brightness is desired
so that the image appears as would a printed image in normal lighting conditions. Further, it may be desired to offset an undesired color cast caused by the lighting in the viewing room. This second type of adjustment is a grey balance adjustment.
The apparatus of the invention provides a compact method of making these general changes. A work order tool 117 is a tool that combines other tools which have FUTs, associated with them such as inking tools, grey balance tools and tonality tools. A work order tool 117 is shown in an activated configuration in fig. 16. The work order tool shown has three other tools associated with it: a grey balance tool 110a for affecting the shadows, a grey balance tool 110b for affecting the highlights and a tonal tool 112a for an overall effect. The work order tool can accommodate a large number of tools. It should be noted that there is no limit to the number of identical types of tools that a user can use in one image processing environment. For instance, new tonal tools 112 can be freely made by copying from existing tonal tools, and then modifying them. If a tool is copied, according to the software embodiment of the invention, a new data structure is created that is initially virtually identical to the data structure of the tool copied.
To add an additional tool to the work order tool 117, the user simply causes the hand 100 to grasp the desired tool and release it on the work order tool 117. The released object sends a "process" message to the work order tool. The work order tool sends a message that is received by the released tool to "send FUT". The work order tool receives the FUT, and combines it with the previously combined FUTs received from tools already a part of the work order. The work order tool then graphically merges the added tool into the work order. Each "pocket" of a work order tool is actually a sticky pad. Thus, by grasping and moving the work
order tool, the other tools in the work order are moved also.
For instance, as the tonal tool 112a is released on the work order tool 117 (fig. 16), it composes the tonality FUT with a composite FUT previously composed from FUTs sent by the two grey balance tools 110a and 110b. Thus, the work order tool is another tool by which the user can combine transformations, simply by causing the hand to move graphical objects relative to one another on a terminal display.
Returning to the discussion of the use shown in Fig. 7 of the work order tool 117 in conjunction with the monitor tool 104, the user grasps 444 a previously prepared work order tool 117 and moves it to the sticky pad 109 on the monitor tool adjustment panel 105, where it is released. The work order tool 117 sends its composed FUT to the monitor tool, which receives it and composes 450 a new FUT based on the FUTs already installed, for instance the display calibration FUT 15(a) and gamut compression FUT 30 generated by inking tool 116, stuck onto sticky pad 107.
It should be noted that the order in which the monitor tool composes the FUTs is important. Further, it is not always possible for the monitor tool to simply compose a new FUT, for instance from the work order tool 117, with a previously existing one, e.g. from the display calibration combined with the gamut compression. Sometimes, depending on the types of FUTs involved, it is necessary to begin anew with its uncomposed source FUTs, and recompose them all. The order of composition is to first compose customer modifiable monitor-to-proof ("CMMP") FUTs, such as grey balance, tonal tool or work orders, with inking FUTs, and next with display calibrations FUTs, such as 15(a) for a monitor or 15(b) for a printer.
The monitor tool 104 gets 452 the original picture data and processes the data through the newly composed
FUT and redisplays the altered picture on color monitor 16(a) .
The foregoing changes are the type of changes that the user must make so that the picture is displayed as it will be printed on the chosen printer. The user may also want to add an artificial or artistic alteration to the picture, such as to increase the contrast of the midtones of the picture.
At 454 the hand 100 grasps a tonal tool 112 and moves 456 it to the monitor, where it is released and then opens up. The user points 458 the hand at the tonal tool 112 to activate it, whereby the tonal tool sends its FUT to the monitor tool 104, which composes a new FUT as described above in connection with step 450. The monitor tool applies the new FUT to the data for the picture and displays the further modified picture on the color monitor 16(a) .
It is likely that the user will want to adjust the contrast variation applied after viewing it on the display terminal. The tonal tool is shown in detail in fig. 13. The user may, for instance, desire to increase the brightness and contrast of certain portions of the picture, for instance those presently of medium brightness. The tonal tool includes a brightness control 502 and a contrast control 504, a maximum effect control 506, an increasing effect control 508, a decreasing effect control 510, and region 599 of both brightness and contrast control together. To increase the contrast, the user points 462 the hand 100 at the contrast control 504 and presses the R button sending 464 a message to activate the slider control 504. The slider control 504 sends 466 a message to the tonal tool 112, indicating its current position. The tonal tool 112 computes 468 its FUT based on the location of the contrast slider control 504. The means by which the parameters of the FUT vary
with respect to the location of the controls is unimportant with respect to the present invention. In general, changes in location of the contrast control invoke changes in the FUT such that when the FUT is applied to data, the picture generated changes with respect to its contrast. Similarly with respect to brightness.
The tonal tool sends 470 the newly computed FUT to the monitor tool as already discussed, and the monitor tool recomputes 472 its FUT including the new FUT from the tonal tool 112 and recomputes 474 and redisplays the picture, indicating the effect of the tonal adjustment. If the user again moves the contrast control 504, at 476 the tonal tool loops back to 466 and generates a new FUT, which is sent to the monitor tool and used as discussed.
In a preferred embodiment, if the user activates and moves any of the range controls 508, 506, 510 without releasing the right mouse button, rather than showing the actual contrast, the pixel elements that will be affected are highlighted in some fashion, for instance all are made orange or all are made white. When the R button is released, then the actual contrast transformation takes place. If at 476, the user is pleased with the contrast change the user grasps 478 the monitor tool and removes it from the picture object 118. The monitor 16(a) will no longer display 480 the picture in question. The user grasps the tonal tool 112 with the hand and applies 482 it to the picture object 118. The tonal tool gets the picture data from memory and transforms it according to the FUT generated by the tonal tool 112 according to its settings 484. This data is stored in the location for the picture data and the picture data is thus permanently changed.
The user interface of the invention keeps inviolate the data for the picture during the
interactive user modifications made while the monitor tool is displaying the picture. In other words, the original data is kept in one memory location and modified picture data is kept in another memory location. When the tonal tool 112 (or any other tool) is applied directly to the picture object, this inviolability is no longer maintained and the original picture data is changed.
It must also be reiterated that all of the FUTs applied as discussed above, except for the final contrast transform, are applied to the picture data to insure that the image displayed on the monitor 16(a) faithfully shows what will be printed by a specified printer. The gamut compression FUT 30 generated by the inking tool 116 stuck onto the inking sticky pad 107 insures that the monitor does not display colors that the printer cannot print. The pleasing brightness FUT 26 discussed with respect to the room lighting of the monitor environment is only set once for the specific monitor site. The display calibration FUT 15 (a) is used only for the specific monitor. Therefore, none of these three FUTs should be permanently applied to the picture data, since they have nothing to do with anything other than viewing the data on a specific monitor.
Another aspect of the present invention is an apparatus for the straightforward creation of new or multiple instances of tools. A tool room is shown in Fig. 12. In a preferred embodiment of the invention, multiple "rooms" are provided. The user can move objects or tools from one room to another, by grasping the objects and carrying it to a door 121. The door opens and the tool or object can be carried to the adjoining room. The user can create new rooms and connect them to each other as desired.
The CHI of the invention includes a room with special properties, referred to as a "tool room, " which
acts as a room having an endless supply of standard tools. The tool room is shown in Fig. 12 and includes one master copy of each tool that will be used, set up according to default parameters. When the user grasps a master copy of a tool and takes it to the door leading from the room, the tool room creates a working copy of the tool which the user takes to the new room. In the software embodiment of the invention, this creation amounts to the generation of a copy of the data structure for the tool. In a hardware, many processors embodiment, the creation is effected by the activation of an additional processor designed to carry out the activities of the tool. Thus, the user goes to the tool room and simply takes the tool desired. It is not necessary to consciously keep track of the last copy of a tool, so that it is not inadvertently destroyed or altered. Further the master copies of the tools are all kept in a central place.
It will be understood by one of ordinary skill in the art that the invention also includes a method and apparatus for visual or graphical programming. The user programs the transform controller 20 by selecting and moving graphical objects on a screen relative to each other. For instance, as described in connection with Fig. 7, the user programmed the transform controller 20 to select, first FUTS 15(a), 30, 26 and 24 and to apply those to the data array for purposes of viewing on monitor 16(a). Next, the user programmed the controller 20 to select FUTS 24, 27, 30 and 15(b) and apply them to the data for purposes of printing. All of this programming was done by manipulating graphics rather than by typing commands in a word-based programming language.
While the foregoing has discussed an embodiment of the invention suited for color image processing, it will be understood that the invention is applicable in any data processing context where a user must
manipulate multiple transform definitions. Other applications include 3-dimensional modeling, signal processing and dynamical system modeling.
It will also be understood that the CHI of the invention is also useful in connection with hardware that operates at a high enough instruction rate per second that it is not necessary to compose multiple FUTs together in order to achieve real time image processing. The claimed invention is intended to cover those embodiments that do not include composition of FUTs. The invention may be implemented in software or hardware. In a hardware implementation, each tool is controlled by a processor or a portion of a processor, which generates signals causing the terminal to display the icon of the tool.
A software appendix showing in the C programming language a preferred embodiment of a software implementation of important portions of the CHI of the invention is attached hereto and incorporated by reference.
The foregoing discussion should be considered to be illustrative only and not limiting in any sense. What is claimed is:
Claims
1. Apparatus for generating an array of modified data values related to an array of input data values in response to commands from a user, comprising: a. storage means for storing at least one data transform definition; b. data processing means for:
(i) identifying at least one of said transform definitions in response to a user's selection of at least one data transformation; and
(ii) applying said identified at least one transform definition to said array of input data values; and c. interface means between the user and the data processing means for receiving instructions from the user and for indicating the outputs of said data processing means, said interface means comprising:
(i) an input device for the user to transmit instructions; (ii) signal generating means for generating signals representative of at least one of said at least one transform definition and said user instructions; and
(iii) display means for displaying a graphical representation of said signals representing said user instructions and said transform definitions; and
(iv) signal processing means for receiving said user instructions to select and move on the display means the graphical representation of at least one of said at least one transform definitions and to generate commands to the data processing means to identify said selected at least one transform definitions and to apply said selected transform definitions to said array of input data values to generate an array of modified data values.
2. The apparatus of claim 1, said data processing means further comprising means for generating a single composite transform definition whereby application of said single composite transform definition to said input data values generates the same modified data values as does the individual serial application of at least two selected transform definitions.
3. The apparatus of claim 1 wherein at least one of said selected transform definitions comprises a predetermined transform definition.
4. The apparatus of claim 1 said data processing means further comprising means for generating a custom transform definition in accordance with a user's instructions.
5. The apparatus of claim 4, said graphical representation of at least one of said transform definitions comprising at least one zone having a marker that is moveable in accordance with instructions from the user, and said data processing means further comprising means for generating a custom transform definition based on the location of said marker in the zone.
6. The apparatus of claim 4, said data processing means further comprising means for applying said custom at least one transform definition to said array of input data values to generate an array of output data values, said interface means further comprising means for displaying said output data.
7. The apparatus of claim 5, said interface further comprising means for designating a primary range of said input data array to be transformed according to a selected transform definition and means for designating at least one secondary range of said input data array to be transformed according to a transform definition that provides a selected rate of change of the transformation of data values within said secondary range of said input data array with respect to the proximity of said data values to the primary range of said input data array.
8. The apparatus of claim 7, said means for selecting a primary range comprising a first subzone of said graphical representation of said transform definition, said subzone having a marker that is moveable in accordance with instructions from the user, said data processing means further comprising means for generating a custom transform definition that applies the selected transform to said primary data range based on the location of said marker in said first subzone.
9. The apparatus of claim 7, said means for selecting a secondary range comprising a second subzone of said graphical representation of said transform definition, said second subzone having a marker that is moveable in accordance with instructions from the user, said means for generating a custom transform definition further comprising means for applying a transform to said secondary data range wherein the transformation of data values within said secondary range of said input data array changes with respect to the proximity of said data values to the primary range of said input data array, said rate of change based on the location of said marker in said second subzone.
10. The apparatus of claim 9 said second subzone comprising two subsubzones, each having a moveable marker, said means for applying a transform to said secondary data range further comprising means for changing the transformation of a first subset of data values within said secondary range of said input data array values with respect to the proximity of said first subset of data values to the primary range of said input data array, said rate of change based on the location of a first of said markers in said first subsubzone and means for changing the transformation of a second subset of data values within said secondary range of said input data array values with respect to the proximity of said second subset of data values to the primary range of said input data array, said rate of change based on the location of a second of said markers in said second subsubzone.
11. The apparatus of claim 2 wherein said input data values comprise image data values.
12. Apparatus for generating an array of modified data values related to an array of input data values in response to commands from a user, comprising: a. storage means for storing at least one input/output relationship definition; b. data processing means for:
(i) identifying and modifying at least one of said at least one input/output relationship definitions in response to a user's selection of at least one input/output transformation; and
(ii) for applying said identified at least one input/output relationship definition to said array of input data values; and c. interface means between the user and the data processing means for receiving instructions from the user and for indicating the outputs of said data processing means, said interface means comprising: (i) an input device for the user to transmit instructions; (ii) signal generating means for generating signals representative of said at least one input/output relationship definition and said user instructions; and (iii) display means for displaying a graphical representation of said signals representing said user instructions and said input/output relationship definition;
(iv) signal processing means for receiving commands to move on the display means the graphical representation of at least one of said input/output relationship definitions and to generate commands to the data processing means to identify said
selected input/output relationship definition, and to apply said selected at least one modified input/output relationship definition to said array of input data values; and (v) means for designating a primary range of said input data array to be transformed according to said selected input/output relationship definition and means for designating at lease one secondary range of said input data array to be transformed according to an input/output relationship definition that provides a selected rate of change of the transformation of data values within said secondary range of said input data array with respect to the proximity of said data values to the primary range of said input data array.
13. A method for generating an array of modified data values related to an array of input data values in response to commands from a user, comprising the steps of: a. providing: (i) means for data processing; and
(ii) interface means between the user and the data processing means for receiving instructions from the user and for indicating the outputs of said data processing means, said interface means comprising:
(a) an input device for the user to transmit instructions;
(b) signal generating means for generating signals; (c) display means for displaying a graphical representation; and
(d) signal processing means for receiving said user instructions; b. generating signals representative of at least two transform definitions and said user instructions; c. displaying a graphical representation of said signals representing said user instructions and said transform definitions; and d. selecting and moving on the display means the graphical representation of at least two of said transform definitions relative to each other; and e. applying said transform definitions represented by said selected graphical representations to said array of input data values.
14. The method of claim 13, further comprising, before the step of applying said selected transform definitions, the step of generating a single composite transform definition to said input data values and generating the same modified data values as is generated by the individual serial application of said selected at least two transform definitions.
15. The method of claim 14, further comprising before the step of applying said selected transform definition, the step of generating a custom transform definition.
16. Apparatus for generating modified data comprising: a. storage means for storing at least one unchangeable master data transform definition and at least one changeable working transform definition; b. transform controller means for selecting at least one working transform definitions in response to a user's selection of at least one data transformation; c. data processing means for applying said selected at least one working transform definitions to said input data value; and d. interface means between the user and the data processing means for receiving instructions from the user and indicating the results of data processing by the data processing means, said interface means comprising:
(i) an input device for the user to transmit instructions;
(ii) signal generating means for generating signals representative of said input and output data values, said at least one master and working transform definitions and said user instructions; and
(iii) display means for displaying to said user a graphical representation of said signals representing said input and output data, said user instructions and said master and working transform definitions; and
(iv) signal processing means for receiving from the user commands to identify on the display means the graphical representation of at least one master transform definition and to generate commands to the transform controller means to generate a changeable working transform definition that is the same as said identified master transform definition, other than the ability to be changed.
17. Apparatus for generating an array of modified data values related to an array of input data values in response to commands from a user, comprising: a. data processing means for applying a plurality of transform definitions to said array of input data values; and b. interface means between the user and the data processing means for receiving instructions from the user and for indicating the outputs of said data processing means, said interface means comprising: (i) means for sending a message to and means for receiving a message from tools and objects; (ii) at least one data transform definition tool having means for sending a message to and receiving a message from other tools and objects; and (iii) an object representing said input data values having means for sending a message to and means for receiving a message from tools, said message sent by said transform tool to said data object including a message to apply a selected transform to said input data values.
18. The apparatus of claim 17, said interface means further comprising a plurality of data transform definition tools.
19. The apparatus of claim 18, said messages sent by said transform definition tool further comprising a message specifying the transform definition associated with said transform definition tool, said interface means further comprising a work order tool having means for sending and receiving messages to and from other tools said messages sent by said work order tool comprising a message to said input data object specifying a single composite transform definition related to a plurality of transform definitions sent to said work order tool by a plurality of transform definition tools.
20. The apparatus of claim 19 said work order tool further comprising means for generating said single composite transform definition which when applied to said input data values generates the same output data values as does the individual serial application of said plurality of transform definitions.
21. The apparatus of claim 17, said data transform definition tool further comprising moveable controls, said messages that the data transform definition tool sends varying depending on the locations of the moveable controls.
22. The apparatus of claim 21, said moveable controls further comprising a first control specifying a primary range of said input data array to be transformed according to a selected transform definition and a second control specifying at least one secondary range of said input data array to be transformed according to a transform definition that provides a selected rate of change of the transformation of data values within said secondary range of said input data array with respect to the proximity of said data values to the primary range of said input data array.
23. A method for generating an array of modified data values related to an array of input data values in response to commands from a user, comprising the steps of: a. providing
(i) means for data processing; and (ii) interface means between the user and the data processing means for receiving instructions from the user and for indicating the outputs of said data processing means, said interface means comprising:
(a) means for sending a message to and means for receiving a message from tools and objects;
(b) at least one data transform definition tool having means for sending a message to and receiving a message from other tools and objects; and (c) an object representing said input data values having means for sending a message to and means for receiving a message from tools; and b. sending a message from said transform tool to said data object to apply a selected transform to said input data values.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP4511852A JPH06502055A (en) | 1991-04-26 | 1992-04-15 | A computer/human interface that allows data transformation operations to be performed by manipulating graphical objects on a video display |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US69328191A | 1991-04-26 | 1991-04-26 | |
US693,281 | 1991-04-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1992020184A1 true WO1992020184A1 (en) | 1992-11-12 |
Family
ID=24784041
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US1992/003135 WO1992020184A1 (en) | 1991-04-26 | 1992-04-15 | Computer/human interface for performing data transformations by manipulating graphical objects on a video display |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP0538462A1 (en) |
JP (1) | JPH06502055A (en) |
WO (1) | WO1992020184A1 (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0630146A1 (en) * | 1993-06-15 | 1994-12-21 | Xerox Corporation | Interactive user support and method using sensors and machine knowledge |
EP0648042A1 (en) * | 1993-10-06 | 1995-04-12 | Fuji Photo Film Co., Ltd. | Colour reproduction system |
EP0674430A1 (en) * | 1994-03-24 | 1995-09-27 | Eastman Kodak Company | Method and apparatus for interactive color transformation of color values between color spaces |
EP0680201A2 (en) * | 1994-04-27 | 1995-11-02 | Canon Kabushiki Kaisha | Image processing apparatus and method |
US5844542A (en) * | 1995-07-11 | 1998-12-01 | Fuji Xerox Co., Ltd. | Image processing apparatus and method with multi-dimensional display of image adjustment levels |
US5910796A (en) * | 1996-05-20 | 1999-06-08 | Ricoh Corporation | Monitor gamma determination and correction |
EP1014688A2 (en) | 1998-12-14 | 2000-06-28 | Canon Kabushiki Kaisha | Image processing method and apparatus, image processing system, and storage medium |
EP1089553A1 (en) * | 1999-10-01 | 2001-04-04 | Seiko Epson Corporation | Colour image processing apparatus and method |
EP1427185A2 (en) * | 2002-12-05 | 2004-06-09 | Canon Kabushiki Kaisha | Incremental color transform creation |
EP1821518A1 (en) * | 2006-02-16 | 2007-08-22 | Hewlett-Packard Development Company, L.P. | Personalized color reproduction |
US7619755B2 (en) * | 2004-09-01 | 2009-11-17 | Ricoh Company, Ltd. | Apparatus, method, system, and computer program for managing image processing |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3163987B2 (en) * | 1995-09-04 | 2001-05-08 | 富士ゼロックス株式会社 | Image processing apparatus and gamut adjustment method |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2633792A1 (en) * | 1988-06-29 | 1990-01-05 | Sony Corp | PRINTING APPARATUS FOR VIDEO SIGNAL |
EP0350870A2 (en) * | 1988-07-12 | 1990-01-17 | Dainippon Screen Mfg. Co., Ltd. | Method of simulating object image to be printed |
EP0416654A2 (en) * | 1989-09-08 | 1991-03-13 | Fuji Photo Film Co., Ltd. | Color scanner |
-
1992
- 1992-04-15 WO PCT/US1992/003135 patent/WO1992020184A1/en not_active Application Discontinuation
- 1992-04-15 EP EP19920913518 patent/EP0538462A1/en not_active Withdrawn
- 1992-04-15 JP JP4511852A patent/JPH06502055A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2633792A1 (en) * | 1988-06-29 | 1990-01-05 | Sony Corp | PRINTING APPARATUS FOR VIDEO SIGNAL |
EP0350870A2 (en) * | 1988-07-12 | 1990-01-17 | Dainippon Screen Mfg. Co., Ltd. | Method of simulating object image to be printed |
EP0416654A2 (en) * | 1989-09-08 | 1991-03-13 | Fuji Photo Film Co., Ltd. | Color scanner |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5490089A (en) * | 1993-06-15 | 1996-02-06 | Xerox Corporation | Interactive user support system and method using sensors and machine knowledge |
EP0630146A1 (en) * | 1993-06-15 | 1994-12-21 | Xerox Corporation | Interactive user support and method using sensors and machine knowledge |
EP0648042A1 (en) * | 1993-10-06 | 1995-04-12 | Fuji Photo Film Co., Ltd. | Colour reproduction system |
US6269184B1 (en) | 1994-03-24 | 2001-07-31 | Eastman Kodak Company | Method and apparatus for interactive color transformation of color values between color spaces |
EP0674430A1 (en) * | 1994-03-24 | 1995-09-27 | Eastman Kodak Company | Method and apparatus for interactive color transformation of color values between color spaces |
EP0680201A2 (en) * | 1994-04-27 | 1995-11-02 | Canon Kabushiki Kaisha | Image processing apparatus and method |
EP0680201A3 (en) * | 1994-04-27 | 1996-06-05 | Canon Kk | Image processing apparatus and method. |
US5677741A (en) * | 1994-04-27 | 1997-10-14 | Canon Kabushiki Kaisha | Image processing apparatus and method capable of adjusting hues of video signals in conversion to display signals |
US5844542A (en) * | 1995-07-11 | 1998-12-01 | Fuji Xerox Co., Ltd. | Image processing apparatus and method with multi-dimensional display of image adjustment levels |
US5910796A (en) * | 1996-05-20 | 1999-06-08 | Ricoh Corporation | Monitor gamma determination and correction |
EP1014688A3 (en) * | 1998-12-14 | 2003-08-20 | Canon Kabushiki Kaisha | Image processing method and apparatus, image processing system, and storage medium |
US7038810B1 (en) | 1998-12-14 | 2006-05-02 | Canon Kabushiki Kaisha | Image processing method and apparatus, image processing system, and storage medium |
EP1014688A2 (en) | 1998-12-14 | 2000-06-28 | Canon Kabushiki Kaisha | Image processing method and apparatus, image processing system, and storage medium |
US7590308B2 (en) | 1999-10-01 | 2009-09-15 | Seiko Epson Corporation | Image processing apparatus, an image processing method, and a computer readable medium having recorded thereon a processing program for permitting a computer to perform image processing routines |
US6868189B1 (en) | 1999-10-01 | 2005-03-15 | Seiko Epson Corporation | Image processing apparatus, an image processing method, and a computer readable medium having recorded thereon a processing program for permitting a computer to perform image processing routines |
US7356202B2 (en) | 1999-10-01 | 2008-04-08 | Seiko Epson Corporation | Image processing apparatus, an image processing method, and a computer readable medium having recorded thereon a processing program for permitting a computer to perform image processing routines |
EP1089553A1 (en) * | 1999-10-01 | 2001-04-04 | Seiko Epson Corporation | Colour image processing apparatus and method |
US7916973B2 (en) | 1999-10-01 | 2011-03-29 | Seiko Epson Corporation | Image processing apparatus, an image processing method, and a computer readable medium having recorded thereon a processing program for permitting a computer to perform image processing routines |
EP1427185A2 (en) * | 2002-12-05 | 2004-06-09 | Canon Kabushiki Kaisha | Incremental color transform creation |
EP1427185A3 (en) * | 2002-12-05 | 2006-07-19 | Canon Kabushiki Kaisha | Incremental color transform creation |
US7342682B2 (en) | 2002-12-05 | 2008-03-11 | Canon Kabushiki Kaisha | Incremental color transform creation |
US7619755B2 (en) * | 2004-09-01 | 2009-11-17 | Ricoh Company, Ltd. | Apparatus, method, system, and computer program for managing image processing |
EP1821518A1 (en) * | 2006-02-16 | 2007-08-22 | Hewlett-Packard Development Company, L.P. | Personalized color reproduction |
US7796296B2 (en) | 2006-02-16 | 2010-09-14 | Hewlett-Packard Development Company, L.P. | Personalized color reproduction |
Also Published As
Publication number | Publication date |
---|---|
JPH06502055A (en) | 1994-03-03 |
EP0538462A1 (en) | 1993-04-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5254978A (en) | Reference color selection system | |
US5898436A (en) | Graphical user interface for digital image editing | |
US6031543A (en) | Image processing apparatus for correcting color space coordinates and method | |
US4721951A (en) | Method and apparatus for color selection and production | |
US5311212A (en) | Functional color selection system | |
US6266103B1 (en) | Methods and apparatus for generating custom gamma curves for color correction equipment | |
US5627950A (en) | Real-time three-dimensional color look-up table interactive editor system and method | |
JP3869910B2 (en) | Image processing method and apparatus, and storage medium | |
US6701011B1 (en) | Image processing apparatus and image processing method and storage medium | |
US5208911A (en) | Method and apparatus for storing and communicating a transform definition which includes sample values representing an input/output relation of an image transformation | |
US4794382A (en) | Image retouching | |
US5243414A (en) | Color processing system | |
US7110595B2 (en) | Method of and apparatus for image processing, and computer product | |
EP0800150B1 (en) | Image process apparatus and method | |
EP0503051B1 (en) | Color image processing system for preparing a composite image transformation module for performing a plurality of selected image transformations | |
EP0070680B1 (en) | Reproduction of coloured images | |
EP0674430A1 (en) | Method and apparatus for interactive color transformation of color values between color spaces | |
EP1821518A1 (en) | Personalized color reproduction | |
EP0310388B1 (en) | Interactive image modification | |
JPH05210719A (en) | Image editing device for image processing system | |
US6057931A (en) | Method and apparatus for controlling color image reproduction | |
WO1992020184A1 (en) | Computer/human interface for performing data transformations by manipulating graphical objects on a video display | |
JP4356697B2 (en) | Color settings for monochrome images | |
JP2001358962A (en) | Color image processor and method for image rendering | |
JPH11196285A (en) | Image processing method, device and recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): JP |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH DE DK ES FR GB GR IT LU MC NL SE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1992913518 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1992913518 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 1992913518 Country of ref document: EP |