US11803101B2 - Method for setting the focus of a film camera - Google Patents
Method for setting the focus of a film camera Download PDFInfo
- Publication number
- US11803101B2 US11803101B2 US17/280,472 US201917280472A US11803101B2 US 11803101 B2 US11803101 B2 US 11803101B2 US 201917280472 A US201917280472 A US 201917280472A US 11803101 B2 US11803101 B2 US 11803101B2
- Authority
- US
- United States
- Prior art keywords
- image
- measuring device
- film camera
- real image
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 30
- 230000009466 transformation Effects 0.000 claims abstract description 26
- 230000003190 augmentative effect Effects 0.000 claims abstract description 12
- 238000001514 detection method Methods 0.000 claims description 6
- 230000003287 optical effect Effects 0.000 claims description 6
- 238000012937 correction Methods 0.000 claims description 3
- 238000013519 translation Methods 0.000 claims description 3
- 230000008901 benefit Effects 0.000 description 5
- 210000000887 face Anatomy 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005562 fading Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/18—Focusing aids
- G03B13/20—Rangefinders coupled with focusing arrangements, e.g. adjustment of rangefinder automatically focusing camera
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/18—Focusing aids
- G03B13/30—Focusing aids indicating depth of field
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/02—Bodies
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/18—Signals indicating condition of a camera member or suitability of light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/671—Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/2228—Video assist systems used in motion picture production, e.g. video cameras connected to viewfinders of motion picture cameras or related video signal processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
Definitions
- the present invention relates to a method for setting the focus of a film camera, in which distance information is obtained by means of a measuring device arranged in the area of the film camera, which distance information is used for setting the focus of the film camera.
- measuring devices are used on a case-by-case basis that assign distance data to different image areas. This allows an operator to select an image area, for example, in order to effect a focus adjustment on this image area.
- a depth image can be generated with a stereoscopic camera arrangement in order to thereby control the focus of a camera.
- the real image of the measuring camera and the depth image of the measuring camera are displayed on a touch PC or monitor.
- the distance can be measured in this area and subsequently the focus of the film or TV camera is adjusted.
- a 3D sensor e.g. a stereoscopic measuring device
- the real image of the measuring device will always show a different image section than the image of the film camera 1 .
- the image of the measuring device is often more wide-angled than the image of the film camera 1 , this has the advantage that an operator can already measure objects before they enter the image of the film camera 1 .
- the operator cannot see how the film material is recorded. For this, he would need a second monitor at the measuring device, which displays the image of the film camera 1 , wherein he must always look back and forth between the monitor of the measuring device and the so-called video assist (or viewfinder).
- the image of the film camera is not suitable for a tracking algorithm.
- the image of a film camera cannot be used for video tracking.
- this object is solved by the measuring device producing a real image on the one hand and a depth image on the other hand, producing therefrom a real image augmented with depth information and calculating this into the image of the film camera by means of an image transformation.
- An essential feature of the method according to the invention is that the real image of the measuring device is of high resolution and has a large depth of field. This makes it possible to establish a match with the image of the film camera and to create a mapping of the image elements so that the individual pixels of one image are assigned corresponding pixels of the other image. This makes it possible to compensate for the unavoidable parallax that results due to the different position of the film camera and the measuring device.
- this is an image transformation in which the image of the film camera could be reconstructed from the real image of the measuring device if it were not present. Excluded from this transformation are only pixels that are covered in an image by an object in front of it.
- the distance between the film camera and the measuring device is not too great, the proportion of such pixels is relatively small. Since the real image of the measuring device is previously augmented with distance information from the depth image, this depth information can thus also be transferred to the image of the film camera.
- An essential feature of the present invention is that, with the transformation, there is an exact correspondence in perspective between the view of the film camera and the view of the measuring device, and thus depth information is assigned to each (or at least to a sufficient number of) pixels of the real image of the film camera. A depth image of the film camera is obtained by this method.
- the real image of the measuring device has a large depth of field, preferably covering the entire range of distances to be expected for the shot. In this way, focusing can be performed efficiently even if the focus of the film camera at the relevant moment differs significantly from the distance of the object that is ultimately to be focused on. This also applies when the aperture of the film camera is fully open and, accordingly, the depth of field is shallow.
- the image of the film camera may be displayed on a further display device, in which distance information is superimposed. In this way, the exact section in question is always displayed without any parallactic distortion.
- the display can also be done by fading in (overlay), which is possible in a useful manner by precise perspective assignment in the correct position.
- the distance information is related to the focus setting of the film camera. This means, for example, that objects in front of the focal plane are highlighted in one color and objects behind the focal plane are highlighted in another color, wherein the color intensity and/or shading can vary depending on the distance from the focal plane.
- Automatic focusing can be realized particularly advantageously by tracking objects in the real image of the film camera. Tracking objects outside the depth-of-field range is normally considerably more difficult or impossible due to the blurring. Due to the perspective correspondence between the image of the film camera and the real image of the measuring device, tracking can be carried out without any problems even in the out-of-focus area, since the image recognition processes or the like are carried out on the real image of the measuring device and then the results are transferred to the image of the film camera.
- image transformation is performed by image recognition and feature detection algorithms with translation, rotation, distortion correction and scaling of views.
- image recognition and feature detection algorithms with translation, rotation, distortion correction and scaling of views.
- the image transformation is carried out by presetting the geometric and optical parameters of the film camera and the measuring device and arranging them relative to each other. This can reduce the required computing power.
- a particularly advantageous embodiment variant of the invention provides that, on the basis of the depth information, areas of the image of the film camera are combined into elements of groups, each of which can be selected separately. Due to the three-dimensional detection, the pixels belonging to certain real objects, such as, for example, a person, can be grouped particularly efficiently.
- the formation of groups is possible not only because of the depth information, but also because of the real image, e.g. a group “eyes” (as elements of the groups).
- image areas can be divided into groups and marked in the image. These groups or elements of the groups are included and displayed as an overlay in the image of the film camera with perspective accuracy.
- An operating device can select a group or switch between groups. A tracking algorithm can be started based on this selection. Due to the stored depth data, the distances of the group elements to the film camera are known, on which the focus can be adjusted.
- the markings, identifications and distance information determined in the measuring device can be included and superimposed as a layer in the image of the film camera with perspective accuracy.
- Efficient processing of the data can be achieved by linking image data and depth data from the film camera with a time code signal and storing them together.
- the present invention also relates to a device for adjusting the focus of a film camera, wherein a measuring device is arranged in the area of the film camera to obtain distance information usable for adjusting the focus of the film camera.
- this device is characterized in that the measuring device consists of a real image camera and a depth camera, which are fixedly arranged on a common carrier.
- the measuring device In the measuring device or as a separate device there is a computing unit which produces the real image of the measuring device ( 7 ) augmented with the depth information.
- This computing unit has the possibility to run algorithms of image processing.
- the device according to the invention can also provide that markings and the corresponding distance values can be selected by an operating device and processed further at the operating device.
- the real image camera has an image sensor with HDR function (High Dynamic Range Image).
- HDR function High Dynamic Range Image
- a compensation of large brightness differences is carried out by known methods. It is important for the processing according to the invention that the depth-of-field range in the real image of the measuring device is sufficiently large. It is advantageous if the real image sensor has an HDR function in order to display the real image of the measuring device with a reasonably uniform illumination so that no information is lost.
- the real image camera has a small sensor and/or small aperture.
- the choice of aperture and the focus setting are optimally selected so that the entire area of interest can be sharply imaged. In the case of film recordings, for example, this can be an area in front of the camera between 40 cm and INF.
- a video overlay unit can also be provided in order to include the real image of the measuring device in the image of the film camera.
- the video overlay unit has an input for the data from the measuring device, the operating devices and the film camera.
- the data can be output to the display device.
- the video overlay unit is installed in the measuring device, in the film camera or in the display device or mounted as a separate device at the film camera.
- the measuring device is arranged on a lens hood of the film camera.
- a lens hood is a device that is usually arranged on the lens of the film camera. Its purpose is to reduce or prevent the incidence of stray light into the lens, such as sunlight or light from light sources obliquely behind or next to the camera.
- Such lens hoods sometimes called compendiums or matte boxes, often have adjustable flaps (French flags).
- the measuring device can thus be placed very close to the film camera, where it does not interfere and forms a compact unit with it. This enables a space-saving embodiment with a simple structure.
- the measuring device can be integrated in the lens hood.
- the lens hood has a square outer frame, wherein the measuring device is arranged in the region of at least one corner of the outer frame.
- the outer frame is arranged along the axis of the lens of the film camera at the end of the lens hood facing away from the film camera.
- FIG. 1 shows a schematic representation of the device according to the invention
- FIG. 2 shows a diagram explaining the configuration of the device.
- the device of FIG. 1 consists of a film camera 1 with a lens 2 and a viewfinder 3 .
- a carrier 4 is detachably attached to the film camera 1 , which carries a real image camera 5 and a depth camera 6 . These are arranged vertically one above the other in the position of use.
- the position of use is the usual position of the film camera 1 in which the long side of the rectangular image is horizontal.
- the connecting line between the real image camera 5 and the depth camera 6 is perpendicular to the long side of the rectangular image and thus of the sensor of the film camera 1 .
- the carrier 4 including the real image camera 5 and the depth camera 6 is a measuring device 7 which allows enriching image information with distance information. Thereby, distance information is assigned to each pixel (or sufficiently many pixels) of the real image.
- a touch PC 8 as a display device enables the display of various representations that facilitate automatic or, in particular, manual focus control.
- Another operating device 9 with a rotary control can also be used for control.
- FIG. 2 shows the logical interconnection of the individual components.
- the film camera 1 is in communication with a video overlay unit 10 to transmit the image to it.
- the video overlay unit 10 also receives image and distance information from the measuring device 7 so that the image from the film camera 1 can be matched in correct position with the real image from the real image camera 5 and the depth image from the depth camera 6 of the measuring device 7 .
- a servo motor 11 on the film camera 1 can be controlled via the measuring device 7 to control the focus adjustment.
- Various operating devices are designated 8 , 9 and 12 , namely a touch PC 8 , another operating device 9 with a rotary control and an operating device 12 with a joystick.
- the viewfinder 3 can optionally be supplied with the desired displays. Alternatively or in addition to the viewfinder 3 , a screen can be provided as video assist 3 a.
- the 3D sensor consists of a stereoscopic camera array, TOF camera, laser scanner, lidar sensor, radar sensor or combination of different 3D sensors to improve the measurement quality, range and resolution.
- the measuring device 7 has a video camera which generates a real image and which is therefore referred to here as real image camera 5 .
- the 3D sensor and the real image camera 5 are mechanically fixed to each other and calibrated.
- the display perspectives are the same.
- a distance value can be assigned to each recognizable pixel of the video camera. This assignment is called depth image.
- this real image camera 5 has an infinitely large depth-of-field range in order to be able to sharply depict all objects in the image.
- this video camera has a large exposure range (e.g. through an HDR mode) in order to be able to image subjects of different brightness uniformly.
- the measuring device 7 consists of the depth camera 6 (3D sensor) and the real image camera 5 as measuring unit and a computer unit for processing the measuring results.
- the measuring unit and the computing unit are in the same housing. However, they can also exist as separate units.
- the measuring device 7 is arranged on a film or television camera (here film camera 1 ) in such a way that it can be brought into correspondence with the image of the film camera 1 , namely that the image of the film camera 1 is contained in a partial area of the real image camera 5 .
- the field of view of the measuring device 7 is very wide-angled and usually larger than the field of view of the film camera 1 .
- the measuring device 7 is detachably arranged on or near the film camera 1 .
- the optical axes of the real image camera 5 of the measuring device 7 and the film camera 1 are parallel.
- a control device is preferably implemented in the measuring device and/or an interface is provided for the focus motor or all three lens motors (focus, iris, zoom) of the film camera 1 .
- IMU Inertial Measurement Unit
- tracking algorithms for automated tracking of objects in the video image are executed through analyses of video images. Also, the closest point to the film camera 1 can be calculated.
- image areas can be divided into groups using the available real image and depth data.
- features can be extracted from images by image recognition algorithms.
- Such features can be eyes, faces of people, entire persons or various predefined objects. For example, all faces can be divided into elements of the group “face” and marked with a frame in the real image.
- contiguous image areas can be defined in elements of a group and color-coded depending on the distance. Areas of 1-2 m, for example, are displayed in red, areas of 2-3 m in blue, and so on.
- contiguous image areas with a regular depth profile can form an element of a group.
- a floor has a regular depth profile as long as it is sufficiently flat and no objects are lying on the floor. Therefore, the group element “Floor” can be formed in the image and combined as a single image object and marked in the real image.
- Video Overlay Unit 10
- the video overlay unit 10 has interface input and output for the video image of the film camera 1 , allowing a video image to be read in and output again.
- the video overlay unit 10 has an interface input for the real image and the depth image of the measuring device 7 .
- the real image of the measuring device 7 is included in the perspective of the image of the film camera 1 by transformation.
- the transformation can be performed in the following ways:
- the image transformation between the real image of the film camera and the images of the measuring device can also be carried out by a purely mathematical shift, if
- the transformation can also be calculated from when the two real images are displayed graphically and shifted on top of each other by manual graphical manipulations on the screen (shift, rotate, tilt, scale).
- this transformation of the real images also establishes a correlation between the pixels of the image of the film camera 1 and their distance.
- a depth image of the image of the film camera 1 is realized.
- the image of the film camera 1 , the real image of the measuring device 7 and the depth image of the measuring device 7 can be displayed together as an overlay, or only individual layers of the overlay are displayed.
- Additional information can be placed over the image of the film camera 1 as another overlay layer.
- Markings, labeled image areas and the position of the tracking points or tracking areas, cursors, etc. are also included in the video image of the main camera as a perspective-accurate overlay by the transformation, and can thus be displayed and checked. Even if the image of the film camera 1 is blurred or too dark, these markings, labels and tracking functions can still run correctly because they are calculated in the background using the real image of the measuring device.
- Groups or elements of a group and the labeling (marking) of these can be included in the real image of the film camera by the image transformation with exact perspective.
- image groups are displayed in the real image in color, by shading, marked as a pattern or enclosed in a frame as an overlay.
- a colored overlay can be placed over the image of the film camera.
- the image of the film camera 1 is ideally executed as a grayscale image.
- Each distance of a pixel or image area gets its own color coding. This provides depth information for the image of the film camera 1 .
- the depth-of-field range is included as an overlay in the video image of the film camera 1 and can be displayed. Those pixels/image points of the film camera 1 that lie in the focus area are marked in color. This corresponds to a conventional focus peaking function for video cameras.
- Each pixel or image area of the image of the film camera 1 can be given its own color representation depending on the distance to the focal plane.
- This color marking can be displayed as an overlay. This makes it possible to see at a glance which areas are in the focal plane and what distance image areas are from the focal plane.
- This function is called Visual Focus Peaking. It is similar to the focus peaking function in traditional video cameras or cameras, where only the pixels that lie in the focal plane are marked. In contrast to this, the distance of all other pixels can also be displayed in color.
- the distance of an image area can also be displayed in different section lines.
- a horizontal or vertical line is placed in the video image.
- a bar is used to show how far the corresponding pixel is from the focal plane. Pixels in front of the focal plane are drawn with a bar above the intersection line, pixels behind the focal plane are drawn with a bar below the intersection line. The result is a histogram that shows the distance of the pixels from the focal plane.
- the video image of the film camera 1 can be tilted in the display perspective by linking it to the depth information.
- the focal plane can be superimposed and it is easy to see how the image areas are spatially related to the focal plane.
- General settings and information of the measuring device are included in the video image of the film camera 1 as an overlay and can be displayed.
- the thus processed and enhanced image of the film camera 1 is displayed on a video assist monitor 3 a or in the video viewfinder 3 of the film camera 1 .
- the thus processed image of the film camera, real image of the measuring device 7 and the depth image can also be output to a touch PC, on which image areas are selected, stored and retrieved manually or automatically.
- the processed image of the film camera 1 can be linked to a time code signal and stored. This means that the real image and the depth information of the film camera 1 are available at any defined time.
- This video overlay unit is preferably located in the measuring device 7 or the video assist 3 a and is a part thereof.
- Video Assist 3 a and/or Viewfinder 3
- the video assist 3 a or the viewfinder 3 are display elements for the image of the film camera 1 .
- Video assist 3 a or viewfinder 3 can have a touchscreen and/or control buttons on which the measuring device 7 can be controlled directly. In this way, it is also possible to select an image area on which to focus.
- Control Unit 9 , 12
- the control unit 9 , 12 has a control element for setting the focus. Turning/shifting causes a shift of the focal plane in space.
- the control unit 9 , 12 has a control element for adjusting a focus ramp.
- control options can be one element or two elements.
- the control element preferably a haptic control element (e.g. slider, rotary knob) has an adjustment path and two stops.
- the two stops are designated as the start value and end value.
- Focal planes (distances) can be assigned to the control element.
- the start value is assigned focal plane A and the end value is assigned focal plane E.
- Focal plane A and focal plane E can be distances, tracking areas, image areas, etc.
- Different focus levels (A1, A2, A3, . . . , E1, E2, . . . ) can be assigned to the start value or end value of the control element. This can be done by pressing a button or by other operation.
- the initial value and the final value are not fixed distance values, but variable distance planes in space.
- plane A & E can be derived from the tracking algorithm.
- plane A can also be the set value from the manual control element, and plane E can be derived from the tracking algorithm.
- the control unit has switches or buttons for switching between an automatic focusing mode and a manual focusing mode.
- a focal plane A is assigned to the start value and a focal plane B is assigned to the end value.
- a shift/rotation of the control element causes a shift of the focal plane.
- the speed of the rotation/shifting determines how fast the focus should move from the initial value to the final value, i.e. from plane A to plane E.
- control elements In known controllers, control elements (usually a rotary knob for adjusting the focal plane) have two fixed distances at the start value and end value of the rotary travel. For example, the focus can be moved from 1 m to 10 m in space.
- variable focal planes are assigned to the start value and end value here.
- These variable planes can be, for example, two tracking points of two different subjects. The subjects can naturally move in space, so they always change the distance to the camera. Thus, the operator is actually no longer pulling the focus in distance, but only controlling the timing of how long and when the focus should be at the new endpoint. He is thus relieved of the difficult task of permanently “knowing” (estimating) the correct distance to the end subject. He can devote himself exclusively to the temporal course of the focus ramp, which leads to artistically more interesting results.
- the focal planes are fixed values.
- a corresponding distance corresponds to the adjustment path.
- a rotation/shift causes a shift of the focal plane in space.
- the operating device has operating elements or switches for setting, retrieving and deleting tracking points or markers or for moving a cursor on the video image.
- Operating elements or switches and control element can be distributed in one device or in several devices.
- Control element can also be designed as a slider, joystick, sensor surface, touch screen or other actuator instead of a rotary knob
- the operating element can be used to position a cursor in the video image.
- This operating element can be a switch, joystick, touchpad, eye tracking of the user, gyroscope or other element to control x/y coordinates.
- the corresponding distance value can be output or the focus position on the optics can be approached.
- the control unit is connected directly to the measuring device via cable or radio link.
- the control unit can also be connected to a lens control system, which in turn has a connection to the measuring device 7 .
- the control unit can additionally control the iris and/or zoom of the film camera 1 .
- the rotary knob has a motorized drive. This makes it possible to automatically turn the rotary knob to the corresponding distance position of the measuring device.
- the rotary knob therefore always has the position which corresponds to the current distance value of the measuring device.
- the operating element can be used to select subjects or image groups from the video image/depth image in different ways:
- Tracking points on the video assist 3 a are approached with the cursor and set at the push of a button. The distance measurement can then run automatically.
- the distance to a specific object is measured.
- this object is located in the optical center of the main camera.
- a tracking point is started.
- the object can then be automatically tracked and focused by image recognition on the real image. This means that the operator does not need to look at a monitor to set a tracking point and save or start automatic focusing.
- Image feature recognition can be started via the control unit. These image features are, for example, eyes, faces or persons. These features are displayed in the video image of the main camera as an overlay. By switching, a feature can be selected and saved as a tracking point or a tracking can be started.
- the depth of field (DoF) range can be superimposed in the image of the film camera in the area of the plane of focus in color or represented by patterns by using the depth data that is correct in perspective for the image of the film camera. Only those pixels are marked which are in the DoF; the other pixels remain in the real representation.
- DoF depth of field
- Image feature recognition can be started via the operating device. These image features are, for example, eyes, faces or persons. By turning the rotary knob, the features corresponding to the corresponding distance are marked. This feature is selected and can be saved as a tracking point.
- the corresponding distances in the video image are marked in color. If only one image area in the video image is marked, it can be saved as a tracking point. If several image areas are marked in the video image, switching selects one area and saves it as a tracking point.
- the elements of a group can be selected via the control unit. In this way, all elements can be selected in sequence. If an element is selected at the push of a button, the corresponding distance value can be output or the focus position can be determined. It is also possible to save this element as a tracking point or to start tracking based on this element.
- the distance of a first group from a focal plane is assigned to a first stop of an operating device and that a further distance of a further group from the focal plane is assigned to a further stop of the operating device.
- the adjustment range of the operating device is dynamically adapted to the distance of the two groups, so that the focal plane can be continuously adjusted between the groups.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Computational Biology (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Focusing (AREA)
- Image Analysis (AREA)
- Automatic Focus Adjustment (AREA)
- Studio Devices (AREA)
Abstract
Description
-
- The cameraman cannot operate a touch PC or other monitor during his activity (guiding the camera, adjusting the image area) in order to draw the focus independently.
- The video image of the measuring device has a different perspective and image area than the image of the film camera. The cameraman or focus assistant cannot see exactly which subjects are in the scene because the image areas (perspectives) of the auxiliary camera and film camera are different.
- The image of the real camera cannot be viewed on the TouchPC, i.e. the result of the focusing process cannot be seen. The focus assistant therefore needs a second auxiliary monitor (the video assist 3 a) in order to also view the image of the real camera.
-
- Film lenses have a low range of field depth. They also display desired areas in the image very blurred. However, if image areas are out of focus, tracking cannot be performed in these areas because objects in the video image cannot be detected and analyzed. If the focal plane wants to be moved from area A to area B, this may not be possible because area B is not detected.
- Often the images of a
film camera 1 are exposed in such a way that only certain subjects are shown illuminated, other image areas are kept in the dark and are poorly visible or not visible at all in the real image.
-
- Automatically by image recognition/feature detection. The basis of this transformation is detection of features (Feature Detection), wherein matches are searched for in both images. The transformation of the image of the measuring device into the image of the film camera is performed by
- translation,
- rotation,
- distortion correction, and
- scaling
of the two video images on the basis of image features which are found in both images. Theoretically, three identical image features are sufficient, preferably several features distributed over the image, which are detected in both images. This type of transformation has the advantage that the image perspectives are automatically aligned when the lens of the film camera is changed without manual interaction.
- Semi-automatic by entering the geometric and optical parameters.
- Automatically by image recognition/feature detection. The basis of this transformation is detection of features (Feature Detection), wherein matches are searched for in both images. The transformation of the image of the measuring device into the image of the film camera is performed by
-
- the distances and
- the alignment of the optical center,
- the distances between the image sensors, and
- the image sections of the optics
are entered. A simple image transformation and thus an incorrect match of the perspective can also be achieved with only two or three of the parameters listed. - Manually by nesting the representations of the real image of the film camera with the real image or depth image of the measuring device on a monitor.
Claims (26)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
ATA60156/2018 | 2018-09-26 | ||
AT601562018A AT521845B1 (en) | 2018-09-26 | 2018-09-26 | Method for adjusting the focus of a film camera |
PCT/AT2019/060316 WO2020061604A1 (en) | 2018-09-26 | 2019-09-26 | Method for setting the focus of a film camera |
Publications (2)
Publication Number | Publication Date |
---|---|
US20220035226A1 US20220035226A1 (en) | 2022-02-03 |
US11803101B2 true US11803101B2 (en) | 2023-10-31 |
Family
ID=68109072
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/280,487 Active US12066743B2 (en) | 2018-09-26 | 2019-09-26 | Method for focusing a camera |
US17/280,472 Active US11803101B2 (en) | 2018-09-26 | 2019-09-26 | Method for setting the focus of a film camera |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/280,487 Active US12066743B2 (en) | 2018-09-26 | 2019-09-26 | Method for focusing a camera |
Country Status (4)
Country | Link |
---|---|
US (2) | US12066743B2 (en) |
EP (2) | EP3857303A1 (en) |
AT (1) | AT521845B1 (en) |
WO (2) | WO2020061604A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE202020105870U1 (en) * | 2020-10-14 | 2020-10-22 | Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg | Remote control device for a video camera |
DE102022207014A1 (en) * | 2022-07-08 | 2024-01-11 | Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg | Camera assistance system |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2746368A (en) * | 1953-06-02 | 1956-05-22 | Rollei Werke Franke Heidecke | Range finder for photographic cameras |
WO2012126868A1 (en) * | 2011-03-18 | 2012-09-27 | Martin Waitz | Method and device for focusing a film camera |
EP2947510A1 (en) | 2014-05-23 | 2015-11-25 | Howard Preston | Camera focusing system |
US20160088212A1 (en) * | 2014-09-24 | 2016-03-24 | Panavision Federal Systems, Llc | Distance measurement device for motion picture camera focus applications |
US20180176439A1 (en) * | 2016-12-20 | 2018-06-21 | Microsoft Technology Licensing, Llc | Dynamic range extension to produce high dynamic range images |
US10368057B1 (en) * | 2016-09-28 | 2019-07-30 | Amazon Technologies, Inc. | Synchronizing data streams |
Family Cites Families (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07114065A (en) * | 1993-10-19 | 1995-05-02 | Fuji Photo Optical Co Ltd | Automatic visibility adjusting device for finder of camera |
US6104398A (en) * | 1998-02-13 | 2000-08-15 | International Business Machines Corporation | Fast and efficient means for grouped object selection and deselection |
US20150297949A1 (en) * | 2007-06-12 | 2015-10-22 | Intheplay, Inc. | Automatic sports broadcasting system |
DE19926559A1 (en) * | 1999-06-11 | 2000-12-21 | Daimler Chrysler Ag | Method and device for detecting objects in the vicinity of a road vehicle up to a great distance |
US7954719B2 (en) * | 2000-11-24 | 2011-06-07 | Metrologic Instruments, Inc. | Tunnel-type digital imaging-based self-checkout system for use in retail point-of-sale environments |
US20090134221A1 (en) * | 2000-11-24 | 2009-05-28 | Xiaoxun Zhu | Tunnel-type digital imaging-based system for use in automated self-checkout and cashier-assisted checkout operations in retail store environments |
US7132636B1 (en) * | 2001-07-06 | 2006-11-07 | Palantyr Research, Llc | Imaging system and methodology employing reciprocal space optical design |
US7162101B2 (en) * | 2001-11-15 | 2007-01-09 | Canon Kabushiki Kaisha | Image processing apparatus and method |
US7544137B2 (en) * | 2003-07-30 | 2009-06-09 | Richardson Todd E | Sports simulation system |
US8134637B2 (en) * | 2004-01-28 | 2012-03-13 | Microsoft Corporation | Method and system to increase X-Y resolution in a depth (Z) camera using red, blue, green (RGB) sensing |
US9094615B2 (en) * | 2004-04-16 | 2015-07-28 | Intheplay, Inc. | Automatic event videoing, tracking and content generation |
US20070182812A1 (en) * | 2004-05-19 | 2007-08-09 | Ritchey Kurtis J | Panoramic image-based virtual reality/telepresence audio-visual system and method |
JP2006074634A (en) * | 2004-09-03 | 2006-03-16 | Nippon Hoso Kyokai <Nhk> | Imaging apparatus and method for displaying depth of field |
EP2149067A1 (en) * | 2007-04-19 | 2010-02-03 | D.V.P. Technologies Ltd. | Imaging system and method for use in monitoring a field of regard |
JP5247076B2 (en) * | 2007-07-09 | 2013-07-24 | 株式会社ニコン | Image tracking device, focus adjustment device, and imaging device |
JP2010091669A (en) * | 2008-10-06 | 2010-04-22 | Canon Inc | Imaging device |
JPWO2012023168A1 (en) * | 2010-08-19 | 2013-10-28 | パナソニック株式会社 | Stereoscopic imaging device and stereoscopic imaging method |
KR101638919B1 (en) * | 2010-09-08 | 2016-07-12 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
WO2013088917A1 (en) * | 2011-12-13 | 2013-06-20 | ソニー株式会社 | Image processing device, image processing method, and recording medium |
WO2014041859A1 (en) * | 2012-09-11 | 2014-03-20 | ソニー株式会社 | Imaging control device, imaging device, and method for controlling imaging control device |
US10331207B1 (en) * | 2013-03-15 | 2019-06-25 | John Castle Simmons | Light management for image and data control |
US9392129B2 (en) * | 2013-03-15 | 2016-07-12 | John Castle Simmons | Light management for image and data control |
CN106575027B (en) * | 2014-07-31 | 2020-03-06 | 麦克赛尔株式会社 | Image pickup apparatus and subject tracking method thereof |
US9836635B2 (en) * | 2014-10-09 | 2017-12-05 | Cognex Corporation | Systems and methods for tracking optical codes |
GB2596029B (en) * | 2015-09-16 | 2022-07-27 | Canon Kk | Image sensor and image capturing apparatus |
US9609307B1 (en) * | 2015-09-17 | 2017-03-28 | Legend3D, Inc. | Method of converting 2D video to 3D video using machine learning |
US9842402B1 (en) * | 2015-12-21 | 2017-12-12 | Amazon Technologies, Inc. | Detecting foreground regions in panoramic video frames |
US9824455B1 (en) * | 2015-12-21 | 2017-11-21 | Amazon Technologies, Inc. | Detecting foreground regions in video frames |
US20180352167A1 (en) * | 2016-02-19 | 2018-12-06 | Sony Corporation | Image pickup apparatus, image pickup control method, and program |
SE541141C2 (en) | 2016-04-18 | 2019-04-16 | Moonlightning Ind Ab | Focus pulling with a stereo vision camera system |
US10981061B2 (en) * | 2016-06-13 | 2021-04-20 | Sony Interactive Entertainment LLC | Method and system for directing user attention to a location based game play companion application |
CN109314800B (en) * | 2016-06-13 | 2022-02-01 | 索尼互动娱乐有限责任公司 | Method and system for directing user attention to location-based game play companion application |
US10147191B1 (en) * | 2016-07-26 | 2018-12-04 | 360fly, Inc. | Panoramic video cameras, camera systems, and methods that provide object tracking and object based zoom |
US10274610B2 (en) * | 2016-09-09 | 2019-04-30 | Minnesota Imaging And Engineering Llc | Structured detectors and detector systems for radiation imaging |
EP3547003B1 (en) * | 2016-11-25 | 2021-09-22 | Sony Group Corporation | Focus control device, focus control method, program, and image capturing device |
EP3340106B1 (en) * | 2016-12-23 | 2023-02-08 | Hexagon Technology Center GmbH | Method and system for assigning particular classes of interest within measurement data |
US11250947B2 (en) * | 2017-02-24 | 2022-02-15 | General Electric Company | Providing auxiliary information regarding healthcare procedure and system performance using augmented reality |
GB201709199D0 (en) * | 2017-06-09 | 2017-07-26 | Delamont Dean Lindsay | IR mixed reality and augmented reality gaming system |
WO2019040329A2 (en) * | 2017-08-22 | 2019-02-28 | Ping Li | Dual-axis resonate light beam steering mirror system and method for use in lidar |
US10440276B2 (en) * | 2017-11-02 | 2019-10-08 | Adobe Inc. | Generating image previews based on capture information |
US10122969B1 (en) * | 2017-12-07 | 2018-11-06 | Microsoft Technology Licensing, Llc | Video capture systems and methods |
US10554921B1 (en) * | 2018-08-06 | 2020-02-04 | Microsoft Technology Licensing, Llc | Gaze-correct video conferencing systems and methods |
-
2018
- 2018-09-26 AT AT601562018A patent/AT521845B1/en active
-
2019
- 2019-09-26 EP EP19780139.2A patent/EP3857303A1/en active Pending
- 2019-09-26 US US17/280,487 patent/US12066743B2/en active Active
- 2019-09-26 WO PCT/AT2019/060316 patent/WO2020061604A1/en unknown
- 2019-09-26 US US17/280,472 patent/US11803101B2/en active Active
- 2019-09-26 WO PCT/AT2019/060317 patent/WO2020061605A2/en unknown
- 2019-09-26 EP EP19783969.9A patent/EP3857304B1/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2746368A (en) * | 1953-06-02 | 1956-05-22 | Rollei Werke Franke Heidecke | Range finder for photographic cameras |
WO2012126868A1 (en) * | 2011-03-18 | 2012-09-27 | Martin Waitz | Method and device for focusing a film camera |
EP2947510A1 (en) | 2014-05-23 | 2015-11-25 | Howard Preston | Camera focusing system |
US20160088212A1 (en) * | 2014-09-24 | 2016-03-24 | Panavision Federal Systems, Llc | Distance measurement device for motion picture camera focus applications |
US10368057B1 (en) * | 2016-09-28 | 2019-07-30 | Amazon Technologies, Inc. | Synchronizing data streams |
US20180176439A1 (en) * | 2016-12-20 | 2018-06-21 | Microsoft Technology Licensing, Llc | Dynamic range extension to produce high dynamic range images |
Non-Patent Citations (1)
Title |
---|
WO-2012126868 Machine English Translation, download from WIPO Patentscope. * |
Also Published As
Publication number | Publication date |
---|---|
WO2020061605A2 (en) | 2020-04-02 |
WO2020061604A1 (en) | 2020-04-02 |
EP3857303A1 (en) | 2021-08-04 |
AT521845A1 (en) | 2020-05-15 |
AT521845B1 (en) | 2021-05-15 |
US12066743B2 (en) | 2024-08-20 |
WO2020061605A3 (en) | 2020-06-18 |
WO2020061605A9 (en) | 2020-09-03 |
EP3857304B1 (en) | 2023-07-05 |
EP3857304C0 (en) | 2023-07-05 |
EP3857304A2 (en) | 2021-08-04 |
US20220035226A1 (en) | 2022-02-03 |
US20220030157A1 (en) | 2022-01-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109151439B (en) | Automatic tracking shooting system and method based on vision | |
JP5178553B2 (en) | Imaging device | |
EP1466210B1 (en) | Digital camera with viewfinder designed for improved depth of field photographing | |
US20060044399A1 (en) | Control system for an image capture device | |
JP5281972B2 (en) | Imaging device | |
US20050094019A1 (en) | Camera control | |
US4851901A (en) | Stereoscopic television apparatus | |
CN104427225B (en) | The control method of picture pick-up device and picture pick-up device | |
JP6838994B2 (en) | Imaging device, control method and program of imaging device | |
JP4306006B2 (en) | Three-dimensional data input method and apparatus | |
CN107735713A (en) | Medical image processing unit, medical image processing method and medical viewing system | |
CN107852467A (en) | Supported with video and switch the/Based on Dual-Aperture zoom camera without switching dynamic control | |
JPH09181966A (en) | Image processing method and device | |
CA2962977C (en) | Method, apparatus, system and software for focusing a camera | |
US20120013757A1 (en) | Camera that combines images of different scene depths | |
JP2010093422A (en) | Imaging apparatus | |
US10984550B2 (en) | Image processing device, image processing method, recording medium storing image processing program and image pickup apparatus | |
CN102843509A (en) | Image processing device and image processing method | |
CN102843510A (en) | Imaging device and distance information detecting method | |
JP3907008B2 (en) | Method and means for increasing the depth of field for a photograph | |
WO1997025690B1 (en) | Increased depth of field for photography | |
US11803101B2 (en) | Method for setting the focus of a film camera | |
US20120069149A1 (en) | Photographing device and controlling method thereof, and three-dimensional information measuring device | |
US20220148223A1 (en) | Lens Calibration System | |
JP2000152285A (en) | Stereoscopic image display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
AS | Assignment |
Owner name: QINEMATIQ GMBH, AUSTRIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WAITZ, MARTIN;REEL/FRAME:059028/0134 Effective date: 20210607 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |