Nothing Special   »   [go: up one dir, main page]

Image stabilization (IS) is a family of techniques that reduce blurring associated with the motion of a camera or other imaging device during exposure.

Comparison of simplified image stabilisation systems:
  1. unstabilised
  2. lens-based optical stabilisation
  3. sensor-shift optical stabilisation
  4. digital or electronic stabilisation

Generally, it compensates for pan and tilt (angular movement, equivalent to yaw and pitch) of the imaging device, though electronic image stabilization can also compensate for rotation about the optical axis (roll).[1] It is mainly used in high-end image-stabilized binoculars, still and video cameras, astronomical telescopes, and also smartphones. With still cameras, camera shake is a particular problem at slow shutter speeds or with long focal length lenses (telephoto or zoom). With video cameras, camera shake causes visible frame-to-frame jitter in the recorded video. In astronomy, the problem of lens shake is added to variation in the atmosphere, which changes the apparent positions of objects over time.

Application in still photography

edit
 
Photography of a sound reinforcement system prior to a pop concert, wherein the room was nearly dark except for the blue spotlight and the dim white light from the device rear panel itself. Though the exposure time of 14 s at (35 mm equivalent) 180 mm focal length would typically result in a relatively strong blur according to the "1/mm rule", the image is quite sharp – which is the result of the activated image stabilizer of the employed Lumix digital camera.

In photography, image stabilization can facilitate shutter speeds 2 to 5.5 stops slower (exposures 4 to 22+12 times longer), and even slower effective speeds have been reported.

A rule of thumb to determine the slowest shutter speed possible for hand-holding without noticeable blur due to camera shake is to take the reciprocal of the 35 mm equivalent focal length of the lens, also known as the "1/mm rule"[a]. For example, at a focal length of 125 mm on a 35 mm camera, vibration or camera shake could affect sharpness if the shutter speed is slower than 1125 second. As a result of the 2-to-4.5-stops slower shutter speeds allowed by IS, an image taken at 1125 second speed with an ordinary lens could be taken at 115 or 18 second with an IS-equipped lens and produce almost the same quality. The sharpness obtainable at a given speed can increase dramatically.[3] When calculating the effective focal length, it is important to take into account the image format a camera uses. For example, many digital SLR cameras use an image sensor that is 23, 58, or 12 the size of a 35 mm film frame. This means that the 35 mm frame is 1.5, 1.6, or 2 times the size of the digital sensor. The latter values are referred to as the crop factor, field-of-view crop factor, focal-length multiplier, or format factor. On a 2× crop factor camera, for instance, a 50 mm lens produces the same field of view as a 100 mm lens used on a 35 mm film camera, and can typically be handheld at 1100 second.

However, image stabilization does not prevent motion blur caused by the movement of the subject or by extreme movements of the camera. Image stabilization is only designed for and capable of reducing blur that results from normal, minute shaking of a lens due to hand-held shooting. Some lenses and camera bodies include a secondary panning mode or a more aggressive 'active mode', both described in greater detail below under optical image stabilization.

Astrophotography makes much use of long-exposure photography, which requires the camera to be fixed in place. However, fastening it to the Earth is not enough, since the Earth rotates. The Pentax K-5 and K-r, when equipped with the O-GPS1 GPS accessory for position data, can use their sensor-shift capability to reduce the resulting star trails.[4]

Stabilization can be applied in the lens, or in the camera body. Each method has distinctive advantages and disadvantages.[5]

Techniques

edit

Optical image stabilization

edit
 
A comparison of close-up photographs of a calculator keypad with and without optical image stabilization

An optical image stabilizer (OIS, IS, or OS) is a mechanism used in still or video cameras that stabilizes the recorded image by varying the optical path to the sensor. This technology is implemented in the lens itself, as distinct from in-body image stabilization (IBIS), which operates by moving the sensor as the final element in the optical path. The key element of all optical stabilization systems is that they stabilize the image projected on the sensor before the sensor converts the image into digital information. IBIS can have up to 5 axis of movement: X, Y, Roll, Yaw, and Pitch. IBIS has the added advantage of working with all lenses.

Benefits of OIS

edit

Optical image stabilization prolongs the shutter speed possible for handheld photography by reducing the likelihood of blurring the image from shake during the same exposure time.

For handheld video recording, regardless of lighting conditions, optical image stabilization compensates for minor shakes whose appearance magnifies when watched on a large display such as a television set or computer monitor.[6][7][8]

Names by vendors

edit

Different companies have different names for the OIS technology, for example:

  • Vibration Reduction (VR) – Nikon (produced the first optical two-axis stabilized lens, a 38–105 mm f/4–7.8 zoom built into the Nikon Zoom 700VR (US: Zoom-Touch 105 VR) camera in 1994)[9][10]
  • Image Stabilizer (IS) – Canon introduced the EF 75–300 mm f/4–5.6 IS USM) in 1995. In 2009, they introduced their first lens (the EF 100mm F2.8 Macro L) to use a four-axis Hybrid IS.)
  • Anti-Shake (AS) – Minolta and Konica Minolta (Minolta introduced the first sensor-based two-axis image stabilizer with the DiMAGE A1 in 2003)
  • IBIS - In Body Image Stabilisation – Olympus and Fujifilm
  • Optical SteadyShot (OSS) – Sony (for Cyber-shot and several α E-mount lenses)
  • Optical Image Stabilization (OIS) – Fujifilm
  • MegaOIS, PowerOIS – Panasonic and Leica
  • SteadyShot (SS), Super SteadyShot (SSS), SteadyShot INSIDE (SSI) – Sony (based on Konica Minolta's Anti-Shake originally, Sony introduced a 2-axis full-frame variant for the DSLR-A900 in 2008 and a 5-axis stabilizer for the full-frame ILCE-7M2 in 2014)
  • Optical Stabilization (OS) – Sigma
  • Vibration Compensation (VC) – Tamron
  • Shake Reduction (SR) – Pentax
  • PureView – Nokia (produced the first cell phone optical stabilised sensor, built into the Lumia 920)
  • UltraPixel – HTC (Image Stabilization is only available for the 2013 HTC One & 2016 HTC 10 with UltraPixel. It is not available for the HTC One (M8) or HTC Butterfly S, which also have UltraPixel)

Most high-end smartphones as of late 2014 use optical image stabilization for photos and videos.[11]

Lens-based

edit

In Nikon and Canon's implementation, it works by using a floating lens element that is moved orthogonally to the optical axis of the lens using electromagnets.[12] Vibration is detected using two piezoelectric angular velocity sensors (often called gyroscopic sensors), one to detect horizontal movement and the other to detect vertical movement.[13] As a result, this kind of image stabilizer corrects only for pitch and yaw axis rotations,[14][15] and cannot correct for rotation around the optical axis. Some lenses have a secondary mode that counteracts vertical-only camera shake. This mode is useful when using a panning technique. Some such lenses activate it automatically; others use a switch on the lens.

To compensate for camera shake in shooting video while walking, Panasonic introduced Power Hybrid OIS+ with five-axis correction: axis rotation, horizontal rotation, vertical rotation, and horizontal and vertical motion.[16]

Some Nikon VR-enabled lenses offer an "active" mode for shooting from a moving vehicle, such as a car or boat, which is supposed to correct for larger shakes than the "normal" mode.[17] However, active mode used for normal shooting can produce poorer results than normal mode.[18] This is because active mode is optimized for reducing higher angular velocity movements (typically when shooting from a heavily moving platform using faster shutter speeds), where normal mode tries to reduce lower angular velocity movements over a larger amplitude and timeframe (typically body and hand movement when standing on a stationary or slowly moving platform while using slower shutter speeds).

Most manufacturers suggest that the IS feature of a lens be turned off when the lens is mounted on a tripod as it can cause erratic results and is generally unnecessary. Many modern image stabilization lenses (notably Canon's more recent IS lenses) are able to auto-detect that they are tripod-mounted (as a result of extremely low vibration readings) and disable IS automatically to prevent this and any consequent image quality reduction.[19] The system also draws battery power, so deactivating it when not needed extends the battery charge.

A disadvantage of lens-based image stabilization is cost. Each lens requires its own image stabilization system. Also, not every lens is available in an image-stabilized version. This is often the case for fast primes and wide-angle lenses. However, the fastest lens with image stabilisation is the Nocticron with a speed of f/1.2. While the most obvious advantage for image stabilization lies with longer focal lengths, even normal and wide-angle lenses benefit from it in low-light applications.

Lens-based stabilization also has advantages over in-body stabilization. In low-light or low-contrast situations, the autofocus system (which has no stabilized sensors) is able to work more accurately when the image coming from the lens is already stabilized.[citation needed] In cameras with optical viewfinders, the image seen by the photographer through the stabilized lens (as opposed to in-body stabilization) reveals more detail because of its stability, and it also makes correct framing easier. This is especially the case with longer telephoto lenses. This is not an issue for Mirrorless interchangeable-lens camera systems, because the sensor output to the screen or electronic viewfinder is stabilized.

Sensor-shift

edit

The sensor capturing the image can be moved in such a way as to counteract the motion of the camera, a technology often referred to as mechanical image stabilization. When the camera rotates, causing angular error, gyroscopes encode information to the actuator that moves the sensor.[20] The sensor is moved to maintain the projection of the image onto the image plane, which is a function of the focal length of the lens being used. Modern cameras can automatically acquire focal length information from modern lenses made for that camera. Minolta and Konica Minolta used a technique called Anti-Shake (AS) now marketed as SteadyShot (SS) in the Sony α line and Shake Reduction (SR) in the Pentax K-series and Q series cameras, which relies on a very precise angular rate sensor to detect camera motion.[21] Olympus introduced image stabilization with their E-510 D-SLR body, employing a system built around their Supersonic Wave Drive.[22] Other manufacturers use digital signal processors (DSP) to analyze the image on the fly and then move the sensor appropriately. Sensor shifting is also used in some cameras by Fujifilm, Samsung, Casio Exilim and Ricoh Caplio.[23]

The advantage with moving the image sensor, instead of the lens, is that the image can be stabilized even on lenses made without stabilization. This may allow the stabilization to work with many otherwise-unstabilized lenses, and reduces the weight and complexity of the lenses. Further, when sensor-based image stabilization technology improves, it requires replacing only the camera to take advantage of the improvements, which is typically far less expensive than replacing all existing lenses if relying on lens-based image stabilization. Some sensor-based image stabilization implementations are capable of correcting camera roll rotation, a motion that is easily excited by pressing the shutter button. No lens-based system can address this potential source of image blur. A by-product of available "roll" compensation is that the camera can automatically correct for tilted horizons in the optical domain, provided it is equipped with an electronic spirit level, such as the Pentax K-7/K-5 cameras.

One of the primary disadvantages of moving the image sensor itself is that the image projected to the viewfinder is not stabilized. Similarly, the image projected to a phase-detection autofocus system that is not part of the image sensor, if used, is not stabilized. This is not an issue on cameras that use an electronic viewfinder (EVF), since the image projected on that viewfinder is taken from the image sensor itself.

Some, but not all, camera-bodies capable of in-body stabilization can be pre-set manually to a given focal length. Their stabilization system corrects as if that focal length lens is attached, so the camera can stabilize older lenses, and lenses from other makers. This isn't viable with zoom lenses, because their focal length is variable. Some adapters communicate focal length information from the maker of one lens to the body of another maker. Some lenses that do not report their focal length can be retrofitted with a chip which reports a pre-programmed focal-length to the camera body. Sometimes, none of these techniques work, and image-stabilization cannot be used with such lenses.

In-body image stabilization requires the lens to have a larger output image circle because the sensor is moved during exposure and thus uses a larger part of the image. Compared to lens movements in optical image stabilization systems the sensor movements are quite large, so the effectiveness is limited by the maximum range of sensor movement, where a typical modern optically-stabilized lens has greater freedom. Both the speed and range of the required sensor movement increase with the focal length of the lens being used, making sensor-shift technology less suited for very long telephoto lenses, especially when using slower shutter speeds, because the available motion range of the sensor quickly becomes insufficient to cope with the increasing image displacement.

In September 2023, Nikon has announced the release of Nikon Z f, which has the world’s first Focus-Point VR technology that centers the axis of sensor shift image stabilization at the autofocus point, rather than at the center of the sensor like the conventional sensor shift image stabilization system. This allows for vibration reduction at the focused point rather than just in the center of the image.[24]

Dual

edit
 
Free-hand museum shot of a historic universal theodolite taken without flash light but with dual image stabilization. The image was taken with a Panasonic Lumix DMC-GX8 and a Nocticron with almost two times the normal focal length of the camera system (42.5 mm) at f/1.2 and with a polarizing filter in order to remove reflections from the transparent glass of the display case. ISO speed = 800, exposure time = 18 s, exposure value = 0.5.

Starting with the Panasonic Lumix DMC-GX8, announced in July 2015, and subsequently in the Panasonic Lumix DC-GH5, Panasonic, who formerly only equipped lens-based stabilization in its interchangeable lens camera system (of the Micro Four Thirds standard), introduced sensor-shift stabilization that works in concert with the existing lens-based system ("Dual IS").

In the meantime (2016), Olympus also offered two lenses with image stabilization that can be synchronized with the in-built image stabilization system of the image sensors of Olympus' Micro Four Thirds cameras ("Sync IS"). With this technology a gain of 6.5 f-stops can be achieved without blurred images.[25] This is limited by the rotational movement of the surface of the Earth, that fools the accelerometers of the camera. Therefore, depending on the angle of view, the maximum exposure time should not exceed 13 second for long telephoto shots (with a 35 mm equivalent focal length of 800 millimeters) and a little more than ten seconds for wide angle shots (with a 35 mm equivalent focal length of 24 millimeters), if the movement of the Earth is not taken into consideration by the image stabilization process.[26]

In 2015, the Sony E camera system also allowed combining image stabilization systems of lenses and camera bodies, but without synchronizing the same degrees of freedom. In this case, only the independent compensation degrees of the in-built image sensor stabilization are activated to support lens stabilisation.[27]

Canon and Nikon now have full-frame mirrorless bodies that have IBIS and also support each company's lens-based stabilization. Canon's first two such bodies, the EOS R and RP, do not have IBIS, but the feature was added for the more recent higher end R3, R5, R6 (and its MkII version) and the APS-C R7. However, the full frame R8 and APS-C R10 do not have IBIS. All of Nikon's full-frame Z-mount bodies—the Z 6, Z 7, the Mark II versions of both, the Z 8 and Z 9, have IBIS. However, its APS-C Z 50 lacks IBIS.

Digital image stabilization

edit
Short video showing image stabilization done purely in software in post processing stage

Real-time digital image stabilization, also called electronic image stabilization (EIS), is used in some video cameras. This technique shifts the cropped area read out from the image sensor for each frame to counteract the motion. This requires the resolution of the image sensor to exceed the resolution of the recorded video, and it slightly reduces the field of view because the area on the image sensor outside the visible frame acts as a buffer against hand movements.[28] This technique reduces distracting vibrations from videos by smoothing the transition from one frame to another.

This technique can not do anything about existing motion blur, which may result in an image seemingly losing focus as motion is compensated due to movement during the exposure times of individual frames. This effect is more visible in darker sceneries due to prolonged exposure times per frame.

Some still camera manufacturers marketed their cameras as having digital image stabilization when they really only had a high-sensitivity mode that uses a short exposure time—producing pictures with less motion blur, but more noise.[29] It reduces blur when photographing something that is moving, as well as from camera shake.

Others now also use digital signal processing (DSP) to reduce blur in stills, for example by sub-dividing the exposure into several shorter exposures in rapid succession, discarding blurred ones, re-aligning the sharpest sub-exposures and adding them together, and using the gyroscope to detect the best time to take each frame.[30][31][32]

Stabilization filters

edit

Many video non-linear editing systems use stabilization filters that can correct a non-stabilized image by tracking the movement of pixels in the image and correcting the image by moving the frame.[33][34] The process is similar to digital image stabilization but since there is no larger image to work with the filter either crops the image down to hide the motion of the frame or attempts to recreate the lost image at the edge through spatial or temporal extrapolation.[35]

Online services, including YouTube, are also beginning to provide 'video stabilization as a post-processing step after content is uploaded. This has the disadvantage of not having access to the realtime gyroscopic data, but the advantage of more computing power and the ability to analyze images both before and after a particular frame.[36]

Orthogonal transfer CCD

edit

Used in astronomy, an orthogonal transfer CCD (OTCCD) actually shifts the image within the CCD itself while the image is being captured, based on analysis of the apparent motion of bright stars. This is a rare example of digital stabilization for still pictures. An example of this is in the upcoming gigapixel telescope Pan-STARRS being constructed in Hawaii.[37]

Stabilizing the camera body

edit
 
A moving TV camera that is remote controlled and gyro-stabilized through a Newton head on rail dolly system.

A technique that requires no additional capabilities of any camera body–lens combination consists of stabilizing the entire camera body externally rather than using an internal method. This is achieved by attaching a gyroscope to the camera body, usually using the camera's built-in tripod mount. This lets the external gyro (gimbal) stabilize the camera, and is typically used in photography from a moving vehicle, when a lens or camera offering another type of image stabilization is not available.[38]

A common way to stabilize moving cameras after approx. year 2015 is by using a camera stabilizer such as a stabilized remote camera head. The camera and lens are mounted in a remote controlled camera holder which is then mounted on anything that moves, such as rail systems, cables, cars or helicopters. An example of a remote stabilized head that is used to stabilize moving TV cameras that are broadcasting live is the Newton stabilized head.[39]

Another technique for stabilizing a video or motion picture camera body is the Steadicam system, which isolates the camera from the operator's body using a harness and a camera boom with a counterweight. [40]

Camera stabilizer

edit

A camera stabilizer is any device or object that externally stabilizes the camera. This can refer to a Steadicam, a tripod, the camera operator's hand, or a combination of these.

In close-up photography, using rotation sensors to compensate for changes in pointing direction becomes insufficient. Moving, rather than tilting, the camera up/down or left/right by a fraction of a millimeter becomes noticeable if you are trying to resolve millimeter-size details on the object. Linear accelerometers in the camera, coupled with information such as the lens focal length and focused distance, can feed a secondary correction into the drive that moves the sensor or optics, to compensate for linear as well as rotational shake. [41]

In biological eyes

edit

In many animals, including human beings, the inner ear functions as the biological analogue of an accelerometer in camera image stabilization systems, to stabilize the image by moving the eyes. When a rotation of the head is detected, an inhibitory signal is sent to the extraocular muscles on one side and an excitatory signal to the muscles on the other side. The result is a compensatory movement of the eyes. Typically eye movements lag the head movements by less than 10 ms.[42]

See also

edit

Notes

edit
  1. ^ This rule was invented in the film era; with modern high-resolution digital sensors a minimum shutter speed of the reciprocal of twice the focal length may be more appropriate, i.e. 1/(2*mm).[2]

References

edit
  1. ^ "Correct Excessive Shake in Final Cut Pro". Apple, Inc.
  2. ^ "A Simple Rule to Guarantee Sharp Photos". 27 July 2015. Retrieved 2021-01-29.
  3. ^ "Image Stabilization (IS) and Vibration Reduction (VR)". www.kenrockwell.com.
  4. ^ PENTAX O-GPS1 - News Release, Pentax.jp (archived)
  5. ^ "Image Stabilization - Lens vs. Body". Bobatkins.com. Retrieved 2009-12-11.
  6. ^ "Sony α7R IV 35 mm full-frame camera with 61.0 MP". Sony.
  7. ^ "Sony A7R IV sensor review". November 14, 2019.
  8. ^ "Lens Stabilization vs In-camera Stabilization". photographylife.com. 20 February 2012.
  9. ^ Nikon camera models 1992-1994 MIR
  10. ^ Nikon Zoom 700VR Camera Gossip
  11. ^ "15 smartphone cameras with optical image stabilization". December 14, 2014.
  12. ^ What is Optical Image Stabilizer? Archived 2006-05-16 at the Wayback Machine, Technology FAQ, Canon Broadcast Equipment
  13. ^ Glossary : Optical : Image Stabilization, Vincent Bockaert, Digital Photography Review
  14. ^ "Panasonic MegaOIS Explained". Archived from the original on January 12, 2009. Retrieved March 16, 2007.{{cite web}}: CS1 maint: unfit URL (link)
  15. ^ "Mega OIS". Archived from the original on 2011-01-07. Retrieved 2011-11-05.
  16. ^ "Why does your compact camera need the O.I.S.?". Archived from the original on July 3, 2013. Retrieved December 31, 2013.
  17. ^ "Vibration Reduction (VR) Technology". Archived from the original on 2007-11-04. Retrieved 2007-05-19.
  18. ^ "CameraHobby: Nikon AF-S VR 70–200mm f2.8 Review". Archived from the original on 2007-05-23. Retrieved 2007-05-19.
  19. ^ "Technical report". Canon.com. Archived from the original on 2009-12-25. Retrieved 2009-12-11.
  20. ^ "Development of a Test Method for Image Stabilization Systems" (PDF). Archived from the original (PDF) on 2009-01-17. Retrieved 2008-04-04.
  21. ^ Dynax 7D Anti-Shake Technology Archived 2006-06-19 at the Wayback Machine, Konica Minolta
  22. ^ "Olympus Image Stabilization Technology". Archived from the original on 2007-07-02. Retrieved 2007-06-30.
  23. ^ "Image Stabilization".
  24. ^ Nikon Z f | Focus Point VR, retrieved 2023-10-11
  25. ^ DL Cade: Olympus Says Earth’s Rotation Limits Image Stabilization to 6.5 Stops Max, petapixel.com, 26 September 2016, retrieved 26 October 2017
  26. ^ Markus Bautsch: Optomechanische Bildstabilisierung, German, Optomechanical image stabilisation, in: Wikibook Digital imaging techniques, retrieved 30 October 2017
  27. ^ A Comparison of How Olympus and Sony’s 5 Axis Stabilization Work, thephoblographer.com, 17 December 2014, retrieved 26 October 2017
  28. ^ Chereau, R., Breckon, T.P. (September 2013). "Robust Motion Filtering as an Enabler to Video Stabilization for a Tele-operated Mobile Robot" (PDF). In Kamerman, Gary W; Steinvall, Ove K; Bishop, Gary J; Gonglewski, John D (eds.). Proc. SPIE Electro-Optical Remote Sensing, Photonic Technologies, and Applications VII. Electro-Optical Remote Sensing, Photonic Technologies, and Applications VII; and Military Applications in Hyperspectral Imaging and High Spatial Resolution Sensing. Vol. 8897. SPIE. pp. 88970I. doi:10.1117/12.2028360. S2CID 11556469. Retrieved 5 November 2013.{{cite book}}: CS1 maint: multiple names: authors list (link)[permanent dead link]
  29. ^ "Stop misleading 'Image Stabilization' labels: Digital Photography Review". Dpreview.com. 2007-01-05. Retrieved 2009-12-11.
  30. ^ "Sony DSC-HX5V Features". sony.co.uk. 2010-04-01. Retrieved 2012-06-24.
  31. ^ "Fujifilm FinePix HS20EXR features - Triple Image Stabilization". fujifilm.ca. 2011-01-05. Archived from the original on 2012-01-12. Retrieved 2012-06-26.
  32. ^ Zimmerman, Steven (12 October 2016). "Sony IMX378: Comprehensive Breakdown of the Google Pixel's Sensor and its Features". XDA Developers. Retrieved 17 October 2016.
  33. ^ "The Event Videographer's Resource". EventDV.net. Retrieved 2009-12-11.
  34. ^ "Stabilization per Software". studiodaily.com. 2011-02-28. Retrieved 2014-03-17.
  35. ^ "Capabilities | Stabilization". 2d3. Archived from the original on 2009-11-25. Retrieved 2009-12-11.
  36. ^ "Secrets of Video Stabilization on YouTube". May 15, 2013. Retrieved October 17, 2014.
  37. ^ Pan-STARRS Orthogonal Transfer CCD Camera Design Archived 2004-08-07 at the Wayback Machine, Gareth Wynn-Williams, Institute for Astronomy
  38. ^ Multimedia: Use Image Stabilization, Andy King, Web Site Optimization, 2004
  39. ^ "NEWTON stabilized remote camera heads". newtonnordic.com. 2020-01-09. Retrieved 2020-06-26.
  40. ^ Harris, Tom (2001-11-22). "How Steadicams Work". HowStuffWorks.com. Discovery Communications LLC. Retrieved 2008-07-26.
  41. ^ "Hybrid Image Stabilizer". Canon Global News Releases. canon.com. 2009-07-22. Archived from the original on 2012-06-17. Retrieved 2012-06-26.
  42. ^ Barnes, G. R. (1979). "Vestibulo-ocular function during co-ordinated head and eye movements to acquire visual targets". The Journal of Physiology. 287: 127–147. doi:10.1113/jphysiol.1979.sp012650. PMC 1281486. PMID 311828.