Tactile Image Sensors Employing Camera: A Review
<p>Typical structure of a tactile imaging sensor employing a camera. A tactile skin with a contact-light conversion mechanism somehow converts physical tactile stimuli into light signals, that can be captured by the camera. In a camera, an optical system that includes a lens determines spatial measurement range. The spatial resolution and the temporal resolution are determined by the specification of the image sensor. The image sequence, that is captured by the camera, is analysed by the computer on which the image processing software is running to extract tactile information.</p> "> Figure 2
<p>Principles of the three typical methods for converting physical tactile stimuli into light signals. (<b>a</b>) Light conductive plate-based method. The total reflection occurs if the incident angle <math display="inline"><semantics> <mi>θ</mi> </semantics></math> is larger than the critical angle. When the object (<math display="inline"><semantics> <msub> <mi>n</mi> <mn>3</mn> </msub> </semantics></math>) makes contact with the sensor (medium 1), the condition for total reflection is broken at this part, and the scattered light (shown by the red arrows) is observed (direct method). In the case of using a flexible cover, this cover makes contact with the sensor (indirect method). (<b>b</b>) Marker displacement-based method. The markers (shown by the gray circles) move from the initial position (dotted circles) according to deformation of elastomer. These movements can be observed as two-dimensional motion on the camera image. (<b>c</b>) Reflective membrane-based method. The deformation of the sensor surface, which is coated by the reflective membrane, causes a gradient change at each point on the surface. The intensity of the reflective light changes depending on the gradient changes.</p> "> Figure 3
<p>Example of tactile image sensor using a camera based on the light conducted plate. (<b>a</b>) Picture of the combined contact and proximity sensor (©IEEE 2016. Reprinted with permission from [<a href="#B25-sensors-19-03933" class="html-bibr">25</a>]); (<b>b</b>,<b>c</b>) application to robotic grasp control. The bottom row shows the output images including the contact detection (shown by purple color) and the visible light image of the object in the gripper. The contact detection is based on the direct light conducted plate method.</p> "> Figure 4
<p>Example of tactile image sensor using a camera based on the marker displacement method. (<b>a</b>) Picture of a full-resolution optical tactile sensor developed by ETH Zurich research group [<a href="#B26-sensors-19-03933" class="html-bibr">26</a>]; (<b>b</b>,<b>c</b>) are the original image and dense optical flow computed with the DIS algorithm. Note that the flow is estimated at each pixel, and a subsampled version is shown in (<b>c</b>) for ease of visualization [<a href="#B26-sensors-19-03933" class="html-bibr">26</a>]. (Source: C. Sferrazza and R. D’Andrea, “Design, Motivation and Evaluation of a Full-Resolution Optical Tactile Sensor,” <span class="html-italic">Sensors</span>, 19, 2019 [<a href="#B26-sensors-19-03933" class="html-bibr">26</a>]).</p> "> Figure 5
<p>Comparison of appearance of tactile skin with three different reflective membranes used in GelSight tactile sensor developed by an MIT research group [<a href="#B40-sensors-19-03933" class="html-bibr">40</a>]. Three kinds of elastomer coatings: Semi-specular coating painted by bronze flake and aluminum flake paint, and matte coating by aluminum powder. In the second and third row, the three pieces of elastomer are pressed against a ball with diameter 6 mm, but the in the third row, the elastomer is illuminated by light from the side direction [<a href="#B40-sensors-19-03933" class="html-bibr">40</a>]. (Source: W. Yuan, S. Dong, and E. H. Adelson, “GelSight: High-Resolution Robot Tactile Sensors for Estimating Geometry and Force,” <span class="html-italic">Sensors</span>, 17, 2017 [<a href="#B40-sensors-19-03933" class="html-bibr">40</a>]).</p> "> Figure 6
<p>(<b>a</b>) Standard shape of the sensor with the flat sensing skin. (<b>b</b>) Reducing the thickness of the sensor using a mirror in [<a href="#B101-sensors-19-03933" class="html-bibr">101</a>]. (<b>c</b>) The inside of the cylinder can be seen using a perspective camera, such as that used in [<a href="#B97-sensors-19-03933" class="html-bibr">97</a>,<a href="#B98-sensors-19-03933" class="html-bibr">98</a>].</p> "> Figure 7
<p>Rolling cylindrical tactile image sensor. (<b>a</b>) Picture of the sensor rolling on the surface of the object (a Japanese bill), (<b>b</b>) structure of the sensor, (<b>c</b>) output image obtained through a image mosicking, (<b>d</b>) enlarged output image. The video of the experiment is available: <a href="https://youtu.be/6mm4fgTJWB0" target="_blank">https://youtu.be/6mm4fgTJWB0</a>.</p> "> Figure 8
<p>Examples of combined tactile image sensor. (<b>a</b>) Structure of the sensor for combined force and surface texture sensing. Force sensing and surface texture imaging is based on the marker displacement and the reflective membrane method, respectively. The bottom three images show texture images, a screw, a fingertip, and a coin (©IEEE 2018. Reprinted with permission from [<a href="#B37-sensors-19-03933" class="html-bibr">37</a>]). (<b>b</b>) Combined contact and proximity sensing. Proximity and the three-dimensional shape of the object are detected from visible-light stereo pair images. Contact and its area are detected from infrared image (©IEEE 2016. Reprinted with permission from [<a href="#B25-sensors-19-03933" class="html-bibr">25</a>]). A compound-eye camera, compact, and thin multiple camera system, is used to obtain both visible and infrared light images through the different optical filters.</p> ">
Abstract
:1. Introduction
2. Basic Structure of Tactile Image Sensor
- High spatial resolution can be realized. The pixel number of the image sensor is now several to 10 mega pixels, even in an inexpensive one. Image sensors of 100 mega pixels are being applied to consumer digital cameras and smart phones. The pixel pitch on the image sensor is in the order of m, and the resolution on the sensor surface can be controlled by an imaging lens. This feature leads not only to easily satisfy those in a human fingertip (mentioned in Section 1), but also to realize tactile function that is far beyond human’s tactile sensing ability, such as accurate measurement of the contact object shape.
- The measurement area can be controlled by an optical system. The view angle of the camera is determined by an imaging lens, and one can realize small and large measurement areas by choosing the appropriate lens. Smaller view angles provide higher spatial resolution. Additionally, one can use a special optical system developed in the field of computer vision to modify the measurement area. Details are provided in Section 4.
- The sensor surface is physically isolated from the camera. This feature leads to physical robustness and design flexibility for the sensor shape. Some examples are described in Section 4.
- Computer vision algorithms and tools can be used. Tactile information is extracted by analyzing the image provided by the camera. Here, one can use computer vision libraries, such as OpenCV [23], and machine learning (e.g., deep learning) frameworks to analyze the image and extract complex tactile information. Details are given in Section 5.
3. Physical Contact to Light Conversion
3.1. Light Conductive Plate-Based Method
3.2. Marker Displacement-Based Method
3.3. Reflective Membrane-Based Method
3.4. Other Methods
4. Shape of Tactile Skin and Sensor Size
5. Image Analysis for Extracting Tactile Information
6. Combined Sensing of Multiple Modalities
7. Conclusions
Funding
Acknowledgments
Conflicts of Interest
References
- Dahiya, R.S.; Metta, G.; Valle, M.; Sandini, G. Tactile Sensing—From Humans to Humanoids. IEEE Trans. Robot. 2010, 26, 1–20. [Google Scholar] [CrossRef]
- Yousefa, H.; Boukallela, M.; Althoeferb, K. Tactile sensing for dexterous in-hand manipulation in robotics—A review. Sens. Actuators A Phys. 2011, 167, 171–187. [Google Scholar] [CrossRef]
- Nicholls, H.R.; Lee, M.H. A survey of robot tactile sensing technology. Int. J. Robot. Res. 1989, 8, 3–30. [Google Scholar] [CrossRef]
- Reinecke, J.; Dietrich, A.; Schmidt, F.; Chalon, M. Experimental comparison of slip detection strategies by tactile sensing with the biotac on the DLR hand arm system. In Proceedings of the IEEE International Conference on Robotics and Automation, Hong Kong, China, 31 May–7 June 2014; pp. 2742–2748. [Google Scholar]
- Kaboli, M.; Yao, K.; Cheng, G. Tactile-based Manipulation of Deformable Objects with Dynamic Center of Mass. In Proceedings of the IEEE-RAS 16th International Conference on Humanoid Robots, Cancun, Mexico, 15–17 November 2016; pp. 752–757. [Google Scholar]
- Lee, M.H. Tactile Sensing: New Directions, New Challenges. Int. J. Robot. Res. 2000, 19, 636–643. [Google Scholar] [CrossRef]
- Ho, V.A.; Nagatani, T.; Noda, A.; Hirai, S. What Can Be Inferred From a Tactile Arrayed Sensor in Autonomous In-Hand Manipulation? In Proceedings of the IEEE International Conference on Automation Science and Engineering, Seoul, South Korea, 20–24 August 2012; pp. 461–468. [Google Scholar]
- Wan, Q.; Howe, R.D. Modeling the Effects of Contact Sensor Resolution on Grasp Success. IEEE Robot. Autom. Lett. 2018, 3, 1933–1940. [Google Scholar] [CrossRef]
- Someya, T.; Sekitani, T.; Iba, S.; Kato, Y.; Kawaguchi, H.; Sakurai, T. A large-area, flexible pressure sensor matrix with organic field-effect transistors for artificial skin applications. Proc. Natl. Acad. Sci. USA 2004, 101, 9966–9970. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Shimojo, M.; Namiki, A.; Ishikawa, M.; Makino, R.; Mabuchi, K. A tactile sensor sheet using pressure conductive rubber with electrical-wires stitched method. IEEE Sens. J. 2004, 4, 589–596. [Google Scholar] [CrossRef]
- Shimojo, M.; Araki, A.; Ming, A.; Ishikawa, M. A High-Speed Mesh of Tactile Sensors Fitting Arbitrary Surfaces. IEEE Sens. J. 2010, 10, 822–830. [Google Scholar] [CrossRef]
- Drimus, A.; Kootstra, G.; Bilberg, A.; Kragic, D. Design of a flexible tactile sensor for classification of rigid and deformable objects. Robot. Auton. Syst. 2014, 62, 3–15. [Google Scholar] [CrossRef]
- Mittendorfer, P.; Cheng, G. Humanoid Multimodal Tactile-Sensing Modules. IEEE Trans. Robot. 2011, 27, 401–410. [Google Scholar] [CrossRef]
- Mittendorfer, P.; Yoshida, E.; Cheng, G. Realizing whole-body tactile interactions with a self-organizing, multi-modal artificial skin on a humanoid robot. Adv. Robot. 2015, 29, 51–67. [Google Scholar] [CrossRef]
- Kappassov, Z.; Baimukashev, D.; Adiyatov, O.; Salakchinov, S.; Massalin, Y.; Varol, H.A. A Series Elastic Tactile Sensing Array for Tactile Exploration of Deformable and Rigid Objects. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Madrid, Spain, 1–5 October 2018; pp. 520–525. [Google Scholar]
- Takao, H.; Sawada, K.; Ishida, M. Monolithic Silicon Smart Tactile Image Sensor With Integrated Strain Sensor Array on Pneumatically Swollen Single-Diaphragm Structure. IEEE Trans. Electron Device 2006, 53, 1250–1259. [Google Scholar] [CrossRef]
- Dahiya, R.S.; Metta, G.; Valle, M. Development of Fingertip Tactile Sensing Chips for Humanoid Robots. In Proceedings of the IEEE International Conference on Mechatronics, Malaga, Spain, 14–17 April 2009. [Google Scholar]
- Shimojo, M.; Ishikawa, M.; Kanaya, K. A flexible high resolution tactile imager with video signal output. In Proceedings of the IEEE International Conference on Robotics and Automation, Sacramento, CA, USA, 9–11 April 1991; pp. 384–391. [Google Scholar]
- Hiraishi, H.; Suzuki, N.; Kaneko, M.; Tanie, K. An Object Profile Detection by a High Resolution Tactile Sensor Using an Optical Conductive Plate. In Proceedings of the 14th Annual Conference of IEEE Industrial Electronics Society, Singapore, 24–28 October 1988; pp. 982–987. [Google Scholar]
- Begej, S. Fingertip-shaped optical tactile sensor for robotic applications. In Proceedings of the IEEE International Conference on Robotics and Automation, Philadelphia, PA, USA, 24–29 April 1988. [Google Scholar]
- Schneiter, J.L.; Sheridan, T.B. An optical tactile sensor for manipulators. Robot. Comput. Integr. Manuf. 1984, 1, 65–71. [Google Scholar] [CrossRef]
- Yamaguchi, A.; Atkeson, C.G. Recent progress in tactile sensing and sensors for robotic manipulation: Can we turn tactile sensing into vision? Adv. Robot. 2019, 33, 661–673. [Google Scholar] [CrossRef]
- OpenCV. Available online: https://opencv.org/ (accessed on 12 August 2019).
- Ohka, M.; Kobayashi, H.; Tanaka, J.; Mitsuya, Y. An experimental optical three-axis tactile sensor featured with hemispherical surface. J. Adv. Mech. Des. Syst. Manuf. 2008, 2, 860–873. [Google Scholar] [CrossRef]
- Shimonomura, K.; Nakashima, H.; Nozu, K. Robotic grasp control with high-resolution combined tactile and proximity sensing. In Proceedings of the IEEE International Conference on Robotics and Automation, Stockholm, Sweden, 16–21 May 2016; pp. 138–143. [Google Scholar]
- Sferrazza, C.; D’Andrea, R. Design, Motivation and Evaluation of a Full-Resolution Optical Tactile Sensor. Sensors 2019, 19, 928. [Google Scholar] [CrossRef] [PubMed]
- Cramphorn, L.; Lloyd, J.; Lepora, N.F. Voronoi Features for Tactile Sensing: Direct Inference of Pressure, Shear, and Contact Locations. In Proceedings of the IEEE International Conference on Robotics and Automation, Brisbane, QLD, Australia, 21–25 May 2018; pp. 2752–2757. [Google Scholar]
- Ito, Y.; Kim, Y.; Obinata, G. Contact Region Estimation Based on a Vision-Based Tactile Sensor Using a Deformable Touchpad. Sensors 2014, 14, 5805–5822. [Google Scholar] [CrossRef] [Green Version]
- Kamiyama, K.; Kajimoto, H.; Kawakami, N.; Tachi, S. Evaluation of a Vision-based Tactile Sensor. In Proceedings of the IEEE International Conference on Robotics and Automation, New Orleans, LA, USA, 26 April–1 May 2004; pp. 1542–1547. [Google Scholar]
- Sato, K.; Kamiyama, K.; Kawakami, N.; Tachi, S. Finger-shaped GelForce: Sensor for measuring surface traction fields for robotic hand. IEEE Trans. Haptics 2010, 3, 37–47. [Google Scholar] [CrossRef]
- Yamaguchi, A.; Atkeson, C.G. Combining finger vision and optical tactile sensing: Reducing and handling errors while cutting vegetables. In Proceedings of the IEEE-RAS 16th International Conference on Humanoid Robotics, Cancun, Mexico, 15–17 November 2016; pp. 1045–1051. [Google Scholar]
- Yuan, W.; Li, R.; Srinivasan, M.; Adelson, E. Measurement of shear and slip with a GelSight tactile sensor. In Proceedings of the IEEE International Conference on Robotics and Automation, Seattle, WA, USA, 26–30 May 2015; pp. 304–311. [Google Scholar]
- Song, H.; Bhattacharjee, T.; Srinivasa, S.S. Sensing Shear Forces During Food Manipulation: Resolving the Trade-Off Between Range and Sensitivity. In Proceedings of the IEEE International Conference on Robotics and Automation, Montreal, QC, Canada, 20–24 May 2019; pp. 8367–8373. [Google Scholar]
- Ito, Y.; Kim, Y.; Obinata, G. Slippage degree estimation for dexterous handling of vision-based tactile sensor. In Proceedings of the IEEE Sensors 2009, Christchurch, New Zealand, 25–28 October 2009. [Google Scholar]
- James, J.; Pestell, N.; Lepora, N. Slip detection with a biomimetic tactile sensor. IEEE Robot. Autom. Lett. 2018, 3, 3340–3346. [Google Scholar] [CrossRef]
- Li, R.; Adelson, E. Sensing and recognizing surface textures using a GelSight sensor. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Portland, OR, USA, 23–28 June 2013; pp. 1241–1247. [Google Scholar]
- Nozu, K.; Shimonomura, K. Robotic bolt insertion and tightening based on in-hand object localization and force sensing. In Proceedings of the IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Auckland, New Zealand, 9–12 July 2018. [Google Scholar]
- Li, R.; Platt, R., Jr.; Yuan, W.; ten Pas, A.; Roscup, N.; Srinivasan, M.A.; Adelson, E. Localization and Manipulation of Small Parts Using GelSight Tactile Sensing. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014; pp. 3988–3993. [Google Scholar]
- Nakashima, H.; Kagawa, K.; Shimonomura, K. Combined tactile and proximity sensor employing compound-eye camera. ITE Trans. Media Technol. Appl. 2015, 3, 227–233. [Google Scholar] [CrossRef]
- Yuan, W.; Dong, S.; Adelson, E.H. GelSight: High-Resolution Robot Tactile Sensors for Estimating Geometry and Force. Sensors 2017, 17, 2717. [Google Scholar] [CrossRef] [PubMed]
- Fang, B.; Sun, F.; Yang, C.; Xue, H.; Chen, W.; Zhan, C.; Guo, D.; Liu, H. A Dual-Modal Vision-Based Tactile Sensor for Robotic Hand Grasping. In Proceedings of the IEEE International Conference on Robotics and Automation, Brisbane, QLD, Australia, 21–25 May 2018; pp. 4740–4745. [Google Scholar]
- Johnson, M.; Adelson, E. Retrographic sensing for the measurement of surface texture and shape. In Proceedings of the IEEE Int’l Conf. on Computer Vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009. [Google Scholar]
- Kajimoto, H.; Ando, S. Active Tactile Sensor using Deformable Sheet Reflector. In Proceedings of the Technical Digest of the 16th Sensor Symposium, Kanagawa, Japan, 2–3 June 1998; pp. 99–104. [Google Scholar]
- Nakao, N.; Kaneko, M.; Suzuki, N.; Tanie, K. A finger shaped tactile sensor using an optical waveguide. In Proceedings of the 16th Annual Conference of IEEE Industrial Electronics Society, Pacific Grove, CA, USA, 27–30 November 1990; pp. 300–301. [Google Scholar]
- Maekawa, H.; Tanie, K.; Komoriya, K.; Kaneko, M.; Horiguchi, C.; Sugawara, T. Development of a finger-shaped tactile sensor and its evaluation by active touch. In Proceedings of the IEEE International Conference on Robotics and Automation, Nice, France, 12–14 May 1992. [Google Scholar]
- Ohka, M.; Mitsuya, Y.; Hattori, K.; Higashioka, I. Data Conversion Capability of Optical Tactile Sensor Featuring an Array of Pyramidal Projections. In Proceedings of the IEEE/SICE/RSJ International Conference on Multisensor Fusion and Integration for Intelligent Systems, Washington, DC, USA, 8–11 December 1996; pp. 573–580. [Google Scholar]
- Ikai, T.; Kamiya, S.; Ohka, M. Robot control using natural instructions via visual and tactile sensations. J. Comput Sci. 2016; 12, 246–254. [Google Scholar]
- Lee, J.H.; Won, C.H. High-Resolution Tactile Imaging Sensor Using Total Internal Reflection and Nonrigid Pattern Matching Algorithm. IEEE Sens. J. 2011, 11, 2084–2093. [Google Scholar] [CrossRef]
- Li, W.; Konstantinova, J.; Noh, Y.; Ma, Z.; Alomainy, A.; Althoefer, K. An Elastomer-based Flexible Optical Force and Tactile Sensor. In Proceedings of the 2nd IEEE International Conference on Soft Robotics, Seoul, Korea, 14–18 April 2019; pp. 361–366. [Google Scholar]
- Shimonomura, K.; Nakashima, H. A combined tactile and proximity sensing employing a compound-eye camera. In Proceedings of the IEEE SENSORS 2013, Baltimore, MD, USA, 3–6 November 2013; pp. 1464–1465. [Google Scholar]
- Shimonomura, K.; Nakashima, H.; Kagawa, K. A miniaturized compound-eye camera for combined position, proximity and tactile sensing. In Proceedings of the IEEE SENSORS 2014, Valencia, Spain, 2–5 November 2014; pp. 406–407. [Google Scholar]
- Han, J.H. Low Cost Multi-Touch Sensing through Frustrated Total Internal Reflection. In Proceedings of the 18th annual ACM symposium on User interface software and technology, Seattle, WA, USA, 23–26 October 2005; pp. 115–118. [Google Scholar]
- Kim, Y.; Park, S.; Park, S.K.; Yun, S.; Kyung, K.U.; Sun, K. Transparent and flexible force sensor array based on optical waveguide. Opt. Express 2012, 20, 14486–14493. [Google Scholar] [CrossRef] [PubMed]
- Nagata, K.; Ooki, M.; Kakikura, M. Feature Detection with an Image Based Compliant Tactile Sensor. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Kyongju, Korea, 17–21 October 1999; pp. 838–843. [Google Scholar]
- Ferrier, N.J.; Brockett, R.W. Reconstructing the shape of a deformable membrane from image data. Int. J. Robot. Res. 2000, 19, 795–816. [Google Scholar] [CrossRef]
- Ikeda, A.; Kurita, Y.; Ueda, J.; Matsumoto, Y.; Ogasawara, T. Grip Force Control for an Elastic Finger using Vision-based Incipient Slip Feedback. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Sendai, Japan, 28 September–2 October 2004; pp. 810–815. [Google Scholar]
- Ueda, J.; Ishida, Y.; Kondo, M.; Ogasawara, T. Development of the NAIST-Hand with Vision-based Tactile Fingertip Sensor. In Proceedings of the IEEE International Conference on Robotics and Automation, Barcelona, Spain, 18–22 April 2005; pp. 2332–2337. [Google Scholar]
- Kolski, S. Vision Based Tactile Sensor Using Transparent Elastic Fingertip for Dexterous Handling. Mobile Robots: Perception & Navigation; IntechOpen: London, UK, 2007; pp. 137–148.
- Kamiyama, K.; Vlack, K.; Mizota, T.; Kajimoto, H.; Kawakami, K.; Tachi, S. Vision-based sensor for real-time measuring of surface traction fields. IEEE Comput. Graph. Appl. 2005, 25, 68–75. [Google Scholar] [CrossRef] [PubMed]
- Yamaguchi, A.; Atkeson, C.G. Implementing Tactile Behaviors Using FingerVision. In Proceedings of the IEEE-RAS 17th International Conference on Humanoid Robotics, Birmingham, UK, 15–17 November 2017; pp. 241–248. [Google Scholar]
- Guo, F.; Zhang, C.; Yan, Y.; Li, P.; Wang, Z. Measurement of three-dimensional deformation and load using vision-based tactile sensor. In Proceedings of the IEEE International Symposium on Industrial Electronics, Santa Clara, CA, USA, 8–10 June 2016; pp. 1252–1257. [Google Scholar]
- Zhang, T.; Cong, Y.; Li, X.; Peng, Y. Robot Tactile Sensing: Vision Based Tactile Sensor for Force Perception. In Proceedings of the IEEE Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems, Tianjin, China, 19–23 July 2018; pp. 1360–1365. [Google Scholar]
- Zhang, Y.; Kan, Z.; Yang, Y.; Alexander, T.; Wang, M.Y. Effective Estimation of Contact Force and Torque for Vision-based Tactile Sensors with Helmholtz-Hodge Decomposition. IEEE Robot. Autom. Lett. 2019. [Google Scholar] [CrossRef]
- Chen, C.; McInroe, B. Towards a Soft Fingertip with Integrated Sensing and Actuation. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Madrid, Spain, 1–5 October 2018; pp. 6431–6436. [Google Scholar]
- Sakuma, T.; von Drigalski, F.; Ding, M.; Takamatsu, J.; Ogasawara, T. A universal gripper using optical sensing to acquire tactile information and membrane deformation. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Madrid, Spain, 1–5 October 2018; pp. 6431–6436. [Google Scholar]
- Ito, Y.; Kim, Y.; Nagai, C.; Obinata, G. Vision-Based Tactile Sensing and Shape Estimation Using a Fluid-Type Touchpad. IEEE Trans. Autom. Sci. Eng. 2012, 9, 734–744. [Google Scholar] [CrossRef]
- Chorley, C.; Melhuish, C.; Pipe, T.; Rossiter, J. Development of a tactile sensor based on biologically inspired edge encoding. In Proceedings of the IEEE International Conference on Advanced Robotics, Munich, Germany, 22–26 June 2009. [Google Scholar]
- Ward-Cherrier, B.; Cramphorn, L.; Lepora, N.F. Tactile Manipulation With a TacThumb Integrated on the Open-Hand M2 Gripper. IEEE Robot. Autom. Lett. 2016, 1, 169–175. [Google Scholar] [CrossRef] [Green Version]
- Lepora, N.; Ward-Cherrier, B. Tactile quality control with biomimetic active touch. IEEE Robot. Autom. Lett. 2016, 1, 646–652. [Google Scholar] [CrossRef]
- Lepora, N.; Aquilina, K.; Cramphorn, L. Exploratory tactile servoing with active touch. IEEE Robot. Autom. Lett. 2017, 2, 1156–1163. [Google Scholar] [CrossRef]
- Ward-Cherrier, B.; Pestell, N.; Cramphorn, L.; Winstone, B.; Giannaccini, M.E.; Rossiter, J.; Lepora, N.F. The tactip family: Softoptical tactile sensors with 3d-printed biomimetic morphologies. Soft Robot. 2018, 5, 216–227. [Google Scholar] [CrossRef]
- Lepora, N.F.; Church, A.; Kerckhove, C.D.; Hadsell, R.; Lloyd, J. From pixels to percepts: Highly robust edge perception and contour following using deep learning and an optical biomimetic tactile sensor. IEEE Robot. Autom. Lett. 2019, 4, 2101–2107. [Google Scholar] [CrossRef]
- Huang, I.; Liu, J.; Bajcsy, R. A Depth Camera-Based Soft Fingertip Device for Contact Region Estimation and Perception-Action Coupling. In Proceedings of the IEEE International Conference on Robotics and Automation, Montreal, QC, Canada, 20–24 May 2019; pp. 8443–8449. [Google Scholar]
- Alspach, A.; Hashimoto, K.; Kuppuswamy, N.; Tedrake, R. Soft-bubble: A highly compliant dense geometry tactile sensor for robot manipulation. arXiv 2019, arXiv:1904.02252. [Google Scholar]
- Kroeger, T.; Timofte, R.; Dai, D.; Van Gool, L. Fast optical flow using dense inverse search. In Proceedings of the European Conference on Computer Vision, Amsterdam, The Netherlands, 8–16 October 2016; pp. 471–488. [Google Scholar]
- Johnson, M.K.; Cole, F.; Raj, A.; Adelson, E.H. Microgeometry capture using an elastomeric sensor. ACM Trans. Graph. 2011, 30. [Google Scholar] [CrossRef] [Green Version]
- Yuan, W.; Zhu, C.; Owens, A.; Srinivasan, M.A.; Adelson, E.H. Shape-independent Hardness Estimation Using Deep Learning and a GelSight Tactile Sensor. In Proceedings of the IEEE International Conference on Robotics and Automation, Singapore, 29 May–3 June 2017; pp. 951–958. [Google Scholar]
- Tian, S.; Ebert, F.; Jayaraman, D.; Mudigonda, M.; Finn, C.; Calandra, R.; Levine, S. Manipulation by Feel: Touch-Based Control with Deep Predictive Models. In Proceedings of the IEEE International Conference on Robotics and Automation, Montreal, Canada, 20–24 May 2019; pp. 818–824. [Google Scholar]
- Lin, J.; Calandra, R.; Levine, S. Learning to Identify Object Instances by Touch: Tactile Recognition Via Multimodal Matching. In Proceedings of the IEEE International Conference on Robotics and Automation, Montreal, Canada, 20–24 May 2019; pp. 3644–3650. [Google Scholar]
- Bauza, M.; Canal, O.; Rodriguez, A. Tactile Mapping and Localization from High-Resolution Tactile Imprints. In Proceedings of the IEEE International Conference on Robotics and Automation, Montreal, Canada, 20–24 May 2019; pp. 3811–3817. [Google Scholar]
- Saga, S.; Kajimoto, H.; Tachi, S. High-resolution tactile sensor using the deformation of a reflection image. Sens. Rev. 2007, 27, 35–42. [Google Scholar] [CrossRef]
- Saga, S.; Taira, R.; Deguchi, K. Precise Shape Reconstruction by Active Pattern in Total-Internal-Reflection- Based Tactile Sensor. IEEE Trans. Haptics 2014, 7, 67–77. [Google Scholar] [CrossRef]
- Koike, M.; Saga, S.; Okatani, T.; Deguchi, K. Sensing method of total-internal-reflection-based tactile sensor. In Proceedings of the IEEE World Haptics, Istanbul, Turkey, 21–24 June 2011; pp. 615–619. [Google Scholar]
- Meyer, G.; and Amer, N.M. Novel optical approach to atomic force microscopy. Appl. Phys. Lett. 1988, 53, 1045–1047. [Google Scholar] [CrossRef]
- Massig, J.H. Deformation measurement on specular surfaces by simple means. Opt. Eng. 2001, 40, 2315–2318. [Google Scholar] [CrossRef]
- Dubey, V.N.; Crowder, R.M. A dynamic tactile sensor on photoelastic effect. Sens. Actuators A Phys. 2006, 128, 217–224. [Google Scholar] [CrossRef] [Green Version]
- Hoshino, K.; Mori, D. Three-dimensional tactile sensor with thin and soft elastic body. In Proceedings of the IEEE International Conference on Advanced Robotics and Its Social Impacts, Taipei, Taiwan, 23–25 August 2008; pp. 1–6. [Google Scholar]
- Hoshino, K.; Mori, D.; Tomida, M. An Optical Tactile Sensor Assuming Cubic Polynomial Deformation of Elastic Body. J. Robot. Mech. 2009, 21, 781–788. [Google Scholar] [CrossRef]
- Igo, N.; Hoshino, K. Small optical tactile sensor for robots. In Proceedings of the IEEE/SICE International Symposium on System Integration, Fukuoka, Japan, 16–18 December 2012; pp. 746–751. [Google Scholar]
- Xie, H.; Jiang, A.; Seneviratne, L.; Althoefer, K. Pixel-based optical fiber tactile force sensor for robot manipulation. In Proceedings of the IEEE Sensors 2012, Taipei, Taiwan, 28–31 October 2012. [Google Scholar]
- Li, W.; Konstantinova, J.; Noh, Y.; Alomainy, A.; Althoefer, K. Camera-Based Force and Tactile Sensor. In Towards Autonomous Robotic Systems; TAROS 2018. Lecture Notes in Computer Science; Giuliani, M., Assaf, T., Giannaccini, M., Eds.; Springer: Cham, Germany, 2018; Volume 10965. [Google Scholar]
- Lin, X.; Wiertlewski, M. Sensing the Frictional State of a Robotic Skin via Subtractive Color Mixing. IEEE Robot. Autom. Lett. 2019, 4, 2386–2392. [Google Scholar] [CrossRef]
- Kappassov, Z.; Baimukashev, D.; Kuanyshuly, Z.; Massalin, Y.; Urazbayev, A.; Varol, H.A. Color-Coded Fiber-Optic Tactile Sensor for an Elastomeric Robot Skin. In Proceedings of the IEEE International Conference on Robotics and Automation, Montreal, QC, Canada, 20–24 May 2019; pp. 2146–2152. [Google Scholar]
- Kaneko, M.; Nanayama, N.; Tsuji, T. Vision-based active sensor using a flexible beam. IEEE/ASME Trans. Mech. 2001, 6, 7–16. [Google Scholar] [CrossRef]
- Lepora, N.F.; Pearson, M.; Cramphorn, L. TacWhiskers: Biomimetic optical tactile whiskered robots. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Madrid, Spain, 1–5 October 2018; pp. 7628–7634. [Google Scholar]
- Shimizu, K.; Kim, Y.; Nagai, C.; Obinata, G. Omnidirectional vision-based tactile sensor implementable in the middle of a link mechanism. In Proceedings of the SICE Annual Conference, Akita, Japan, 20–23 August 2012; pp. 19–22. [Google Scholar]
- Winstone, B.; Melhuish, C.; Pipe, T.; Callaway, M.; Dogramadzi, S. Toward Bio-Inspired Tactile Sensing Capsule Endoscopy for Detection of Submucosal Tumors. IEEE Sens. J. 2017, 17, 848–857. [Google Scholar] [CrossRef]
- Duong, L.V.; Asahina, R.; Wang, J.; Ho, V.A. Development of a Vision-Based Soft Tactile Muscularis. In Proceedings of the 2nd IEEE International Conference on Soft Robotics, Seoul, Korea, 14–18 April 2019; pp. 343–348. [Google Scholar]
- Fearing, R.S. Using a Cylindrical Tactile Sensor for Determining Curvature. IEEE Trans. Robot. Autom. 1991, 7, 806–817. [Google Scholar] [CrossRef]
- Yamazawa, K.; Yagi, Y.; Yachida, M. Omnidirectional imaging with hyperboloidal projection. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Yokohama, Japan, 26–30 July 1993; pp. 1029–1034. [Google Scholar]
- Donlon, E.; Dong, S.; Liu, M.; Li, J.; Adelson, E.; Rodriguez, A. GelSlim: A high-resolution, compact, robust, and calibrated tactile-sensing finger. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Madrid, Spain, 1–5 October 2018; pp. 1927–1934. [Google Scholar]
- Ma, D.; Donlon, E.; Dong, S.; Rodriguez, A. Dense Tactile Force Distribution Estimation using GelSlim and inverse FEM. arXiv 2018, arXiv:1810.04621. [Google Scholar]
- Tanida, J.; Kumagai, T.; Yamada, K.; Miyatake, S.; Ishida, K.; Morimoto, T.; Kondou, N.; Miyazaki, D.; Ichioka, Y. Thin observation module by bound optics (TOMBO): Concept and experimental verification. Appl. Opt. 2001, 40, 1806–1813. [Google Scholar] [CrossRef]
- Shogenji, R.; Kitamura, Y.; Yamada, K.; Miyatake, S.; Tanida, J. Multispectral imaging using compact compound optics. Optics Express 2004, 12, 1643–1655. [Google Scholar] [CrossRef]
- Fujimoto, J.; Mizuuchi, I.; Sodeyama, Y.; Yamamoto, K.; Muramatsu, N.; Ohta, S.; Hirose, T.; Hongo, K.; Okada, K.; Inaba, M. Picking up dishes based on active groping with multisensory robot hand. In Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication, Toyama, Japan, 27 September–2 October 2009; pp. 220–225. [Google Scholar]
- Koyama, K.; Hasegawa, H.; Suzuki, Y.; Ming, A.; Shimojo, M. Preshaping for various objects by the robot hand equipped with resistor network structure proximity sensors. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 4027–4033. [Google Scholar]
- Sato, K.; Shinoda, H.; Tachi, S. Vision-based cutaneous sensor to measure both tactile and thermal information for telexistence. In Proceedings of the IEEE International Symposium on VR Innovation, Singapore, 19–20 March 2011; pp. 119–122. [Google Scholar]
- Sato, K.; Shinoda, H.; Tachi, S. Finger-shaped thermal sensor using thermo-sensitive paint and camera for telexistence. In Proceedings of the IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 1120–1125. [Google Scholar]
- Soter, G.; Conn, A.; Hauser, H.; Lepora, N.F.; Rossiter, J. MultiTip: A multimodal mechano-thermal soft fingertip. In Proceedings of the IEEE International Conference on Soft Robotics, Livorno, Italy, 24–28 April 2018; pp. 239–244. [Google Scholar]
- Lichtsteiner, P.; Posch, C.; Delbruck, T. A 128 × 128 120 dB 15 μs latency asynchronous temporal contrast vision sensor. IEEE J. Solid-State Circuits 2008, 43, 566–576. [Google Scholar] [CrossRef]
- Kumagai, K.; Shimonomura, K. Event-based Tactile Image Sensor for Detecting Spatio-Temporal Fast Phenomena in Contacts. In Proceedings of the IEEE World Haptics Conference, Tokyo, Japan, 9–12 July 2019. [Google Scholar]
- Rigi, A.; Naeini, F.B.; Makris, D.; Zweiri, Y. A Novel Event-Based Incipient Slip Detection Using Dynamic Active-Pixel Vision Sensor (DAVIS). Sensors 2018, 18, 333. [Google Scholar] [CrossRef]
- Naeini, F.B.; Alali, A.; Al-Husari, R.; Rigi, A.; AlSharman, M.K.; Makris, D.; Zweiri, Y. A Novel Dynamic-Vision- Based Approach for Tactile Sensing Applications. IEEE Trans. Instrum. Meas. 2019. [Google Scholar] [CrossRef]
- Izatt, G.; Mirano, G.; Adelson, E.; Tedrake, R. Tracking Objects with Point Clouds from Vision and Touch. In Proceedings of the IEEE International Conference on Robotics and Automation, Singapore, 29 May–3 June 2017; pp. 4000–4007. [Google Scholar]
- Luo, S.; Yuan, W.; Adelson, E.; Cohn, A.G.; Fuentes, R. ViTac: Feature Sharing Between Vision and Tactile Sensing for Cloth Texture Recognition. In Proceedings of the IEEE International Conference on Robotics and Automation, Brisbane, Australia, 21–25 May 2018; pp. 2722–2727. [Google Scholar]
- Wang, S.; Wu, J.; Sun, X.; Yuan, W.; Freeman, W.; Tenenbaum, J.; Adelson, E. 3D shape perception from monocular vision, touch, and shape priors. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Madrid, Spain, 1–5 October 2018; pp. 1606–1613. [Google Scholar]
- Calandra, R.; Owens, A.; Jayaraman, D.; Lin, J.; Yuan, W.; Malik, J.; Adelson, E.; Levine, S. More than a feeling: Learning to grasp and regrasp using vision and touch. IEEE Robot. Autom. Lett. 2018, 3, 3300–3307. [Google Scholar] [CrossRef]
Method | Light Conductive Plate | Marker Displacement | Reflective Membrane |
---|---|---|---|
Structure | |||
Physical quantity to be changed by physical contact | Refractive index of the material that makes contact with the light conductive plate (Equations (1)–(3)) | Marker position (reflecting a geometrical change of flexible material) (Equation (4)) | Surface gradient (reflecting geometrical change of flexible material) (Equation (5)) |
Typical measurable quantities | Contact position and area [19,24,25]/force ** [19,24] | Contact position [26,27,28]/force [29,30,31]/shear [27,29,30,31,32,33]/torque [31]/slip [32,34,35] | Contour and surface texture of contact object [36]/position and orientation [37,38] |
Typical post image processing | Thresholding [19,39] | Tracking [29,31]/optical flow computation [26] | Pattern matching [37,38] |
Advantages | High spatial resolution (at the imager’s resolution) */ease to detect contact (just seeing pixel value change) | Easy to make/no special lighting arrangement is required/arbitrary shape of sensor skin is possible | High spatial resolution/can obtain fine surface texture (such as fingerprint) |
Disadvantages | The response depends on optical characteristics of the object */less spatial resolution in the case of use of the elastomer cover ** | Measurement points are determined by markers (but can be improved via post processing [26]) | Does not respond to the object without any edge or texture (such as flat object which is larger than sensing area) |
Remarks | Does not require deformability of the sensor skin in direct method *, but some indirect methods use deformability of elastomer cover ** [19] | Can be combined with the reflective membrane method [37,40,41] | Three-dimensional shape reconstruction is possible based on photometric stereo [42] or with sensor movement [43] |
© 2019 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Shimonomura, K. Tactile Image Sensors Employing Camera: A Review. Sensors 2019, 19, 3933. https://doi.org/10.3390/s19183933
Shimonomura K. Tactile Image Sensors Employing Camera: A Review. Sensors. 2019; 19(18):3933. https://doi.org/10.3390/s19183933
Chicago/Turabian StyleShimonomura, Kazuhiro. 2019. "Tactile Image Sensors Employing Camera: A Review" Sensors 19, no. 18: 3933. https://doi.org/10.3390/s19183933
APA StyleShimonomura, K. (2019). Tactile Image Sensors Employing Camera: A Review. Sensors, 19(18), 3933. https://doi.org/10.3390/s19183933