Fast 3D Rotation Estimation of Fruits Using Spheroid Models
<p>Roller conveyor unit used to obtain different views of the rotated fruits.</p> "> Figure 2
<p>Four consecutive camera frames.</p> "> Figure 3
<p>Set of views of the fruit highlighted in <a href="#sensors-21-02232-f002" class="html-fig">Figure 2</a>.</p> "> Figure 4
<p>Example of green tomato with no texture. No 3D motion can be estimated in this case.</p> "> Figure 5
<p><b>Left</b>: oblate spheroid model; <b>Center</b>: sphere model; <b>Right</b>: prolate spheroid model.</p> "> Figure 6
<p>Samples of different fruit shapes. <b>Left</b>: oblate; <b>Center</b>: spherical; <b>Right</b>: prolate.</p> "> Figure 7
<p>Relation of variances and semi-principal axes for an axis-aligned ellipse (circle).</p> "> Figure 8
<p>Relation between the variances and principal axes in the case of a rotated ellipse. <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>1</mn> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>λ</mi> <mn>2</mn> </msub> </semantics></math> are the eigenvalues of the covariance matrix.</p> "> Figure 9
<p>Views of an oblate fruit. The major principal axes are very similar in all views (<math display="inline"><semantics> <mrow> <mn>2</mn> <mi>a</mi> <mo>≈</mo> <mn>2</mn> <mi>A</mi> </mrow> </semantics></math>). The range of the minor principal axis in each view is <math display="inline"><semantics> <mrow> <mn>2</mn> <mi>B</mi> <mo><</mo> <mn>2</mn> <msub> <mi>b</mi> <mi>i</mi> </msub> <mo><</mo> <mn>2</mn> <mi>A</mi> </mrow> </semantics></math>. The minor principal axis of the spheroid is visible in the fourth view starting from the left (<math display="inline"><semantics> <mrow> <msub> <mi>b</mi> <mn>4</mn> </msub> <mo>≈</mo> <mi>B</mi> </mrow> </semantics></math>).</p> "> Figure 10
<p>Sample camera view of an oblate object. The red axis is oriented as the eigenvector corresponding to the largest eigenvalue of <math display="inline"><semantics> <mo>Σ</mo> </semantics></math>. Its length is the same as the major spheroid semi-axis, <math display="inline"><semantics> <mrow> <mn>2</mn> <mi>A</mi> </mrow> </semantics></math>. The yellow circle is located at the center of mass of the fruit/view.</p> "> Figure 11
<p>Cross section of fruit across 3D plane <math display="inline"><semantics> <mrow> <msub> <mi>v</mi> <mn>1</mn> </msub> <mo>=</mo> <mn>0</mn> </mrow> </semantics></math> in <a href="#sensors-21-02232-f008" class="html-fig">Figure 8</a>. Camera position above the fruit is shown.</p> "> Figure 12
<p>Ambiguity in the estimation of the elevation angle. The perceived shape from the camera is the same in both possibilities.</p> "> Figure 13
<p>Sequence of views. Blue and red arrows indicate local minima and maxima respectively, of the sequence <math display="inline"><semantics> <mrow> <mi mathvariant="script">B</mi> <mo>=</mo> <mo>{</mo> <msub> <mi>b</mi> <mi>i</mi> </msub> <mo>}</mo> </mrow> </semantics></math> of the semi-minor axis. If the sequence <math display="inline"><semantics> <mi mathvariant="script">B</mi> </semantics></math> is increasing at instant <span class="html-italic">i</span>, then <math display="inline"><semantics> <mrow> <mi>θ</mi> <mo>></mo> <mn>0</mn> </mrow> </semantics></math>. This means (for this direction of rotation) that the part below the fruit center in the view is higher than the part above the center.</p> "> Figure 14
<p>Search grid of rotation vectors.</p> "> Figure 15
<p>Error map for all rotations in the grid search.</p> "> Figure 16
<p>This figure illustrates the local increase in resolution of the error map around the local minimum. The initial local minimum obtained with <math display="inline"><semantics> <mi>γ</mi> </semantics></math> step is shown as a yellow filled circle.</p> "> Figure 17
<p>Result of image pre-processing.</p> "> Figure 18
<p>Area from which relevant points are obtained.</p> "> Figure 19
<p><b>Top</b>: Sample Images from Fruits-360 data set; from left to right kiwi, peach, apple golden, coconut, and watermelon; <b>Bottom</b>: Sample Images from FruitRot3D dataset; from left to right, orange, tomato, and mandarin.</p> "> Figure 20
<p>Sequence of estimated rotations for the coconut sequence with <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>n</mi> <mo>=</mo> <mn>20</mn> </mrow> </semantics></math>.</p> "> Figure 21
<p>Mean Rotations as a function of <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>n</mi> </mrow> </semantics></math> for different fruit types.</p> "> Figure 22
<p>Mean rotation speed as a function of <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>n</mi> </mrow> </semantics></math> for different fruit types.</p> "> Figure 23
<p>Standard deviation of rotations as a function of <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>n</mi> </mrow> </semantics></math> for different fruit types.</p> "> Figure 24
<p>This figure illustrates the idea of the interface to annotate ground-truth for estimating reprojection error. The user is requested to select corresponding points in both views.</p> "> Figure 25
<p>Example of point tracking in the case of two different tomatoes. The initial tracked point is white. Green circles mean predicted visible positions. The geometry model is set to oblate in this case. The sequence has been truncated to the views where the tracked point remains visible.</p> "> Figure 26
<p>Example of point tracking in the case of two different oranges. The geometry model in this case is sphere. The sequence has been truncated to the views where the tracked point remains visible.</p> "> Figure 27
<p>Example of point tracking in the case of three different mandarins. Dark circles mean predicted occluded positions of the initial point. The third row is the same fruit as the second, but a different point is tracked. Oblate geometry has been used.</p> "> Figure 28
<p>Examples of point tracking of fruits in the Fruit-360 dataset.</p> ">
Abstract
:1. Introduction
- Many views of the fruit do not guarantee that the whole surface has been observed. Therefore, a method is needed to assess which fraction has been viewed.
- To prevent multiple counting of defects, it is necessary to match points in different views.
2. Related Work
- Capturing multiple images from different views by using multiple cameras.
- Using a single camera with several helping mirrors.
- Rotating the fruits using rollers or robot hands.
3. Materials and Methods
3.1. Modeling the 3D Shape of the Fruits
3.1.1. Principal Axes of the Projected Ellipses
3.1.2. Determination of the Spheroid Principal Axes
Spherical Model
Oblate Model
Prolate Model
3.1.3. Elevation Angle Estimation
3.1.4. Pixels 3D Coordinates
3.2. 3D Rotation Estimation
- If , the transformed point is not visible in the target image and it is ignored in the similarity computation.
- If , then the pair is added to a list of valid relevant points.
- New intermediate rotations, with step , are computed around the local minimum . The new sampled rotations are shown as empty circles in Figure 16. Then, the minimum on this denser subgrid is found.
- A parabola is fitted locally around the new minimum. The final rotation is obtained as the position of the parabola minimum. This idea is similar to that proposed by Lowe for local extrema detection in SIFT [34]. More details about this parabolic refinement are given in Appendix B.
3.3. Implementation Details
3.3.1. Image Pre-Processing
- Reduce the number of color channels. Unlike many approaches, this step is accomplished by simply taking the green component. Compared with standard RGB to luminance conversion, taking the green component is computationally free.
- Reduce the image resolution. In our experiments, we use a downsampling factor of 4 in both axes. To mitigate aliasing, each pixel of the downsized image is computed using the average of the corresponding block in the original image.
- High pass filtering. This step is performed by computing the signed difference between the downsampled image and a Gaussian blurred version of it with . This image will be zero at smooth portions of the fruit and will exhibit large positive or negative values at details or texture. A fast recursive and separable implementation of the Gaussian filter was used [35].
3.3.2. Selecting Relevant Points
3.4. Datasets
3.4.1. FruitRot3D Dataset
- The dataset contains three types of fruits, namely oranges, mandarins, and tomatoes.
- There are 15 fruit sequences for each fruit type.
- The length of each fruit sequence oscillates between 13 and 16 images.
- The 3D rotation between consecutive views of the same fruit is not constant due to the slipping on the rolling conveyor and irregularities on the fruit shape. Notice that not only the magnitude of the rotation can change but also its axis. The typical range of the magnitude of the rotation is between 10 and 30 degrees.
- The Foreground/Background segmentation was automatically performed by the inspection machine. Background pixels were set to black (RGB = {0,0,0}).
- The imaged fruit diameters are in the range between 250 and 350 pixels depending on the fruit type.
- The images are stored in PNG format with lossless compression.
3.4.2. Fruits-360 Dataset
- The dataset contains more than 100 types of fruits. However, only one sequence per fruit type is available.
- If it is assumed that both the webcam frame rate (no image drops) and the motor speed are constant, then the 3D rotation between consecutive views must be constant. Using these assumptions, the approximate rotation magnitude between consecutive images is about one degree.
- The images in the dataset were resized to a fixed common size pixels.
- The images were stored using JPEG lossy compression.
4. Results
4.1. Rotation Error Analysis
- The proposed method seems to work well for a relatively broad range of fruit types. The only restriction is the presence of texture and a reasonable similarity to the geometric model.
- Typically, industrial inspection machines are adjusted for rotations between consecutive views in the range 18–30 degrees. The method provides consistent rotation estimates for rotations in that range. This fact can be derived from Figure 22 and Table 2 where the average rotation speed is almost constant regardless of the value.
4.2. Reprojection Error Analysis
4.3. Point Tracking along a Sequence of Views
- No bias is observed. If the estimated rotations were biased, then a drift in the predicted position of tracked points would be observed.
- From Figure 28, it can be seen that the geometric model is relatively robust to imperfections in the foreground/background segmentation. The presence of stems (watermelon) or noisy contours (coconuts) did not affect the ability of the method to track points.
- The tracking precision is enough for pairing defects across views and prevent multiple counting of the same defect.
- The method has proved its applicability to very different kinds of fruits. The only limitations are that the geometry of the fruit can be reasonably modeled by a spheroid and the fruit skin contains enough texture variations.
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A. Solution of Quadratic Equation to Compute the Z-Coordinate of a Pixel
Appendix B. Quadratic Refine
References
- Food and Agriculture Organization of the United Nations. Assuring Food Safety and Quality: Guidelines for Strengthening National Food Control Systems; Food and Agriculture Organization of the United Nations, World Health Organization: Geneva, Switzerland, 2003. [Google Scholar]
- Gao, H.; Zhu, F.; Cai, J. A Review of Non-destructive Detection for Fruit Quality. In Computer and Computing Technologies in Agriculture III; Li, D., Zhao, C., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; pp. 133–140. [Google Scholar]
- Patel, K.K.; Kar, A.; Jha, S.N.; Khan, M.A. Machine vision system: A tool for quality inspection of food and agricultural products. J. Food Sci. Technol. 2012, 49, 123–141. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Guo, Z.; Zhang, M.; Dah-Jye, L.; Simons, T. Smart Camera for Quality Inspection and Grading of Food Products. Electronics 2020, 9, 505. [Google Scholar] [CrossRef] [Green Version]
- Saldana, E.; Siche, R.; Lujan, M.; Quevedo, R. Review: Computer vision applied to the inspection and quality control of fruits and vegetables. Braz. J. Food Technol. 2013, 16, 254–272. [Google Scholar] [CrossRef] [Green Version]
- Anish, P. Quality Inspection of Fruits and Vegetables using Colour Sorting in Machine Vision System: A review. Int. J. Emerg. Trends Eng. Dev. 2017, 6. [Google Scholar] [CrossRef]
- Li, J.; Huang, W.; Zhao, C. Machine vision technology for detecting the external defects of fruits—A review. Imaging Sci. J. 2015, 63, 241–251. [Google Scholar] [CrossRef]
- Demant, C.; Streicher-Abel, B.; Waszkewitz, P.; Strick, M.; Schmidt, G. Industrial Image Processing: Visual Quality Control in Manufacturing; Springer-Electronic-Media: Berlin/Heidelberg, Germany, 1999. [Google Scholar] [CrossRef]
- Barjatya, A. Block matching algorithms for motion estimation. IEEE Trans. Evol. Comput. 2004, 8, 225–239. [Google Scholar]
- Cubero, S.; Aleixos, N.; Moltó, E.; Gómez-Sanchis, J.; Blasco, J. Advances in Machine Vision Applications for Automatic Inspection and Quality Evaluation of Fruits and Vegetables. Food Bioprocess Technol. 2011, 4, 487–504. [Google Scholar] [CrossRef]
- Blasco, J.; Aleixos, N.; Cubero, S.; Gómez-Sanchís, J.; Moltó, E. Automatic sorting of satsuma (Citrus unshiu) segments using computer vision and morphological features. Comput. Electron. Agric. 2009, 66, 1–8. [Google Scholar] [CrossRef]
- Shiraishi, Y.; Takeda, F. Proposal of whole surface inspection system by simultaneous six-image capture of prolate spheroid-shaped fruit and vegetables. In Proceedings of the 2011 Fourth International Conference on Modeling, Simulation and Applied Optimization, Kuala Lumpur, Malaysia, 19–21 April 2011; pp. 1–5. [Google Scholar] [CrossRef]
- Zhang, C.; Zhao, C.; Huang, W.; Wang, Q.; Liu, S.; Li, J.; Guo, Z. Automatic detection of defective apples using NIR coded structured light and fast lightness correction. J. Food Eng. 2017, 203, 69–82. [Google Scholar] [CrossRef]
- Zou, X.-B.; Zhao, J.-W.; Li, Y.X.; Holmes, M. In-line detection of apple defects using three color cameras system. Comput. Electron. Agric. 2010, 70, 129–134. [Google Scholar] [CrossRef]
- Li, Q.; Wang, M.; Gu, W. Computer vision based system for apple surface defect detection. Comput. Electron. Agric. 2002, 36, 215–223. [Google Scholar] [CrossRef]
- Imou, K.; Kaizu, Y.; Morita, M.; Yokoyama, S. Three-dimensional shape measurement of strawberries by volume intersection method. Trans. ASABE 2006, 49, 449–456. [Google Scholar] [CrossRef]
- Reese, D.Y.; Lefcourt, A.M.; Kim, M.S.; Lo, Y.M. Whole Surface Image Reconstruction for Machine Vision Inspection of Fruit. In Optics for Natural Resources, Agriculture, and Foods II; Chen, Y.R., Meyer, G.E., Tu, S.I., Eds.; International Society for Optics and Photonics, SPIE: Boston, MA, USA, 2007; Volume 6761, pp. 140–148. [Google Scholar] [CrossRef] [Green Version]
- Reese, D.; Lefcourt, A.M.; Kim, M.S.; Martin Lo, Y. Using parabolic mirrors for complete imaging of apple surfaces. Bioresour. Technol. 2009, 100, 4499–4506. [Google Scholar] [CrossRef]
- Kondo, N. Robotization in fruit grading system. Sens. Instrum. Food Qual. Saf. 2009, 3, 81–87. [Google Scholar] [CrossRef]
- Pham, Q.T.; Liou, N.S. Hyperspectral Imaging System with Rotation Platform for Investigation of Jujube Skin Defects. Appl. Sci. 2020, 10, 2851. [Google Scholar] [CrossRef] [Green Version]
- Rivera, N.; Gómez-Sanchis, J.; Chanona-Pérez, J. Early detection of mechanical damage in mango using NIR hyperspectral images and machine learning. Biosyst. Eng. 2014, 122, 91. [Google Scholar] [CrossRef]
- Mohammadi Baneh, N.; Navid, H.; Kafashan, J. Mechatronic components in apple sorting machines with computer vision. J. Food Meas. Charact. 2018, 12, 1135–1155. [Google Scholar] [CrossRef]
- Huang, W.; Li, J.; Wang, Q.; Chen, L. Development of a multispectral imaging system for online detection of bruises on apples. J. Food Eng. 2015, 146, 62–71. [Google Scholar] [CrossRef]
- Wang, Y.; Chen, Y. Fruit Morphological Measurement Based on Three-Dimensional Reconstruction. Agronomy 2020, 10, 455. [Google Scholar] [CrossRef] [Green Version]
- Kriegman, D.J.; Ponce, J. On recognizing and positioning curved 3-D objects from image contours. IEEE Trans. Pattern Anal. Mach. Intell. 1990, 12, 1127–1137. [Google Scholar] [CrossRef] [Green Version]
- Kalldahl, A.; Forchheimer, R.; Roivainen, P. 3D-Motion Estimation from Projections. In Applications of Digital Image Processing IX; Tescher, A.G., Ed.; International Society for Optics and Photonics, SPIE: San Diego, CA, USA, 1986; Volume 0697, pp. 301–307. [Google Scholar] [CrossRef]
- Wijewickrema, S.N.R.; Paplinski, A.P.; Esson, C.E. Reconstruction of Spheres using Occluding Contours from Stereo Images. In Proceedings of the 18th International Conference on Pattern Recognition, Hong Kong, China, 20–24 August 2006; IEEE Computer Society: Washington, DC, USA, 2006; Volume 1, pp. 151–154. [Google Scholar] [CrossRef]
- Fang, F.; Lee, Y.T. 3D reconstruction of polyhedral objects from single perspective projections using cubic corner. 3D Res. 2012, 3, 1. [Google Scholar] [CrossRef]
- Hilbert, D.; Cohn-Vossen, S. Geometry and the Imagination; AMS Chelsea Publishing Series; AMS Chelsea Pub.: New York, NY, USA, 1999. [Google Scholar]
- Karl, W.; Verghese, G.; Willsky, A. Reconstructing Ellipsoids from Projections. CVGIP Graph. Model. Image Process. 1994, 56, 124–139. [Google Scholar] [CrossRef]
- Hartley, R.I.; Zisserman, A. Multiple View Geometry in Computer Vision; Cambridge University Press: Cambridge, UK, 2000. [Google Scholar] [CrossRef] [Green Version]
- Stirzaker, D. Jointly Distributed Random Variables. In Probability and Random Variables: A Beginner’s Guide; Cambridge University Press: Cambridge, UK, 1999; pp. 238–308. [Google Scholar] [CrossRef]
- Dai, J.S. Euler-Rodrigues formula variations, quaternion conjugation and intrinsic connections. Mech. Mach. Theory 2015, 92, 144–152. [Google Scholar] [CrossRef]
- Lowe, D.G. Distinctive Image Features from Scale-Invariant Keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Young, I.T.; van Vliet, L.J. Recursive implementation of the Gaussian filter. Signal Process. 1995, 44, 139–151. [Google Scholar] [CrossRef] [Green Version]
- Image Data Set. 2021. Available online: https://github.com/alalbiol/3d-rotation-estimation-fruits (accessed on 1 January 2021).
- Mureșan, H.; Oltean, M. Fruit recognition from images using deep learning. Acta Univ. Sapientiae Inform. 2018, 10, 26–42. [Google Scholar] [CrossRef] [Green Version]
Coconut | Peach | Watermelon | Kiwi | Apple | |
---|---|---|---|---|---|
Prolate | Oblate | Spherical | Prolate | Oblate | |
11.23 | 7.92 | 11.0400 | 9.63 | 8.60 | |
20.70 | 14.79 | 19.95 | 17.97 | 16.49 | |
29.22 | 19.67 | 27.01 | 23.88 | 21.36 | |
34.25 | 25.31 | 32.79 | 29.57 | 26.37 | |
42.60 | 29.04 | 38.72 | 35.55 | 31.17 |
Coconut | Peach | Watermelon | Kiwi | Apple | |
---|---|---|---|---|---|
1.40 | 0.99 | 1.38 | 1.20 | 1.08 | |
1.38 | 0.98 | 1.33 | 1.19 | 1.10 | |
1.46 | 0.98 | 1.35 | 1.19 | 1.06 | |
1.37 | 1.01 | 1.31 | 1.18 | 1.05 | |
1.42 | 0.96 | 1.29 | 1.18 | 1.03 |
Coconut | Peach | Watermelon | Kiwi | Apple | |
---|---|---|---|---|---|
1.29 | 2.54 | 1.78 | 1.54 | 1.42 | |
2.48 | 2.13 | 3.24 | 2.31 | 1.84 | |
3.21 | 2.27 | 2.37 | 2.21 | 2.13 | |
2.93 | 2.61 | 3.22 | 3.53 | 3.68 | |
4.30 | 4.46 | 3.94 | 4.06 | 4.12 |
Fruit Type | RMS-Error (Pixels) | RMS-Error/Diameter(%) |
---|---|---|
Oranges | 2.53 | 0.94% |
Mandarins | 6.5 | 2.99% |
Tomatoes | 5.2 | 3.14% |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Albiol, A.; Albiol, A.; Sánchez de Merás, C. Fast 3D Rotation Estimation of Fruits Using Spheroid Models. Sensors 2021, 21, 2232. https://doi.org/10.3390/s21062232
Albiol A, Albiol A, Sánchez de Merás C. Fast 3D Rotation Estimation of Fruits Using Spheroid Models. Sensors. 2021; 21(6):2232. https://doi.org/10.3390/s21062232
Chicago/Turabian StyleAlbiol, Antonio, Alberto Albiol, and Carlos Sánchez de Merás. 2021. "Fast 3D Rotation Estimation of Fruits Using Spheroid Models" Sensors 21, no. 6: 2232. https://doi.org/10.3390/s21062232
APA StyleAlbiol, A., Albiol, A., & Sánchez de Merás, C. (2021). Fast 3D Rotation Estimation of Fruits Using Spheroid Models. Sensors, 21(6), 2232. https://doi.org/10.3390/s21062232