Sediment Sampling in Estuarine Mudflats with an Aerial-Ground Robotic Team
<p>The robotic team in the field trials. The ground vehicle is retrofitted with the drilling tool, which is recharged by a six-DOF arm. The aerial vehicle is equipped with a high resolution camera whose pose is controlled with an active gimbal. (<b>a</b>) The ground vehicle over irregular terrain. A passive longitudinal axle allows the robot to follow the terrain’s unevenness. (<b>b</b>) The ground vehicle traveling over muddy terrain. (<b>c</b>) The ground vehicle over terrain covered with seaweed, small stones and clam shells. (<b>d</b>) The ground vehicle traveling over water after three collected samples and the aerial vehicle surveying the field trials area.</p> "> Figure 2
<p>The ground vehicle drilling the terrain side-by-side a human expert handling a conventional manual tool. The human expert hammers an acrylic hollow cylinder until a desired depth is reached. Then, the cylinder must be gently pulled in order to ensure that the sediments do not fall. All of this process is time consuming and physically demanding. Conversely, the robot drills effortlessly and stores up to nine samples before it needs to return to the base.</p> "> Figure 3
<p>Diagram representing the sediment sampling mission workflow.</p> "> Figure 4
<p>The Unmanned Ground Vehicle (UGV). (<b>a</b>) An exploded CAD view of the UGV; (<b>b</b>) the UGV turning around its geometric center.</p> "> Figure 5
<p>The drilling and dredging tools. (<b>a</b>) A hollow tube being attached to the drilling tool; (<b>b</b>) the robotic arm grasping a hollow tube to insert it into the drilling tool; (<b>c</b>) a dredger being pushed downwards by the dredging tool; (<b>d</b>) the hollow tube’s tip designed to prevent the core from slipping during extraction; (<b>e</b>) a sediment sample laid on the PVC half-tube after extraction.</p> "> Figure 6
<p>UGV’s modular hardware architecture. Dashed arrows represent power supply connections. Solid arrows represent device-device intra-robot wired communication links (e.g., RS232, Firewire, CAN, USB, Ethernet). The UGV communications hardware block abstracts the components for wireless communications, whose details are discussed in <a href="#sec5dot2-sensors-16-01461" class="html-sec">Section 5.2</a>.</p> "> Figure 7
<p>UGV’s possible motions attainable by the robot in two different locomotion modes. (<b>a</b>) The orange arrow depicts the type of motion that the robot can attain when in the turning point; (<b>b</b>) the green arrows depict the type of motion that the robot can perform with double Ackerman locomotion.</p> "> Figure 8
<p>The UAV’s hardware architecture. Orange arrows represent serial connections between low-level and high-level control boards, sensors and actuators. The UAV communications hardware block abstracts the components for wireless communications, whose details are discussed in <a href="#sec5dot2-sensors-16-01461" class="html-sec">Section 5.2</a>. This block communicates with both high-level and low-level boards through Ethernet and PPMconnections, respectively, represented with blue arrows.</p> "> Figure 9
<p>The UAV’s software architecture. The rounded rectangles depict low-level actuators and sensor, while the others depict the software modules. The red arrows depict messages with geo-referenced information, with the dashed line depicting the points that form the aerial survey area and the filled line the next point for the UAV. In green are depicted the images exchanged between the software modules; the filled line is the camera’s raw data and the dashed line the images used for the aerial mosaic. In orange are depicted the action messages and in black the sensor and actuator streams. The blue arrow depicts the action’s sent to the UAV’s navigation in order to perform the aerial survey.</p> "> Figure 10
<p>The aerial mosaic creation process. (<b>a</b>) The expert defines the survey area by setting three geo-referenced points in the GUI (depicted in orange); (<b>b</b>) a set of waypoints (in yellow) is automatically generated so as to ensure that the UAV takes significant key frames for the registration process; (<b>c</b>) satellite imagery of the site with the UAV’s executed path overlaid. This path was generated by following several waypoints in sequence. The UAV’s starting spot is depicted by the red circle overlay. (<b>d</b>) The resulting aerial panorama with the UAV’s starting point depicted by the red circle overlay.</p> "> Figure 11
<p>Human-robot interaction devices and teamwork underlying communication channels.</p> "> Figure 12
<p>Communications hardware architecture for both the UAV and UGV. Black arrows represent the wireless connections between interaction devices (e.g., remote RF controllers), the base station, UAV and UGV. Blue arrows represent the wired connections between the UAV’s and UGV’s communications and control hardware (see <a href="#sensors-16-01461-f006" class="html-fig">Figure 6</a> and <a href="#sensors-16-01461-f008" class="html-fig">Figure 8</a>).</p> "> Figure 13
<p>The diagnostic tool. (<b>a</b>) Graphical summary of the system’s health; (<b>b</b>) detailed diagnostic information. In this case, the system is reporting an issue related to one of the antennas.</p> "> Figure 14
<p>Mission control graphical user interfaces. (<b>a</b>) UGV’s tele-operation view; (<b>b</b>) UAV’s aerial survey configuration view. In orange are depicted the geo-referenced points that define the area of interest while in yellow the waypoints the UAV needs to follow during the survey.</p> "> Figure 15
<p>Mission preparation interaction pattern.</p> "> Figure 16
<p>Mission execution interaction pattern.</p> "> Figure 17
<p>The field trials sites. (<b>a</b>) Satellite image covering the field trials sites. The circles represent the two sampling sites, <span class="html-italic">A</span> and <span class="html-italic">B</span>, visited during the field trials along the longitudinal transect. The filled circles represent the locations of the base station in both sampling sites. (<b>b</b>) A close-up perspective over site <span class="html-italic">A</span>; (<b>c</b>) a close-up perspective over site <span class="html-italic">B</span>. The several labels in (b,c) represent sampling locations. The depicted satellite images were acquired during a high tide, which explains why the overlaid sampling sites appear on water regions. Courtesy of Mapdata © 2016 Google.</p> "> Figure 18
<p>Monitoring mission execution with the aerial vehicle.</p> "> Figure 19
<p>Snapshots of the sampling procedure. (<b>a</b>)–(<b>f</b>) Robot moving the sampling tube from the storage area to the drilling tool. (<b>g</b>)–(<b>j</b>) The robot retrieves the core using the drilling tool. (<b>k</b>) and (<b>l</b>) Robot returning the sampling tube to the storage area.</p> "> Figure 20
<p>The ground vehicle during two sampling runs. (<b>a</b>)–(<b>d</b>) Sampling process in location <math display="inline"> <semantics> <msub> <mi>B</mi> <mn>2</mn> </msub> </semantics> </math>, (<b>e</b>)–(<b>h</b>) Sampling process in location <math display="inline"> <semantics> <msub> <mi>B</mi> <mn>3</mn> </msub> </semantics> </math>.</p> "> Figure 21
<p>Sample collected by the robotic system being post-processed in the lab. (<b>a</b>) The sample in its PVC case; (<b>b</b>) profile layers are kept for further processing.</p> ">
Abstract
:1. Introduction
2. Related Work
3. Experimental Protocol for Robotic Sampling of Estuarine Mudflats
3.1. Case Study
3.2. Mission Workflow
- The human-robot team reaches the operations site. The robots are unloaded from the transportation vehicle. Based on satellite imagery of the operations’ site, the human expert defines the workspace boundaries. These boundaries will be used by the robotic system to constrain its operation range.
- The aerial robot takes off and performs a scan to cover the workspace defined by the expert. As a result of these scanning procedure, the UAV builds a high resolution geo-referenced mosaic from a set of mutually-registered aerial images. The mosaic is then presented to the expert, which is now able to discard the satellite imagery, as it is most likely outdated.
- At the base station, the expert segments potential key features of the environment, such as water ponds and channels, sea grass coverage, salt marsh vertical vegetation, sand banks and all sorts of physical obstacles to the UGV’s navigation. Then, based on this meta-data, the expert specifies a set of transects to be sampled by the robot (see Section 3.3).
- With the information collected in the previous step, the ground vehicle is tele-operated by the expert so as to traverse the transects and periodically sample the terrain while avoiding any peril in its way. At each sampling point, the expert requests the ground vehicle to perform a sediment sampling behavior. If intended by the expert, the aerial vehicle is used to provide aerial images, augmenting the operator’s perception about the mission execution.
- When the ground vehicle has either its sample containers filled or has visited all of the defined sampling locations, the expert tele-operates the robot back to the base station.
- Back at the base station, the expert unloads the sample containers into isothermal boxes with cooling pads, which are subsequently brought to the lab for post-processing. If the mission is not complete, the expert loads the ground vehicle with empty sample containers and resumes the mission (return to the previous step).
- Once the mission in the current operation’s site is complete, the expert washes the ground vehicle with fresh water so as to remove dirt and salt residues. Then, the expert may be called upon to execute some maintenance procedures, such as recharging batteries or re-inflating tires. Finally, the human-robot team leaves the operations site.
3.3. Sampling Procedure
3.4. Sampling Processing
4. The Robotic Team
4.1. The Unmanned Ground Vehicle
4.1.1. Mechanical Hardware
4.1.2. Electronics Hardware
4.1.3. Control Software
4.2. The Unmanned Aerial Vehicle
4.2.1. Mechanical, Hardware and Control System
4.2.2. Flight Behavior and Mosaic Creation
5. Human-Robot Teamwork
5.1. Human-Robot Interaction Devices
5.2. Communications Infrastructure
System’s Health Supervision
5.3. Mission Graphical User Interface
5.4. Human-Robot Interaction Patterns
6. Field Trials
6.1. Sampling Locations Selection
6.2. Logistics
6.3. Testing Ground Mobility
6.4. Drilling Robustness
6.5. Drilling Performance
7. Conclusions
Supplementary Materials
Acknowledgments
Author Contributions
Conflicts of Interest
Abbreviations
GUI | Graphical User Interface |
UAV | Unmanned Aerial Vehicle |
UGV | Unmanned Ground Vehicle |
GPS | Global Positioning System |
References
- Eggleton, J.; Thomas, K.V. A review of factors affecting the release and bioavailability of contaminants during sediment disturbance events. Environ. Int. 2004, 30, 973–980. [Google Scholar] [CrossRef] [PubMed]
- Guedes, M.; Santana, P.; Deusdado, P.; Mendonça, R.; Marques, F.; Henriques, N.; Lourenço, A.; Correia, L.; Barata, J.; Flores, L. ARES-III: A versatile multi-purpose all-terrain robot. In Proceedings of the 2012 IEEE 17th Conference on Emerging Technologies & Factory Automation (ETFA), Krakow, Poland, 17–21 September 2012; pp. 1–8.
- ICRU. Sampling to Estimate Spatial Pattern; Oxford University Press: Oxford, UK, 2006; Volume 2, pp. 45–64. [Google Scholar]
- Deusdado, P.; Pinto, E.; Guedes, M.; Marques, F.; Rodrigues, P.; Lourenço, A.; Mendonça, R.; Silva, A.; Santana, P.; Corisco, J.; et al. An aerial-ground robotic team for systematic soil and biota sampling in Estuarine Mudflats. In Robot 2015: Second Iberian Robotics Conference; Springer: Berlin, Germany, 2016; pp. 15–26. [Google Scholar]
- Smith, L.C. Satellite remote sensing of river inundation area, stage, and discharge: A review. Hydrol. Process. 1997, 11, 1427–1439. [Google Scholar] [CrossRef]
- Mainwaring, A.; Culler, D.; Polastre, J.; Szewczyk, R.; Anderson, J. Wireless sensor networks for habitat monitoring. In Proceedings of the 1st ACM International Workshop on Wireless Sensor Networks and Applications, Atlanta, GA, USA, 28 September 2002; ACM: New York, NY, USA, 2002; pp. 88–97. [Google Scholar]
- Rundel, P.W.; Graham, E.A.; Allen, M.F.; Fisher, J.C.; Harmon, T.C. Environmental sensor networks in ecological research. New Phytol. 2009, 182, 589–607. [Google Scholar] [CrossRef] [PubMed]
- Song, W.Z.; Huang, R.; Xu, M.; Ma, A.; Shirazi, B.; LaHusen, R. Air-dropped sensor network for real-time high-fidelity volcano monitoring. In Proceedings of the 7th International Conference on Mobile Systems, Applications, and Services, Wroclaw, Poland, 22–25 June 2009; ACM: New York, NY, USA, 2009; pp. 305–318. [Google Scholar]
- Capella, J.V.; Bonastre, A.; Ors, R.; Peris, M. A step forward in the in-line river monitoring of nitrate by means of a wireless sensor network. Sens. Actuators B Chem. 2014, 195, 396–403. [Google Scholar] [CrossRef]
- Li, M.; Liu, Y. Underground coal mine monitoring with wireless sensor networks. ACM Trans. Sens. Netw. 2009, 5, 10. [Google Scholar] [CrossRef]
- Bhadauria, D.; Isler, V.; Studenski, A.; Tokekar, P. A robotic sensor network for monitoring carp in Minnesota lakes. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation (ICRA), Anchorage, AK, USA, 3–7 May 2010; pp. 3837–3842.
- Bajracharya, M.; Maimone, M.W.; Helmick, D. Autonomy for mars rovers: Past, present, and future. Computer 2008, 41, 44–50. [Google Scholar] [CrossRef]
- Thrun, S.; Thayer, S.; Whittaker, W.; Baker, C.; Burgard, W.; Ferguson, D.; Hähnel, D.; Montemerlo, M.; Morris, A.; Omohundro, Z.; et al. Autonomous exploration and mapping of abandoned mines. IEEE Robot. Autom. Mag. 2004, 11, 79–91. [Google Scholar] [CrossRef]
- Kantor, G.; Fairfield, N.; Jonak, D.; Wettergreen, D. Experiments in navigation and mapping with a hovering AUV. In Field and Service Robotics; Springer: Berlin, Germany, 2008; pp. 115–124. [Google Scholar]
- Kimball, P.; Bailey, J.; Das, S.; Geyer, R.; Harrison, T.; Kunz, C.; Manganini, K.; Mankoff, K.; Samuelson, K.; Sayre-McCord, T.; et al. The whoi jetyak: An autonomous surface vehicle for oceanographic research in shallow or dangerous waters. In Proceedings of the IEEE/OES Autonomous Underwater Vehicles Conference, Oxford, MS, USA, 6–9 October 2014; pp. 1–7.
- Pinto, E.; Marques, F.; Mendonça, R.; Lourenço, A.; Santana, P.; Barata, J. An autonomous surface-aerial marsupial robotic team for riverine environmental monitoring: Benefiting from coordinated aerial, underwater, and surface level perception. In Proceedings of the IEEE International Conference on Robotics and Biomimetics (ROBIO), Hanoi, Vietnam, 5–10 December 2014.
- Murphy, R.R.; Peschel, J.; Arnett, C.; Martin, D. Projected needs for robot-assisted chemical, biological, radiological, or nuclear (CBRN) incidents. In Proceedings of the IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), College Station, TX, USA, 5–8 November 2012; pp. 1–4.
- Dunbabin, M.; Marques, L. Robots for environmental monitoring: Significant advancements and applications. IEEE Robot. Autom. Mag. 2012, 19, 24–39. [Google Scholar] [CrossRef]
- Marques, F.; Lourenço, A.; Mendonça, R.; Pinto, E.; Rodrigues, P.; Santana, P.; Barata, J. A critical survey on marsupial robotic teams for environmental monitoring of water bodies. In Proceedings of the IEEE OCEANS, Washington, DC, USA, 19–22 October 2015; pp. 1–6.
- Wilkins, J.W. Drilling Sampling/Testing Equipment. U.S. Patent 4,043,407, 6 February 1977. [Google Scholar]
- Philipenko, H. Soil Sampler and Mounting Arrangement. U.S. Patent 4,316,393, 4 June 1982. [Google Scholar]
- Doty, J.G. Bumper Mounted Soil Sampling Device. U.S. Patent 4,333,541, 8 June 1982. [Google Scholar]
- Sneath, R.; Phillips, V.; Price, J. Powered soil samplers for heavy metals and some concepts for the future. J. Agric. Eng. Res. 1989, 44, 159–174. [Google Scholar] [CrossRef]
- Nosewicz, M.A.; Turner, S.B. Portable Soil Sampling Device and Method. U.S. Patent 5,211,248, 18 May 1993. [Google Scholar]
- Wright, N.A.; Wright, H.L. Extended Soil Sampling Head. U.S. Patent 5,950,741, 14 September 1999. [Google Scholar]
- Naber, R.J.; Naber, G.G. Soil Sampling Device. U.S. Patent 6,360,829, 10 October 2002. [Google Scholar]
- Porritt, J.; Scott, H. Vibratory Core Drill Apparatus for the Recovery of Soil or Sediment Core Samples. U.S. Patent 5,004,055, 2 April 1991. [Google Scholar]
- Marker, R. Vehicle Mounted Soil Sampler. U.S. Patent 8,573,074, 5 November 2013. [Google Scholar]
- Bacchelli, A.; Catone, G. A Core Sampling Apparatus. EP Patent App. EP20,100,196,252, 8 October 2011. [Google Scholar]
- Wright, N.A.; Wright, H.L. Mobile Soil Sampling Device. U.S. Patent 5,394,949, 7 March 1995. [Google Scholar]
- Pavlik, J. Mobile Soil Sampling Device With Vacuum Collector. U.S. Patent 7,575,069, 18 August 2009. [Google Scholar]
- Edwards, R.D.; Smith, A.E. Automatic Soil Sampling Machine. U.S. Patent 4,869,115, 13 June 1989. [Google Scholar]
- Hale, G. Soil Sampling “on the Fly”. U.S. Patent 6,016,713, 25 January 2000. [Google Scholar]
- Dagel, J.H.; Hesse, M.B.; Ackerman, R.J. Rotary Soil Probe. U.S. Patent 6,766,865, 27 July 2004. [Google Scholar]
- Burton, J.D. Soil Sampling Apparatus and Method. U.S. Patent 7,827,873, 9 November 2010. [Google Scholar]
- Guzman, R.; Navarro, R.; Ferre, J.; Moreno, M. RESCUER: Development of a modular chemical, biological, radiological, and nuclear robot for intervention, sampling, and situation awareness. J. Field Robot. 2015. [Google Scholar] [CrossRef]
- Winter, A.G.; Deits, R.L.; Dorsch, D.S.; Hosoi, A.E.; Slocum, A.H. Teaching roboclam to dig: The design, testing, and genetic algorithm optimization of a biomimetic robot. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Taipei, Taiwan, 18–22 October 2010; pp. 4231–4235.
- Darukhanavala, C.; Lycas, A.; Mittal, A.; Suresh, A. Design of a bimodal self-burying robot. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe, Germany, 6–10 May 2013; pp. 5600–5605.
- Carvalho, F.P.; Oliveira, J.M.; Silva, L.; Malta, M. Radioactivity of anthropogenic origin in the Tejo Estuary and need for improved waste management and environmental monitoring. Int. J. Environ. Stud. 2013, 70, 952–963. [Google Scholar] [CrossRef]
- Caetano, M.; Madureira, M.J.; Vale, C. Metal remobilization during resuspension of anoxic contaminated sediment: Short-term laboratory study. Water Air Soil Pollut. 2003, 143, 23–40. [Google Scholar] [CrossRef]
- Santana, P.; Cândido, C.; Santos, P.; Almeida, L.; Correia, L.; Barata, J. The Ares robot: Case study of an affordable service robot. In Proceedings of the European Robotics Symposium (EUROS), Diplomat Hotel Prague, Czech Republic, 26–28 March 2008; Springer: Berlin, Germany, 2008; pp. 33–42. [Google Scholar]
- Marques, F.; Santana, P.; Guedes, M.; Pinto, E.; Lourenço, A.; Barata, J. Online self-reconfigurable robot navigation in heterogeneous environments. In Proceedings of the IEEE International Symposium on Industrial Electronics (ISIE), Taipei, Taiwan, 28–31 May 2013; pp. 1–6.
- Quigley, M.; Gerkey, B.; Conley, K.; Faust, J.; Foote, T.; Leibs, J.; Berger, E.; Wheeler, R.; Ng, A. ROS: An open-source robot operating system. In Proceedings of the ICRA Open-Source Software Workshop, Kobe, Japan, 17 May 2009.
- Mace, J. Rosbridge. Available online: http://wiki.ros.org/rosbridge_suite (accessed on 7 September 2015).
- Santana, P.F.; Cândido, C.; Santos, V.; Barata, J. A motion controller for compliant four-wheel-steering robots. In Proceedings of the IEEE International Conference on Robotics and Biomimetics (ROBIO), Kunming, China, 17–20 December 2006; pp. 532–537.
- Sucan, I.A.; Chitta, S. MoveIt! Available online: http://moveit.ros.org/ (accessed on 7 September 2015).
- Arducopter-Open-Source Multicopter Controller. Available online: http://www.arducopter.co.uk/ (accessed on 3 September 2015).
- Photo Stitcher, H.P. D’Angelo, Pablo. Available online: http://hugin.sourceforge.net (accessed on 7 September 2015).
- Axis Communications AB. VAPIX Version 3 Video Streaming API. 2013. Available online: http://www.axis.com/files/manuals/vapix_video_streaming_52937_en_1307.pdf (accessed on 7 September 2015).
- Santana, P.; Correia, L.; Salgueiro, M.; Santos, V.; Barata, J. A Knowledge-based component for human-robot teamwork. In Proceedings of the International Conference on Informatics in Control, Automation and Robotics (ICINCO), Funchal, Portugal, 11–15 May 2008; pp. 228–233.
Drilling Phase | Extraction Phase | |||
---|---|---|---|---|
Area | Linear Actuator | Rotation Actuator | Linear Actuator | Rotation Actuator |
N/A | ||||
N/A | ||||
N/A | ||||
N/A | ||||
N/A |
© 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Deusdado, P.; Guedes, M.; Silva, A.; Marques, F.; Pinto, E.; Rodrigues, P.; Lourenço, A.; Mendonça, R.; Santana, P.; Corisco, J.; et al. Sediment Sampling in Estuarine Mudflats with an Aerial-Ground Robotic Team. Sensors 2016, 16, 1461. https://doi.org/10.3390/s16091461
Deusdado P, Guedes M, Silva A, Marques F, Pinto E, Rodrigues P, Lourenço A, Mendonça R, Santana P, Corisco J, et al. Sediment Sampling in Estuarine Mudflats with an Aerial-Ground Robotic Team. Sensors. 2016; 16(9):1461. https://doi.org/10.3390/s16091461
Chicago/Turabian StyleDeusdado, Pedro, Magno Guedes, André Silva, Francisco Marques, Eduardo Pinto, Paulo Rodrigues, André Lourenço, Ricardo Mendonça, Pedro Santana, José Corisco, and et al. 2016. "Sediment Sampling in Estuarine Mudflats with an Aerial-Ground Robotic Team" Sensors 16, no. 9: 1461. https://doi.org/10.3390/s16091461
APA StyleDeusdado, P., Guedes, M., Silva, A., Marques, F., Pinto, E., Rodrigues, P., Lourenço, A., Mendonça, R., Santana, P., Corisco, J., Almeida, S. M., Portugal, L., Caldeira, R., Barata, J., & Flores, L. (2016). Sediment Sampling in Estuarine Mudflats with an Aerial-Ground Robotic Team. Sensors, 16(9), 1461. https://doi.org/10.3390/s16091461