Export Citations
Save this search
Please login to be able to save your searches and receive alerts for new content matching your search criteria.
- posterNovember 2024
Poster Cooperative UAV Sensor Fusion for Precision Localization and Navigation in Load Transport
SenSys '24: Proceedings of the 22nd ACM Conference on Embedded Networked Sensor SystemsPages 895–896https://doi.org/10.1145/3666025.3699425Cooperative UAV transport operations in GPS-denied environments pose significant challenges in localization, coordination, and payload stability. This paper introduces a vision-based Leader-Follower drone system using MAVROS and depth cameras for real-...
- posterNovember 2024
Poster: Fusing radio tomography and RGB camera data for enhanced multi-person detection and tracking
SenSys '24: Proceedings of the 22nd ACM Conference on Embedded Networked Sensor SystemsPages 891–892https://doi.org/10.1145/3666025.3699423This paper presents a system that fuses radio tomography data with RGB camera information for enhanced multi-person detection and tracking in indoor environments. Experiments were conducted in an irregularly shaped room with four subjects. Due to its ...
- research-articleOctober 2024
FARFusion V2: A Geometry-based Radar-Camera Fusion Method on the Ground for Roadside Far-Range 3D Object Detection
MM '24: Proceedings of the 32nd ACM International Conference on MultimediaPages 8421–8430https://doi.org/10.1145/3664647.3681128Fusing the data of millimeter-wave Radar sensors and high-definition cameras has emerged as a viable approach to achieving precise 3D object detection for roadside traffic surveillance. For roadside perception systems, earlier studies have pointed out ...
- research-articleOctober 2024
Fit2Ear: Generating Personalized Earplugs from Smartphone Depth Camera Images
UbiComp '24: Companion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous ComputingPages 679–684https://doi.org/10.1145/3675094.3680525Earphones, due to their deep integration into daily life, have been developed for unobtrusive and ubiquitous health monitoring. However, these advanced algorithms greatly rely on the high quality sensing data. However, the data collected with universal ...
- research-articleOctober 2024
M-MOVE-IT: Multimodal Machine Observation and Video-Enhanced Integration Tool for Data Annotation
UbiComp '24: Companion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous ComputingPages 911–915https://doi.org/10.1145/3675094.3678479M-MOVE-IT is an open-source framework that simplifies data acquisition, annotation, and AI training for wearable technology. It addresses the challenges of synchronizing video and IMU data, making it easier to develop AI models for healthcare, sports, ...
-
- ArticleSeptember 2024
UniMAE: Multi-modal Masked Autoencoders with Unified 3D Representation for 3D Perception in Autonomous Driving
- research-articleOctober 2024
AstroSLAM: Autonomous monocular navigation in the vicinity of a celestial small body—Theory and experiments
International Journal of Robotics Research (RBRS), Volume 43, Issue 11Pages 1770–1808https://doi.org/10.1177/02783649241234367We propose AstroSLAM, a standalone vision-based solution for autonomous online navigation around an unknown celestial target small body. AstroSLAM is predicated on the formulation of the SLAM problem as an incrementally growing factor graph, facilitated ...
- research-articleOctober 2024
A mathematical characterization of minimally sufficient robot brains
- Basak Sakcak,
- Kalle G Timperi,
- Vadim Weinstein,
- Steven M LaValle,
- Jason M O’Kane,
- Michael Otte,
- Dorsa Sadigh,
- Pratap Tokekar
International Journal of Robotics Research (RBRS), Volume 43, Issue 9Pages 1342–1362https://doi.org/10.1177/02783649231198898This paper addresses the lower limits of encoding and processing the information acquired through interactions between an internal system (robot algorithms or software) and an external system (robot body and its environment) in terms of action and ...
- research-articleJuly 2024
AONeuS: A Neural Rendering Framework for Acoustic-Optical Sensor Fusion
SIGGRAPH '24: ACM SIGGRAPH 2024 Conference PapersArticle No.: 127, Pages 1–12https://doi.org/10.1145/3641519.3657446Underwater perception and 3D surface reconstruction are challenging problems with broad applications in construction, security, marine archaeology, and environmental monitoring. Treacherous operating conditions, fragile surroundings, and limited ...
- otherAugust 2024
The INSANE dataset: Large number of sensors for challenging UAV flights in Mars analog, outdoor, and out-/indoor transition scenarios
- Christian Brommer,
- Alessandro Fornasier,
- Martin Scheiber,
- Jeff Delaune,
- Roland Brockers,
- Jan Steinbrener,
- Stephan Weiss
International Journal of Robotics Research (RBRS), Volume 43, Issue 8Pages 1083–1113https://doi.org/10.1177/02783649241227245For real-world applications, autonomous mobile robotic platforms must be capable of navigating safely in a multitude of different and dynamic environments with accurate and robust localization being a key prerequisite. To support further research in this ...
- research-articleMay 2024
ER‐Mapping: An extrinsic robust coloured mapping system using residual evaluation and selection
AbstractThe colour‐enhanced point cloud map is increasingly being employed in fields such as robotics, 3D reconstruction and virtual reality. The authors propose ER‐Mapping (Extrinsic Robust coloured Mapping system using residual evaluation and ...
- research-articleMay 2024
Unity is Strength? Benchmarking the Robustness of Fusion-based 3D Object Detection against Physical Sensor Attack
WWW '24: Proceedings of the ACM Web Conference 2024Pages 3031–3042https://doi.org/10.1145/3589334.3645612As a safety-critical application, Autonomous Driving (AD) has received growing attention from security researchers. AD heavily relies on sensors for perception. However, sensors themselves are susceptible to various threats since they are exposed to the ...
- short-paperJune 2024
9 in 10 cameras agree: Pedestrians in front possibly endangered
AST '24: Proceedings of the 5th ACM/IEEE International Conference on Automation of Software Test (AST 2024)Pages 219–223https://doi.org/10.1145/3644032.3644468Modern cyber-physical systems integrate data from many sensors as a regular part of their operations. Over the years, researchers have proposed methods ranging from statistical approaches to neural networks to achieve this sensor fusion, along with high-...
- research-articleJanuary 2024
A novel multi‐model 3D object detection framework with adaptive voxel‐image feature fusion
AbstractThe multifaceted nature of sensor data has long been a hurdle for those seeking to harness its full potential in the field of 3D object detection. Although the utilisation of point clouds as input has yielded exceptional results, the challenge ...
A voxel‐based single‐shot multi‐model network for 3D object detection is introduced, namely AVIFF. The authors made some new attempts in fusing features of point cloud and image by designing the adaptive feature fusion (AFF) module and dense fusion (DF) ...
- research-articleJuly 2024
SensNet: An End-to-End Deep Learning-based BLE-IMU Fusion Positioning System for Industry 4.0
Procedia Computer Science (PROCS), Volume 237, Issue CPages 397–404https://doi.org/10.1016/j.procs.2024.05.120AbstractExisting wireless-inertial fusion positioning mechanisms predominantly rely on empirical propagation models of wireless signals and are often fused by filtering algorithms such as the Kalman filter or particle filter. Despite their wide usage, ...
- research-articleDecember 2023
Data Pre-processing and Sensor-Fusion for Multivariate Statistical Process Control of an Extrusion Process
SEA4DQ 2023: Proceedings of the 3rd International Workshop on Software Engineering and AI for Data Quality in Cyber-Physical Systems/Internet of ThingsPages 9–15https://doi.org/10.1145/3617573.3618029In most manufacturing processes, data related to a product are collected across several process steps. Ensuring good data quality is essential for subsequent process modeling, monitoring, and control. Although data for a given process might already be ...
- research-articleNovember 2023
ViWise: Fusing Visual and Wireless Sensing Data for Trajectory Relationship Recognition
ACM Transactions on Internet of Things (TIOT), Volume 4, Issue 4Article No.: 23, Pages 1–29https://doi.org/10.1145/3614441People usually form a social structure (e.g., a leader-follower, companion, or independent group) for better interactions among them and thus share similar perceptions of visible scenes and invisible wireless signals encountered while moving. Many ...
- research-articleOctober 2023
Wave-for-Safe: Multisensor-based Mutual Authentication for Unmanned Delivery Vehicle Services
MobiHoc '23: Proceedings of the Twenty-fourth International Symposium on Theory, Algorithmic Foundations, and Protocol Design for Mobile Networks and Mobile ComputingPages 230–239https://doi.org/10.1145/3565287.3610253In recent years, the deployment of unmanned vehicle delivery services has increased unprecedentedly, leading to a need for enhanced security due to the risk of leaving high-value packages to an unauthorized third party during pickup or delivery. ...
- research-articleOctober 2023
Exploring the Potential of Multimodal Emotion Recognition for Hearing-Impaired Children Using Physiological Signals and Facial Expressions
ICMI '23 Companion: Companion Publication of the 25th International Conference on Multimodal InteractionPages 398–405https://doi.org/10.1145/3610661.3616240This study proposes an approach for emotion recognition in children with hearing impairments by utilizing physiological and facial cues and fusing them using machine learning techniques. The study is a part of a child-robot interaction project to ...
- research-articleOctober 2023
Enhancing Transportation Mode Detection using Multi-scale Sensor Fusion and Spatial-topological Attention
UbiComp/ISWC '23 Adjunct: Adjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable ComputingPages 534–539https://doi.org/10.1145/3594739.3610751Mobile sensors have improved traffic prediction through transportation mode recognition. Researchers are interested in exploring mobile sensor-based recognition methods for transportation modes. The SHL Recognition Challenge is a prominent competition ...