Abstract:
In this study, the integration of a handheld device with a ZED Mini depth sensor is explored to produce Z-HandAR, a framework that enhances occlusion handling in handheld Augmented Reality (AR) applications. The aim is to improve the occlusion handling capabilities of the conventional AR depth-based framework by leveraging the powerful stereo vision technology of an RGBD depth sensor. By combining the handheld device with the depth sensor, we can benefit from accurate depth information for more realistic AR experiences. The setup involves the fusion of handheld device with depth sensor via a fusion module involving Unity Render Streaming, a WebRTC-based streaming framework. This allows the handheld device to access the frame data captured by the depth sensor in real-time. For this study, we conducted a pilot test to get the finding where we aim to enhance occlusion handling and reduce flickering in handheld AR applications. We compare our results with one of the state-of-the-art handheld AR frameworks, Lightship ARDK. Our results show improvement over Lightship ARDK’s handheld AR occlusion handling. This paper explores a promising potential for improving occlusion handling in handheld AR applications. By leveraging the stereo vision technique of the depth sensor, we present the experiments and framework of this integration could lead to significant advancements in handheld AR technology.