Abstract
<div class="section abstract"><div class="htmlview paragraph">This paper addresses a three dimensional (3D) mission domain coverage control problem combined with camera pose control to align towards specific objects of interest. We consider an unmanned ground vehicle (UGV) based on a unicycle kinematics model with an onboard camera sensor based on a visual perspective sensor model. The coverage control problem has been researched in large part for planar domains, which is however not sufficient for real world applications for UGV navigation. Furthermore, in contrast to coverage control of points in the environment, when dealing with objects of interest, it is more amicable to consider that there exist certain orientations to which the camera must align itself to properly cover the object and make ‘sense’ of it. Hence, we seek to derive both UGV coverage control law for 3D mission domains and onboard camera pose control considering target orientation. The goal is to allow the UGV to survey each point in the environment to a preset level while also controlling the camera to align and look towards objects of interest, thereby increasing situational awareness. Analytic control laws for the UGV are formulated based on a Lyapunov error function defined in a time-varying 3D domain. L1 and L2 rotation averaging techniques are applied and analyzed for multiple target points and are coupled with a convergence function to give analytic control laws on the manifold for the camera pose. Finally, the control laws are verified and simulated in CoppeliaSim using a Python API and the results are analyzed and discussed.</div></div>