Drone

Research Title - Remotely Piloted Aircraft: A Human Factors Investigation into Human Performance Limitations


Remotely Piloted Aircraft Systems (RPAS) have facilitated new growth in civil aviation. However, a reduction in sensory cueing available to remote pilots including auditory and visual information, has been associated with higher accident rates compared with traditional manned aircraft. RPAS are typically operated under two visual conditions: in direct visual-line-of-sight to the remote pilot (VLOS) and beyond VLOS with first-person-view imagery transmitted via onboard cameras (BVLOS). Under BVLOS conditions, real time auditory feedback is not usually present. This research examined how audiovisual feedback impacted RPAS operator performance. Eighteen pilots (three female) trained to operate an RPAS and completed six navigation and twelve spotting tasks. Participant performance, in terms of timeliness, horizontal accuracy and vertical accuracy was examined under three visual (VLOS (Control), BVLOS-Monitor & BVLOS-Goggles), two auditory (with and without auditory feedback), and two wind component (no wind and wind) conditions. Horizontal deviation and timeliness each improved in the BVLOS-Monitor condition, whilst auditory feedback produced nuanced examples of improved and degraded pilot performance. These results contribute to previous audiovisual literature; auditory information is highly valued when the visual stimulus lacks detail, however when the visual stimuli is sufficient, auditory information can be considered as noise. Within the context of civilian operating environments, the availability of additional and dynamic sensory cueing could assist remote pilots performing BVLOS tasks more safely, including medical deliveries, search and rescue, and environmental assessments. The inclusion of dynamic sensory cueing made available to remote pilots operating RPAS is supported by these findings.