Date of Submission


Document Type


Degree Name

Master of Science in Mechanical Engineering


Mechanical and Industrial Engineering


Dr. Eric A. Dieckman

Committee Member

Dr. Cheryl Li

Committee Member

Dr. Gokhan Egilmez


Drone detection, Drone classification, Commercial-off-the-shelf (COTS) sensors, RGB camera, Thermal camera, Deep Convolutional Neural Network (CNNs)


Drone aircraft, Machine learning, Microphone arrays


The increasing commercial availability of mini-drones and quadrotors has led to their greater usage, highlighting the need for detection and classification systems to ensure safe operation. Instances of drones causing serious complications since 2019 alone include shutting down airports [1-2], spying on individuals [3-4], and smuggling drugs and prohibited items across borders and into prisons [5-6]. Some regulatory measures have been taken, such as registration of drones above a specific size and the establishment of no-fly zones in sensitive areas such as airports, military bases, and national parks. While commercial systems exist to detect drones [7-8], they are expensive, unreliable, and often rely on a single sensor. This thesis will explore the practicality of using low-cost, Commercial-off-the-shelf (COTS) sensors and machine learning to detect and classify drones.

A Red, Green, and Blue (RGB) USB camera [9], FLIR Lepton 3.0 thermal camera [10], miniDSP UMA-16 acoustic microphone array [11], and a Garmin LIDAR [12] were mounted on a robotic sensor platform and integrated using a Minisforum Z83-F with 4GB RAM and Intel Atom x5-Z8350 CPU to collect data from drones operating in unstructured, outdoor, and real-world environments. Approximately 1,000 unique measurements were taken from three mini-drones – Parrot Swing, Parrot Quadcopter, and Tello Quadcopter – using the RGB, thermal, and acoustic sensors. Deep Convolutional Neural Network (CNNs), based on Resnet-50 [13-14], trained to classify the drones, achieved accuracies of 96.6% using the RGB images, 82.9% using the thermal images, and 71.3% using the passive acoustic microphone array.