Date of Submission

5-15-2025

Document Type

Thesis

Degree Name

Master of Science in Mechanical Engineering

Department

Mechanical and Industrial Engineering

Advisor

Cheryl Li, Ph.D.

Committee Member

Genesh Balasubramanian, Ph.D.

Committee Member

Sumith Yesudasan, Ph.D.

Keywords

Robotic Automation, UR3e Collaborative Arms, Real-Time Dual-Arm Teleoperation, Similarity-Transform Gesture-Mapping Framework

LCSH

Robots—Dynamics

Abstract

Robotic teleoperation often relies on mode switching, scripted waypoints, or offline learning pipelines that constrain responsiveness and raise the barrier to deployment. This dissertation presents a markerless, similarity-transform gesture-mapping framework that enables a single operator to drive two UR3e collaborative arms in real time using natural bimanual hand motions captured by a Leap Motion sensor. A one-shot Umeyama calibration computes a uniform-scale rigid transform between the 60 Hz Leap frame and each robot base, after which millimeter-scale palm displacements are permuted, sign-corrected, and scaled (k≈1.5×10−3 m/mm) into meter-scale tool-center-point waypoints. The mapping, plus a five-sample moving-average filter, adds only 20 ms of processing latency; combined with a 10 Hz Real-Time Data Exchange stream and UR servo interpolation, the sensor-to-actuator delay averages 178 ms, well inside the 200 ms transparency threshold for human agency. Bench-top trials on a Windows 11 workstation demonstrate a mean positional error of 8 mm across a 0.3 m workspace, 21 ms internal pipeline latency, sub-30 ms inter-arm skew, and 0.41 s grasp-gesture latency with ±0.3 mm gripper repeatability. A layered safety architecture couples gesture-based halts, occlusion freezes, and workspace clamps with the cobot’s native force limits, yielding immediate, intuitive braking without external sensors. Compared with state-of-the-art single-arm or mode-switching systems, the proposed method offers faster setup, lower cognitive load, and dual-arm synchronicity using only commodity hardware. The work establishes a reproducible software stack for rapid evaluation of gesture-controlled manipulators and outlines future extensions in six-DOF orientation, multi-sensor frustum stitching, and vibrotactile feedback.

Available for download on Sunday, May 30, 2027

Share

COinS