An immersive American Sign Language (ASL) hand pose recognition prototype built with SwiftUI, RealityKit, and ARKit Hand Tracking.
The project visualizes hand joints in 3D and recognizes basic ASL letters when a pose is held steadily.
ASLProject uses Apple’s VisionOS hand tracking to:
- Track left and right hands in real time
- Visualize hand joints as 3D spheres
- Detect finger extension and thumb contact
- Recognize ASL letters based on hand pose
- Provide visual feedback through color changes
- Uses
ARKitSessionwithHandTrackingProvider - Updates all
HandSkeleton.JointNamejoints every frame - Supports both left and right hands
For each hand:
- Finger extension is calculated using distance from fingertip to wrist
- Thumb extension is handled separately
- Finger contact (e.g. thumb touching index) is detected via joint distance
- A
HandPosestructure represents the current state
- Each ASL letter is defined as a target pose
- The current hand pose is compared to known ASL poses
- A letter is recognized only if the pose is held consistently for a short duration (temporal filtering)
- 🔴 Red joints: no valid ASL pose detected
- 🟢 Green joints: ASL letter successfully recognized
Currently implemented:
- A
- B
- D
- E
- I
- L
- U
- W
- Y
Each letter is defined by:
- Which fingers are extended
- Whether the thumb is touching the index finger