Skip to content

Improve local eye tracking: rotation space, axis calibration, offline bone actuation#663

Open
kidkwazine wants to merge 3 commits intoBasisVR:developerfrom
kidkwazine:eye-tracking-improvements
Open

Improve local eye tracking: rotation space, axis calibration, offline bone actuation#663
kidkwazine wants to merge 3 commits intoBasisVR:developerfrom
kidkwazine:eye-tracking-improvements

Conversation

@kidkwazine
Copy link

Summary

  • BasisLocalEyeDriver.Initalize() captured initial eye rotations using .rotation (world-space) but they were being used for .localRotation writes for eye tracking. This would bake the parent chain into a parent-relative value. I'm not fully sure how / if this manifested in practice, but I presume it wasn't too noticeable because the avatar is in T-pose when it's captured?
  • SetEyeRotation() assumed a specific bone orientation. Hai flagged this in a comment, but was waiting on a WIP normalized muscle system. I integrated BasisLocalEyeDriver's existing per-eye calibration (since that already accounts for eye bone axis orientation during init) as a stopgap for the local path.
  • EyeTrackingBoneActuation gated local eye tracking on IsLocal which is only set during OnHVRReadyBothAvatarAndNetwork so bone actuation didn't work unless you were connected / in Host Mode. I switched it to _eyeFollowDriverApplicable (set on OnHVRAvatarReady) to match how BlendshapeActuation already handles this with _isWearer.

Changes

  • BasisLocalEyeDriver.cs:
    • Fix world-space to local-space for initial eye rotation capture
    • Refactors leftEyeInitialRotation/rightEyeInitialRotation into the EyeCalibration struct
    • Expose calLeft/calRight as public static for use by EyeTrackingBoneActuation 1
  • EyeTrackingBoneActuation.cs:
    • Replace Quaternion.Euler on the local path with axis-aware rotation via BasisLocalEyeDriver
    • Gate local eye bone rotation tracking on _eyeFollowDriverApplicable instead of IsLocal so it works without a network connection

I smoke tested with a few avis and an iPhone app that sends face tracking data since I don't have any fancy eye tracking HMDs, so may benefit from someone with eye trackers giving this a review pass!

Footnotes

  1. I exposed calLeft/calRight as bare public static fields to be consistent with how other fields are exposed there. I presume they're never meant to be set outside of CalibrateEyes() though, so might be better to encapsulate them? Can change if you'd prefer that

- initial eye rotation capture from .rotation (world) to .localRotation (local)
- Replace Quaternion.Euler axis assumption with BasisLocalEyeDriver's per-eye calibration
- Expose calLeft/calRight for use by EyeTrackingBoneActuation
- Align leftEyeInitialRotation type
Move leftEyeInitialRotation/rightEyeInitialRotation from separate static fields on BasisLocalEyeDriver into the EyeCalibration struct, captured during CalibrateOneEye()
Gate local eye tracking state management on `_eyeFollowDriverApplicable` (set on avatar ready) instead of `IsLocal` (set only after network ready). `BlendshapeActuation` already works offline because it gates on `_isWearer`. `EyeTrackingBoneActuation` used `IsLocal` for the equiv checks.

(Network methods are unaffected)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant