Hand Tracking
Note
This feature is an experimental feature and is subject to change. Experimental features should not be used in production products and are unsupported. We are providing these experimental features as a preview of what might be coming in future releases. If you have feedback on an experimental feature post your feedback in the Lightship Developer Community
Use hand tracking to track the position of a person’s hand in the frame. Hand tracking performs palm detection and provides information for the 2D position of the hand and the width and height of the hand.
Hand tracking uses the hand tracking machine learning model from MediaPipe, as described in this model card.
Enable hand tracking by adding a ARHandTrackingManager
to your scene script and registering an event handler for ARHandTrackingManager.HandTrackingUpdated
.
using Niantic.ARDK.AR.Awareness;
using ARDK.Extensions;
public class MySceneScript: MonoBehaviour
{
[SerializeField]
private ARHandTrackingManager _handTrackingManager;
private void Start()
{
_handTrackingManager.HandTrackingUpdated += OnHandTrackingUpdated;
}
}
In your event handler inspect the TrackingData.AlignedDetections
list for hand tracking data.
private void OnHandTrackingUpdated(HumanTrackingArgs args)
{
var data = args.TrackingData;
for(var i=0; i < data.AlignedDetections.Count; i++)
{
var item = data.AlignedDetections[i];
Debug.Log(item.X +" "+ item.Y + " -- " + item.W + " " + item.H );
}
}
If you can’t use ARHandTrackingManager
in your script, you can enable hand tracking by enabling IsPalmDetectionEnabled
in your session’s ARWorldTrackingConfiguration
, and then add your own event handler for ARSession.HandTracker.HandTrackingStreamUpdated
.
See more code examples in the HandTracking example Unity scene in ARDK-examples under ContextAwareness/HumanAR.