philip lelyveld The world of entertainment technology

9Jun/19Off

AR Hand Gesture UI test results – Self-made AR glasses – Testing Project North Star

During development, we wanted to implement and test the following functionality:

  • UI, which is hovering parallel to the user’s hand
  • UI can be activated and deactivated by a hand gesture
  • UI contains several tabs with different functionality
  • You can switch between the UI tabs using finger gestures
  • A tab where numbers or text can be entered
  • A tab with different sliders
  • A tab with object interaction
  • A new, from scratch gesture: snap

The result of our Augmented Reality glasses can be seen in the following video. In our own gesture, we were inspired by Marvel Avengers and implemented a snap gesture. With this gesture, the user could divide the number given in the first tab in half. Gathering the objects in the third tab also felt a bit like collecting the Infinity Stones.

Our experiences, our take-away from the camp:

  • Position and alignment of palms, forefingers and thumbs are detected very reliably and accurately. The detection of the little finger is also very close to it.
  • The position and orientation of the middle and ring fingers are not very reliable and only provide the correct values in about 60% of When you move your hand, there are many blind spots where these fingers are covered by others in relation to the sensor. This is hard to work with and the user experience suffers greatly.
  • For gestures that are triggered by a rotation movement, the angle at limits should not be too tight because you cannot keep the hand 100% still. Therefore, we recommend that the deactivation radius should be min. 10 degree greater than the activation radius, so that no flickering occurs.
  • Finger gestures should be very simple and tested with many different people, as fine motor skills in the fingers are extremely different from person to person. The same applies to the differences between the right and left hands.
  • When interacting with objects, the user must be actively supported via visual and auditory aids. It helps a lot if the interaction elements adjust or intensify the colours in relation to the distance of the finger and the interaction itself is emphasized by a tone.
  • The best aids will not help you if they are covered by hand during interaction.
Comments (0) Trackbacks (0)

Sorry, the comment form is closed at this time.

Trackbacks are disabled.