Interactions overview¶
Hand interactions for XR can be divided into two main categories: Physical and Learned interactions.
Physical interactions tend to be intuitive and require little or no user instruction, as they are so close to how things work in the real world. They use direct interaction and examples include involve handling objects, opening drawers, or waving to other players. Objects should be designed well so their function is obvious to users.
Learned interactions are different. They don’t relate to real world experiences and tend to be interactions that only happen in VR or AR. Examples include locomotion / teleportation, distance selection and opening UI panels. These can be easy to use once learned, but must be introduced to users via help or onboarding content.
Physical interactions¶
Learned interactions¶
Want to learn more about implementing these features in the game engine of your choice? Check out the implementation guides: