haleykong

Life Sized Pool

Mixed Reality Concept Using Unity Hands Tracking OpenXR

MROE ABOUT MY CONCEPT

When reading into the Unity XR documentation, I found a new preview update which let allows Unity Developers to create an XR rig with Hands tracking. This new XR hands tracking not only allowed other devices to use the Hands Tracking feature, but allowed one's hands to interact with each other, making the reliance on what it can track and display much more accurately.


The ability for one's hands to interact with each other opened up a whole new level of immersion and realism in XR experiences.


Intrigued by this concept, I set out to showcase this new feature in my Senior Gallery Exhibition. To do so, I created a sample game that would serve as a testing platform for this new technology.As I worked on the game, I was struck by the immense potential of this new technology. Not only did it make XR experiences more immersive and engaging, but it also had practical applications in fields such as education, training, and healthcare.


The ability to interact with virtual objects using one's own hands could revolutionize the way we approach these fields, allowing for more effective and engaging learning and training experiences.

Overall, my experience exploring the Unity XR hands tracking feature was both enlightening and exciting. As XR technology continues to advance, I believe that this feature will play a pivotal role in creating deeper immersive, interactive, and transformative experiences.


NEXT STEPS

Developing the test game for my exhibit piece proved to be a crucial step in showcasing the Unity XR hands tracking feature. As I delved deeper into the technology, I discovered some interesting quirks and challenges that needed to be addressed in order to create a seamless user experience.


One of the first things I noticed while testing the XR hands tracking feature was that it required a pinching motion in order to grab objects. This was a default setting among the three preset action poses that come with the technology. While it is possible to manually adjust the hand pose to better suit individual needs, I found that this was not practical for my purposes.


Unfortunately, I also discovered that the XR tracker had difficulty when it came to tracking objects that were grabbed within my hands. While it performed admirably when it came to poking buttons, palming objects, and performing natural gestures, it was less successful when it came to object grabbing. Assets would often lose their guided movement, resulting in a less than optimal user experience.


Despite these challenges, I remained determined to create a compelling XR experience that would showcase the potential of the Unity XR hands tracking feature. Through trial and error, I was able to fine-tune the game mechanics and create a more seamless and intuitive user experience. 


Developing the test game for the Unity XR hands tracking feature was a challenging but ultimately rewarding experience. It allowed me to push the boundaries of what is possible with XR technology and create an immersive and engaging user experience. While there were certainly challenges along the way, I believe that the end result was well worth the effort, and I got a fun game for myself to enjoy.


Overall, my experience exploring the Unity XR hands tracking feature was both enlightening and exciting. As XR technology continues to advance, I believe that this feature will play a pivotal role in creating deeper immersive, interactive, and transformative experiences.