MiVRy – 3D Gesture Recognition AI for VR, Android, and iOS

What if you could train any 3D gesture for your game or software in 30 seconds?

Introducing our 3D Gesture Recognition Library, a patented machine learning library designed to help game, app, and experience developers quickly and reliably program gesture input for Unity, Unreal, and (basically) any software in their pipeline. MiVRy works for VR, AR, Android, iOS, and any device with 3D input.

(APK, loadable through SideQuest)

How our patented machine learning library
shaves crucial hours off your development time

The biggest bottleneck in production processes is often user input. A perfect example is the modern keyboard QWERTY layout – a relic from the past designed to slow down your input speed due to conditions of the time (typewriter jamming). One of the biggest promises of VR is the ability to not only perceive in 360 degrees of freedom, but to also use both controllers as 360-degree input devices, allowing greater fidelity. With mobile, limited touchscreen real estate means you may not be able to fit all the controls you want onto the screen – even though smartphone screens are bigger than ever! With MiVRy you can add simple or complex gestures to your game or app for VR, mobile (iOS or Android), or anything you can think of!

Programming 3D gestures manually is extremely tedious. That’s where MiVRy’s AI steps in, slicing that manual programming time by turning it over to an advanced neural network which can learn any gesture with 98% reliability after simply 30 repetitions, which can be performed in about 30 seconds.

This frees precious development time up so you can spend more time working on the things that matter, not tweaking gestures endlessly. Any gesture – trained and implemented – in the time it takes your coffee to brew.

  • Want to draw a bow and arrow and shoot them in your VR game? You can do that.
  • Hoping to easily implement movement gestures into your Android app? Done in seconds.
  • Want to have your game allow user-programmed spells to cast specific effects in your spellcaster game? Easy.
  • How about a series of exercises for a iOS fitness app? Piece of cake.

Our 3D Gesture Recognition AI turns what would have taken dozens or perhaps hundreds of hours of manual programming time into something you can do in minutes.

The gestures can be both direction specific (“swipe left” vs. “swipe right”) or direction independent (“draw an arrow facing in any direction”) – either way, you will receive the direction, position, and scale at which the user performed the gesture!
Draw a large 3d cube and there it will appear, with the appropriate scale and orientation.
Both hand-handed gestures, two-handed gestures, and multi-part sequential gestures are supported. MiVRy supports any sort of 3D input – from VR to a phone’s internal gyroscope and accelerometer. 

Develop for mobile with MiVRy (Android)! 

Check out the video to the right, demonstrating a quick test app we made to show MiVRy’s functionality on Android. Using MiVRy you can quickly and easily add 3D movement-based gestures to your game or app. Gesture recognition has never been easier for Android and iOS!

Additionally, MiVRy now also works with continuous gestures. This means it can “listen” for gestures and only perform a function when that gesture is performed.

  • Learns your gestures in a few seconds with >99% reliability
  • Works with any VR or AR device​
  • Works on Smartphones (Android) with the internal motion sensor
  • Real 3D gestures - like waving a magic wand
  • Record your own gestures - simple and straightforward
  • Easy to use - single C++ or C# class or C function set
  • Can have multiple sets of gestures simultaneously (for example: different sets of gestures for different buttons)
  • Continuous gesture recognition - "listen" for gestures and perform a function once the user makes the gesture
  • High recognition fidelity
  • Outputs the position, scale, and orientation at which the gesture was performed
  • High performance (back-end written in optimized C/C++)
  • Integrate as a library - either static or dynamic
  • Integrate as a Unity plug-in - with a no-coding-required wrapper object