MiVRy Documentation

MiVRy Gesture Recognition Documentation

This article guides you through the use of the MiVRy Gesture Recognition.

For a detailed guide on the Unity Plug-in please see:
https://www.marui-plugin.com/documentation-mivry-unity/

For a detailed guide on the Unreal Engine Plug-in please see:
https://www.marui-plugin.com/documentation-mivry-unreal/

Package Overview

You always download MiVRy from GitHub:
https://github.com/MARUI-PlugIn/MiVRy

(1) Plug-in library files (binaries):

To use MiVRy in your own project, you must add the binary library files to your build environment. MiVRy is available for the following platforms:

Windows:
In the windows/ folder you can find the plugin library files for Windows – both statically linked ( lib/ folder) and dynamically linked ( dll/ folder) for x86 32 bit and 64 bit platforms.

UniversalWindowsPlatform / Hololens:
In the UWP/ folder you can find the plugin library files for UniversalWindowsPlatform – both statically linked ( lib/ folder) and dynamically linked ( dll/ folder) for x86 and arm platforms (32 bit and 64 bit).
Note that the Hololens 1 uses a x86 32 bit processor,
and the Hololens 2 uses an arm 64 bit processor.

Linux / Unix:
In the linux/ folder you can find the plugin library files for Linux – both statically linked ( lib/ folder) and dynamically linked ( dll/ folder) for x86 and arm platforms (32 bit and 64 bit).

Android:
In the andoid/ folder you can find the plugin library files for Android – (dynamically linked .so) for x86 and arm platforms (32 bit and 64 bit).
You will also an .aar package to use in Android development. Details on how to use the .aar package can be found here.

NOTE:
For use in Unity and Unreal Engine, special plug-ins are available:
Unity: https://www.marui-plugin.com/documentation-mivry-unity/
Unreal: https://www.marui-plugin.com/documentation-mivry-unreal/

(2) Header Files (C / C++)

MiVRy provides three different script to use the plug-in library. You ever only need one of the three, depending on your requirements and development goals.
GestureRecognition.h : C/C++ header for using one-handed one-part gestures.
GestureCombinations.h: C/C++ header for using two-handed or multi-part gestures.
The are also C# wrappers GestureRecognition.cs and GestureCombinations.cs
Please see the sections below on how to use these headers.

Licensing and Activation

Mivry is free to use for commercial, personal, and academic use.
However, the free version of MiVRy has certain limitations.
The free version of MiVRy can only be used to identify 100 gestures per session (meaning every time you run the app). When using continuous gesture identification, it can only be used for a total of 100 seconds of identifying gestures.

To unlock unlimited gesture recognition, you must purchase a license at:
https://www.marui-plugin.com/mivry/

The license key will be sent to you automatically and immediately after purchase.
If the license email does not arrive, please check your spam filter, and contact support@marui-plugin.com

The license credentials must then be used to activate MiVRy by using the activateLicense() function (during runtime) or by inserting it into the license settings fields in the Unity or Unreal plug-in.

How to use the GestureRecognition.h header (for one-handed one-motion gestures)

(1) Include the GestureRecognition.h header file file into your project, together with the appropriate library file for your platform (Windows DLL x86 / 64, Linux .so, Android .so, UWP, …).

(2) Create a new Gesture recognition object and register the gestures that you want to identify later.

IGestureRecognition* gr = new IGestureRecognition();
int myFirstGesture = gr->createGesture("my first gesture");
int mySecondGesture = gr->createGesture("my second gesture");

(3) Record a number of samples for each gesture by calling startStroke(), contdStroke() and endStroke() for your registered gestures, each time inputting the headset and controller transformation.

double hmd_p[3] = { /* Headset position (x,y,z) */ };
double hmd_q[4] = { /* Headset rotation quaternion (x,y,z,w) */ };
gr->startStroke(hmd_p, hmd_q, myFirstGesture);
[…]
// repeat the following while performing the gesture with your controller:
double p[3] = { /* Controller position (x,y,z) */ };
double q[4] = { /* Controller rotation quaternion (x,y,z,w) */ };
gr->contdStrokeQ(p,q);
// ^ repead while performing the gesture with your controller.
[…]
gr->endStroke();

Repeat this multiple times for each gesture you want to identify.
We recommend recording at least 20 samples for each gesture.

(4) Start the training process by calling startTraining().
You can optionally register callback functions to receive updates on the learning progress by calling setTrainingUpdateCallback() and setTrainingFinishCallback().

gr->setMaxTrainingTime(10); // Set training time to 10 seconds.
gr->startTraining();

You can stop the training process by calling stopTraining().
After training, you can check the gesture identification performance by calling recognitionScore() (a value of 1 means 100% correct recognition).

(5) Now you can identify new gestures performed by the user in the same way as you were recording samples:

double hmd_p = { /* Headset position (x,y,z) */ };
double hmd_q = { /* Headset rotation quaternion (x,y,z,w) */ };
gr->startStroke(hmd_p, hmd_q);
[…]
// repeat the following while performing the gesture with your controller:
double p = { /* Controller position (x,y,z) */ };
double q = { /* Controller rotation quaternion (x,y,z,w) */ };
gr->contdStrokeQ(p,q);
// ^ repeat while performing the gesture with your controller.
[…]
int identifiedGesture = gr->endStroke();
if (identifiedGesture == myFirstGesture) {
// ...
}

(7) More than just getting the most likely candidate which gesture was performed, you can also get the similarity how much the performed motion resembles the identified gesture:

double similarity;
int identifiedGesture = gr->endStroke(&similarity);

This provides a value between 0 and 1, where 0 indicates that the performed gesture is very much unlike the previously recorded gestures, and 1 indicates that performed gesture is the exact average of all previously recorded gestures and thus highly similar to the intended gesture.

(8) You can save and load your gestures to a gesture database file.

gr->saveToFile("C:/myGestures.dat");
// ...
gr->loadFromFile("C:/myGestures.dat");

How to use the GestureCombinations.h header (for two-handed gestures and gesture-combos)

(1) Include the appropriate library file (.DLL file on Windows) and the GestureRecognition.h and GestureCombinations.h files to your project.

(2) Create a new GestureCombinations object with the number of “parts” that you want to use. “Part” here means a separate object or body part (eg. left hand and right hand). But it can also be used to identify consecutive gestures (ie. multi-part sequential gestures). Then register the gestures that you want to identify later.
In this example, we use gesture part “0” to mean “left hand” and gesture part “1” to mean right hand, but you can use any indexing of parts that you want.

IGestureCombinations* gc = new IGestureCombinations(2);
int myFirstCombo = gc->createGestureCombination("wave your hands");
int mySecondCombo = gc->createGesture("play air-guitar");

Also, create the individual gestures (aka sub-gestures) that each combo will consist of.

const int left = 0;
const int right = 1;
int myFirstCombo_leftHandGesture = gc->createGesture(left, "Wave left hand");
int myFirstCombo_rightHandGesture = gc->createGesture(right, "Wave right hand");
int mySecondCombo_leftHandGesture = gc->createGesture(left, "Hold guitar neck");
int mySecondCombo_rightHandGesture = gc->createGesture(right, "Hit strings");

Then set the Gesture Combinations to be the connection of those gestures.

gc->setCombinationPartGesture(myFirstCombo, left, myFirstCombo_leftHandGesture);
gc->setCombinationPartGesture(myFirstCombo, right, myFirstCombo_rightHandGesture);
gc->setCombinationPartGesture(mySecondCombo, left, mySecondCombo_leftHandGesture);
gc->setCombinationPartGesture(mySecondCombo, right, mySecondCombo_rightHandGesture);

(3) Record a number of samples for each gesture by calling startStroke(), contdStroke() and endStroke() for your registered gestures, each time inputting the headset and controller transformation.

double hmd_p[3] = { /* insert headset position here */ };
double hmd_q[4] = { /* insert headset rotation quaternion here */ };
gc->startStroke(left, hmd_p, hmd_q, myFirstCombo_leftHandGesture);
gc->startStroke(right, hmd_p, hmd_q, myFirstCombo_rightHandGesture);
[…]
// repeat the following while performing the gesture with your controller:
double p_left[3] = { /* insert left controller position here */ };
double q_left[4] = { /* insert left controller rotation quaternion here */ };
gc->contdStrokeQ(left, p_left, q_left);
double p_right[3] = { /* insert right controller position here */ };
double q_right[4] = { /* insert right controller rotation quaternion here */ };
gc->contdStrokeQ(right, p_right, q_right);
// ^ repeat the abover while performing the gesture with your controller.
[…]
// at the end of the gesture, call:
gc->endStroke(left);
gc->endStroke(right);

Repeat this multiple times for each gesture you want to identify.
We recommend recording at least 20 samples for each gesture, and have different people perform each gesture.

(4) Start the training process by calling startTraining().
You can optionally register callback functions to receive updates on the learning progress by calling setTrainingUpdateCallback() and setTrainingFinishCallback().

gc->setMaxTrainingTime(60); // Set training time to 60 seconds.
gc->startTraining();

You can stop the training process by calling stopTraining(). After training, you can check the gesture identification performance by calling recognitionScore() (a value of 1 means 100% correct recognition).

(5) Now you can identify new gestures performed by the user in the same way as you were recording samples:

double hmd_p[3] = { /* headset position */ };
double hmd_q[4] = { /* headset rotation quaternion */ };
gc->startStroke(left, hmd_p, hmd_q);
gc->startStroke(right, hmd_p, hmd_q);
[…]
// repeat the following while performing the gesture with your controller:
double p_left[3] = { /* left controller position */ };
double q_left[4] = { /* left controller rotation quaternion */ };
gc->contdStrokeQ(left, p_left, q_left);
double p_right[3] = { /* right controller position */ };
double q_right[4] = { /* right controller rotation quaternion */ };
gc->contdStrokeQ(right, p_right, q_right);
// ^ repead while performing the gesture with your controller.
[…]
gc->endStroke(left);
gc->endStroke(right);
int identifiedGestureCombo = gc->identifyGestureCombination();
if (identifiedGestureCombo == myFirstCombo) {
// ...
}

(6) Now you can save and load the artificial intelligence.

gc->saveToFile("C:/myGestureCombos.dat");
// ...
gc->loadFromFile("C:/myGestureCombos.dat");

How to use the Android AAR package

(1) Import the library into your Android Studio project:
1: Select File > New > New Module.
2: Select Import .JAR/.AAR Package then click Next.
3: Select the location of MiVRy.aar and click “Finish”.
4: In your project’s build.gradle add to the “dependencies” list the following: implementation project(‘:mivry’)

(2) Include in your java files:
1: Add the import to the top of your file:
import com.maruiplugin.mivry.MiVRy;
2: Create a MiVRy object:
MiVRy mivry = new MiVRy();

(3) Create gestures:
1: Register a new gesture with:
int gesture_id = mivry.CreateGesture(“my gesture”);
2: Start the gesture recording with:
mivry.StartGesture(gesture_id);
3: Move the device to perform the gesture.
4: Finish the recording with:
mivry.EndGesture();
5: Repeat the process (3.2 ~ 3.4) multiple times. We recommend recording at least 20 samples.
6: After recording samples for all your gestures, start the training process:
mivry.StartTraining();

(4) Identify gestures:
1: Start the gesture with:
mivry.StartGesture(-1);
2: Move the device to perform the gesture.
3: End the gesture and get the ID of the identified gesture with
int gesture_id = mivry.EndGesture();
4: If needed, the name of the identified gesture can be acquired with
String gesture_name = mivry.GetGestureName(gesture_id);

Troubleshooting and Frequently Asked Questions

(1) Where / when in my own program do I have to create the GestureRecognition or GestureCombination object?

You can create the gesture recognition object anywhere in your project. There are no special requirements where to do it.

(2) Can MiVRy be used to identify hand gestures, finger gestures, or motion patterns of other objects than VR controllers?

Yes, MiVRy can identify the motion pattern of any object or body part. Just input the motion of each tracked object, finger, or body part separately (for example by creating a GestureCombinations object and then using it’s contdStroke function for each part).

(3) Can I use MiVRy to identify gestures in a continuous stream of motion data without a clear start or end?

Yes MiVRy can identify gestures continuously. Just use the contdIdentify function of either the GestureRecognition or GestureCombinations object.

(4) How can I open and edit gesture database (.DAT) files?

Please use the “GestureManager” to open and edit.DAT gesture database files.

(5) The Gesture Recognition library does not detect if a gesture is different from all recorded gestures. I want to know if the user makes the gesture I recorded or not.

The gesture recognition plug-in will always return the number of which other (known) gesture is most similar to the one you just performed.
If you want to check if the gesture you made is different from all the recorded gestures, use the following code instead of the normal “endStroke()” function:

double similarity;
int identified_gesture = endStroke(ref similarity);

Then the similarity variable will give you a measurement of how similar the performed gesture was to the detected gesture. A value of one will indicate perfect similarity, a low value close to zero indicate great differences between the performed gesture and the recorded gesture. You can use this value to judge if the performed gesture is sufficiently similar to the recorded one.

(6) I want to use Gesture Recognition in my commercial project. What commercial licensing options do you provide?

We offer both single-payment licenses for one project or profit-sharing licenses where we receive a part of the sales price on each unit sold.
Pricing is dependent on the size of your project.
Please contact us at support@marui-plugin.com for details.

(7) Do I have to call “startTraining()” every time I start my game? Does it have to keep running in the background while my app is running?

No, you only need to call startTraining() after you have recorded new gesture data (samples) and want these new recordings to be used by the AI. However, you need to save the AI after training to a database file (.DAT) and load this file in your game before using the other gesture recognition functions.
While the training is running, you cannot use any of the other functions, so you cannot let training run in the background. You must start (and stop) training in between using the AI.

(8) How long should I let the training run to achieve optimal recognition performance?

Usually, the AI will reach its peak performance within one minute of training, but if you’re using a large number of gestures and samples, it may take longer. You can check the current recognition performance from the training callback functions and see if the performance still keeps increasing. If not, feel free to stop the training.

(9) Gestures aren’t recognized correctly when I look up/down/left/right or tilt my head.

You can choose if the frame of reference for your gestures are the players point of view (“head”) or the real world or game world (“world”). For example, if the player is looking up to the sky when performing a gesture towards the sky, then from a “world” frame-of-reference the direction is “up”, but from players “head” point-of-view, the direction is “forward”. Therefore, if you consider your gestures to be relative to the world “up” (sky) and “down” (ground) rather than the visual “upper end of the screen” and “lower end of the screen”, then change the frameOfReferenceUpDownPitch to FrameOfReference.World. The same setting is available for the yaw (compass direction) and head tilt.

(10) After some time, all attempts to identify a gesture fail with error code -16.

You have used up all “free” gesture recognitions of the free version of MiVRy for this
session. To identify more gestures, restart the app, or purchase an “unlimited” license at
https://www.marui-plugin.com/mivry/