MiVRy Gesture Recognition Documentation
This article guides you through the use of the MiVRy Gesture Recognition.
For a detailed guide on the Unity Plug-in please see:
https://www.marui-plugin.com/documentation-mivry-unity/
For a detailed guide on the Unreal Engine Plug-in please see:
https://www.marui-plugin.com/documentation-mivry-unreal/
For a detailed guide on MiVRy for Oculus Hands Hand Tracking please see:
https://www.marui-plugin.com/documentation-mivry-questhands
Package Overview
You always download MiVRy from GitHub:
https://github.com/MARUI-PlugIn/MiVRy
(1) Plug-in library files (binaries):
To use MiVRy in your own project, you must add the binary library files to your build environment. MiVRy is available for the following platforms:
Windows:
In the windows/ folder you can find the plugin library files for Windows – both statically linked ( lib/ folder) and dynamically linked ( dll/ folder) for x86 32 bit and 64 bit platforms.
UniversalWindowsPlatform / Hololens:
In the UWP/ folder you can find the plugin library files for UniversalWindowsPlatform – both statically linked ( lib/ folder) and dynamically linked ( dll/ folder) for x86 and arm platforms (32 bit and 64 bit).
Note that the Hololens 1 uses a x86 32 bit processor,
and the Hololens 2 uses an arm 64 bit processor.
Linux / Unix:
In the linux/ folder you can find the plugin library files for Linux – both statically linked ( lib/ folder) and dynamically linked ( dll/ folder) for x86 and arm platforms (32 bit and 64 bit).
Android:
In the andoid/ folder you can find the plugin library files for Android – (dynamically linked .so) for x86 and arm platforms (32 bit and 64 bit).
You will also an .aar package to use in Android development. Details on how to use the .aar package can be found here.
NOTE:
For use in Unity and Unreal Engine, special plug-ins are available:
Unity: https://www.marui-plugin.com/documentation-mivry-unity/
Unreal: https://www.marui-plugin.com/documentation-mivry-unreal/
(2) Header Files (C / C++)
MiVRy provides three different script to use the plug-in library. You ever only need one of the three, depending on your requirements and development goals.
– GestureRecognition.h : C/C++ header for using one-handed one-part gestures.
– GestureCombinations.h: C/C++ header for using two-handed or multi-part gestures.
The are also C# wrappers GestureRecognition.cs and GestureCombinations.cs
Please see the sections below on how to use these headers.
The Gesture Manager
The Gesture Manager is an app that allows you to record, test, and organize your gestures and handle Gesture Database (.DAT) files.
You can download the Gesture Manager App here:
Windows (Oculus or SteamVR):
https://www.marui-plugin.com/download/mivry/MiVRy_GestureManager_Win.zip
Android (Oculus Quest & Quest 2):
https://www.marui-plugin.com/download/mivry/MiVRy_GestureManager_ARM64.apk
How to use the GestureManager:
Please watch the following video on how to use the Gesture Manager:
Important input fields in the GestureManager
Number of Parts: How many motions – at most – comprise a gesture. A gesture consisting of one single hand motion has one part. A two-handed gesture has two parts, one for the left hand and one for the right hand. It is also possible to use gesture combinations where one hand has to perform multiple sequential motions (such as writing three letters – the individual letters are parts to a combination triplet). The number you put in this field decides the maximum. You can still also have combinations with less parts (for example: one-handed gestures among two-handed gestures).
Rotational Frame of Reference: How direction like “up”, “down”, “left”, “right”, “forward” and “back” are defined. For example, if a player is looking at the ceiling and performs a gesture in front of his face, in the “world” frame-of-reference, the gesture was performed “upward” because it was performed above the player’s head. But in the “head” frame-of-reference, the gesture was performed “forward”. This can decide which gesture is identified. For example, if you have a “punch the ceiling” gesture and a “punch the ground” gesture, you must choose a “world” frame-of-reference, but if you have a “touch my forehead” gesture and a “touch my chin” gesture, a “head” frame-of-reference may be more appropriate. The frame of reference can be selected separately for yaw (left-right / north-south), pitch (up/down) and roll (tiling the head). The “Rotation Order” should be Yaw -> Pitch -> Roll (ZYX in Unreal, YXZ in Unity).
Record Gesture Samples: This selects I for which gesture you want to record new samples or if you want to test the identification instead (please note that new samples do not have any effect until the “training” was performed). When you record samples, please make sure that you record the gesture many different ways. For example, if the player should be allowed to perform the gesture with a small motion and a large motion, be sure to record both small and large samples. It can also help to record gesture samples from several people to ensure that particular habits of one person don’t affect the recognition for other players.
Compensate head motion during gesture: Whether the motion of your frame of reference should be removed from the motion of the joints of the hands. For example, if you’re turning you head to the right while constantly holding your hand in front of your face, the head rotation will be compensated and your hand will appear motionless to the gesture recognition AI. This setting affects how gestures are recorded – it is not possible to change this after the gesture sample has been recorded. Please note that activating the “compensate head motion” option is likely to result in lower recognition scores and lower similarity values for identified gestures. This is because people tend to look at their hands when gesturing. Imagine making a “swipe-left” gesture and a “swipe-right” gesture. If you don’t compensate head motion, they are easy to distinguish. But if you look at your hands while making the gesture and compensate for that head motion, then all that MiVRy gets is that your hand stays in front of your face. Then it’s difficult to distinguish whether you swiped left or right.
Coordinate System Conversion: These two settings help to ensure that the same VR coordinates (the directions of “x”, “y”, and “z” of the headset and controllers) are used by the Gesture Manager and your final project. Set “Unity XR Plug-in” to the XR plug-in that you are using in Unity (in Unity Project Settings -> XR Plug-in Management). Set “MiVRy Coordinate System” to whichever coordinate system you want to use in your own project (for example: “Unreal” for Unreal Engine coordinates). If you don’t wish to use different coordinate systems, you don’t need to adjust these values.
Start Training / Stop Training: This starts or interrupts the training process where the AI tries to learn your gestures. The “Performance” value which is updated during the training indicates how many of your gestures the AI can already correctly identify. Even when the training the training is stopped prematurely the result is still preserved, so you can stop it as soon as you are satisfied. Sometimes the AI ‘misunderstands’ your intentions and the future recognition of gestures is not satisfactory. In this case, just re-run the training process. If the result still is not good, please record more gesture samples with greater variation to make it clearer to the AI what you intend.
Licensing and Activation
Mivry is free to use for commercial, personal, and academic use.
However, the free version of MiVRy has certain limitations.
The free version of MiVRy can only be used to identify 100 gestures per session (meaning every time you run the app). When using continuous gesture identification, it can only be used for a total of 100 seconds of identifying gestures.
To unlock unlimited gesture recognition, you must purchase a license at:
https://www.marui-plugin.com/mivry/
The license key will be sent to you automatically and immediately after purchase.
If the license email does not arrive, please check your spam filter, and contact support@marui-plugin.com
The license credentials must then be used to activate MiVRy by using the activateLicense() function (during runtime) or by inserting it into the license settings fields in the Unity or Unreal plug-in.
Using a License File:
Alternatively, you can save the license name (ID) and key into a file [and load it with the activateLicenseFile() function (or input insert the path to the license file into the Mivry.cs component / MivryActor if you use either of those).
The license file is a simple text file that you can create with any text editor, that contains the keywords “NAME” and “KEY”, each followed by a colon (“:”) or equal sign (“=”) and then your respective license credentials.
Here is an example of how the contents of a valid license file may look:
NAME: your@email.com_3z0UvQ3GBkAc74VW9nQKPlbm
KEY : b701b7235a483698e61a2b8d69479ed013a03069fcb9b892302277a0f394c257
How to use the GestureRecognition.h header (for one-handed one-motion gestures)
(1) Include the GestureRecognition.h header file file into your project, together with the appropriate library file for your platform (Windows DLL x86 / 64, Linux .so, Android .so, UWP, …).
(2) Create a new Gesture recognition object and register the gestures that you want to identify later.
IGestureRecognition* gr = new IGestureRecognition();
int myFirstGesture = gr->createGesture("my first gesture");
int mySecondGesture = gr->createGesture("my second gesture");
(3) Record a number of samples for each gesture by calling startStroke(), contdStroke() and endStroke() for your registered gestures, each time inputting the headset and controller transformation.
double hmd_p[3] = { /* Headset position (x,y,z) */ };
double hmd_q[4] = { /* Headset rotation quaternion (x,y,z,w) */ };
gr->startStroke(hmd_p, hmd_q, myFirstGesture);
[…]
// repeat the following while performing the gesture with your controller:
double p[3] = { /* Controller position (x,y,z) */ };
double q[4] = { /* Controller rotation quaternion (x,y,z,w) */ };
gr->contdStrokeQ(p,q);
// ^ repead while performing the gesture with your controller.
[…]
gr->endStroke();
Repeat this multiple times for each gesture you want to identify.
We recommend recording at least 20 samples for each gesture.
(4) Start the training process by calling startTraining().
You can optionally register callback functions to receive updates on the learning progress by calling setTrainingUpdateCallback() and setTrainingFinishCallback().
gr->setMaxTrainingTime(10); // Set training time to 10 seconds.
gr->startTraining();
You can stop the training process by calling stopTraining().
After training, you can check the gesture identification performance by calling recognitionScore() (a value of 1 means 100% correct recognition).
(5) Now you can identify new gestures performed by the user in the same way as you were recording samples:
double hmd_p = { /* Headset position (x,y,z) */ };
double hmd_q = { /* Headset rotation quaternion (x,y,z,w) */ };
gr->startStroke(hmd_p, hmd_q);
[…]
// repeat the following while performing the gesture with your controller:
double p = { /* Controller position (x,y,z) */ };
double q = { /* Controller rotation quaternion (x,y,z,w) */ };
gr->contdStrokeQ(p,q);
// ^ repeat while performing the gesture with your controller.
[…]
int identifiedGesture = gr->endStroke();
if (identifiedGesture == myFirstGesture) {
// ...
}
(7) More than just getting the most likely candidate which gesture was performed, you can also get the similarity how much the performed motion resembles the identified gesture:
double similarity;
int identifiedGesture = gr->endStroke(&similarity);
This provides a value between 0 and 1, where 0 indicates that the performed gesture is very much unlike the previously recorded gestures, and 1 indicates that performed gesture is the exact average of all previously recorded gestures and thus highly similar to the intended gesture.
(8) You can save and load your gestures to a gesture database file.
gr->saveToFile("C:/myGestures.dat");
// ...
gr->loadFromFile("C:/myGestures.dat");
How to use the GestureCombinations.h header (for two-handed gestures and gesture-combos)
(1) Include the appropriate library file (.DLL file on Windows) and the GestureRecognition.h and GestureCombinations.h files to your project.
(2) Create a new GestureCombinations object with the number of “parts” that you want to use. “Part” here means a separate object or body part (eg. left hand and right hand). But it can also be used to identify consecutive gestures (ie. multi-part sequential gestures). Then register the gestures that you want to identify later.
In this example, we use gesture part “0” to mean “left hand” and gesture part “1” to mean right hand, but you can use any indexing of parts that you want.
IGestureCombinations* gc = new IGestureCombinations(2);
int myFirstCombo = gc->createGestureCombination("wave your hands");
int mySecondCombo = gc->createGesture("play air-guitar");
Also, create the individual gestures (aka sub-gestures) that each combo will consist of.
const int left = 0;
const int right = 1;
int myFirstCombo_leftHandGesture = gc->createGesture(left, "Wave left hand");
int myFirstCombo_rightHandGesture = gc->createGesture(right, "Wave right hand");
int mySecondCombo_leftHandGesture = gc->createGesture(left, "Hold guitar neck");
int mySecondCombo_rightHandGesture = gc->createGesture(right, "Hit strings");
Then set the Gesture Combinations to be the connection of those gestures.
gc->setCombinationPartGesture(myFirstCombo, left, myFirstCombo_leftHandGesture);
gc->setCombinationPartGesture(myFirstCombo, right, myFirstCombo_rightHandGesture);
gc->setCombinationPartGesture(mySecondCombo, left, mySecondCombo_leftHandGesture);
gc->setCombinationPartGesture(mySecondCombo, right, mySecondCombo_rightHandGesture);
(3) Record a number of samples for each gesture by calling startStroke(), contdStroke() and endStroke() for your registered gestures, each time inputting the headset and controller transformation.
double hmd_p[3] = { /* insert headset position here */ };
double hmd_q[4] = { /* insert headset rotation quaternion here */ };
gc->startStroke(left, hmd_p, hmd_q, myFirstCombo_leftHandGesture);
gc->startStroke(right, hmd_p, hmd_q, myFirstCombo_rightHandGesture);
[…]
// repeat the following while performing the gesture with your controller:
double p_left[3] = { /* insert left controller position here */ };
double q_left[4] = { /* insert left controller rotation quaternion here */ };
gc->contdStrokeQ(left, p_left, q_left);
double p_right[3] = { /* insert right controller position here */ };
double q_right[4] = { /* insert right controller rotation quaternion here */ };
gc->contdStrokeQ(right, p_right, q_right);
// ^ repeat the abover while performing the gesture with your controller.
[…]
// at the end of the gesture, call:
gc->endStroke(left);
gc->endStroke(right);
Repeat this multiple times for each gesture you want to identify.
We recommend recording at least 20 samples for each gesture, and have different people perform each gesture.
(4) Start the training process by calling startTraining().
You can optionally register callback functions to receive updates on the learning progress by calling setTrainingUpdateCallback() and setTrainingFinishCallback().
gc->setMaxTrainingTime(60); // Set training time to 60 seconds.
gc->startTraining();
You can stop the training process by calling stopTraining(). After training, you can check the gesture identification performance by calling recognitionScore() (a value of 1 means 100% correct recognition).
(5) Now you can identify new gestures performed by the user in the same way as you were recording samples:
double hmd_p[3] = { /* headset position */ };
double hmd_q[4] = { /* headset rotation quaternion */ };
gc->startStroke(left, hmd_p, hmd_q);
gc->startStroke(right, hmd_p, hmd_q);
[…]
// repeat the following while performing the gesture with your controller:
double p_left[3] = { /* left controller position */ };
double q_left[4] = { /* left controller rotation quaternion */ };
gc->contdStrokeQ(left, p_left, q_left);
double p_right[3] = { /* right controller position */ };
double q_right[4] = { /* right controller rotation quaternion */ };
gc->contdStrokeQ(right, p_right, q_right);
// ^ repead while performing the gesture with your controller.
[…]
gc->endStroke(left);
gc->endStroke(right);
int identifiedGestureCombo = gc->identifyGestureCombination();
if (identifiedGestureCombo == myFirstCombo) {
// ...
}
(6) Now you can save and load the artificial intelligence.
gc->saveToFile("C:/myGestureCombos.dat");
// ...
gc->loadFromFile("C:/myGestureCombos.dat");
Special Applications
(1) Identifying gestures while walking in the real world:
If you want to identify a gesture while changing your position or orientation in the real world, it can help to activate the “Compensate Head Motion” option of MiVRy.
This will cause MiVRy to record the controller position relative to the current headset position and rotation.
In the Gesture Manager this can be activates in the Menu below the Recording Submenu under “Compensate Head Motion”.
In Unreal, the MivryActor component offers the setting “CompensateHeadMotion”.
In Unity, the Mivry.cs component offers the setting “compensateHeadMotion”.
Please note that activating the “compensate head motion” option is likely to result in lower recognition scores and lower similarity values for identified gestures. This is because people tend to look at their hands when gesturing. Imagine making a “swipe-left” gesture and a “swipe-right” gesture. If you don’t compensate head motion, they are easy to distinguish. But if you look at your hands while making the gesture and compensate for that head motion, then all that MiVRy gets is that your hand stays in front of your face. Then it’s difficult to distinguish whether you swiped left or right.
(2) Continuous gesture identification
If you don’t want to explicity define the beginning and end of a gesture motion (e.g. with a trigger button), you can use MiVRy to continously try to identify ongoing gesture motions while they are in the process.
In the Unreal Plugins `MiVRyActor` component, this can be activated with the `ContinuousGestureRecognition` option.
In the Unity Plugins `Mivry.cs` script component, this can be activated with the `ContinuousGestureRecognition` option.
Please note that you still need to define a trigger to start/stop the overall identification process. Between the start and stop indicated by the trigger, MiVRy will continously try to identify the gesture motion. However, can set the trigger to an absolute value higher than the threshold instead of using some user input – then MiVRy will start immediately when the level is started.
To do this with the `GestureRecognition` or `GestureCombinations` objects directly, you still have to call “startStroke” once (for example at the start of the level),
and then you still have to continuously provide the controller position with the “contdStroke” function.
After each call to the “contdStroke” function, you can use the “contdIdentify” function to identify the currently performed gesture. Use the contdIdentificationPeriod value to control how long of a time frame to consider in the identification. You can also use contdIdentificationSmoothing to avoid the identification result from jumping from one
gesture ID to another too easily. For recording continuous gestures, you can use the “contdRecord” function, which should also be called after each call to “contdStroke” during recording. You don’t need to call “endStroke” until you’re finishing your gesture recording or identification session.
Check the settings for `ContinuousGesturePeriod` and `ContinuousGestureSmoothing` to specify details regarding the continuous gesture identification.
How to use the Android AAR package
(1) Import the library into your Android Studio project:
1: Select File > New > New Module.
2: Select Import .JAR/.AAR Package then click Next.
3: Select the location of MiVRy.aar and click “Finish”.
4: In your project’s build.gradle add to the “dependencies” list the following: implementation project(‘:mivry’)
(2) Include in your java files:
1: Add the import to the top of your file:
import com.maruiplugin.mivry.MiVRy;
2: Create a MiVRy object:
MiVRy mivry = new MiVRy();
(3) Create gestures:
1: Register a new gesture with:
int gesture_id = mivry.CreateGesture(“my gesture”);
2: Start the gesture recording with:
mivry.StartGesture(gesture_id);
3: Move the device to perform the gesture.
4: Finish the recording with:
mivry.EndGesture();
5: Repeat the process (3.2 ~ 3.4) multiple times. We recommend recording at least 20 samples.
6: After recording samples for all your gestures, start the training process:
mivry.StartTraining();
(4) Identify gestures:
1: Start the gesture with:
mivry.StartGesture(-1);
2: Move the device to perform the gesture.
3: End the gesture and get the ID of the identified gesture with
int gesture_id = mivry.EndGesture();
4: If needed, the name of the identified gesture can be acquired with
String gesture_name = mivry.GetGestureName(gesture_id);
Troubleshooting and Frequently Asked Questions
(1) Where / when in my own program do I have to create the GestureRecognition or GestureCombination object?
You can create the gesture recognition object anywhere in your project. There are no special requirements where to do it.
(2) Can MiVRy be used to identify hand gestures, finger gestures, or motion patterns of other objects than VR controllers?
Yes, MiVRy can identify the motion pattern of any object or body part. Just input the motion of each tracked object, finger, or body part separately (for example by creating a GestureCombinations object and then using it’s contdStroke function for each part).
(3) Can I use MiVRy to identify gestures in a continuous stream of motion data without a clear start or end?
Yes MiVRy can identify gestures continuously. Just use the contdIdentify function of either the GestureRecognition or GestureCombinations object.
(4) How can I open and edit gesture database (.DAT) files?
Please use the “GestureManager” to open and edit.DAT gesture database files.
(5) The Gesture Recognition library does not detect if a gesture is different from all recorded gestures. I want to know if the user makes the gesture I recorded or not.
The gesture recognition plug-in will always return the number of which other (known) gesture is most similar to the one you just performed.
If you want to check if the gesture you made is different from all the recorded gestures, use the following code instead of the normal “endStroke()” function:
double similarity;
int identified_gesture = endStroke(ref similarity);
Then the similarity variable will give you a measurement of how similar the performed gesture was to the detected gesture. A value of one will indicate perfect similarity, a low value close to zero indicate great differences between the performed gesture and the recorded gesture. You can use this value to judge if the performed gesture is sufficiently similar to the recorded one.
(6) I want to use Gesture Recognition in my commercial project. What commercial licensing options do you provide?
We offer both single-payment licenses for one project or profit-sharing licenses where we receive a part of the sales price on each unit sold.
Pricing is dependent on the size of your project.
Please contact us at support@marui-plugin.com for details.
(7) Do I have to call “startTraining()” every time I start my game? Does it have to keep running in the background while my app is running?
No, you only need to call startTraining() after you have recorded new gesture data (samples) and want these new recordings to be used by the AI. However, you need to save the AI after training to a database file (.DAT) and load this file in your game before using the other gesture recognition functions.
While the training is running, you cannot use any of the other functions, so you cannot let training run in the background. You must start (and stop) training in between using the AI.
(8) How long should I let the training run to achieve optimal recognition performance?
Usually, the AI will reach its peak performance within one minute of training, but if you’re using a large number of gestures and samples, it may take longer. You can check the current recognition performance from the training callback functions and see if the performance still keeps increasing. If not, feel free to stop the training.
(9) Gestures aren’t recognized correctly when I look up/down/left/right or tilt my head.
You can choose if the frame of reference for your gestures are the players point of view (“head”) or the real world or game world (“world”). For example, if the player is looking up to the sky when performing a gesture towards the sky, then from a “world” frame-of-reference the direction is “up”, but from players “head” point-of-view, the direction is “forward”. Therefore, if you consider your gestures to be relative to the world “up” (sky) and “down” (ground) rather than the visual “upper end of the screen” and “lower end of the screen”, then change the frameOfReferenceUpDownPitch to FrameOfReference.World. The same setting is available for the yaw (compass direction) and head tilt.
(10) After some time, all attempts to identify a gesture fail with error code -16.
You have used up all “free” gesture recognitions of the free version of MiVRy for this
session. To identify more gestures, restart the app, or purchase an “unlimited” license at
https://www.marui-plugin.com/mivry/
(11) MiVRy identifies any motion as some gesture, even when it doesn’t resemble any of the recorded gestures. Why? How can I tell if no valid gesture motion was performed?
MiVRy will always tell you the “most likely” best guess as to which gesture was just performed, no matter how different the currently performed motion is from all recorded gestures. This is because we cannot decide for you how much difference is tolerable.
In order to disqualify “wrong” motions, you have two options:
(A) you can check the “similarity” value returned by MiVRy. This value describes how similar the gesture motion was compared to previous recordings on a scale from 0 (very different) to 1 (very similar).
(B) you can check the “probability” value. Especially when you compare the probability values for all recorded gestures (for example via the “endStrokeAndGetAllProbabilitiesAndSimilarities” function) and see that they are all very low and not very different from one another, you may want to decide that the current gesture performance was not valid.
(12) What exactly does die “similarity” value of a gesture performance mean? How is it different from the probability value?
The “similarity” value expresses how much the identified gesture differs from the average of the recorded samples for that gesture. When you record several samples, MiVRy internally calculates a “mean” (“average”, “typical”) gesture motion based on those samples. It also calculates how much the recorded samples differ from this “mean” (ie. the “variance” of the samples). The “similarity” value is then calculated based on this “mean”. If your newly performed gesture motion hits exactly this “average”, then the similarity value will be one. The more it differs, the lower the “similarity” value will be, going towards zero. How fast it will fall depends on how similar the recorded samples were. If all recorded samples looked exactly the same, then MiVRy will be very strict, and the “similarity” value will fall fast when the currently performed motion isn’t also exactly alike. If, however the samples differed a lot, MiVRy will be more tolerant when calculating the “similarity” value and it will be higher. The value is always between 0 and 1. This “similarity” is different from the “probability” values, which are estimates by the artificial intelligence (neural network). “Probability” may contain many more considerations, for example if there are other gestures who resemble the identified gesture (probability may drop, similarity is unaffected), or if there are a multitude of distinct motions lumped together as one “gesture” (for example: having a gesture “alphabet” which contains drawings of “A”, “B”, “C” etc all lumped together as one gesture – then “similarity” will be calculated based on an “average” character that doesn’t resemble any sample, but the AI may successfully understand what you mean and give high “probability” values).
(13) Instead of triggering the start and end of a gesture motion, I want MiVRy to constantly run in the background and detect gestures as they occur.
You can use the “Continuous Gesture Identification” feature of MiVRy. When using the “GestureRecognition” or “GestureCombinations” objects directly, use the “contdIdentify” function – you can call this function repeatedly (for example on every frame or when something in your app happens) and every time it will tell you which gesture is currently being performed. When using the Unity “Mivry” component or the UnrealEngine “MivryActor”, use the “Continuous Gesture Identification” switch. Either way, two settings are important for Continuous Gesture Identification: “Continuous Gesture Period” and “Continuous Gesture Smoothing”. “Continuous Gesture Period” is the time frame (in milliseconds) that continuous gestures are expected to be. So if your gestures take 1 second to perform, set this to “1000” so that MiVRy will consider the last 1000 milliseconds to identify the gesture. “Continuous Gesture Smoothing” is the number of samples (previous calls to “contdIdentify” to use for smoothing continuous gesture identification results). When setting this to zero, each attempt to identify the gesture will stand alone, which may lead to sudden changes when switching from one gesture to another. If ContinuousGestureSmoothing is higher than zero, MiVRy will remember previous attempts to identify the gesture and will produce more stable output.
(14) What is the “Update Head Position Policy” / “Compensate Head Motion” setting?
This setting decides whether the AI should consider changes in head position during the gesturing.
During gesturing, the current position of the VR headset can/will be updated via the “updateHeadPosition” procedure.
This data is saved together with the motion data.
However, in many cases it is not advisable to take head motions during gesturing into account, because people may watch their hands while gesturing.
Following the moving hands with the head would then eliminate the hand motion relative to the headset (the hands would always be “in front of the headset”).
However, in some cases it may be useful to use the changing head position, for example if the user might be walking during a gesture.
You can chose whether or not the the data provided via calls to “updateHeadPosition” functions will be used with the UpdateHeadPositionPolicy (or call to GestureRecognition.setUpdateHeadPositionPolicy().
“UseLatest” will cause MiVRy to use the changing head position, thus compensating the relative head motion during gesturing.
“UseInitial” will not consider changes in head motion during gesturing, but only the head position at the start of the gesture.
Note that if you use a GestureRecognition or GestureCombinations object directly, you also need to provide the changing head position via “updateHeadPosition()” for this to have any effect.
Also note that the data provided via “updateHeadPosition” is stored regardless of the policy, even when it is not used later.
(15) I’m getting return value / error code “-1” when trying to identify a gesture.
The value “-1” indicates that Mivry doesn’t know of any gestures to which it could compare your current gesture motion to. For example if you performed a one-handed gesture motion but Mivry has only been trained on two-handed gestures, or has not been trained at all. There are three common causes:
(1) Make sure that your gesture database file was loaded successfully. Check the log if there are any MiVRy error message about failing to load your gesture database file.
(2) If you are using two-handed gestures (combinations), make sure that both triggers (for your left and right hand) are working and that you engage both of them at the same time.
(3) Make sure that you perfomed training after recording gestures as samples.
(16) I have a lot of gestures in my database file and the recognition performance is not so good anymore. Is there a way to limit the gestures without having to record new database files?
You can use the “setGestureEnabled” function to limit the gestures to which MiVRy compares the current motion and increase the recognition performance. You can do this at run-time in your app, for example to offer different gestures in different levels. There are, however, two caveats:
(1) it disables the gesture (per one hand), not the combination (ie two-handed gesture). So if you have a combination “clap hands” which consists of the gestures “clap left hand” and “clap right hand”, then you disable “clap left hand” and “clap right hand”. This also means that if another gesture combination other than “clap hands” would use either “clap left hand” or “clap right hand”, this other gesture combination would also be affected by disabling the gestures.
(2) enabling/disabling gestures works by omitting these gestures during training.
So after disabling the gestures, you need to run training again. After training, you can save a new copy of your gesture database file – the enabled/disabled gestures are saved in the database file.
Example code for disabling the gestures (parts) of a combination:
```
for (int combination = gc.numberOfGestureCombinations()-1; combination >=0; combination--) {
bool enabled = enabledCombinations.Contains(combination);
int leftHandGesture = gc.getCombinationPartGesture(combination, 0);
int rightHandGesture = gc.getCombinationPartGesture(combination, 0);
gc.setGestureEnabled(0, leftHandGesture , enabled);
gc.setGestureEnabled(1, rightHandGesture , enabled);
}
gc.startTraining();
```