This information was graciously excepted from the printed documentation that comes with Object Glove. This information is Copyright 1992, 1993 by Mark Thomas Pflaging. No support for defining or using Gestures will be given to unregistered users of Object Glove and Court Jesture. Using the "Court Jesture" Recognition System Court Jesture is a gesture recognition system for Object Glove that is supplied with Object Glove. It operates in real time, allows definition of gestures by the end user, and dispenses the gestures selectively to application objects in an object-oriented way. Gestures can be grouped into "sets" and gesture sets can be activated and deactivated dynamically by the application. When two gloves are used, each glove can have separate gesture sets or the same gesture sets. As with Object Glove, it works under DOS and Windows and has been combined with Rend386. How Court Jesture Works Court Jesture works by measuring changes in the glove input and reporting when the change is "close" to the values defined for any gesture. A gesture is nothing but a predefined change in the glove input over time. (There are other ways of defining gestures, but this is the definition used in Court Jesture.) Given this simple definition, there are a wide variety of gestures that are possible. One of the primary purposes of a gesture recognition system should be to provide "commands" to the application instead of a steady flow of unfiltered data. Court Jesture succeeds in this task. Let's look at a couple of examples. Suppose we want to define a gesture like "turn the wrist upward and pull toward the body, while the fingers are in a fist." First, we observe that the gesture does not depend on the X or Y values that come from the glove. Second, we have not defined how long it will take the user to make this gesture. This is important because, if this time is too short, the user will not have enough time to move his hand. If the (duration) count is too long, the user has to wait in order to see the effect of the gesture. In Court Jesture, a count of 100 corresponds to a gesture duration of about a second. So a duration of 80 might be chosen for this gesture. Third, we should realize that we are asking for the "change" in the finger input to be zero! (Because we said "while" the fingers were in a fist.) For the fingers, it would obviously be more convenient to specify a "static" value - one that is not a change, but a specification of the position of the finger. In a closed fist position, the finger values are all zero. Lastly, we have choose changes for the Z value and for the rotation. Over a change of .8 seconds, the user can probably move their hand about two feet toward their body. This corresponds to a change in the Z value of positive 30. A rotation of positive 5 corresponds to a wrist moving clockwise (from the point of view of the user) about 165 degrees. We have decided on the following "delta" (change in input): Duration of change: 80 Change in X: don't care Change in Y: don't care Change in Z: positive 30 Change in Rotation: positive 5 Static Thumb value: zero Static Index finger value: zero Static Middle finger value: zero Static Ring finger value: zero Another possible gesture might be "open your fist while the glove is near the center of the sensing area. We don't how far the glove is from the sensors, just use the X and Y values." Making a fist implies that the glove fingers change from a closed position (of 0) to an open position (of 3). So, the delta for each finger is negative 3. We don't care about the change in X and Y, we only care that they are near zero. The Z and rotation values should be totally ignored. The duration will probably be pretty short, so we can use a value of 40. Summing up, Duration of change: 80 Static X value: 0 Static Y value: 0 Change in Z: don't care Change in Rotation: don't care Change in Thumb: -3 Change in Index finger: -3 Change in Middle finger: -3 Change in Ring finger: -3 As you can see, a wide variety of gestures can be specified in this way. There is one more aspect of Court Jesture that is somewhat esoteric. Every gesture has a priority that is used to "lock out" other gestures. Each time a gesture is sensed, it sends an "on" message to the application only if there is no other "on" gesture with a higher priority. When the gesture is no longer detected by the program, an "off" message is sent to the application, and lower priority gestures can occur. This can be very useful if a particular motion tends to set off more than one gesture. Generally you want to assign the highest priority to the gesture that gets triggered first. That way, only one gesture will be sent to the application - the one with the highest priority. If two gestures arrive at exactly the same time (it is possible!), then the priority of the gesture with the longest duration will be checked first. It is typical to assign a priority of 100 when priority is not an issue. Editing Gestures for the Demo Program Now we can try out our example gestures. An easy way to do this is to use one of the demo programs (either the DOS or Windows version.) They automatically read the gestures specified in the ``[Gestures]'' and ``[Buttons]'' sections. We will deal with the ``[Gestures]'' section for now. When a gesture from the ``[Gestures]'' section is received by the demo application, it's name is printed on the screen. On each line, the following parameters are specified in order: Priority, Duration, Change in X, Change in Y, Change in Z, Change in Rotation, Change in Thumb, Change in Index finger, Change in Middle finger, Change in Ring finger, and Change in Keypad buttons. If a parameter has a ``static'' attribute, meaning that it specifies a fixed value instead of a delta, then it should be directly preceded by an asterisk ('*'). If we don't care about a particular value, it's place should be held by an 'X'. The number of spaces between the values is arbitrary. Here are the lines for the two examples given above: Example1=100 80 X X 30 5 *0 *0 *0 *0 X Example2=100 40 *0 *0 X -3 -3 -3 -3 X You should study the relationship between the specification given in Section 1, and the Gestures given here. There is one gesture parameter that hasn't been discussed - the ``change in Keypad buttons'' parameter. It works like the other parameters, except that its value is not a number - it can only be one of the following: ``0'', ``Center'', ``1'', ``2'', ``3'', ``4'', ``5'', ``6'', ``7'', ``8'', ``9'', ``Left'', ``Up'', ``Down'', ``Right'', ``Enter'', ``Select'', ``Start'', ``A'', or ``B''. They correspond obviously to the buttons on the glove keypad. ``0'' and ``Center'' are wired together on the glove, so they correspond to the same value. You cannot have a negative sign in front of the button, but you can have a '*' in front of it, which means the gesture needs to have that button down only at the time the gesture is finished. The gestures in the ``[Buttons]'' section are used to trigger events in the demo programs. You can see that they are just gestures that respond to buttons being pushed. They can be redefined if you wish, but make sure to use the same parameter names. (i.e. keep the left hand sides of the settings the same.) In the Windows version of the demo program, you can associate gestures with sounds by specifying the association in the ``[Sounds]'' section of ``GLOVE.INI''. The sound is played through the Windows 3.1 Multimedia Interface, so this works best when you have a sound card installed. You should go into the Control Panel application, and from there to the ``Sound'' applet to change and edit the system sound assignments. See the file ``GLOVE.INI'' for further information and examples. Addendum: There are several sections in GLOVE.INI that define gestures for the various demo programs. [1.Gestures] and [2.Gestures] are used by the DOS and Windows demo programs. [1.Rend386] and [2.Rend386] are used by "DEMO4B.EXE". (See DEMO4B.DOC.) [Windows] is used by "GLOVMOUS.EXE" and [Buttons] is used by the DOS and Windows demo programs.