summaryrefslogtreecommitdiffstats
path: root/src/sensors/doc/src/qtsensorgestures-plugins.qdoc
blob: 67734aad2c065d5bd04b956bbc7344b16aba612c (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
/****************************************************************************
**
** Copyright (C) 2012 Digia Plc and/or its subsidiary(-ies).
** Contact: http://www.qt-project.org/legal
**
** This file is part of the documentation of the Qt Toolkit.
**
** $QT_BEGIN_LICENSE:FDL$
** Commercial License Usage
** Licensees holding valid commercial Qt licenses may use this file in
** accordance with the commercial license agreement provided with the
** Software or, alternatively, in accordance with the terms contained in
** a written agreement between you and Digia.  For licensing terms and
** conditions see http://qt.digia.com/licensing.  For further information
** use the contact form at http://qt.digia.com/contact-us.
**
** GNU Free Documentation License Usage
** Alternatively, this file may be used under the terms of the GNU Free
** Documentation License version 1.3 as published by the Free Software
** Foundation and appearing in the file included in the packaging of
** this file.  Please review the following information to ensure
** the GNU Free Documentation License version 1.3 requirements
** will be met: http://www.gnu.org/copyleft/fdl.html.
** $QT_END_LICENSE$
**
****************************************************************************/

/*!
\group sensorgesture_plugins_topics
\title QtSensorGestures Plugins
\brief Information about the QtSensorGestures recognizer plugins.

The QtSensorGestures recognizer plugins are the way to create your own
sensor gestures.

Creating software to recognize motion gestures using sensors is a huge subject not covered here.

The QSensorGesture API does not limit usage to any of the common classification methods of gesture
recognition such as Hidden Markov Models, Neural Networks, Dynamic Time Warping, or even the
ad-hoc heuristic recognizers of Qt's built-in sensor gesture recognizers. It's basically a
signaling system for lower level gesture recogition methods and algorithms to communicate to the
higher level applications.

\tableofcontents

\section1 Overview

    The steps to creating a sensor gesture plugin are as follows:
\list
\li Sub-class from QSensorGesturePluginInterface
\li Sub-class from QSensorGestureRecognizer and implement gesture recognizer logic using QtSensors.
\li Create an instance of that recognizer in the derived QSensorGesturePluginInterface class, and call
 QSensorGestureManager::registerSensorGestureRecognizer(sRec); in your registerRecognizers()
function. QSensorGestureManager will retain ownership of the recognizer object.

This is the class in which the gesture recognizer system should be implemented from.

\endlist

\snippet sensorgestures/plugin.cpp Plugin


\section2 Recognizer Classes

If you are making sensorgestures available through the QtSensorGestures API, these are the
classes to subclass.

\annotatedlist sensorgestures_recognizer

\target Qt Sensor Gestures
\section3 Recognizer Plugins

The Sensor Gesture Recognizers that come with Qt are made using an ad-hock heurustic approach.
The user cannot define their own gestures, and must learn how to perform and accomodate the
pre-defined gestures herein.

A developer may use any method including computationally and training intensive well
 known classifiers, to produce gesture recognizers. There are currently no classes in Qt for
gesture training, nor ability for the user to define their own sensor based motion gestures.

A procedure for writing ad-hock recognizers might include:
\list
    \li Obtain and gather output from the accelerometer through QAccelerometerReading of a gesture being performed.
    \li Use various methods and constraints on the accelerometer data to recognize the various states:
    \list i
        \li Initial 'listening' state for a gesture
        \li Start of a possible gesture, moving into a 'detecting' state
        \li End of a possible gesture, moving into a 'recognizing' state
        \li and finally, if it is recognized, the 'recognized' state, or if not recognized, move back to
           the 'listening' state.
    \endlist
    \li Test procedure to make sure it is easy to perform, and will not
 produce too many false positive recognitions. And if used with other gestures, collisions. Meaning
that gestures performed get recognized as another gesture instead.

\endlist

Here is a list of included plugins and their signals

For ShakeGestures plugin:
    \table
        \row
            \li Recognizer Id
            \li Signals
        \row
            \li QtSensors.shake
            \li shake
\endtable

For QtSensorGestures plugin:
    \table
        \row
            \li Recognizer Id
            \li Signals
            \li Description
            \li Images
        \row
            \li QtSensors.cover
            \li cover
            \li Hand covers up phone display for one second, when it's face up, using the Proximity and Orientation sensors.
            \li \image sensorgesture-cover.png
        \row
            \li QtSensors.doubletap
            \li doubletap
            \li Double tap of finger on phone, using the DoubleTap sensor.
            \li \image sensorgesture-doubletap.png
        \row
            \li QtSensors.hover
            \li hover
            \li Hand hovers about 4 cm above the phone for more than 1 second, then is removed when face up,
               using the IR Proximity sensor.
            \li \image sensorgesture-hover.png
        \row
            \li QtSensors.pickup
            \li pickup
            \li Phone is resting face up on a flat curface, and is then picked up and brought up into viewing position, using the Accelerometer sensor.
            \li \image sensorgesture-faceup.png
        \row
            \li QtSensors.shake2
            \li shakeLeft, shakeRight, shakeUp, shakeDown
            \li Shake phone in a certain direction, using the Accelerometer sensor.
            \li \image sensorgesture-shake.png
        \row
            \li QtSensors.slam
            \li slam
            \li Phone is held in a top up position with a side facing forward for a moment. Swing it quickly with a downward motion
             like it is being used to point at something with the top corner. using the Accelerometer and Orientation sensors.
            \li \image sensorgesture-slam_1.png
            \image sensorgesture-slam_2.png
        \row
            \li QtSensors.turnover
            \li turnover
            \li Phone is turned face down and placed on a surface, using Proximity and Orientation sensors.
            \li \image sensorgesture-facedown.png
        \row
            \li QtSensors.twist
            \li twistLeft, twistRight
            \li Phone is held face up and then twisted left or right (left side up or right side up) and back, using the Accelerometer and Orientation sensors.
            \li \image sensorgesture-twist.png
        \row
            \li QtSensors.whip
            \li whip
            \li Move phone quickly down and then back up. using the Accelerometer and Orientation sensors.
            \li \image sensorgesture-whip.png
   \endtable

*/