summaryrefslogtreecommitdiffstats
path: root/doc/src/tutorials/shaders.qdoc
blob: c35839cbf44ee6159ff23aff7bf787be4e7b69ed (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
/****************************************************************************
**
** Copyright (C) 2012 Nokia Corporation and/or its subsidiary(-ies).
** Contact: http://www.qt-project.org/
**
** This file is part of the Qt3D documentation of the Qt Toolkit.
**
** $QT_BEGIN_LICENSE:FDL$
** GNU Free Documentation License
** Alternatively, this file may be used under the terms of the GNU Free
** Documentation License version 1.3 as published by the Free Software
** Foundation and appearing in the file included in the packaging of
** this file.
**
** Other Usage
** Alternatively, this file may be used in accordance with the terms
** and conditions contained in a signed written agreement between you
** and Nokia.
**
**
**
**
**
** $QT_END_LICENSE$
**
****************************************************************************/

/*!
    \title Using GLSL shaders in QML
    \example quick3d/shaders


    In this tutorial, we will show how to apply QML property animations to
    GLSL shaders in a 3D application.

    Starting with a relatively simple shader program, we are going to
    manipulate various parameters to explain how both the shader and the QML
    integration work (This is quite a lot to start a tutorial with, but we'll
    focus on each small piece of it in turn):

    \raw HTML
        <table align="left" width="100%">
        <tr class="qt-code"><td>
    \endraw
    \snippet quick3d/shaders/shader-tutorial.qml 1
    \raw HTML
        </td><td align="right">
    \endraw
    \inlineimage tutorials/shader-tutorial.png
    \raw HTML
        </td></tr>
        </table>
    \endraw

    At the highest level, shaders in Qt3D are created using the ShaderProgram
    element.  The ShaderProgram element in this example has the id \a program,
    and applying it to the teapot is as simple as assigning it to the \c effect
    property of an Item3D derived element.

    The two most important properties of the ShaderProgram element are the
    \c vertexShader and \c fragmentShader properties, which are strings
    containing the code for the respective shader programs.  Broadly speaking,
    the vertex shader is run on each vertex in a model (for example, the 8
    corners of a cube) to calculate the final position of each point, while the
    fragment shader is run on each pixel visible onscreen to calculate it's
    color.

    The other attribute we use in this tutorial is the \c texture property.
    This property is connected to the \c qt_Texture0 attribute in our shader
    programs, and we'll cover how it's used shortly.

    \section1 Texture Coordinates and Textures (shader-tutorial-1-textures.qml)

    To start with, there are a couple of obvious problems with our starting
    point - the Qt is upside down and back to front, and it's also boring
    (before you ask, the reasons for the image being the wrong way around are
    also boring).

    First, let's get the logo the right way up.  The key here is the texture
    coordinates.  Let's look at it's declaration in the vertex shader for a
    moment: \code attribute highp vec4 qt_MultiTexCoord0; \endcode

    The \c attribute declaration indicates that this is a per-vertex value
    that we get from our model.  Our teapot has a position, a normal, and
    texture coordinates for each vertex, and qt3d automatically provides
    these to us through the \c qt_Vertex, \c qt_MultiTexCoord0, and \c
    qt_Normal attributes. (we don't care about normals until we get to
    lighting).

    The \c mediump tag indicates that we want pretty good accuracy on this
    attribute, but it doesn't need double precision.  It's a hint for
    embedded systems to save bandwidth (highp is usually used for positions,
    and lowp is generally only suitable for colors, and the rule of thumb is
    to use as little memory as you can get away with).

    \c vec4 indicates that the value is a 4-element vector.  Our texture is
    2D, so we only care about the first two elements, but there are 3D
    textures and cube maps out there.

    To fix the image's orientation we can simply reverse the sign when we
    pass the texture coordinate to our fragment shader because the default
    behaviour is for texture coordinates wrap around if they are higher than
    1.0 or lower than 0.0:
    \code texCoord = -qt_MultiTexCoord0; \endcode

    In order to fix the boring, we're going to have to add a bit more QML
    first.  We need to add a property to our ShaderProgram element, and a
    matching variable to the shader program.  The types match up exactly
    between QML and GLSL, but if the variables have the same name they
    should automatically be hooked up, so we just add this to the
    ShaderProgram element:
    \code property real textureOffsetX : 1.0
        NumberAnimation on textureOffsetX
        {
            running: true; loops: Animation.Infinite
            from: 0.0; to: 1.0;
            duration: 1000
        }
    \endcode
    and we add this to the vertex shader:
    \code uniform mediump float textureOffsetX; \endcode

    The final step is to work our new variable into the program, by changing
    the texCoord assignment.  We'll add our animated variable to our horizontal
    texture coordinate, which will effectively scroll our texture around our
    teapot:
    \code
        texCoord = vec4(-qt_MultiTexCoord0.s - textureOffsetX,
                        -qt_MultiTexCoord0.t,
                        0,0);
    \endcode

    Adding an additional texture is done by adding another uniform.  The
    ShaderProgram element interprets string properties as URIs for texture
    resources, so adding a second texture is as easy as:
    \code
    property string texture2: "textures/basket.jpg"
    \endcode
    In order to have a smooth transition back and forth, we'll add and animate
    a second property to use to blend the two textures:
    \code
        property real interpolationFactor : 1.0
        SequentialAnimation on interpolationFactor
        {
            running: true; loops: Animation.Infinite
            NumberAnimation {
            from: 0.0; to: 1.0;
            duration: 2000
            }
            PauseAnimation { duration: 500 }
            NumberAnimation {
                from: 1.0; to: 0.0;
                duration: 2000
            }
            PauseAnimation { duration: 500 }
        }
    \endcode

    Next we need to use all of that information in the fragment shader:
    \code
        varying highp vec4 texCoord;
        uniform sampler2D qt_Texture0;
        uniform sampler2D texture2;
        uniform mediump float interpolationFactor;

        void main(void)
        {
            mediump vec4 texture1Color = texture2D(qt_Texture0, texCoord.st);
            mediump vec4 texture2Color = texture2D(texture2, texCoord.st);
            mediump vec4 textureColor = mix(texture1Color, texture2Color,
                                        interpolationFactor);
            gl_FragColor = textureColor;
        }
    \endcode

    In general, textures needs to have the same name in the shaders as in the
    ShaderProgram element in order to be automatically connected, but \c
    qt_Texture0 is special cased and very slightly faster, so it's good to use
    it first.

    \c mix() is another handy built in GLSL function that interpolates linearly
    between two values or vectors.  (You can find details of all the built in
    functions in the official OpenGL Shader Language specification available at
    http://www.khronos.org/opengl/ )

    Finally, let's make one more change to make this example pop.  If you're
    a performance fanatic, it just might have rankled that we padded out our
    texture coordinates with two zeros, and passed them in for processing
    on every single pixel of our teapot.

    Let's make use of that space by putting a second set of co-ordinates in
    there to use with our second texture.  Let's change our texture assignment
    to this:
    \code
        texCoord.st = vec2(-qt_MultiTexCoord0.s - textureOffsetX,
                        -qt_MultiTexCoord0.t);
        texCoord.pq = vec2(-qt_MultiTexCoord0.s + textureOffsetX,
                        -qt_MultiTexCoord0.t);
    \endcode
    Now the top half of our vector contains co-ordianates spinning in the
    opposite direction.  Back in the fragment shader, we just need use these
    instead for our second texture color, so let's change the \c texture2Color
    assignment to this, and really fix that boring:
    \code
        mediump vec4 texture2Color = texture2D(texture2, texCoord.pq);
    \endcode

    \section1 Varying values (shader-tutorial-varying.qml)

    The left hand side value is our \c varying attribute \c texCoord.
    \c varying values are how the vertex shader communicates with the
    fragment shader, and the declaration has to be identical in both shaders.
    \c varying values are calculated once for each
    vertex, but the values are interpolated across the shapes.  The
    shader-tutorial-varying.qml shows this visually using the Pane
    class and a neat debugging trick - using the texture coordinates as
    a color. Even with only 4 vertexes the texture coordinates are smeared
    smoothly across the shape:
    \table
    \row
    \o \code gl_FragColor = vec4(mod(texCoord.x, 1.0),
        mod(texCoord.y, 1.0), 0.0, 1.0);  \endcode
    \o
    \o \image tutorials/shader-tutorial-varying.png
    \endtable


    \section1 Vertexes and Matrices (shader-tutorial-2-vertex-shader.qml)

    Let's go back to the vertex shader.  As already mentioned, the
    vertex shader's primary function is to generate the final position of a
    vertex.

    First, let's look at \c qt_Vertex.  This is the value we're getting out
    of our model, the actual points on our shape.  Manipulating this value
    will lets us change the position of the points somewhat independantly.

    For this tutorial, we'll create a squashing effect that might work for
    something rising out of water, or something rubbery being squashed down.

    What we need to do is create a floor, where vertexes above the floor
    retain their position, and vertexes below it are clamped to (nearly) that
    value.  Then we move the model relative to this floor to create a nice
    effect.

    We need to use some knowledge that we have about the model for this to
    work - most notably it's height, and it's bottom.  A bit of experimenting
    suggests that 2.0 for height and -1.0 for bottom are close enough.

    We need to introduce a couple more GLSL things here.  \c max() is one of
    the GL SL built in functions, selecting the higher of the two arguments as
    you might expect.

    The \c foo.xyzw are called \i twiddles, and are
    convenient and efficient ways to pull out specific elements of vector
    areguments.  You can see in this example, that we use twiddles to get out
    the x, z, and w values from our original vector and substitute in our own
    y value.  For convenience, there are 3 different sets of twiddles that are
    all equivalent: foo.xyzw (co-ordinates), foo.rgba (colors), and
    foo.stpq (texture coordinates).  As far as GLSL is concerned, though,
    they're just vectors, and effectively just commenting your code.

    The \c vec4() function will accept whatever combination of values and
    twiddles you throw at it, and mold them back into a 4 element vector.

    What this function is doing is moving the model down the screen (which for
    us, here, is along the y axis in the negative direction).
    We draw an imaginary line where the bottom of the model used to be, and if
    the vertex ends up below that line, we move it back to just past the line.

    "Just past" is important.  Graphics cards get confused if vertexes are too
    close together, and the result is ugly.  Try taking out the
    "\c {qt_Vertex.y * 0.01}" if you'd like to see what it looks like.

    \code
            float newY = max(qt_Vertex.y - squashFactor * modelSize,
                               qt_Vertex.y * 0.01 + modelBottom);
    \endcode


    \code
            const float modelSize = 2.0;
            const float modelBottom = -1.0;
            float newY = max(qt_Vertex.y - squashFactor * modelSize,
                               qt_Vertex.y * 0.01 + modelBottom);

            gl_Position = qt_ModelViewProjectionMatrix *
                                  vec4(qt_Vertex.x, newY, qt_Vertex.zw);
    \endcode

    Hopefully, that makes the function of the \c qt_Vertex attribute clear, so
    next we'll look at the Model/View/Projection matrices.

    The model matrix is generally used to place an object in a scene.  In the
    simplest case, this  might just be "up and to the left a bit", but it's
    often inherited from a parent.  It's very easy mathematically to combine
    the matrices in such a way that one object is always in the same relative
    position to another ("The hip bones connected to the thigh bone").

    The view matrix is generally used much like a camera is in a movie, for
    panning around a whole scene at once.  Manipulating the view matrix is also
    commonly used for effects like mirrors.

    The projection matrix (\c qt_ProjectionMatrix) functions much like a
 camera does when you take a
    picture, converting all the 3d points in a scene into the 2d points on the
    screen.

    We won't be exploring the matrices individually, but let's explore what
    happens if we use the same effect that we just used on the vertex after
    the transformation matrices are applied.

    Firstly, we're going to need multiple teapots to see the difference, so
    let's add those in.  We want to move them all together, so we'll wrap
    them in an Item3D, and apply our animations to that item instead of
    the teapots directly.
    \code

    Item3D {
        z: -8.0
        transform: [
            Rotation3D {
                NumberAnimation on angle {
                    running: true; loops: Animation.Infinite
                    from: 0; to: 360; duration: 5000
                }
                axis: Qt.vector3d(0, 0, 1.0)
            }
        ]

        TutorialTeapot {id: teapot1; effect: program; y:2.0; x:0.0}
        TutorialTeapot {id: teapot2; effect: program; y:-1.0; x:-1.732}
        TutorialTeapot {id: teapot3; effect: program; y:-1.0; x:1.732}
    }
    \endcode

    In order to show the difference, what we want to do now is the same sort
    of effect as the previous example, only applied to the positions after
    the matrices have been applied, so our new vertex shader looks like this:
    \code
    attribute highp vec4 qt_Vertex;
    uniform mediump mat4 qt_ModelViewProjectionMatrix;

    attribute mediump vec4 qt_MultiTexCoord0;
    varying mediump vec4 texCoord;

    void main(void)
    {
        const float modelBottom = -4.0;

        vec4 workingPosition = qt_ModelViewProjectionMatrix * qt_Vertex;
        float newY = max(workingPosition.y,
                           workingPosition.y * 0.15 + modelBottom);
        workingPosition.y = newY;
        gl_Position = workingPosition;

        texCoord = -qt_MultiTexCoord0;
    }
    \endcode

    There's nothing new here, we're just tweaking a few numbers for the new
    effect and manipulating the vertexes after the matrices have been applied.
    The result is an imaginary line across the whole scene, and when any part
    of any teapot dips below that line we deform it as though it's being
    squished or refracted.

    The obvious difference is that when you're manipulating \c qt_Vertex,
    the inputs, outputs, and changes are relative to the model.  After the
    matrices are applied,

    We'll leave adding a pretty watery surface as an exercise for the reader.



    \section1 Lighting

    Finally, we'll add lighting.  We've left lighting till last because it
    requires a lot of additional variables, and it is not within the scope of
    this tutorial to explore them all individually.  There are many better
    resources readily available, and the techniques already covered can be
    used to explore each element.


    For further reading, the full specification for the OpenGL Shader Language
    is available from the Khronos website at http://www.khronos.org/opengl/

    \l{qt3d-examples.html}{Return to the main Tutorials page}.
*/