My WebGL saga continues. If we are
talking tortoise vs hare here, then I am certainly the tortoise
member of this fable.
Slowly but surely we beat the thing into
submission. Its not really bad, just badly documented, IMHO, like so
much of the GL family. With WebGl we are definitely in a world
circa 1982 or so. But MUCH faster than we ever would have achieved
in 1982 no matter how rich you were. So we have vast power on all
our desktops, but for some reason "we" have "chosen"
to program it in a rather low level way. Well, hell, I like low
level now and then. I like debugging this stuff that could have
been documented but why bother, they can figure it out on their own.
So here is the kind of things I have
been working on instead of figuring out how to end war or torture the
rich or writing great fiction.
1. The Element Array Debacle
So, for example, if I had been paying
attention, I would have noticed that the "element array"
feature of WebGL (1) was defined to be a "short" or a 16
bit number. Warning alarms should have gone off in my head but its
been a long time since I have programmed 16 bit machines. Because its a short, the largest number of elements that one can therefore address is 64K vertices. In other
words, it is useless for objects that are of modern complexity.
Remember there are no higher order surfaces rendered directly by
WebGL so we get the appearance of complexity by having a lot of
triangles, I mean a lot of triangles, and maybe play with the
shading. Maybe this limitation was noted in the documentation but I
don't think so.
The result was that I had to rewrite
the core of the object viewer to not use element arrays but just use
the vertices, and the vertex attributes, etc. It took about 2 - 3
days and resulted in a much cleaner if slightly more verbose piece of
code that could actually be maintained if it had to be.
2. The How Many of WHAT Exactly? Problem
The documentation says that you need to
specify the number of things you are transferring. Well now, that
could mean the number of triangles, or it could mean the number of
vertices it takes to specify the triangles, or it could mean the
number of bytes that it takes to hold the data, or ...
And the answer is: its the number of
vertices you use to define the object. So the count you send is (3 *
number of triangles) or (2 * number of lines). Maybe it was obvious
to you, but it sure was not obvious to me from the documentation.
3. The Ongoing Normal Mystery
Look at the following picture. See
anything odd? Well, its made out of flat triangles, and you should
see flat surfaces. Ok, so its interpolating the normals, whats so
odd about this? Its just that all the normals for a face (all
triangles really) are all pointing the same direction. Unless WebGl
is rebuilding the topology of the object by doing a giant vertex
sort, there is no way it could be interpolating the normals.
So what is going on? No one knows, but
I suspect that it is a bug in my shader that somehow does not compute
the diffuse component correctly. The specular would normally and
correctly be "smooth shaded", e.g. not show flat surfaces
for the most part. So this maybe is just flat shaded, with
transparency, and a specular. If that is not the problem then we are
definitely in the twilight zone here.
"This war will be over one day".
You get extra credit for knowing what
movie that is from and who said it.
____________________________________
1. Where one specifies a line or
triangle by keeping a list of vertex numbers rather than repeating
the vertex itself over and over again.
No comments:
Post a Comment