Showing posts with label badly documented software. Show all posts
Showing posts with label badly documented software. Show all posts

Wednesday, November 5, 2014

Potential Collapse of Civilizaton Seen as a Result of Weak Type Checking


In a world filled with the threat of war, with midterm elections that once again demonstrate the self-destructive credulity of the American people, with the collapse of the American economy due to the greed and stupidity of the American elites, is now the proper time to talk about the looming crisis of weak type checking in our programming languages?

Yes, now is the time. The need has never been greater to stop this foolish slide into moral decay and socialized health care.

The promise of weakly typed or untyped languages such as Javascript is that you can quickly and flexibly create new data structures, complex data structures, and not get bogged down by being forced to go through your entire system and make everything work together in a very pedantic, and literal way.  You can throw together arrays and lists and pass them as parameters and feel a certain pleasant lack of mental overhead in doing so.  

This can be very productive but it can also generate a false sense of correctness when in fact one has introduced a subtle incompatibility which the system is blandly ignoring for a while, only to fail in a non-trivial way when you are least expecting it.

In fact, it is shocking how well a system like Javascript is willing to accommodate undefined variables, functions, incompatible types and just keep moving along as if nothing was wrong.

But having seduced the programmer into a false sense of security, it then waits for the program to reach a certain size, or grow to more than a single programmer, and suddenly the author or authors of a program have to start tracking down a bug that comes from one side not enforcing or maintaining a data structure in the way to which it was intended, or partially implemented, or perhaps implemented but then changed due to incomplete knowledge.

The larger the system, the more people who contribute to a system, and the longer the software is in use and being evolved, the more likely this is to happen. And when it happens, one is left without the tools to find the problem other than reading the code carefully and providing ones' own type checking.

How could this lead to the end of civilization?   It can do so in two different ways.  The first is that by permitting this mental weakness, this accommodation to those who would advocate weak type safety, we are letting those who are more lazy enter into a position of responsibility in society. This is certain to lead inevitably to sloppy programming resulting in falling buildings, collapsed bridges, exploding nuclear power plants and God only knows what else.

But second, this nation is under relentless attack by inscrutable oriental criminal elements that are sponsored by their evil, slave owning, government.   Can you imagine their glee whenever they penetrate another freedom-loving web page or program in America that has been left defenseless by a weakly type-checked programming language?

We must stand firm against these efforts to leave America defenseless against these threats and rebuild American strength through strongly typed languages.

Thank you.

Friday, October 18, 2013

Bad User Interface Design: Celebrate the Nightmare from Hell that is Gimp


Many philosophers over the centuries have asked: "what makes bad user interface design?" Oh the arguments that have raged over that apparently simple question. Is it all inspiration, accident or genius that leads to bad user interface design? Are there principles we can deduce to help new and inexperienced designers write bad user interface code, maybe even dreadful user interface code?

I believe that the answer is yes, we can help people design and write bad user interface code, code that demeans the user, insults the user, makes their life worse, and makes their work impossible or nearly so. True inspiration may be beyond our capability to teach, true genius may break these principles we write down here, but for the great majority we can deduce principles that can act as guidelines for a truly bad user interface or "user experience" as we say.

We will take the case study approach as pioneered by Harvard Business School and from these case studies try to create principles to apply to new situations.

I recently sat down to learn GIMP, the Gnu Image Processing program, and was thrilled by its bad user interface ideas. From these I derived some principles and will then discuss how GIMP achieves these worthwhile goals.

Principle #1: Make a bad first impression.

If you can make a bad first impression, then you may even be able to make the user give up entirely. But how, specifically, can one make a bad first impression?

Principle #2:  Increase frustration by focusing on what the beginning user has to do and make that more difficult

In other words, concentrate your bad design into those areas that the beginner has to work through, it is less important to inconvenience the advanced user as they are far fewer in number and have more capability and skills to work through your stupidity.

Principle #3:  Use a GUI design or principle from a similar program that  the beginner almost certainly knows but give it a completely different meaning, and actually hide something important that the user needs to do under that category.

Our third principle here is a particularly nasty one. It solves several problems at once, it confuses the user, and makes them less confident that they bring skills to your program that will be useful.

Let us examine how GIMP achieves these three principles in a truly elegant manner. What is the first thing a beginner of a "paint" program might wish to do? Well I would argue that finding a paint brush and setting a color is pretty much right at the top of the list for a beginning user. What Gimp does for this is to make it ok to find a paint brush, although there is some good confusion there, but then it completely makes it inscrutable to choose a color. How does it do this?  It does this by hiding the pick-a-color function under a glyph that means something else entirely in Photoshop.

Then ask yourself how many new users of Gimp will have been exposed to Photoshop? I would argue that at least 80% of any user of Gimp will have learned at least something of Photoshop and the percentage may well be higher. Then what could be more devious and self-defeating than to hide "pick a color" under a graphic that has nothing to do with picking a color from Photoshop? And that is exactly what they do. In Gimp, pick a color is carefully hidden under the following icon:




which means of course to exchange a foreground color with a background color in Photoshop and has nothing to do, actually with choosing a color.

What genius ! What mad genius ! What a clever and nasty person whoever did this must be !!

No one would ever think to look under switch fg/bg color, and they have to look up how to pick color in Gimp on the internet (of course there is no online documentation for Gimp) and eventually they find it, but not after many minutes or even hours of frustration and hairpulling.

So our first case study suggests: find something a beginner certainly has to do, and hide it in a place he will explicitly not look for it based on previous experience.

In future posts we will examine other examples of genius bad user interface design.

Thursday, October 17, 2013

Global Wahrman Retreats from Dynamic Views


I think that I must have unrealistic expectations about how these things are documented.  It makes perfect sense to me that I would want to have my posts truncated to 4 lines, no, wait, that doesn't make any sense at all.  In fact, that would be nutty.  Oh well, I guess making billions of dollars as Blogspot and Google do is no guarantee of quality.  

We temporarily retreat from dynamic views, but we will return to the subject one day soon!


Friday, October 11, 2013

Is Anyone Home at Khronos / WebGL ? Hello ?


Is anyone home at WebGL / Khronos ?   I mean, does anyone work there?  Or is it just a front from some sort of criminal organization or venture capital firm that is involved in corporate fraud or something similar?

Maybe, if someone does work there, they could get them to update their documentation, so they do not waste everyone's time?  

About two years ago, a flaw was found in WebGL security and a fix was implemented by Google and Mozilla that completely broke using texture mapping.   There is a work around it for those who want to use texture mapping, but it requires implementing something known as CORS and most people have not.   So bye bye texture mapping.  Who needs it anyway?   No problem.

However, isn't it odd that Khronos and WebGL don't say a word about this on their web site?  Its been over two years since this was broken and they don't mention it once, nor is it mentioned in the few tutorials that they have on their web site that happen to be about texture mapping.  So the question is, what are those people thinking?  A few words on the topic could certainly have saved me some time.  Of course WebGL makes it clear, your time doesn't matter to them.

There is more misinformation out there about WebGL then there is genuine up-to-date and helpful information.   Good old internet.  Gotta love it.  Bold new paradigm, you know.


See how Perlin noise adds a certain grunginess to our rocketship without using texture mapping per se.  Hey, I have an idea!   WebGL could have had noise in specification instead of making everyone implement it on their own.   Oh, that would have been too much trouble, I guess.

Thursday, October 10, 2013

Information about WebGL for the Graphics Professional


There are many aspects of WebGL 1.0 that are somewhat documented and somewhat undocumented. These lacuna are very annoying. Well-meaning web pages that repeat tutorials for absolute beginners over and over again clutter the search engine space and make finding solutions diffiicult.

This post does not have information for beginners or dilettantes.   These notes are for the serious graphics person who is trying to get work done and wants to be more productive. Some of what follows are hard facts or details, some are loose impressions or generalizations.  

-- WebGL is deeply intertwined into Javascript, DOM and the browser, so you must know exactly what you are using because there are differences.

-- Chrome's javascript does not allow "const" in strict mode
-- Chrome's browser throws security violations when attempting to load a texture map
-- Firefox's javascript allows const and does not throw security violations (on linux)
-- Failure to test on your selected OS & browser delivery targets will bite you in the ass
-- You must be very aware of this as you desperately try to get information from the internet

-- Do not assume anything from other OpenGL's will be there.

-- WebGL is not OpenGL ES 2.0
-- WebGL 1.0.2 is derived from OpenGL ES 2.0, but they are not identical
-- WebGL is closeer to OpenGL ES than any of the other OpenGLs though
-- There are almost no carryovers of concepts and ideas and intent from the earlier OpenGLs
-- This is a VERY low level graphics interface, you will reinvent the wheel over and over
-- The WebGL shader language is the same as the shader language of OpenGL ES 1.0 GLSL.

-- Documentation is broken

-- The Khronos site does have a WebGL 1.0.2 specification.
-- The specification is highly concise (at best) or just incomplete (at worst)
-- The shader language is not described beyond the statement that it is the same as GLSL ES 1.0
-- The documentation for GLSL ES 1.0 Shader Language documentation is excellent (see link)
-- There is a pretty good concise WebGL synopsis in PDF form (see link)
-- There is no active forum, the one on Khronos was killed years ago
-- There are a few helpful tutorials on the Khronos site, but not too many, some are wrong
-- The examples on Khronos use techniques that have not worked for years (loading textures)
-- There are relevant articles on the Mozilla developer site, but they are incomplete and the
comments are closed

-- WebGL is fragile and unhelpful

-- Things have to be done exactly their way or it will not work
-- Once something does not work, other errors cascade from it, some of them inexplicable
-- Relentlessly check for webgl errors, do not assume things worked
-- Errors appear either as return codes from WebGL, or in browser error windows, or not at all
-- Most of the WebGL error messsages just say "error"
-- Remember that checking for errors will slow you down, so they must be removed for
production code
-- Sometimes when a problem is unsolvable, I move on to other problems, then come back,
rewrite from scratch using the exact same techniques and everything works.

-- Specific Issues

-- drawArrays "count" is not well defined. It probably wants the number of vertices,
not the number of drawn elements (e.g. triangles) or number of floats, etc.
-- drawElements restricts the number of elements to what will fit in a short. This is not
enough to be useful for many if not most applications
-- The documentation for how to load a texture is wrong and will cause security violations
on modern browsers, it is not clear how much the CORS solution helps
-- texture maps must be exact powers of two in size
-- "const" is not cross-browser, do not use
-- functions are only defined in the shader portion being compiled. A function defined in
a vertex shader will not be available in a fragment shader
-- functions must be defined before use (prototypes may work)

-- Recommendations

-- Use the strict mode of javascript
-- Write or pick a matrix manipulation package for javascript. I use gl-matrix.js
-- Expect to write your own display list manager, object manager, texture manager, etc
-- Expect to spend a lot of time reverse engineering the documentation
-- Be very cautious about what you read on the internet. It is often out of date or applies to
something else that sounds similar but isn't
-- Of course the above warning applies to this post as well.  
-- Test on your target platforms as you develop.  Don't be surprised at the end.

-- Links

-- Article on cross domain textures disabled

-- The WebGL 1.0.2 spec is what passes for documentation

-- The OpenGL ES 1.0 Shader Language Document

-- The WebGL 1.0 Quick Reference Card

Sunday, October 6, 2013

Unspeakable Evil, Esoteric Knowledge and WebGL


What madness possessed me that day I will never be able to explain. Having accomplished what needed to be done, why did I press on?  I do not know myself what unconscious self-destructive character flaw caused me to innocently go forward and attempt to learn the forbidden knowledge that lurks just below the surface of WebGL.

Those who watch, those who control, those who oppress our pathetic lives through means both grand and small,  yes even through those apparently innocent graphics API committees that design our APIs. 

Or put another way, I decided to move forward and add texture mapping and multiple objects to my little WebGL object viewer which is really just an excuse for learning WebGL.  

1. I attempted to use the feature, attribute arrays, to download the UV coordinates for use by the shader. But even though I have used this feature successfully for the vertices, colors and normals I could not make it work for UVs.

2. Texture mapping would not work. After a while I figure out that my texture map is not an exact power of two, and that seems to be fatal. First, I do not think that is clear in the documentation. Second, why is the only error code that WebGL seems to know about is "error"? Why can there not be more informative error messages?

3. I use a trick to get the UV coordinates to the shader, and that works, but I can not use that trick in the final software.

Yeah !  A cube with a texture map !

4. In the process of writing slightly more sophisticated shaders, I discover that (a) functions are tricky and that (b) it is not clear which GLSL (shader language) WebGL implements unless you read very carefully (the answer is that it implements GLSL ES 1.0 only) and (c) there is no obvious documentation.  There is documentation it turns out, but you have to go searching around for the documentation for OpenGL ES 2.0's 1.0 shader language. Confused?  You should be, there are many, many OpenGL shader languages defined out there and they are all different yet similar. 

5. In the process of doing this, using textures and so forth, Firefox on Windows gets a hard failure because I am not using vertex attribute 0.   Really, why does it care?

6. Once I start using texture maps, Google Chrome, which had been very compatible before, now fails hard with the following security error.

Security violation?  What security violation?  I am loading a texture map, ok?

Here are the conclusions I drew from this not very fun 72 hours: 

1. WebGL is fragile. You may think you have learned how to do something in WebGL, but its possible that you did not and that you got lucky. You may not be able to use that feature in the future, or you might, but you do not know.

2. WebGL is designed to a very low level. Even the API for the 2D canvas is at a much higher level (and much more usable) than the 3D API.   The only solution seems to be to write your own management system to do basic stuff that could or should have been done for you.  

3. The WebGL error system is nearly useless. It does not give enough information to diagnose problems and errors.

4. The WebGL documentation is problematic. There is no reference manual, there is no shader language documentation (it turns out there is, but you have to be a detective to figure out which one it is and where it is), there is way too much reliance on group sourced internet tutorials that are very low level and often incorrect or unhelpful.

5. It is not clear how much a problem browser compatibility is. Initially there did not seem to be any problem but as of yesterday I am dead in the water and I can not figure out what it is complaining about. Therefore be warned, browser compatibility is a real issue.

6. This might all just be a problem of software (im)-maturity.  Maybe it will all get better with time.  Maybe someone will write a Javascript package to smooth over the loose ends and everyone will use that API instead of the WebGL one.

7. One does what one always does in this situation.  You write your own software to hide the ugliness of the underlying system and after a while the problems go away.   The irony here is that OpenGL was originally created to make it easier for people to start doing 3D.   It has evolved into something that seems to take pride in making it as difficult as possible for a beginner to do 3D.  Weird.

There are a lot of positive things to say about WebGL (including the ability to run a 3D application from a browser which means that the end-user does not have to install a program). But if I were advising a client on whether or not to commit to doing an application that was important to the company using WebGL, I would tell them "Maybe, but proceed with caution". This is a very low-level API, it is not yet robust or documented. One will need to do tests and carefully evaluate whether what you need to do can in fact be done in this environment and at what cost.  Look before you leap.



Thursday, September 26, 2013

WebGL and the Learning Curve Saga Part III


My WebGL saga continues. If we are talking tortoise vs hare here, then I am certainly the tortoise member of this fable.

Slowly but surely we beat the thing into submission. Its not really bad, just badly documented, IMHO, like so much of the GL family. With WebGl we are definitely in a world circa 1982 or so. But MUCH faster than we ever would have achieved in 1982 no matter how rich you were. So we have vast power on all our desktops, but for some reason "we" have "chosen" to program it in a rather low level way. Well, hell, I like low level now and then. I like debugging this stuff that could have been documented but why bother, they can figure it out on their own.

So here is the kind of things I have been working on instead of figuring out how to end war or torture the rich or writing great fiction.

1. The Element Array Debacle

So, for example, if I had been paying attention, I would have noticed that the "element array" feature of WebGL (1) was defined to be a "short" or a 16 bit number. Warning alarms should have gone off in my head but its been a long time since I have programmed 16 bit machines.   Because its a short, the largest number of elements that one can therefore address is 64K vertices. In other words, it is useless for objects that are of modern complexity. Remember there are no higher order surfaces rendered directly by WebGL so we get the appearance of complexity by having a lot of triangles, I mean a lot of triangles, and maybe play with the shading. Maybe this limitation was noted in the documentation but I don't think so.

The result was that I had to rewrite the core of the object viewer to not use element arrays but just use the vertices, and the vertex attributes, etc. It took about 2 - 3 days and resulted in a much cleaner if slightly more verbose piece of code that could actually be maintained if it had to be.

2. The How Many of WHAT Exactly? Problem

The documentation says that you need to specify the number of things you are transferring. Well now, that could mean the number of triangles, or it could mean the number of vertices it takes to specify the triangles, or it could mean the number of bytes that it takes to hold the data, or ...

And the answer is: its the number of vertices you use to define the object. So the count you send is (3 * number of triangles) or (2 * number of lines). Maybe it was obvious to you, but it sure was not obvious to me from the documentation.

3. The Ongoing Normal Mystery

Look at the following picture. See anything odd? Well, its made out of flat triangles, and you should see flat surfaces. Ok, so its interpolating the normals, whats so odd about this? Its just that all the normals for a face (all triangles really) are all pointing the same direction. Unless WebGl is rebuilding the topology of the object by doing a giant vertex sort, there is no way it could be interpolating the normals.



So what is going on? No one knows, but I suspect that it is a bug in my shader that somehow does not compute the diffuse component correctly. The specular would normally and correctly be "smooth shaded", e.g. not show flat surfaces for the most part. So this maybe is just flat shaded, with transparency, and a specular. If that is not the problem then we are definitely in the twilight zone here.

"This war will be over one day".

You get extra credit for knowing what movie that is from and who said it.

____________________________________

1. Where one specifies a line or triangle by keeping a list of vertex numbers rather than repeating the vertex itself over and over again.