Thursday, March 7, 2013

Toaster Oven or Computer? How Can We Tell The Difference?


[Note, I use the term "computer" here to refer to both the hardware and software.  Did anyone think that a computer did not include software?]

As citizens of our modern world (1), we are expected to promiscuously hop from computer platform to computer platform at the slightest hint of trendiness or novelty. To fail to do so is an unmistakable sign of imminent and unstoppable senility and creates the genuine possibility that you will be thrown under the wheels of the train of progress by your helpful and loving friends and colleagues.

So we embrace our new platforms and devices and pretend to be excited by whatever bad implementation of an old idea that is poked in our face as "the latest thing".

But some of us, for reasons that may not be our fault, are also called upon to do our "work" on these exciting new platforms (3) and that can cause a lot of problems, especially if the new device claiming to be a computing device actually turns out to be more like a toaster oven than a computer.


Does that modern looking toaster oven have an ethernet interface?   How about wireless routing?


Now this is not to in any way put down the noble art of the design of a good toaster oven, far from it. Toast can turn a formerly inedible piece of old bread into a tasty culinary element, no small feat.  Most of us would not consider having a toaster oven in our kitchen that did not have a satellite uplink and at least 1GB of main memory.  But a toaster oven is still conceptually different from a computer in at least one important way.

I maintain that the key distinguishing concept separating the computer from the toaster oven is the need to get work done beyond the controlled burning of bread. It is this idea that a computer is used to "get work done" that is considered so revolutionary and so threatening to the computer manufacturers of today who believe that a computer is first, last and always a device to extract money from the consumer.

A computer is not just to demonstrate a bankrupt user interface idea discredited 20 years ago at SIGCHI and implemented by morons: a computer is actually a tool intended to accomplish something that the biped mammal thinks is worthwhile... something as simple as writing a letter or as complicated as mapping the human genome.  (4) Or that was the naive and idealistic belief held by many of the original users of computers back in the day when we thought computers were going to help the world and not just torture it.

How can we easily spot the computer from the toaster oven in actual practice?  We have developed a procedure which is outlined here.  First find a comfortable location within easy view of a clock.  Cozy up to your computer candidate, note what time it is, and then try to perform the following simple tasks, taking note of how long it takes you to complete them.

A. How hard is it to find a command line interface? How hard is it to find a text editor that does not insist on changing your data in order to "fix" it?   Can you create a file without the computer screaming bloody murder and asking stupid questions about whether you want linefeeds in Vietnam? (2)

B. How hard is it to create a new program for the computer, even the simplest program, and run it on the computer?  Not their program (or "app" if you insist), but your program.  Almost any computer language will do, whether or not it is the "native" language of the computer.    Do you need to get permission from Jesus or the Pope before you run this program of yours?   Boy that would be pretty fucking arrogant if computer companies were actually trying to control the software you could run on the computer you just bought from them in a sleazy bid to extract more money from you, don't you think?

C. How hard is it to find good (e.g. useful) technical documentation for the computer?  Documentation that a reasonably knowledgeable technical person would want to know when programming or operating that computer?  Does such documentation even exist?  Or is it carefully kept only for the elite in order to avoid giving actual users the information they would need to program their computer?  They might hurt themselves!

D. Does the computer support open standards and protocols or did the manufacturer work with tremendous diligence and cynicism to make sure that any application written for this platform could never in a billion years be ported?

E. Does the computer allow you to easily get data on the computer and off of it again? Why would anyone want to do that?   

There are cases where something may not fulfill all the five categories above and still be a computer, but generally it is a special purpose computer that has a large support team around it, say the kind of computer we might use to blow up Iraq.

Consider the following four case studies: Redhat Linux 9.0 circa 1988, the DEC PDP 8E circa 1970, MAC OS X and the Android Nexus 7.

Redhat Linux circa 1998.    The subject was able to find a shell within about 30 seconds, a text editor in about 5 seconds, write a program in about 1 minute, find a compiler in about 20 seconds and compile and run a program in about 30 seconds after that.  The subject had trouble finding documentation because he had inadvertently not installed it by default, and he had to learn about the stupidity of the Info system for which GNU should be shot.  Definitely a computer.

DEC PDP 8E.   The subject discovered that the DEC PDP 8E, which his high school acquired about 1970, came with a built in line editor, a built in compiler (for FOCAL), and was running his own program within about 5 minutes.  One got data on and off with paper tape.  Definitely a computer.


A Real Computer


MAC OS X.  The subject had a project that required him to port a program from Linux to the MAC.   He was able to find wonderful technical documentation instantly, a good text editor in seconds, a compiler in a few minutes, and run a program in about an hour. Definitely a computer.

Android Nexus 7.   Subject had to deal with the immensely patronizing bullshit surrounding programming the Android for something like 6 weeks before getting a simple "hello, world" like program to run.  Said program was a page of insane java calls and the program itself needed to be embedded in a crazy hierarchy of useless directories and was painful to get to the designated tablet and to figure out how to run it.  There is no serious technical documentation. Anything involving a text editor, or getting data on and off, relies entirely on unsupported third party software that you have to find and install yourself without help or documentation from the manufacturer.

More toaster oven than computer, I think.

_______________________________

1. And "modern" is such an old-fashioned word, too.

2. In order to discourage users from using public and open standards, Microsoft Office would put you through a battery of questions before allowing you to save a .txt file, including about line feed encoding in the Democratic Republic of Vietnam, or whatever that socialist paradise is called these days.

3. Sarcasm intended.

4. Or, conversely, as complicated as writing a letter and as simple as decoding the human genome.

No comments:

Post a Comment