I am in a contradictory position regarding technology. Simultaneously I am a bit of a tech geek and am deeply suspicious of technophilia. I was researching hypertext before the web existed, when the only way to distribute such products was the not-so-network-aware floppy disk. I was developing web applications before that term was coined. And I have been recording digital audio since the days of the Atari . With some justification I label myself a geek.
The problem is that I prefer to live my life as an artist. In practice, this means that though I'd like to be up-to-date with the latest electronics, I can never afford them. Nor do I delight in gadgets for their own sake, once the initial honeymoon period is over.
As a result, I am far from the "bleeding edge", and have a good many anecdotes to illustrate the point. One that springs to mind: Back when I was developing vertical market applications I regularly outfitted my clients with the brand new Pentium class personal computers. But I was still developing on a '386! (Some of you will remember...) Yes, I was patient and had lots of coffee on hand for the compile cycles.
This explains why I have rarely been a fan of Apple Computers, as they were once known. (The name seems quaint now, doesn't it?) This stance was not out of ignorance, since I was depending on an Apple Classic for typesetting in the early eighties. Back in those days the company had the lower end of the art and design market all to themselves, and for good reason. The design process was painful, but at least one could get the job done with an Apple. The WYSIWYG screens rocked  .
That was then... But for some years now the re-branded Apple have concentrated on making shiny toys for technophiles who want the honeymoon to last forever... or at least until the next iteration. Apparently, all devices must be as thin and silver as possible. It's a retrograde view of "the future", through aluminum-tinted 1930s glasses . Apple's goal is to encourage conspicuous consumption. Purchasers do so largely to ensure bragging rights.
That's not my thing, as you can tell from my tone. I won't get into the ethical objections here, though they largely drive my decisions. Instead I will focus on the practical. I want a tool I can shape to my desires, not the other way around .
Apple gadgets work passably well for the things they do, just so long as you don't want to do anything else. This is a cliché and I repeat it here to be generous. Actually, some of their software is absolutely horrible (iTunes springs to mind). And they have made a lot of interface errors over the years while consistently claiming interface perfection. Drag-to-trash-to-eject! No doubt it's the arrogance that sets my teeth on edge. (While many of my points apply equally to other brands, Apple remain the exemplar.)
OK, so by now the Apple fans have all stopped reading and the Windows boys think I am am going to say something nice about their favourite toy. Er, no. All desktop operating systems use the same paradigm, invented almost entirely by the late lamented Douglas Engelbart and his team as Xerox PARC. I was writing about him two decades ago, when no-one remembered his name. Even now, it's mostly HCI experts who sing his praises . Apple had a lot to do with that, since they wrote him out of history after taking all his best ideas.
All desktop computers confine us to much the same paradigms, some of which work but most of which really don't, especially not for those users who want a simple tool and nothing more. The proof of this is simple: Try to teach a new computer user. Especially someone with a physical or mental impairment . There is simply too much to learn. Our WIMP interfaces are a minefield of cognitive dissonance and dexterity challenges.
In any case, desktop computers are disappearing, and this makes good sense. Most people want a tool for a particular task: a phone, a camera, a web browser, a chat interface, a music player. While one can certainly bundle several of these tasks together, the result is still far from a general-purpose computer.
So what we need on the one hand are devices targeted to simple tasks, enabling people to get these done simply, with few prerequisites. And, on the other hand, general-purpose processing devices for deep and broad tasks. Of course a lot of common tasks get missed out in between, especially those that are not about communications or entertainment. Do you really want to do inventory control or accounting on your phone? And yet why should you learn the thousand idiosyncrasies of your operating system just to balance your books? This problem remains unsolved.
Anyone who has been awake the last five years will realise that a new alternative mobile device interface has sprung to life -- touch. I have remained quite sceptical of touch interfaces in general, especially when applied indiscriminately to desktop computers. (Ahem, Microsoft.) This attitude largely stems from my interest in music production, where touch interfaces have made inroads on hardware devices. Yet a knob or fader provides feedback that is infinitely more useful. And the ergonomics are less likely to lead to Repetitive Strain Injuries.
Nonetheless, it is past time for me to check out this domain. Add to this the random fact that my eight-year-old mobile phone is getting rather temperamental. So, time to finally buy a smart phone! Which is a good place to end this ramble. I'll be back soon with some practical advice.
1. To be precise, it was MIDI not digital audio. Computers were not yet capable of manipulating the large streams of digital audio in anything like "real time".
2. Of course, those with lots of money had purpose-built gear, against which personal computers were only making small incursions.
3. Though it must be said that I laid out my first book in WordPerfect 5.1, a task for which I deserve some sort of a prize.
4. Read William Gibson's amusing short story "The Gernsback Continuum".
5. Strictly impossible, of course, pace Marshall McLuhan. But I will leave the philosophical considerations for a different forum.
6. We used to abbreviate Computer-Human Interaction as "CHI" so it could be pronounced as the Greek letter. Now the acronym puts the "Human" first, but I prefer it the old way as a constant reminder of a common bias.
7. Yes, I worked in an assistive devices programme for a while, too.