Are you a digital native, a digital immigrant or an analogue?

k-punk

Spectres of Mark
All k-punks out there should check out the IT survey in the Economist, which confirms what you don't have to be a rabid cyberpunk to know: most mass market IT is over-complex at the front end. Or too Prog as we k-punks have it.

It seems that a staggering 66% of IT projects either fail outright or take much longer to install than they should because of the arcane complexity of the interfaces. The survey starts with the despairing cry of one John Maeda: 'The computer knows me as its enemy. Everything I touch doesn't work.' OK, many of us have felt like that, but Maeda is a professor in computer design.

How has it come to this?

Well, the tendency of Prog Tech, as k-punks have long complained, is towards 'featuritis.' Most consumers only use 10% of the features on MS Word, for instance, and the obvious and depressingly predicatble result of the MetaStatic proliferation of superflous Prog clutter is that users find it harder to locate those features they DO use.

The survey shows that this is in fact typical of many popular technologies, which start off very feature-laden and demanding at the front end. Clocks and sewing machines used to come with telephone directory-size instruction manuals. Cars used to require their driver to be conversant with the inner workings of the machine (which is the reason that chauffeurs were so popular; running a car required you to be a mechanic, so better to have one on hand).

This can't be overlooked from the producers' POV because is that it is estimated that 70% of the world's population are 'analogues', i.e. those who live in terrified flight from technology (I think I work with most of them, and most of them are in IT support). 15% are 'digital immigrants' - who are typically thirtysomethings like myself who didn't grow up with the technology, but are now reasonably at home with it. That leaves the remaining 15% of teenagers and young adults who have been born into IT and take it for granted. As we all know, it is technology, not music, that produces the real generation gap today.

But if IT is to improve, i.e. become less complex, it will become more invisible. Complexity will go back a step, retreat behind the screen. While this is no doubt positive for consumers, that very fact means that the computer is increasingly becoming only a consumer tool. Kittler has warned about the way in which PCs increasingly lock users out of their inner workings. Doesn't this mean more MyStagoguery, more dependence upon a cadre of initiate-priests?
 

xero

was minusone
I read an interview with Nicholas Negroponte were he complained about this problem of featuritis, he ended up citing those Blackberry phones as one positive example of feature-lean coding. I think it stems from years of accrued laziness on the part of programmers, or perhaps just deadline pressures. My brother once worked in IT for American Express and told me that the software they use for card transactions etc. is basically a palimpsest of code which has been added over the years to the original software they wrote for their first mainframes back in the early seventies - its just been modified & tweaked whenever it had to be to keep it going.

As for me I must be an immigrant, unlss you count my pisspoor attempts to programme the spectrum when I was a young'un
 

Woebot

Well-known member
without wanting to fall into the default apple mac evangelist position, isnt this something that apple have actually excelled at? that's to say they haven't created apps weighed down by gizmos, but have created elegantly simple ones (and certainly under os x) ones under-pinned by the strength of unix, with just the right amount of depth. iTunes and QuickTime (which needs a bit of a spring-clean, but is the jewel in apple's crown) are lovely things.

this could turn into a diatribe against bill and the darkside!
 

mms

sometimes
you're quite right, but it strikes me the more things go wrong with my computer, especially with security, and all the crappy little faults with programmes everyone get's shafted by, the more you begin to learn, but it's all completley illogical, there are lots of great things with xp like system restore etc, but you again you wouldn't know that without a manual, as it's part of system tools in accessories.
 

k-punk

Spectres of Mark
Matt, I think yr right about Mac obv (and as a neophyte PC user I am shocked by the fact that MS Plague really is as bad as --- no, scracth that, it's even WORSE --- than I had caricatured it as an inveterate Mac user. PC users, it simply is not normal or acceptable to have programs freezing for no reason whatsoever and having to continually boot up yr machine. :)

BUT

from the Kittlerian POV Mac is the enemy, in that in producing the Graphic User Interface, it effectively locked out users from the inner workings of their computers. Previously, users were also programmers. Of course, that is also the great benefit for consumers; they are no longer required to program etc. But their relationship with their machine is now no longer properly cybernetic; they are dealing with it as a ROM 'judgement of God' whose basic nature is already decided. In other words, the computer becomes simply a consumer tool, like a toaster. Now MicroShit are bad toasters, and Macs are good toasters, but they are both toasters.

What's particularly appalling about MS Plague machines is that, like the Blairite political regime they resemble, they are the worst of all worlds: they have a GUI front end which locks out users from their MyStagogic Inner Core, whilst at the same time demanding arcane knowledge on the part of users. In other words, they are like toasters that require users to understand an incoherent geek hermeticode ever time they go wrong, which is very often.

Open Source, which is more demanding on consumers, is the cyberpunk way. That is rigorously anti-authoritarian in that it restores a fully cybernetic circuit between machine and user (so, at the limit, there is no distinction between the two). Mac OS is a consumer product; Linux is a cyberpunk machine.

Sphaleotas has promised to write more on Open Source here, since he's an expert....
 
k-punk said:
Open Source, which is more demanding on consumers, is the cyberpunk way. That is rigorously anti-authoritarian in that it restores a fully cybernetic circuit between machine and user (so, at the limit, there is no distinction between the two). Mac OS is a consumer product; Linux is a cyberpunk machine.

let's get real here. there is nobody who understands all of a computer, from there highest level abstractions down to the nitty-gritty of microcode. at some point we're all ignorant and will have to yield to somebody else's authority. linux hype nonwithstanding, the hardware has to be taken as given, as unchangable to the individual user, if only for economic reasons, because designing and producing hardware cannot be done in mum's basement. so they choice is really between different levels and areas of ignorance.

but the beauty of turing computability is that it's hard to suppress. almost anything is a turing machine. if you want you can code up turing machines on top of a mac's shiny GUI and do all the computation you want.
 

xero

was minusone
echo-friendly said:
but the beauty of turing computability is that it's hard to suppress. almost anything is a turing machine. if you want you can code up turing machines on top of a mac's shiny GUI and do all the computation you want.

can you elaborate? I thought a turing machine was a hypothetical computer successfully passing itself off as a human being
 

Mika

Active member
While Kittler's 'There is no software' position is valid, I think his argument are most effective as a provocation - a bit extreme to imagine, for instance, that the future for cultural studies lies in engineering and code (i.e. that critics should know arithmetic, the integral function, the sine function and at least two software languages, etc.). Ironically, a bit of this feels like an old Marxist/Enzensberger-type socialist strategy for the 'emancipatory use of media' - maybe a bit naive to imagine a decoding digital massive? I think that his point is simply to emphasize the materialities of communication, for surely, there are other modes of becoming cybernetic rather than writing code or soldering circuits on a motherboard?

As the default option for 'creatives', however - from graphic design, DV editing and DSP - Apple raises a interesting paradox. For according to Lev Manovich, the uptake of new media has meant at some point during the late twentieth century, technology effectively overtook art:

"that is, not only have new media technologies - computer programming, graphical human-computer interface, hypertext, computer multimedia, networking (both wired-based and wireless) - actualized the ideas behind projects by artists, they have extended them much further then the artists originally imagined. As a result these technologies themselves have become the greatest works of art today. The greatest hypertext is the web itself, because it is more complex, unpredictable and dynamic than any novel that could have been written by a single human writer, even James Joyce. The greatest interactive work is the interactive human-computer interface itself: the fact that the user can easily change everything that appears on her screen, in the process changing the internal state of a computer or even commanding reality outside it."

From this perspective, people like J.C.R. Licklider, Douglas Engelbart, Ivan Sutherland, Ted Nelson, Seymour Papert, Tim Berners-Lee become the most important artists of our time, the true visionaries and innovators, rather than the slew of tech-based artists that animate and inhabit these environments. The GUI might not be imagined as just a consumer item then, but also a significant (romantic) piece of art in and of itself (as opposed to a strictly repressive disciplinary apparatus).
 
minusone said:
can you elaborate? I thought a turing machine was a hypothetical computer successfully passing itself off as a human being

no, you mean the turing test.

a turing machine is a mathematical construct, one of the many formalism that are what is called turing-universal. a formalism is turing-univeral if anything that could conceivably be a machine can be simulated by that formalism.
 
B

be.jazz

Guest
k-punk said:
from the Kittlerian POV Mac is the enemy, in that in producing the Graphic User Interface, it effectively locked out users from the inner workings of their computers.
So, are you saying that cars would be better if we still needed to be mechanics to be able to use them? Should programmers still be using Assembler or even directly typing in 1s and 0s?
 

k-punk

Spectres of Mark
The question of realism is neither here nor there.... just because it isn't likely to happen doesn't mean that wouldn't be good if it did... it's about destratification, i.e. punk...

There's no question that the machinic potential of computers is locked down by the graphic user interface; yes, this makes certain activities possible, but it also massively limits what a computer is. There are of course good practical reasons why one's interactions with computers should be limited. But when they are so limited, they are then the equivalent of toaster, i.e. OBJECTS which have (only) have a use value, not machines whose functions and purpose is in every sense open. I'm as guilty of this as the next neurobot of course, in that I want a quick fix anthropomorphically configured object to perform recognizable human tasks.

As for the Marxist/Enzensberger position. My argument about computers as toasters is actually stolen from Baudrillard in For a Critique of the Political Economy of the Sign which is specifically directed against what he takes to be Enzensberger's naivete. For Baudrillard, Enzensberger has failed to assimilate the basic lesson of McLuhan: that the media is the message. There is no positive or emancipatory use of media, because media are itself the problem. A television really is no different from a toaster, it is something that people use and which therefore has a fixed purpose and, more importantly --- its use entails certain passive behaviours.

But computers need not be media, need not be consumer objects that are just used. Precisely because a computer is nothing in itself, it has no essence in the philosophical sense, it is just a simulation machine, and therefore can potentially be anything.

Responding to a ROM menu is not interacting with a computer. Genuine interaction only comes when the machine and its user (and in a strict cybernetic sense, the user is a component of the machine, not extrinsic to it) engage in a mutual becoming (i.e. where the behaviours and structures of both the computer and its user start to affect and influence one another). This can only happen at the level of code, not at the anthropomorphic level of image-metaphors (GUI), or the - in the worst sense fantasmatic - speculation about computers you can talk to.

It depends whether computers are deployed as tools of Human OS or as escape routes from it.

As for cars: it would be better if they didn't exist at all, that's obvious .... :D
 

Woebot

Well-known member
echo-friendly said:
but the beauty of turing computability is that it's hard to suppress. almost anything is a turing machine. if you want you can code up turing machines on top of a mac's shiny GUI and do all the computation you want.
hey there. im genuinely curious as to what you mean by the turing machine here. i have read a bit about alan turing but im stuffed if i can remember precisely what the machine did. also what kind of tasks would you get a turing machine running in these conditions to perform?
 
WOEBOT said:
hey there. im genuinely curious as to what you mean by the turing machine here. i have read a bit about alan turing but im stuffed if i can remember precisely what the machine did. also what kind of tasks would you get a turing machine running in these conditions to perform?

a TM is a mathematical model of a computer. in fact you can think TM = computer. everything a computer can do a TM can do and vice versa. http://en.wikipedia.org/wiki/Turing_machine you can also think of TM = programing language, TM = Java, TM = C. so the tasks performed "under these conditions" would be anything programmable at all.
 

Mika

Active member
Still seems to be an element of remorse in your argument, a suggestion that consumers should somehow become the producers of technology - which is a bit different from Baudrillard's orientation on simulation (isn't his response from the 'Requiem for the Media' essay?), especially the humanist investment in the Gutenberg Galaxy of 'the real' etc. For him, there is no potential for intervention in the process of digitalization since technological media destroy the aura of an event - the failed revolution of '68 never survived being broadcast without being completed negated and eviscerated of meaning. In this sense, a toaster is radically different from a television - one being directly implicated in the production of simulacra, the other being simply an effect.

This is not the case with other commentators on 'new media' - there are some other more nuanced conceptualizations of the computer other than a simulation machine.
 

DigitalDjigit

Honky Tonk Woman
If I remember my class correctly then C/Java/etc. are not quite a Turing Machines. Programming languages are context-free grammars (believe it or not Chomsky is relevant to computer science because of his work with grammars). In a context-free grammar the context of any symbol is irrelevant to its meaning. Turing machines are equivalent to context-sensitive grammars. Context-sensitive grammars are more "powerful" than context-free grammars.

Anyway, the original Turing machine is a theoretical construct. Imagine a tape of infinite lengths with a reading/recording head. The head can move right or left and read or write a symbol. The head is attached to a finite-state machine. Just think of it as a machine that keeps track of what state it is in. There is also a table of rules that says what state the machine will go into on receiving a given input. There is a starting state and a final state and any number of intermediate states. If you give an input to a machine and it results in it going to the final state than you say the machine accepted the input, otherwise it rejects the input. A set of inputs the machine accepts is the language that the machine accepts. A set of rules that generate the language are called the grammar.

Anyway, the claim is that given a machine like that you can compute anything that is computable. Moreover it is possible to build a universal Turing machine that will read a description of any machine from the tape and run as if it was that machine.

This is one of the premier intellectual achivements of the last century and as such I think it is important that people are at least familiar with it.
 

DigitalDjigit

Honky Tonk Woman
OK, now that that is out of the way I can go back and try to remember what my original response was meant to be.

I do not think that a GUI necessarily locks people out of anything. To continue the car analogy, despite most drivers not being mechanics nowadays many of them are able to drive from New York to Los Angeles (or even to the mall). Pretty much all development these days happens in Integrated Development Environments which are themselves very complex GUI's. It's a bootstrapping process. Just like these days people dont' use digging sticks to dig holes, they use shovels manufactures on complex machinery that was built on less complex machinery that was built with hand tools that eventually down the line were built with the help of sticks.

As for Linux, while I appreciate it, it really is a huuuuuge pain in the ass. I take it you have never used it. It is fine for a short while until you have to do that thing that you haven't done yet, such as printing. Or for example you can browse the web just fine until you encounter Flash and you need to get the plug-in. Prepare for some serious headache. Yes, maybe after a few hours you will be able to download the right source code, compile it (after making sure you have the dependencies for any of which you may have to go through this whole process before you can go on), install it in the right place, modify the configuration files and get it running. But really, all you wanted to do was watch that short movie.

Most needs are taken care of by software that exists out there. The need to have complete mastery of your machine is prerequisite to such esoteric pursuits that I am having a hard time even thinking of them.
 
DigitalDjigit said:
If I remember my class correctly then C/Java/etc. are not quite a Turing Machines. Programming languages are context-free grammars (believe it or not Chomsky is relevant to computer science because of his work with grammars). In a context-free grammar the context of any symbol is irrelevant to its meaning. Turing machines are equivalent to context-sensitive grammars. Context-sensitive grammars are more "powerful" than context-free grammars.

I'm afraid, you've got things slightly mixed up. Syntactically correct Java (et al) programs may well be describable by context free grammars, But so is the input to any Turing machine (as it is a single finite string over a finite alphabet). What is relevant here is that the set of languages TMs decide is exactly the same as the set of languages all Java program decide. Basically you need to distinguish between (1) the language that contains exactly the syntactically valid Java programs (this may be recognisable by a context free grammar, though I don't think so, because after parsing there's type checking) and (2) the languages decided by the syntactically correct Java programs.

Another way of looking at it is that it's pretty trivial to implement TMs in Java and Java interpreters as a TM. So both formalisms must be equivalent.
 

Woebot

Well-known member
I do like the principal that the GUI is inhibitive, and it must be admirable that artists get into coding. Actually the feeling that one ought to get into into coding has (for me at least) assumed the voice once entirely adopted by my protestant worth ethic. "Must study UNIX. Must abandon frivolous pursuits."

I'm glad the Mac's, and "graphics" slightly awkward relationship to the kind of hardline position laid out by Mark K has been brought up. The thing is Computer "Art", if there is such a thing now that almost all art has a digital component in it's construction was essentially incredibly poor. The herculaneum efforts involved in coding one's own art seem to preclude making it any good. The end product became almost an afterthought, the pieces meanings ponderously "overcoded" theoretically. Actually this period of computer art has JUST the purity of approach that Mark (quite rightly) celebrates.

<img alt="abel.jpg" src="http://www.woebot.com/images/dissensus/abel.jpg" width="480" height="340" border="0" />

This by Robert Abel from 1984. Just atrociously cringe-making. Actually wrestling with visual/technical issues like shadows and textures, the kind of thing that nowadays looks like a few primitives hit with preset textures.

<img alt="todd.jpg" src="http://www.woebot.com/images/dissensus/todd.jpg" width="466" height="470" border="0" />

This by William Latham from 1989. Latham had/has an extremely developed theory called "evolutionary art", they're quite beautiful in a kind of Giger-ish way, but ULTIMATELY cripplingly naff.

<img alt="max.jpg" src="http://www.woebot.com/images/dissensus/max.jpg" width="911" height="301" border="0" />

This last one one of the only exceptions to a "thoroughbred coding" approach to video art which is really adorable (oh this AND David Em's things), called "Carla's Island" by Nelson Max from 1981.

When you delve deep into motion graphics there are juntures when you enter into the software's "shell". Most of the time (certainly in 2d graphics) there's no cause to start tinkering with code. But the 3d package Maya has it's own scripting language called MEL, which is extremely useful when you are unable to get the software to perform how you want it to (or want to automate stuff). Indeed the deeper you go into motion graphics, for instance in formidable software like Houdini and Shake which adopt user-defined procedures as the algorithms defining motion, you'll find that coding is central to the program's operation.

Most of the time you can get away without this depth of knowledge, and i suspect that it's only in the larger software houses where people are thoroughly distributed through the pipeline (ie one person in charge of modelling the feathers on the bird, one person animating it, one person coding how it interacts with other birds in the scene) that it gets implemented. Though there is some kind of filtering down happening at the moment, with apps like After Effects using expressions.

Part of me likes to be a little more trusting with regards to the intentions behind the people making the software that one uses. Isn't that what's at stake here to some degree, that one doesn't TRUST the engineers. Matt Silverman, who was behind one of my all-time fave pieces of software called Commotion (really a dreamy exquisite piece of software) is sound as a pound, just another human being.
 
B

be.jazz

Guest
k-punk said:
Responding to a ROM menu is not interacting with a computer. Genuine interaction only comes when the machine and its user (and in a strict cybernetic sense, the user is a component of the machine, not extrinsic to it) engage in a mutual becoming (i.e. where the behaviours and structures of both the computer and its user start to affect and influence one another). This can only happen at the level of code, not at the anthropomorphic level of image-metaphors (GUI), or the - in the worst sense fantasmatic - speculation about computers you can talk to.
But, even at the level of code, programmers use their equivalent of GUIs, APIs (Application Programming Interface), which voluntarily and helpfully limit and direct what you can do. Also, the aim of advanced languages is to free the programmer from having to talk directly to the computer, worry about where things are in memory and how much of it they take up, how different hardware elements are communicating and what they are doing, etc. So, if the interface is present at lower levels, why is the GUI your chosen bottleneck? Even if all users become Java or C# programmers, they'll hit another "barrier."
 
a programmer's perspective

"Windows and Mac OS are products, contrived by engineers in the service of specific companies. Unix, on the other hand, is not so much a product as it is a painstakingly compiled oral history of the hacker subculture"

"Though Linux works for me and many other users, its sheer power and generality is its Achilles' heel. If you know what you are doing, you can buy a cheap PC from any store, throw away the Windows disks that come with it, turn it into a Linux system of mind-boggling complexity and power. You can configure it so that a hundred different people can be logged onto it at once over the Internet, via as many modem lines, Ethernet cards, TCP/IP sockets, and packet radio links. You can hang half a dozen different monitors off it and play Doom with someone in Australia while tracking communications satellites in orbit and controlling your house's lights and thermostats and streaming live video from your web-cam whilst surfing the net and designing a circuit board on another screen... it's so vastly technically superior to other OS'es that sometimes its just too formidable for routine day-to-day use. Sometimes, in other words, i just want to go to Disneyland."
- Neal Stephenson, 'In the beginning... was the command line'

That's the crux of it. Our brain's greatest playground can at once be a simple consumer product with purpose-built packages designed to ensure quick results AND the most radical man-made object in the history of humanity (the Intel chip as a modern wonder of the world). Its multiplicity and ubiquity feed into its potential, rather than detract from it. Without the spreadsheet, Apple may have flopped in its early stages. Without that piece of corporate software i may never have encountered the Unix shell and got back into computing through my use of OS X.

Technology moves at a pace that renders impossible its demystification. To paraphrase De Lillo:

every advance in technology is matched by a similar rise in the superstition of a population

This is unavoidable: Sheer utopian to strive for a technologically enlightened society. The fact is GUI design has to move to meet practical concerns, the interface to provide a seamless coat: apple and nokia have done a fantastic job under enormous economic pressures. product driven companies in a world that spares no extra time for design excellence. They didn't have to do it. But rather than sit back and ruminate on how the population might be better off as a mass group of enlightened hackers, they (such companies are built by individuals last time i checked) chose to jump in and help democratise an object in the manner they best saw fit.

Some level of digital literacy in the populace renders the WIMP Vs CLI debate null: Both are acquired skills (you weren't born knowing what a mouse was, what an icon, a folder a directory and all these superfluous metaphors meant) and in perspective both have their place. An individual can choose on what level to interact with their own particular digital device. For example, some people in my department are running Linux on a playstation. That's fun and all, and it would be nice to show people how malleable and hackable any device potentially is, but people don't need to know that. People have aims and ends that simply incorporate digital technology in a highly functional manner. There seems to me nothing inferior in this approach to computational devices.

However, the next generation (you know, the kids) needs to be given some perspective - educated on what, broadly speaking, a computer is, because every aspect of their life will involve computing. It'll become so ubiquitous and distributed through our environment that if interface design (I mean both GUI and device interface: protocols with other devices) doesn't improve to simplify all those processes, everything is going to take longer than it should (see the linux flash plug-in nightmare - i've heard that from about 6 people!). Imagine sixteen wireless digital devices strewn throughout your home. Do you really want a CLI for all that? I have at least 8, and am thankful for great UI design on most of them.

99% of modern computing as we know it is an abstraction. If it weren't so, the computer wouldn't have inspired such obssessive genius as it attracted in the last fifty years. Stripping back the layers is half the fun, but building newer, more sophisticated layers, such as UNIX and C - there's a beauty to that which can exist entirely outside the physical world, in the abstracted realm of high-level programming. Looking for art in computing always leads me to high-level software. Like programming AI in Prolog. GUI design is simply design excellence - functional, yes; sublime, no. Using the applications coded in C to produce art seems totally valid to me, if only i could think of any examples. The applications themselves need to strike a balance between demanding too much of the user, or limiting usability. I think the sound software MAX/MSP is a good example of that compromise struck well. So there's one example. Creativity, as always, can be found in the system of signs itself (language) or in the manipulation of said signs. A true visionary of modern computing? Try Dennis Ritchie's work at Bell labs. Quietly coding away for forty years. Pivotal in both C and UNIX.
 
Last edited:
Top