PDA

View Full Version : Are you a digital native, a digital immigrant or an analogue?



k-punk
31-10-2004, 06:59 PM
All k-punks out there should check out the IT survey in the Economist, which confirms what you don't have to be a rabid cyberpunk to know: most mass market IT is over-complex at the front end. Or too Prog as we k-punks have it.

It seems that a staggering 66% of IT projects either fail outright or take much longer to install than they should because of the arcane complexity of the interfaces. The survey starts with the despairing cry of one John Maeda: 'The computer knows me as its enemy. Everything I touch doesn't work.' OK, many of us have felt like that, but Maeda is a professor in computer design.

How has it come to this?

Well, the tendency of Prog Tech, as k-punks have long complained, is towards 'featuritis.' Most consumers only use 10% of the features on MS Word, for instance, and the obvious and depressingly predicatble result of the MetaStatic proliferation of superflous Prog clutter is that users find it harder to locate those features they DO use.

The survey shows that this is in fact typical of many popular technologies, which start off very feature-laden and demanding at the front end. Clocks and sewing machines used to come with telephone directory-size instruction manuals. Cars used to require their driver to be conversant with the inner workings of the machine (which is the reason that chauffeurs were so popular; running a car required you to be a mechanic, so better to have one on hand).

This can't be overlooked from the producers' POV because is that it is estimated that 70% of the world's population are 'analogues', i.e. those who live in terrified flight from technology (I think I work with most of them, and most of them are in IT support). 15% are 'digital immigrants' - who are typically thirtysomethings like myself who didn't grow up with the technology, but are now reasonably at home with it. That leaves the remaining 15% of teenagers and young adults who have been born into IT and take it for granted. As we all know, it is technology, not music, that produces the real generation gap today.

But if IT is to improve, i.e. become less complex, it will become more invisible. Complexity will go back a step, retreat behind the screen. While this is no doubt positive for consumers, that very fact means that the computer is increasingly becoming only a consumer tool. Kittler has warned about the way in which PCs increasingly lock users out of their inner workings. Doesn't this mean more MyStagoguery, more dependence upon a cadre of initiate-priests?

xero
01-11-2004, 07:45 AM
I read an interview with Nicholas Negroponte were he complained about this problem of featuritis, he ended up citing those Blackberry phones as one positive example of feature-lean coding. I think it stems from years of accrued laziness on the part of programmers, or perhaps just deadline pressures. My brother once worked in IT for American Express and told me that the software they use for card transactions etc. is basically a palimpsest of code which has been added over the years to the original software they wrote for their first mainframes back in the early seventies - its just been modified & tweaked whenever it had to be to keep it going.

As for me I must be an immigrant, unlss you count my pisspoor attempts to programme the spectrum when I was a young'un

Woebot
01-11-2004, 09:33 AM
without wanting to fall into the default apple mac evangelist position, isnt this something that apple have actually excelled at? that's to say they haven't created apps weighed down by gizmos, but have created elegantly simple ones (and certainly under os x) ones under-pinned by the strength of unix, with just the right amount of depth. iTunes and QuickTime (which needs a bit of a spring-clean, but is the jewel in apple's crown) are lovely things.

this could turn into a diatribe against bill and the darkside!

mms
01-11-2004, 09:46 AM
you're quite right, but it strikes me the more things go wrong with my computer, especially with security, and all the crappy little faults with programmes everyone get's shafted by, the more you begin to learn, but it's all completley illogical, there are lots of great things with xp like system restore etc, but you again you wouldn't know that without a manual, as it's part of system tools in accessories.

k-punk
01-11-2004, 10:34 AM
Matt, I think yr right about Mac obv (and as a neophyte PC user I am shocked by the fact that MS Plague really is as bad as --- no, scracth that, it's even WORSE --- than I had caricatured it as an inveterate Mac user. PC users, it simply is not normal or acceptable to have programs freezing for no reason whatsoever and having to continually boot up yr machine. :-)

BUT

from the Kittlerian POV Mac is the enemy, in that in producing the Graphic User Interface, it effectively locked out users from the inner workings of their computers. Previously, users were also programmers. Of course, that is also the great benefit for consumers; they are no longer required to program etc. But their relationship with their machine is now no longer properly cybernetic; they are dealing with it as a ROM 'judgement of God' whose basic nature is already decided. In other words, the computer becomes simply a consumer tool, like a toaster. Now MicroShit are bad toasters, and Macs are good toasters, but they are both toasters.

What's particularly appalling about MS Plague machines is that, like the Blairite political regime they resemble, they are the worst of all worlds: they have a GUI front end which locks out users from their MyStagogic Inner Core, whilst at the same time demanding arcane knowledge on the part of users. In other words, they are like toasters that require users to understand an incoherent geek hermeticode ever time they go wrong, which is very often.

Open Source, which is more demanding on consumers, is the cyberpunk way. That is rigorously anti-authoritarian in that it restores a fully cybernetic circuit between machine and user (so, at the limit, there is no distinction between the two). Mac OS is a consumer product; Linux is a cyberpunk machine.

Sphaleotas has promised to write more on Open Source here, since he's an expert....

echo-friendly
01-11-2004, 11:05 AM
Open Source, which is more demanding on consumers, is the cyberpunk way. That is rigorously anti-authoritarian in that it restores a fully cybernetic circuit between machine and user (so, at the limit, there is no distinction between the two). Mac OS is a consumer product; Linux is a cyberpunk machine.

let's get real here. there is nobody who understands all of a computer, from there highest level abstractions down to the nitty-gritty of microcode. at some point we're all ignorant and will have to yield to somebody else's authority. linux hype nonwithstanding, the hardware has to be taken as given, as unchangable to the individual user, if only for economic reasons, because designing and producing hardware cannot be done in mum's basement. so they choice is really between different levels and areas of ignorance.

but the beauty of turing computability is that it's hard to suppress. almost anything is a turing machine. if you want you can code up turing machines on top of a mac's shiny GUI and do all the computation you want.

xero
01-11-2004, 11:11 AM
but the beauty of turing computability is that it's hard to suppress. almost anything is a turing machine. if you want you can code up turing machines on top of a mac's shiny GUI and do all the computation you want.

can you elaborate? I thought a turing machine was a hypothetical computer successfully passing itself off as a human being

Mika
01-11-2004, 11:53 AM
While Kittler's 'There is no software' position is valid, I think his argument are most effective as a provocation - a bit extreme to imagine, for instance, that the future for cultural studies lies in engineering and code (i.e. that critics should know arithmetic, the integral function, the sine function and at least two software languages, etc.). Ironically, a bit of this feels like an old Marxist/Enzensberger-type socialist strategy for the 'emancipatory use of media' - maybe a bit naive to imagine a decoding digital massive? I think that his point is simply to emphasize the materialities of communication, for surely, there are other modes of becoming cybernetic rather than writing code or soldering circuits on a motherboard?

As the default option for 'creatives', however - from graphic design, DV editing and DSP - Apple raises a interesting paradox. For according to Lev Manovich, the uptake of new media has meant at some point during the late twentieth century, technology effectively overtook art:

"that is, not only have new media technologies - computer programming, graphical human-computer interface, hypertext, computer multimedia, networking (both wired-based and wireless) - actualized the ideas behind projects by artists, they have extended them much further then the artists originally imagined. As a result these technologies themselves have become the greatest works of art today. The greatest hypertext is the web itself, because it is more complex, unpredictable and dynamic than any novel that could have been written by a single human writer, even James Joyce. The greatest interactive work is the interactive human-computer interface itself: the fact that the user can easily change everything that appears on her screen, in the process changing the internal state of a computer or even commanding reality outside it."

From this perspective, people like J.C.R. Licklider, Douglas Engelbart, Ivan Sutherland, Ted Nelson, Seymour Papert, Tim Berners-Lee become the most important artists of our time, the true visionaries and innovators, rather than the slew of tech-based artists that animate and inhabit these environments. The GUI might not be imagined as just a consumer item then, but also a significant (romantic) piece of art in and of itself (as opposed to a strictly repressive disciplinary apparatus).

echo-friendly
01-11-2004, 12:41 PM
can you elaborate? I thought a turing machine was a hypothetical computer successfully passing itself off as a human being

no, you mean the turing test.

a turing machine is a mathematical construct, one of the many formalism that are what is called turing-universal. a formalism is turing-univeral if anything that could conceivably be a machine can be simulated by that formalism.

be.jazz
01-11-2004, 01:43 PM
from the Kittlerian POV Mac is the enemy, in that in producing the Graphic User Interface, it effectively locked out users from the inner workings of their computers.
So, are you saying that cars would be better if we still needed to be mechanics to be able to use them? Should programmers still be using Assembler or even directly typing in 1s and 0s?

k-punk
01-11-2004, 05:09 PM
The question of realism is neither here nor there.... just because it isn't likely to happen doesn't mean that wouldn't be good if it did... it's about destratification, i.e. punk...

There's no question that the machinic potential of computers is locked down by the graphic user interface; yes, this makes certain activities possible, but it also massively limits what a computer is. There are of course good practical reasons why one's interactions with computers should be limited. But when they are so limited, they are then the equivalent of toaster, i.e. OBJECTS which have (only) have a use value, not machines whose functions and purpose is in every sense open. I'm as guilty of this as the next neurobot of course, in that I want a quick fix anthropomorphically configured object to perform recognizable human tasks.

As for the Marxist/Enzensberger position. My argument about computers as toasters is actually stolen from Baudrillard in For a Critique of the Political Economy of the Sign which is specifically directed against what he takes to be Enzensberger's naivete. For Baudrillard, Enzensberger has failed to assimilate the basic lesson of McLuhan: that the media is the message. There is no positive or emancipatory use of media, because media are itself the problem. A television really is no different from a toaster, it is something that people use and which therefore has a fixed purpose and, more importantly --- its use entails certain passive behaviours.

But computers need not be media, need not be consumer objects that are just used. Precisely because a computer is nothing in itself, it has no essence in the philosophical sense, it is just a simulation machine, and therefore can potentially be anything.

Responding to a ROM menu is not interacting with a computer. Genuine interaction only comes when the machine and its user (and in a strict cybernetic sense, the user is a component of the machine, not extrinsic to it) engage in a mutual becoming (i.e. where the behaviours and structures of both the computer and its user start to affect and influence one another). This can only happen at the level of code, not at the anthropomorphic level of image-metaphors (GUI), or the - in the worst sense fantasmatic - speculation about computers you can talk to.

It depends whether computers are deployed as tools of Human OS or as escape routes from it.

As for cars: it would be better if they didn't exist at all, that's obvious .... :D

Woebot
01-11-2004, 05:21 PM
but the beauty of turing computability is that it's hard to suppress. almost anything is a turing machine. if you want you can code up turing machines on top of a mac's shiny GUI and do all the computation you want.
hey there. im genuinely curious as to what you mean by the turing machine here. i have read a bit about alan turing but im stuffed if i can remember precisely what the machine did. also what kind of tasks would you get a turing machine running in these conditions to perform?

echo-friendly
01-11-2004, 05:27 PM
hey there. im genuinely curious as to what you mean by the turing machine here. i have read a bit about alan turing but im stuffed if i can remember precisely what the machine did. also what kind of tasks would you get a turing machine running in these conditions to perform?

a TM is a mathematical model of a computer. in fact you can think TM = computer. everything a computer can do a TM can do and vice versa. http://en.wikipedia.org/wiki/Turing_machine you can also think of TM = programing language, TM = Java, TM = C. so the tasks performed "under these conditions" would be anything programmable at all.

Mika
01-11-2004, 07:35 PM
Still seems to be an element of remorse in your argument, a suggestion that consumers should somehow become the producers of technology - which is a bit different from Baudrillard's orientation on simulation (isn't his response from the 'Requiem for the Media' essay?), especially the humanist investment in the Gutenberg Galaxy of 'the real' etc. For him, there is no potential for intervention in the process of digitalization since technological media destroy the aura of an event - the failed revolution of '68 never survived being broadcast without being completed negated and eviscerated of meaning. In this sense, a toaster is radically different from a television - one being directly implicated in the production of simulacra, the other being simply an effect.

This is not the case with other commentators on 'new media' - there are some other more nuanced conceptualizations of the computer other than a simulation machine.

DigitalDjigit
01-11-2004, 07:44 PM
If I remember my class correctly then C/Java/etc. are not quite a Turing Machines. Programming languages are context-free grammars (believe it or not Chomsky is relevant to computer science because of his work with grammars). In a context-free grammar the context of any symbol is irrelevant to its meaning. Turing machines are equivalent to context-sensitive grammars. Context-sensitive grammars are more "powerful" than context-free grammars.

Anyway, the original Turing machine is a theoretical construct. Imagine a tape of infinite lengths with a reading/recording head. The head can move right or left and read or write a symbol. The head is attached to a finite-state machine. Just think of it as a machine that keeps track of what state it is in. There is also a table of rules that says what state the machine will go into on receiving a given input. There is a starting state and a final state and any number of intermediate states. If you give an input to a machine and it results in it going to the final state than you say the machine accepted the input, otherwise it rejects the input. A set of inputs the machine accepts is the language that the machine accepts. A set of rules that generate the language are called the grammar.

Anyway, the claim is that given a machine like that you can compute anything that is computable. Moreover it is possible to build a universal Turing machine that will read a description of any machine from the tape and run as if it was that machine.

This is one of the premier intellectual achivements of the last century and as such I think it is important that people are at least familiar with it.

DigitalDjigit
01-11-2004, 07:55 PM
OK, now that that is out of the way I can go back and try to remember what my original response was meant to be.

I do not think that a GUI necessarily locks people out of anything. To continue the car analogy, despite most drivers not being mechanics nowadays many of them are able to drive from New York to Los Angeles (or even to the mall). Pretty much all development these days happens in Integrated Development Environments which are themselves very complex GUI's. It's a bootstrapping process. Just like these days people dont' use digging sticks to dig holes, they use shovels manufactures on complex machinery that was built on less complex machinery that was built with hand tools that eventually down the line were built with the help of sticks.

As for Linux, while I appreciate it, it really is a huuuuuge pain in the ass. I take it you have never used it. It is fine for a short while until you have to do that thing that you haven't done yet, such as printing. Or for example you can browse the web just fine until you encounter Flash and you need to get the plug-in. Prepare for some serious headache. Yes, maybe after a few hours you will be able to download the right source code, compile it (after making sure you have the dependencies for any of which you may have to go through this whole process before you can go on), install it in the right place, modify the configuration files and get it running. But really, all you wanted to do was watch that short movie.

Most needs are taken care of by software that exists out there. The need to have complete mastery of your machine is prerequisite to such esoteric pursuits that I am having a hard time even thinking of them.

echo-friendly
01-11-2004, 08:01 PM
If I remember my class correctly then C/Java/etc. are not quite a Turing Machines. Programming languages are context-free grammars (believe it or not Chomsky is relevant to computer science because of his work with grammars). In a context-free grammar the context of any symbol is irrelevant to its meaning. Turing machines are equivalent to context-sensitive grammars. Context-sensitive grammars are more "powerful" than context-free grammars.


I'm afraid, you've got things slightly mixed up. Syntactically correct Java (et al) programs may well be describable by context free grammars, But so is the input to any Turing machine (as it is a single finite string over a finite alphabet). What is relevant here is that the set of languages TMs decide is exactly the same as the set of languages all Java program decide. Basically you need to distinguish between (1) the language that contains exactly the syntactically valid Java programs (this may be recognisable by a context free grammar, though I don't think so, because after parsing there's type checking) and (2) the languages decided by the syntactically correct Java programs.

Another way of looking at it is that it's pretty trivial to implement TMs in Java and Java interpreters as a TM. So both formalisms must be equivalent.

Woebot
01-11-2004, 08:28 PM
I do like the principal that the GUI is inhibitive, and it must be admirable that artists get into coding. Actually the feeling that one ought to get into into coding has (for me at least) assumed the voice once entirely adopted by my protestant worth ethic. "Must study UNIX. Must abandon frivolous pursuits."

I'm glad the Mac's, and "graphics" slightly awkward relationship to the kind of hardline position laid out by Mark K has been brought up. The thing is Computer "Art", if there is such a thing now that almost all art has a digital component in it's construction was essentially incredibly poor. The herculaneum efforts involved in coding one's own art seem to preclude making it any good. The end product became almost an afterthought, the pieces meanings ponderously "overcoded" theoretically. Actually this period of computer art has JUST the purity of approach that Mark (quite rightly) celebrates.

<img alt="abel.jpg" src="http://www.woebot.com/images/dissensus/abel.jpg" width="480" height="340" border="0" />

This by Robert Abel from 1984. Just atrociously cringe-making. Actually wrestling with visual/technical issues like shadows and textures, the kind of thing that nowadays looks like a few primitives hit with preset textures.

<img alt="todd.jpg" src="http://www.woebot.com/images/dissensus/todd.jpg" width="466" height="470" border="0" />

This by William Latham from 1989. Latham had/has an extremely developed theory called "evolutionary art", they're quite beautiful in a kind of Giger-ish way, but ULTIMATELY cripplingly naff.

<img alt="max.jpg" src="http://www.woebot.com/images/dissensus/max.jpg" width="911" height="301" border="0" />

This last one one of the only exceptions to a "thoroughbred coding" approach to video art which is really adorable (oh this AND David Em's things), called "Carla's Island" by Nelson Max from 1981.

When you delve deep into motion graphics there are juntures when you enter into the software's "shell". Most of the time (certainly in 2d graphics) there's no cause to start tinkering with code. But the 3d package Maya has it's own scripting language called MEL, which is extremely useful when you are unable to get the software to perform how you want it to (or want to automate stuff). Indeed the deeper you go into motion graphics, for instance in formidable software like Houdini and Shake which adopt user-defined procedures as the algorithms defining motion, you'll find that coding is central to the program's operation.

Most of the time you can get away without this depth of knowledge, and i suspect that it's only in the larger software houses where people are thoroughly distributed through the pipeline (ie one person in charge of modelling the feathers on the bird, one person animating it, one person coding how it interacts with other birds in the scene) that it gets implemented. Though there is some kind of filtering down happening at the moment, with apps like After Effects using expressions.

Part of me likes to be a little more trusting with regards to the intentions behind the people making the software that one uses. Isn't that what's at stake here to some degree, that one doesn't TRUST the engineers. Matt Silverman, who was behind one of my all-time fave pieces of software called Commotion (really a dreamy exquisite piece of software) is sound as a pound, just another human being.

be.jazz
01-11-2004, 08:57 PM
Responding to a ROM menu is not interacting with a computer. Genuine interaction only comes when the machine and its user (and in a strict cybernetic sense, the user is a component of the machine, not extrinsic to it) engage in a mutual becoming (i.e. where the behaviours and structures of both the computer and its user start to affect and influence one another). This can only happen at the level of code, not at the anthropomorphic level of image-metaphors (GUI), or the - in the worst sense fantasmatic - speculation about computers you can talk to.
But, even at the level of code, programmers use their equivalent of GUIs, APIs (Application Programming Interface), which voluntarily and helpfully limit and direct what you can do. Also, the aim of advanced languages is to free the programmer from having to talk directly to the computer, worry about where things are in memory and how much of it they take up, how different hardware elements are communicating and what they are doing, etc. So, if the interface is present at lower levels, why is the GUI your chosen bottleneck? Even if all users become Java or C# programmers, they'll hit another "barrier."

joanofarctan
01-11-2004, 10:02 PM
"Windows and Mac OS are products, contrived by engineers in the service of specific companies. Unix, on the other hand, is not so much a product as it is a painstakingly compiled oral history of the hacker subculture"

"Though Linux works for me and many other users, its sheer power and generality is its Achilles' heel. If you know what you are doing, you can buy a cheap PC from any store, throw away the Windows disks that come with it, turn it into a Linux system of mind-boggling complexity and power. You can configure it so that a hundred different people can be logged onto it at once over the Internet, via as many modem lines, Ethernet cards, TCP/IP sockets, and packet radio links. You can hang half a dozen different monitors off it and play Doom with someone in Australia while tracking communications satellites in orbit and controlling your house's lights and thermostats and streaming live video from your web-cam whilst surfing the net and designing a circuit board on another screen... it's so vastly technically superior to other OS'es that sometimes its just too formidable for routine day-to-day use. Sometimes, in other words, i just want to go to Disneyland."
- Neal Stephenson, 'In the beginning... was the command line' (http://www.amazon.co.uk/exec/obidos/ASIN/0380815931/qid=1099342047/ref=sr_8_xs_ap_i1_xgl/202-8202017-0148620)

That's the crux of it. Our brain's greatest playground can at once be a simple consumer product with purpose-built packages designed to ensure quick results AND the most radical man-made object in the history of humanity (the Intel chip as a modern wonder of the world). Its multiplicity and ubiquity feed into its potential, rather than detract from it. Without the spreadsheet, Apple may have flopped in its early stages. Without that piece of corporate software i may never have encountered the Unix shell and got back into computing through my use of OS X.

Technology moves at a pace that renders impossible its demystification. To paraphrase De Lillo:

every advance in technology is matched by a similar rise in the superstition of a population

This is unavoidable: Sheer utopian to strive for a technologically enlightened society. The fact is GUI design has to move to meet practical concerns, the interface to provide a seamless coat: apple and nokia have done a fantastic job under enormous economic pressures. product driven companies in a world that spares no extra time for design excellence. They didn't have to do it. But rather than sit back and ruminate on how the population might be better off as a mass group of enlightened hackers, they (such companies are built by individuals last time i checked) chose to jump in and help democratise an object in the manner they best saw fit.

Some level of digital literacy in the populace renders the WIMP Vs CLI debate null: Both are acquired skills (you weren't born knowing what a mouse was, what an icon, a folder a directory and all these superfluous metaphors meant) and in perspective both have their place. An individual can choose on what level to interact with their own particular digital device. For example, some people in my department are running Linux on a playstation. That's fun and all, and it would be nice to show people how malleable and hackable any device potentially is, but people don't need to know that. People have aims and ends that simply incorporate digital technology in a highly functional manner. There seems to me nothing inferior in this approach to computational devices.

However, the next generation (you know, the kids) needs to be given some perspective - educated on what, broadly speaking, a computer is, because every aspect of their life will involve computing. It'll become so ubiquitous and distributed through our environment that if interface design (I mean both GUI and device interface: protocols with other devices) doesn't improve to simplify all those processes, everything is going to take longer than it should (see the linux flash plug-in nightmare - i've heard that from about 6 people!). Imagine sixteen wireless digital devices strewn throughout your home. Do you really want a CLI for all that? I have at least 8, and am thankful for great UI design on most of them.

99% of modern computing as we know it is an abstraction. If it weren't so, the computer wouldn't have inspired such obssessive genius as it attracted in the last fifty years. Stripping back the layers is half the fun, but building newer, more sophisticated layers, such as UNIX and C - there's a beauty to that which can exist entirely outside the physical world, in the abstracted realm of high-level programming. Looking for art in computing always leads me to high-level software. Like programming AI in Prolog. GUI design is simply design excellence - functional, yes; sublime, no. Using the applications coded in C to produce art seems totally valid to me, if only i could think of any examples. The applications themselves need to strike a balance between demanding too much of the user, or limiting usability. I think the sound software MAX/MSP is a good example of that compromise struck well. So there's one example. Creativity, as always, can be found in the system of signs itself (language) or in the manipulation of said signs. A true visionary of modern computing? Try Dennis Ritchie's (http://cm.bell-labs.com/cm/cs/who/dmr/bigbio1st.html) work at Bell labs. Quietly coding away for forty years. Pivotal in both C and UNIX.

k-punk
02-11-2004, 04:31 AM
I still think some people are missing my point.

Articulating things in terms of existing human 'needs' - there are of course no human needs of any kind, the whole notion of need is ideological through and through - immediately massively limits the potential of any technology, especially computers, which in being Nothing - in having no essence - can potentially be anything.

The idea that there is a more 'complex and nuanced' view of computers than that they are simulation machines is really quite stunning. What could be more complex and nuanced than the idea of a machine that can simulate the function of any other machine but which itself is Nothing? (Sadly think that using the magic word 'Baudrillard' has closed down thought here -- because aha, we all know what Baudrillard says, don't we, and it's not very interesting :) ). The idea of computers as simulators is really not at all controversial, and certainly not reducible to Baudrillard (who in any case is a master of the subtle and the nuanced, whatever reductive readings of him maintain). Much of Sadie Plant's stuff in the nineties, in which she parallels computers with women, both of which have been defined as nothing but simulation, lacking in any essence, makes good use of this idea that computers are simulation machines.

No, I haven't used Linux, but then I'm lazy and I don't really want to defend such laziness. My default position would be argue that yes Macs are superior etc etc, but this is to avoid the broader and more crucial techopolitical point about potentials. 'User friendliness' basically slaves computers into the human pleasure principle, to - if we pursue Sadie's parallel - becoming prostitutes, dressing up in familiar garb to service the same old dreary desires. 'All I wanted to do was watch the movie': well, yes, precisely. But there are more destratifying potentials that human-computer interaction could explore --- once both use value and the pleasure principle are left behind.

k-punk
02-11-2004, 04:38 AM
Link for those interested in open source (especially in relation to education):

<a href=http://www.oss-watch.ac.uk> Oxford University's Open Source watch </a>

As I mentioned on another thread, I met the fellow involved with this the other week; he's a real Stephensonian k-punk geek, totally inspiring in his no-nonsense, can-do anti-mystagogic approach....

Mika
02-11-2004, 06:01 AM
I guess, like most people, when I read 'simulation', I think of Baudrillard - whose writing in many ways presents a deeply cynical and pessimistic take on popular media. Not a particularly helpful one either, for the most part.

But in terms of a more nuanced reading, I'm even thinking of stuff like Kittler in Gramophone, Film, Typewriter - whose comparably bleak posthumanist perspective is based on, I believe, a more useful notion of data-processing in the post-Gutenberg universe. Particularly, the notion of technological media as bypassing symbolic mediation to record visual and acoustic effects of the real (i.e. the Lacanian slant of the text), seperating out the senses and fragmenting the subject. Which I find is a more illuminating take on the experience of technology in terms of recording devices, keyboards, strobe-lighting, visual FX, etc. But also notions of time and death, etc.

Of course, then he goes on to suggest that as we enter the post-medium era; as formerly distinct media like television, radio, telephone and mail begin to converge in the computer; the collected bits of the subject are remade technologically so that - "instead of wiring people and technologies; absolute knowledge will run as an endless loop". Naturally, whether we'll ever reach this stage of artificial intelligence is highly questionable.

I guess that Lev Manovich is someone else who I find is equally useful in capturing the current process of new media as evolutionary, and gauging these emerging technologies against the history/influence of cinema. In this account, computer mediated technology is not a blank 'Nothing' - but is a processor continually tied to RL, and born from the genealogies and historical trajectories of film.

And also, the work of Mark Hansen, whose critique of Kittler and posthumanism/poststructuralism is particularly innovative in reconceptualizing embodiment and new media in terms of haptic vision, affect, and endogeous bodily framing processes through re-readings of Bergson, Deleuze and the neurobiology of Francisco Varela.

Maybe I'm just not really getting what you mean by a computer philosophically being defined as 'Nothing'? For instance, what I think is interesting/unique about these writers is the attempt to historically contextualize the emergence of computer technology, connect it with other media and frame it in terms of RL. While I guess hypothetically computers could be 'anything' or a 'void', in this literature the evolution of new media appears as quite a distinct object - historical, contextual and physical.

Woebot
02-11-2004, 07:00 AM
Those pictures really are revolting Mr. Woebot!

I think you mistake what an artist's role is. It's never about the materials my friend! Unless you really think there's such a thing as 'process art'. Ha ha ha! Obviously, people with a lot of time on their hands will do all kinds of things, but who wants to look at 'em?

er, thats actually what i say in the post if you took the time to read it. its difficult to tell from your remark whether you're in agreement, or just haven't taken this on board. maybe you're just trolling as usual?!?

xero
02-11-2004, 07:25 AM
When you delve deep into motion graphics there are juntures when you enter into the software's "shell". Most of the time (certainly in 2d graphics) there's no cause to start tinkering with code. But the 3d package Maya has it's own scripting language called MEL, which is extremely useful when you are unable to get the software to perform how you want it to (or want to automate stuff). Indeed the deeper you go into motion graphics, for instance in formidable software like Houdini and Shake which adopt user-defined procedures as the algorithms defining motion, you'll find that coding is central to the program's operation.


Apparently at Pixar, Maya is thought of as an operating system rather than an application - it has enormous flexibility. But flexibility can be the enemy of artistry - especially on an individual level ie when you don't have a a vast hierachy of technical operators performing specialised functions like they do at pixar. It's possible to draw a parallel with audio synthesis: huge modular synthesisers have the potential to create a massive variety of sounds but huge creative leaps have been made with hardwired little roland boxes with a minimum number of knobs. Art becomes interesting when there is a critical or emotional response to the technology rather than just exploring what it can do

joanofarctan
02-11-2004, 07:36 AM
Most comprehensive open source publication available on the net:

Open Sources (http://www.oreilly.com/catalog/opensources/book/toc.html) , courtesy of the crucial publisher O'Reilly

johneffay
02-11-2004, 10:04 AM
It depends whether computers are deployed as tools of Human OS or as escape routes from it.


Can you elaborate on exactly how you think computers can be deployed in this way?

I get the feeling that all this antipathy towards GUIs, etc. is rather missing the point: It is the GUI that allows most people to do anything at all on a computer (including most LINUX users). I guess your point is that they are not doing what you think they should be on computers, but I don't understand what it is that you think they should be doing.

I actually agree with Echo-Friendly :eek: about the question of levels of complexity: I'm digital native enough to have spent the best part of a decade in the data comms industry, speccing and managing wide area networks for a company which used to manufacture and maintain them. The software engineers worked on raw UNIX code and made the same disparaging noises about DOS that you are making about GUIs, but they still didn't know how the hardware worked; that was what the hardware engineers were for. In fact, all the engineers had their own specialisations, and none of them could have built an entire WAN.

Now you might want to argue that this is because the stuff is ridiculously overcomplicated, but I would suggest that if you simplify down to a level where everybody has full user transparency, we will be back to paper and pencil. Come to think of it though, just how do they get the lead into a pencil? ;)

Woebot
02-11-2004, 10:30 AM
But flexibility can be the enemy of artistry - especially on an individual level ie when you don't have a a vast hierachy of technical operators performing specialised functions like they do at pixar.

Yes, and it's so rarely important (for me at least!) to delve that far under the hood. Did Leonardo make his own paint? (Not that I'm suggesting there is anything particularly important about the works I produce in my own small capacity)

Still, I'm very interested to hear what Mark will make of effay's point. That's to say "What could these machines be liberated from the GUI to do?" I've a feeling he's gonna say something along the lines of "We'll never know until we abandon it!" Lol.

xero
02-11-2004, 01:42 PM
Did Leonardo make his own paint?

with out wanting to undermine the point here by being a pedant - i've a feeling he did! although no doubt assistants did the legwork -I think it was quite common for painters until relatively recently (someone help me out here) to mix up paint from separate pigments, binders etc.

Woebot
02-11-2004, 01:44 PM
i've a feeling he did!

lol :-) did warhol?

xero
02-11-2004, 02:47 PM
warhol used the cheapest naffest paints imaginable - from a hardware store or the kind of powder paints you used to get at primary school - I found this out because someone I know put his foot through a warhol painting whilst moving saatchi's collection around and the expert restorer who patched it up told him & that's the kind of paint he used to match it in. This was unfortunately an accident not an act of postmodern iconoclasm, before you ask...

polystyle desu
02-11-2004, 03:56 PM
Nice thread K
much more interesting then the whole W Gibson site by now ...

relative of Cossack barbarians ,
father worked in Pentagon on the Monet (MoNet ?),
Futants jammed with Robin Simon '79,
called Gibson in '84 , setting relationship in motion ("Hip Tech High Lit" '87 w/ WG, Bruce Sterling, Judy Nylon, Sean Young ; Original music for "Neuromancer" Audio Book '94 ; same for "Johnny Mnemonic" , yeah, i know it sucked, we got bought out by Sony before the movie was even done),
tried cobbling together software mod's for OS site in '97,
got bored with tech,
went to Himalayas

Valhalla , we are coming

be.jazz
03-11-2004, 11:38 AM
with out wanting to undermine the point here by being a pedant - i've a feeling he did! although no doubt assistants did the legwork -I think it was quite common for painters until relatively recently (someone help me out here) to mix up paint from separate pigments, binders etc.
I caught a BBC documentary on Da Vinci (or was it Michaelangelo?) some weeks ago and (whoever old Italian it was about) innovated in the making of paint.

be.jazz
19-11-2004, 09:38 PM
I don't think anyone posted the link to the article k-punk mentions, here it is: http://www.economist.com/displayStory.cfm?Story_id=3307363

I think I'm a digital native: I've never found installing a printer or plugging my digital camera into the USB port to be particularly difficult.

mms
19-11-2004, 10:52 PM
I caught a BBC documentary on Da Vinci (or was it Michaelangelo?) some weeks ago and (whoever old Italian it was about) innovated in the making of paint.

on a tangent it's almost a given that most of the brit art artists of the 90's use the factory methods of warhol etc, it makes me kinda sad, talented artists making "to order " artworks made out of dead flies in a warehouse in Stroud for damien hurst's rich clients.

can any "artistry " come out of this method of making things, not sure if it can, just a stinking menagerie of dead flies ready to be gassed and stuck to a canvas.
have any of the brit pop artists perpetuated this state of "artistry" into anything useful and viable?