Neurosis as a lifestyle: remixing revisited

“We stand on the last promontory of the centuries! Why should we look back, when what we want is to break down the mysterious doors of the impossible ? Time and Space died yesterday. We already live in the absolute, because we have created eternal, omnipresent speed”
– Fillippo Marinetti, 1909

When a year ago I looked at some of the strange attitudes to copyright and creativity that abound on the internet, vilification followed swiftly. I wondered what was behind odd assertions that “the power of creativity has been granted to a much wider range of creators because of a change in technology”, which grew, without pausing for punctuation, into even odder and grander claims, such as “the law of yesterday no longer makes sense.” ‘Remix Culture’ as defined by the technology utopians wasn’t so much a celebration of culture as it is of the machines that make it possible, we noted. But many people simply find such thinking quite alien. So it’s heartening to see writers like Nick Carr and, today, the Wall Street Journal‘s columnist Lee Gomes join the debate that so animates Reg readers, and question these silly assumptions too.

Gomes hears a dot com executive sell his movie editing service with the claim that, “until now, watching a movie has been an entirely passive experience.”

(We heard a similar, silly claim from Kevin Kelly recently, only about reading.)

Passive? Not at all, Gomes explains today:

“Watching a good movie is ‘passive’ in the same way that looking at a great painting is ‘passive’ – which is, not very. You’re quite actively lost in thought. For my friend, though, the only activity that seemed ‘active’, and thus worthwhile, was when a person sitting at a PC engaged in digital busy work of some kind.”

Which is the world view in a nutshell. The future in which the scribbles of the digerati adorn every book or movie is a nightmare, he agrees. It’s also rather presumptious. Who does this self-selecting group claim to represent?

We’ve had a glimpse into this “future” with Google for the past three years, where to reach some original source material, one must wade through thickets of drivel, some of it generated by bloggers, the rest by machines pretending to be bloggers. It’s hardly anyone’s idea of enhancement, and Gomes calls it “dismally inferior”, and has a lovely simile.

“Reading some stray person’s comment on the text I happen to be reading is about as appealing as hearing what the people in the row behind me are saying about the movie I’m watching.”

Read more

BBC seeks ‘Digital Assassins’

What if they held a digital media revolution – and nobody came? The BBC is having trouble finding citizens to attend a conference devoted to the exciting new world of Citizens Media. It’s a Beeb-sponsored day about the “democratization of the media”, but despite a 50 quid bribe to attend – that’s more than you … Read more

People more drunk at weekends, researchers discover

A parody from 2000

It’s open season on Wikipedia these days. The project’s culture of hatred for experts and expertise has become the subject of widespread ridicule. Nick Carr christened it “the cult of the amateur”.

But what has professional academia done for us lately? Here’s a study from the University of Amsterdam to ponder.

New Scientist reports that researchers for Professor Maarten de Rijke at the Informatics Institute have been recording words used by bloggers, in an attempt to find interesting or unusual patterns. What revelations did the team’s MoodViews software unearth?

The team discovered that the LiveJournal label “drunk” becomes increasingly popular each weekend. And around Valentine’s Day, “there is spike in the numbers of bloggers who use the labels ‘loved’ or ‘flirty’, but also an increase in the number who report feeling ‘lonely’.”

It gets better.

The team also noticed that on the weekend of the publication of the most recent Harry Potter book, bloggers used “words like ‘Harry’, ‘Potter’, ‘shop’ and ‘book’,” PhD student Gilad Mishne reveals.

This work really should put the Nobel Prize Committee on Red Alert. Alongside the existing scientific prizes for Chemistry, Physics and Physiology and Medicine, the Laureate Committee should design a new category for the “Bleeding Obvious”, or the “Dying Ridiculous”.

More seriously, let’s look at what this episode teaches us.

Two things are immediately obvious: Mishne’s study was considered worthy of academic funding, and it was considered worthy of an article in a popular science magazine.

The study doesn’t tell us anything we didn’t know before: unless you’re surprised by the revelation that people get more drunk at weekends, or people talk about Harry Potter books more when a new Harry Potter book goes on sale. The study is really considered funding-worthy and newsworthy because of what’s unsaid – the implication that the aggregation of internet chatter will reveal some new epistemological truth.

Read more

Meet the Jefferson of ‘Web 2.0’

If Google’s PageRank reflects the “uniquely democratic nature of the web” – and if weblogs are the most empowering technology of our age – then how can we begin to fete a humble entrepreneur based in St Paul, MN? Very probably as the Gutenberg of the digital age. And the Jefferson. All rolled into one. … Read more

Flock founder flees

Soon all browsers would look like Flock, predicted Business Week. Included here for this quote, found on the internets. Wasn’t this the quintissential Web 2.0 business plan? Raise a bunch of capital in order to hire old people for pennies on the dollar. Use this vast, untapped resource in order to ‘develop’ a browser that … Read more

‘Lightweight, high-velocity and very connected’

At ZDNet, it’s Microsoft’s “Pearl Harbor”! Forbes screams, “Google’s office invasion is on!” Only it isn’t – and we have the founder’s word for it. As we reported yesterday, Google has paid an undisclosed sum for a web-based document editor, Writely. It’s a product that seems as mature as the company which produced it, Upstartle. … Read more

The Internet Services Puddle

What Ray Ozzie’s strategic memo really says.

Orgone

Ever the master of public relations, Microsoft has always been able to figure its way out of a tight spot with the use of a judiciously leaked memo.

Remember when AOL merged with Netscape back in 1998? Time to take a leak. Remember 2000, when Symbian was stealing the thunder from Microsoft’s cellphone strategy? Time to take a leak. Remember when the antitrust settlement talks had hit a sticky patch? Time to take a leak. Remember when Microsoft’s security woes finally became an issue? Time, once again, to take a leak.

The purpose of these releases is to bolster morale and focus the staff – Microsoft always seems to need a No.1 Enemy – and inform the press that it’s on the case.

(The memos Microsoft doesn’t want you to read such as this one and these two, are always more entertaining and enlightening.)

And so it goes. We know you’re very busy people, so in the spirit of the excellent 500-word “digested reads” offered by some of our better newspapers, we give you the prĂ©cis of the latest Gates and Ozzie memos.

Read more

Web 2.0: It’s … like your brain on LSD!

. My invitation to define Web 2.0 – Tim O’Reilly was clearly struggling – biggest postbag at The Register, ever: five a minute for 24 hours. You’ll see from the suggestions that even before most people had heard the buzzword, they already knew what it portended: a consultancy racket. See the original here, and the … Read more

Six Things you need to know about Bubble 2.0

We follow the money

Web 2.0 Techno utopian types love their earthy metaphors. The web is a new planet that’s being “terraformed” before our eyes, one advertising consultant likes to say. Or the “web is a garden“, if you believe Sun Microsystem’s director of research.

Even my overgrown garden doesn’t have something like this lurking in the corner, and I hope there isn’t such a horror in yours.

But enough with the hot-tub psycho babble. The future of computer networks is both a lot more promising and a lot more ominous than anything you’ll hear at the “Web 2.0” conference in San Francisco this week, where some of the web’s horticulturists will be gathering for an evangelical uplift.

There’s every reason to be optimistic, now in 2005, that computer networks can begin to fulfill their potential. They can even start to be really useful – but it’s only by dispensing with such utopian nonsense – so we can really begin to see what these tools can do for us. Here’s a reality-based guide to what’s happening – and if you hear a futurist omit more than one of these in a presentation, send them to the Exit toot-sweet, with a firm smack on the backside.

Read more