Tim Kring

The audience are the actors in writer Tim Kring’s latest adventure. In his famous creation, the TV show Heroes, people discover they have superhero powers, and go off and battle Evil. In his latest, people go and battle Evil, and discover they have been given Nokia smartphones.

The ambitious, Nokia-sponsored interactive extravaganza began this weekend, and it’s an interesting experiment. In Kring’s own words, this series of events, called Conspiracy For Good, is “not quite a drama, not quite a flashmob, not quite an ARG [alternate reality game]”.

What is it, then, and how did it come about?

Read more

Web politics: The honeymoon is over

Parallel moves in Canada and the US may signal the end of the honeymoon for web-based political campaigning – or change it beyond recognition. Politicians are becoming increasingly familiar with sudden squalls of email filling up their inboxes, and policy makers with responses to public consultations arriving via a web intermediary. But not surprisingly many … Read more

How the photographers won, while digital rights failed

How did the music business end up with a triumph with the new Digital Economy Act? How did photographers, whose resources were one laptop and some old fashioned persuasion, carry an unlikely and famous victory? How did the digital rights campaigners fail so badly? Back in January, a senior music business figure explained to me … Read more

Obama’s got a Google problem

Obama has created an exquisite problem by hiring so many senior executives from Google – some of the Oompa Loompas don’t seem to realise they no longer work for the company. Now a Congressman has called for an enquiry.

The issue was made apparent when a trail of correspondence by administration official Andrew McLaughlin was exposed recently. McLaughlin is Obama’s deputy CTO – a freshly minted post, with CTO meaning either Citizens Twitter Overlord, or Chief Technology Officer – we believe it’s the latter. He was previously Google’s chief lobbyist, or ‘Head of Global Public Policy and Government Affairs’.

McLaughlin’s contacts were also exposed. In an irony to savour, the exposure was by Google itself, as it introduced its privacy-busting Buzz feature in February. As our Cade pointed out, it would be hard to imagine a better Google story.

Read more

Google abandons Search

It’s hard to explain to people new to the web since 2004 – the Digg kids – the effect that Google had on the internet at the turn of the decade. They can’t conceive the Before and the After. Google was miraculous, and so much better than the competition that they effectively gave up trying to compete with it. But Google’s PageRank also unleashed social and political fads which reverberate right through to this day.

Much of the junk science of the web comes from Googlemania of this period. New institutes and venerable academic departments today all drink from the seemingly bottomless well. It permeates into Birtspeak 2.0, and you can see it in the Thumbs Up and Thumbs Down you see in Comments, for example. The mini-industry called “Social media marketing” wouldn’t really exist without it, either.

Google kindled the idea that the Web was a democracy, a great big voting machine. But only Google was uniquely qualified to divine these intentions – only Google had the capability and know-how to discern the ‘Hive Mind’. Google said so itself; its PR blurb explicitly made the connection between a New Form of Democracy and its own innovation, the “uniquely democratic nature of the web”.

For a couple of years, PageRank™ worked wonders. Then reality began to mess things up. What had worked well for conferring authority to peer-reviewed academic papers didn’t work quite so well in the wild. As Google grew, the importance of appearing in its rankings also grew. SEO and dirty tricks became big business. (See Meet the Jefferson of Web 2.0.)

This was first pointed out by your reporter in 2003, and it was manifest in two ways. Firstly, via the ease with which a small group of motivated people could hijack search terms, thanks to the dense interlinking nature of blogs. (A more perfect machine for rigging PageRank has yet to be invented). This was Googlewashing. And secondly, the ease with which spammers could clog the system with noise. The period also saw the migration of large amounts of information to the web in a searchable format. The real-time chatter from protocols that had previously been beyond the reach of search engines – such as AOL chatrooms – found its way into its Google. The result, by mid-2003, was a system that was broken.

You may recall that it was heresy at the time to doubt the quite magical technical ability of Google to get it ‘right’. The bandwagon of Web 2.0 had barely started to roll – it wasn’t christened until the following year – but there was already serious money on riding on it. But it was an even greater heresy to question the moral authority that the technology utopians had by then conferred on Google.

For Google wasn’t just ranking web pages, but adding to the human epistemological cannon – it was telling us what was wrong and right – filtered and legitimised through the people-powered Hive Mind. Thanks to the now-burdensome “Don’t Be Evil”, it constantly reminded us of its impeccable moral credentials.

Well, as you may have seen, PageRank™ is now dead. Google has given up on the job of ranking pages – it can’t cope any more – and outsourced the task of evaluating the job to the user. Needs must, and so it will make a virtue of the very feature that helped destroy the index – real-time noise. As Danny Sullivan points out, this is very big news indeed. I think it’s even bigger than Danny thinks it is – with an extra penthouse layer of bigness on top – for all the social and political implications mentioned above.

By outsourcing the ranking of pages to the hoi polloi, Google is saying that is no longer in the business of ‘arbitrating’ democracy. This is now the job of hordes of roaming single issue fanatics, voting pages up and down. You could say the internet has returned to its primordial soup.

Read more

Kick me again, RIAA!

“ The anti-copyright gaggle has an insatiable need to feel victimized. Injustice burns deep, and is triggered by the merest hint that “The Man” might be tampering with one’s “bits”. Another example of technology utopians trying to bypass politics and claim victimhood – the Net Neutrality” campaign – shows very similar characteristics.” A while ago … Read more

The Tragedy of the Creative Commons

The Creative Commons initiative fulfilled a major ambition last week – but it’s taken only days for the dream to turn to crap.

Google granted the wish by integrating the ability to search images based on rights licences into Google Image Search. Yahoo! Image Search has had a separate image search facility for years, but Google integrated the feature into its main index.

The idea of making the licences machine-readable was a long-standing desire of the project, and lauded as a clever one. It was intended to automate the business of negotiating permissions for using material, so machine would instead negotiate with machine, in a kind of cybernetic utopia. Alas, it hasn’t quite worked out.

As Daryl Lang at professional photography website PDN writes, the search engine is now choked with copyright images that have been incorrectly labelled with Creative Commons licences. These include world-famous images by photographers including Bert Stern and Steve McCurry. As a result, the search feature is all but useless.

Since there’s no guarantee that the licence really allows you to use the photo as claimed, then the publisher (amateur or professional) must still perform the due diligence they had to anyway. So it’s safer (and quicker) not to use it at all.

What’s gone wrong, as Lang explains, is the old engineering principle of GIGO, or Garbage In, Garbage Out:

“The system relies on Internet users to properly identify the status of the images they publish, Unfortunately, many don’t… Many Flickr users still don’t understand the concept of a Creative Commons licence, or don’t care.

“It’s time consuming to put a different label on every image [in their collection], and there are no checks in place [our emphasis] to hold users accountable for unauthorized copying or incorrect licensing labels.”

So Google won’t take responsibility for the accuracy of the licensing metadata, and Creative Commons, as a small private internet quango, says it can’t afford to. (The disclaimer on the website is simple: go find yourself a lawyer.)

Just as we predicted, in fact: the filtering is less than perfect, and it’s a lip-service to creators. Now, why did it have to fail?

Read more

“A country bumpkin approach to slinging generalizations around”

Anderson plagiarism

WiReD magazine Editor-in-Chief Chris Anderson has copped to lifting chunks of material for his second book Free from Wikipedia and other sources without credit. But it could be about to get a lot worse.

In addition to the Wikipedia cut’n’pastes, Anderson appears to have lifted passages from several other texts too. And in a quite surreal twist, we discover that the Long Tail author had left a hard drive backup wide open and unsecured for Google to index, then accused one of his accusers of “hacking”.

Does the WiReD editor and New Economy guru need basic lessons in how to use a computer?

Waldo Jaquith of Virginia Quarterly Review unearthed a dozen suspect passages after what he called “a cursory investigation”, and posted his findings here on Tuesday. Wikipedia entries for ‘There Ain’t No Such Thing as a Free Lunch’, ‘Learning Curve’ and ‘Usury’ had been pasted into Anderson’s book.

In addition to Wikipedia citations, which Anderson reproduced with the errors intact (oops), Jacquith suggests he also lifted from an essay and a recent book. Presented with the evidence, Anderson blamed haste and (curiously) not being able to decide on a presentation format for citations, for his decision to omit the citations altogether. Other examples were “writethroughs”, he said.

Then lit blogger Edward Champion documented several more examples which he says show

“a troubling habit of mentioning a book or an author and using this as an excuse to reproduce the content with very few changes — in some cases, nearly verbatim.”

Champion’s examples of churnalism include blog posts, a corporate websites and (again) Wikipedia.

Read more

Newspapers: David Simon vs Google

Google, the nemesis of newspapers, was at the Congress yesterday, to turn a blonde deaf ear to their troubles. The company’s pin-up VP of products Marissa Meyer described quite a bright future to the Senate’s commerce committee – but it’s a bright future for Google, and people with a lot of time fiddling with their computers. Also testifying was creator of The Wire David Simon.

Let’s contrast how each of them addressed the crisis.

Meyer said Google’s policy “first and foremost” was to respect the wishes of content producers, but offered nothing in the way of new business partnerships. Instead, she gave them a short but haughty lecture on how they should present their stories – they should become more like Wikipedia:

“Consider instead how the authoritativeness of news articles might grow if an evolving story were published under a permanent, single URL as a living, changing, updating entity,” she said in her statement. “We see this practice today in Wikipedia’s entries and in the topic pages at NYTimes.com. The result is a single authoritative page with a consistent reference point that gains clout and a following of users over time.”

So instead of publishing 50 stories a day, the implication is that publications should only publish 50 a year – tweaking those 50 constantly, in the hope they wriggle up through the Google search results. Yes, that’ll fix things.

She also said they should offer more scope for mash-ups. At both ends of the news chain, then, you have people fiddling – instead of writing (at one end) and reading (at the other). That’s very Web 2.0, and you couldn’t get a clearer statement that Google doesn’t really understand what news is for. (It’s merely the stuff that goes between the BODY tags, silly.)

The creator of The Wire and former reporter David Simon said he found the phrase “citizen journalism” Orwellian. He added:

“A neighbor who is a good listener and cares about people is a good neighbor – he is not in any sense a citizen social worker. Just as a neighbor with a garden hose and good intentions is not a citizen firefighter. To say so is a heedless insult to social workers and firefighters.”

Read more

Charlie Nesson’s trip

L.S.and D.

Has Charlie Nesson been at the magic mushrooms again? The hippy head of the Berkman Center, the influential New Age techno-utopian think tank that’s attached to Harvard Law School, wants to enlist Radiohead in his fight against the Recording Industry Association of America (RIAA).

Nesson, a long-time opponent of creator’s digital rights, is contesting the statutory damages in infringement cases. A Boston graduate student called Joel Tenenbaum was ordered to reach a settlement with the record companies after being sued for copyright infringement, having shared files using the Kazaa P2P network back in 2003. Nesson’s strategy in Sony BMG Music vs Tenenbaum is to put the music business on trial. That’s fine – suing freetards isn’t going to stop P2P file sharing and it isn’t going to save the music business. It only adds to the anoraks’ persecution complex. Even the RIAA has now concluded it’s the wrong strategy.

But is Nesson the man to fight The Man? Nesson’s novel argument is that unlicensed P2P file sharing is “fair use”. Even his Harvard students, who are doing the work for him, think that’s stretch. And maybe he doesn’t want to win, just preen about in front of a camera. He wants it televised, he Arse Technica, because:

“It’s like a reality show that we can all be participants in as we go along… It’s an incredibly powerful expansion of the idea of teaching.”

Read more