I paid to have my latest Wired story promoted on social networks, like Twitter and Facebook, to try to show that a lot of the metrics* we use to measure a story’s success are bullshit. It worked. When the story went live today, the page appeared with more than 15,500 links on Twitter, and 6,500 likes on Facebook. The story is a part of Wired’s Cheats package for the latest issue of the magazine. It needed to go live online at the same time readers encountered it in print, and it needed to have all those social shares set up in advance.
…
The entire package was going live at once. I could publish my story a little bit early, but the timing needed to be very close. I wanted all the public-facing stats (like the 15 thousand links and Twitter and 6,000 Facebook shares) to be live by the time the text appeared. Certainly, if someone found it in print or on the tablet, it needed those metrics to already be there. To make that happen, we cheated.
…
This morning (or last night) at a little after 1 am, I added the story text, set it to the current time, and hit update. Now it showed up in RSS readers and I could openly tweet it form my main account. (I had originally used a secondary Twitter account I have for testing 3rd party stuff to link to it and score retweets.)
So now, the story goes “live” and as if by magic it has tens of thousands of social shares listed on it the instant real people start to encounter it. It worked.
*As is site traffic, to a very large extent. My original idea was to use a botnet to throw traffic at it, but Wired’s lawyers said “no, no. Don’t do that.“
And, of course, people tend to associate lots of shares with an article’s significance or influence. Consequently, by ‘cheating’ ahead of time a content owner can add a false gravitas to the content in question. I’m curious to know how search companies that, in part, use social signals to surface content deal with this kind of ‘hacking the social.’