by JCM | 6 Sep, 2014 |
This is exactly why I always try to muscle in on the design process: “what are the ethics of platform design? …When designers create a… news app, they aren’t just designing software. They are creating a platform that participates in constructing an *idea* of news.”
by JCM | 6 Sep, 2014 |
The filtered feeds of Facebook (and LinkedIn) are the things I dislike most about them, the unfiltered most recent first approach of Twitter what I love about it, so this possibility that Twitter’s going down the algorithmic-filter route worries me – and not just because of recent concerns voiced over how algorithms can affect net neutrality and news reporting.
I very much hope Twitter at least retains the option of turning on the firehose, though I fully get the need to tame the chaos with some kind of algo or filter to pull in new users. Not everyone can get to grips with lists and Tweetdeck – too confusing for the newcomer.
Now don’t get me wrong: algorithmic filtering has its place. One of my favourite apps is Zite, and I was an early adoptor of StumbleUpon (well over a decade ago) – precisely because of their ability to get to know my interests and serve me up interesting content from sources I’d usually not discover by myself. For Facebook to offer up this kind of service, with its vast databases of its users’ Likes, makes perfect sense (though I’d still prefer a raw feed, or category feeds, so I can split off news about the world from news about my actual friends – a new baby or a wedding is not the same as a terrorist attack).
This is why I love Twitter – it is raw, unfiltered. And at 140 characters a pop, it’s (more or less) manageable. Especially if these old stats are still accurate, suggesting the majority of Twitter users only follow around 50 other accounts. If you end up following a few hundred, you’re already a power user, and likely know order them via lists. If you end up following a few thousand, then frankly you no longer care if you miss a few things.
Could Twitter be improved with a bit of algo? For sure. Why am I only ever shown three related accounts when I follow a new one? Why isn’t MagicRecs built in?
But the fact is we’ve already got this option on Twitter – it’s called the Discover tab. And I never use it, because it somehow manages to feel even more random than the raw feed. The problem isn’t a lack of algorithms, it’s a lack of intelligent algorithms, intelligently integrated.
by JCM | 4 Sep, 2014 |
I loved the concept when I first heard about it, and love that it seems to be working. Proof of concept done – now it’s time to take that concept and expand. Preferably globally.
In short, it’s a cunning system that allows you to pay for individual articles from publications, thus avoiding the constant fustration of not being able to read that great piece from the likes of the FT, Times or Economist because it’s hiding behind a paywall.
If this sort of thing takes off, it could be a whole new business-model – making paywalls more viable, while allowing monetisable ways around them.
But there’s also an interesting quote from Blendle’s founder:
“People want to read articles or want to follow specific journalists but aren’t particularly interested in the newspaper that it comes from anymore.”
This is especially true in the age of social, where URL-shorteners are so endemic that half the time you have no idea which site you’ll end up on.
I’ve got used to reading content that’s been de-branded via a hefty RSS addiction. That’s been replaced in recent years with an addiction to aggregation apps like Zite, Flipboard and Feedly, where what matters is the content itself, not the packaging, or where it’s from.
If the content is good enough, it will stand on its own – it won’t need to hide behind the brand. In fact, the brand can sometimes be a disadvantage, because it leads to preconceptions that can skew the reader’s opinion before they’ve even started to read a piece. There are some publications I avoid simply because I assume that they have nothing to offer me, for reasons of politics, prejudices, or whatever – and I know I’m far from being alone in this.
Remove the publication’s branding and present me with their content as is, would my preconceptions be different? Of course. And if I like the content, this could win them a new long-term reader.
by JCM | 2 Sep, 2014 |
Useful look at how detailed, adaptable, *tailored* performance data (and people who know how to analyse and explain it) is essential if you want to be successful in modern media. As so often, Buzzfeed seems to be ahead of the curve.
It never ceases to amaze how often online publishers get het up about the wrong metrics. Tools like Omniture are obscenely powerful, yet all we tend to use them for is to find PVs, UUs, occasionally time spent, and sometimes how particular headlines are performing. Used properly, web analytics can help us keep our sites in a state of constant evolution, adapting to the tiniest shifts in user behaviour through minor design/code tweaks.
This isn’t about becoming Keanu Reeves and learning how to read the Matrix – it’s just knowing how to use the tools that are available to us.
by JCM | 29 Aug, 2014 |
*cue thousands of SEO experts desperately trying to work out what this means*
by JCM | 29 Aug, 2014 |
Useful study on metrics vs journalistic pride, but leaves out a key aspect: the sales guys – because this is how the money is made and the metrics are ultimately determined.
Time was, quality audiences would be worth more to advertisers than quantity. Why hasn’t online ad selling (and buying) caught up yet? Only when it does will there be incentive to move beyond page views and unique users as the key metric. Programmatic ad sales could be the answer, or could worsen the situation further – too early to tell.
Anyway, worth a read:
“Online media is made of clicks.
Readers click from one article to the next. Advertising revenue is based on the number of unique visitors for each site. Editors always keep in mind their traffic targets to secure the survival of their publications. Writers and bloggers interpret clicks as a signal of popularity.
The economic realities underpinning the click-based web are well documented. Yet much work remains to be done on the cultural consequences of the growing importance of Internet metrics.
I conducted two years of ethnographic research (observing newsrooms and interviewing journalists, editors, and bloggers) exploring whether web analytics are changing newsroom cultures. The answer is a qualified yes, but in ways that differ from the ones we might expect.”
by JCM | 27 Aug, 2014 |
Twitter Analytics will be fun and useful, but why no ability to sort by best/worst performers? How can we tell what does/doesn’t work if we can’t see what does/doesn’t work? Intro here. Analytics themselves here (you need to activate before you’ll start seeing stats).
by JCM | 26 Aug, 2014 |
Oh yes please!
I’m a massive CMS geek, yet in well over a decade and a half of online publishing, I still haven’t found one I truly adore. Mid-period Wordpress came close, but now it’s too complex and chunky. Buzzfeed’s seems decent, from what I’ve seen. The one they have at ITV News looks great, from the screenshots. But I hear truly great things about the Vox Chorus CMS.
Want!
by JCM | 25 Aug, 2014 |
Lots to agree with here:
“by some measures, journalism has never been healthier. And there’s every reason to believe that it is actually getting stronger because of the web, not weaker — regardless of what’s happening to print”
Are jobs being lost? Yep. Are publications shutting down? Yep. But are readers getting more of what they want? Yep.
My only worry with this optimistic take on the current situation is that, despite years of worrying about it, and over a decade of confident assertions that hyperlocal “citizen journalism” will fill the void left as uneconomic newspapers shut down, there is still a major risk that many communities will be left without a reliable source of local news coverage.
I’m based in London, so there are any number of hyperlocal Twitter accounts and small blogs covering the area, but none of these are comprehensive, even combined, and few have the skills or abilities to dig deeper into what’s going on in the local council. Local newspapers were never especially economically worthwhile, but they did (well, sometimes) provide a valuable public service in holding local government to account – something they were only really able to do because of the level of access they were afforded by their permanent, professional position.
On a local level, as local papers shut, the most common publication to fill the void isn’t a blogger, it’s an official local government publication – we’re replacing public service for propaganda.
by JCM | 24 Aug, 2014 |
* assuming you don’t read anything else about clickbait today
This article focuses on content produced by content marketers, but applies just as much to regular publishers who are constantly trying to ride the latest wave of social media fads to suck in a few unsuspecting punters with low-rent, instantly-forgettable clickbait. Short, cheap, trend-driven / fast-turnaround content may well help you hit short-term engagement metrics, but will long-term kill audience retention:
“The internet is ballooning with fluff, and bad content marketing is to blame. In our obsession with ‘engaging’ our ‘audience’ in ‘real-time’ with ‘targeted content’ that goes ‘viral,’ we are driving people insane… When a publishing agenda is too ambitious, people can’t afford to shoot anything down… They’re under too much pressure to fill… slots”
I particularly like the concept of “click-flu” – the sense of annoyance and disappointment you get (both with the content and, more importantly, with yourself) when you click on a clickbaity link, and the page you end up on fails to deliver on its hyperbolic promise. The resentment builds and builds – and over time, leads to hatred of the people who lured you in time and again.
If you make a big promise, as so many of these “This is the most important thing you will see today” clickbaity headlines do, you’d damned well better live up to it.