On “systems creativity”

This is pretty much what I’ve been talking about for the last few years, via Joe Burns.

A diagram showing the split in focus within agencies between Account, Strategy, and Creative teams - and how it's not as simple as that



The problem isn’t just that the old model doesn’t work in a more complex environment – it’s that the very terminology precludes understanding and alignment, as everyone has a different idea of what the labels mean.

The key to success has always been systems thinking – but many agencies (and even more so in-house marketing teams) continue work in siloes, with nowhere near as much discussion and collaboration as is needed to come up with truly effective approaches.

As Joe Burns put it in his post on this:

“Coherence has to come from the system, not just one execution. The idea of a ‘Campaign’ only works if you can muster a critical mass of attention to carry people through it.”

Maybe it’s my “content” background speaking – because really strong content strategies need to work at multiple levels, across multiple channels and formats, and for multiple audiences with multiple needs. Without understanding the big picture *and* the details, it’s impossible to deliver effectively content across a campaign – individual assets may be solid, but the whole ends up less than the sum of its parts.

This is why I’ll continue trying to play in those overlap areas – not only do I find the diversity and clash of approaches and ideas stimulating, but I see it as the only way to work out the best way to succeed. You have to try to see the big picture to work out the best individual brush strokes.

The impact of Meta’s Canadian news boycott

Facebook logo, with other Meta brand icons. Creative Commons license from Anthony Quintano on Flickr.The decline in news audiences reported here – 43%, or 11 million daily views – is shockingly high. This follows Canada’s ill-considered battle with Meta, which led to Meta pulling news from its platforms, including Facebook, in the Canadian market last year, rather than arrange content licensing agreements with news publishers.

This amply demonstrates the vast power these tech platforms have in society and over the media industry, and so justifies the Canadian government’s worries. But it also more than shows – once again – how utterly dependent the online content ecosystem is on these channels for distribution.

Meta/Facebook obviously isn’t a monopoly, but a 43% decline in news consumption thanks to the shutting down of one set of distribution channels? It’s a safe bet that much of the rest of the traffic will be from Google, so it’s more of a duopoly.

What impact is this level of reliance on a couple of gatekeeping tech platforms – who can change their policies on a whim at any time – going to have on public awareness of current events and society at large

Elsewhere in the article we have an answer: “just 22 per cent of Canadians are aware a ban is in place”.

Shut down access to news, little wonder that awareness of news stories stays low.

Both Canada (with Meta) and Australia (with Google and Meta) have tried forcing the tech giants into doing licensing deals for content that their platforms promote. In both cases, this has – predictably – backfired, and led to the opposite effect to that intended.

But what’s the solution?

This question is becoming more urgent now that GenAI is in the mix, and starting to provide summaries of stories rather than just provide a headline, image, and link.

Meta/Google were effectively acting like a newsstand – showing passing punters a range of headlines to attract their attention and pull in an audience.

GenAI’s summarisation approach, meanwhile, is much closer to what Meta and Google were being (unfairly) accused of doing by the Canadian and Australian governments: Taking traffic away from news sites by providing an overview of the story on their own platforms.

But the GenAI Pandora’s Box has already been opened. Publishers need to move away from wishful thinking – the main cause of the failed Australian/Canadian experiments – and back to harsh reality.

Unlike the Meta news withdrawal – which could be reversed – this new threat to content distribution models isn’t going away.

On Perplexity’s content deal with WordPress

Perplexity logo“If your website is referenced in a Perplexity search result where the company earns advertising revenue, you’ll be eligible for revenue share.”

How many qualifiers can be fitted into one sentence, all while providing next to no information?

To be clear, I’ve loved WordPress ever since I migrated my old blog to it [checks archives] *18* years ago [damn…] I also fully get why they’re doing this – some money is better than none, it may work out, and it may actually lead to more traffic / engagement / visibility for WordPress sites.

But this all feels a little like promises of scraps falling from the table of people who are getting scraps falling from an even higher table.

Perplexity currently claims to be making US$20 million from paid subscriptions to its pro service – about the only source of income it currently seems to have, despite its $2.5-3 billion valuation. If they’re now giving away some of that limited income, I can’t see an obvious path to profitability, given the hefty running costs of GenAI.

This doesn’t just go for Perplexity, but for all these GenAI tools:

  1. What’s the path to a sustainable content publishing-based business model (and all these GenAI companies are content companies) when being able to produce infinite content on demand means the traditional route for making money for these kinds of companies – advertising inventory – is also infinite?
  2. Value comes from scarcity. Content / as inventory is no longer scarce. How do you make something that’s not scarce seem valuable enough to get people to pay for it?
  3. And when all GenAI models offer more or less the same output, and more or less the same level of reliability, and successful features and approaches can be replicated by the competition in next to no time, how do you stand out from the crowd?

Being a content/tech geek I’ve been thinking about this a lot over the last couple of years. Perplexity’s approach is one I like (I did history at university, so I love a good list of sources, even if they’ve mostly just been added to make your work look more credible and most of them are irrelevant, as is often the case with Perplexity) – but I’m far from convinced it has money-making potential. As Wired has put it, Perplexity is a bullshit machine. How valuable is bullshit?

Basically, we’re firmly in the destruction phase of creative destruction. The creative part is yet to come

But still – at least the providers of the raw material these LLMs are so reliant on are starting to get thrown a few bones. That’s a step in the right direction – because as that recent Nature study made clear, the proliferation of AI-generated content risks surprisingly rapid synthetic data-induced model collapse.

Human-created content may no longer be king, but it remains vitally important. Without it – and a hefty dose of critical thinking – the whole system comes tumbling down.

The growing social media advertising boycott

The most surprising thing with this growing move away from social media advertising is that it has taken this long for brands to realise that they can’t control the context in which their adverts appear – and that context can change perception of their messaging.

The real lesson here is not that social media needs stricter controls (an ethical debate), it’s that in the classic Paid/Earned/Owned model, the *only* part brands can fully control is Owned. Many are only now beginning to wake up to the fact that their social accounts are not Owned platforms.

All this should have been obvious for years – every fresh story about an algorithm change destroying business models that were relying on social audiences has been an alarm bell. But perhaps now brands are finally realising that social isn’t as straightforward as they’ve long seemed to believe.

What does this mean for brands?

1) They need more robust, nuanced social strategies. Chucking money at paid posts and adverts doesn’t cut it. It never has.

2) The quality of their genuinely Owned platforms is becoming more important than ever. These are the only places they have complete control over the context and the message.

And it’s also notable that many brands joining the boycott have solid Owned strategies in place…

On the death of the cookie

This move will reshape the internet, and change how publishers, advertisers, brands and marketers operate.

“View-through attribution, third-party data, DMP and multitouch attribution will be ‘dead’ under the proposals. We’re now facing a world with significantly less measurement and targeting.”

What does this mean? Initial thoughts:

  1. Less audience targeting from 3rd party cookies => more need for audience insights from other data sources. Owned web properties will become more important.
  2. Google’s stranglehold on advertising will tighten, as Chrome will track engagement metrics instead.
  3. Throwing money at supposedly targeted distribution will stop appealing to advertisers, many of whom are already suspicious of the purported ROI of such campaigns.
  4. Digital ads we see will become less obviously personalised to us.
  5. Instead, marketing will need to work on its merits – attracting audiences via sustained campaigns based around creative concepts rather than algorithms.
  6. Yet another revenue source will be cut off for publishers, making it harder than ever to fund traditional journalism.
  7. This will in turn either open up more gaps for niche non-profit publishers (and brands) to fill, or lead to a decline in the amount of content produced.

Interesting times…

Review: Who Owns the Future?, by Jaron Lanier

4/5 stars

Impressively prescient, considering it was published five years ago but is about technology – something that’s been moving madly fast during that timeframe. The Facebook / Cambridge Analytica scandal effectively predicted, many of the debates still going on in business and government today about things like the gig economy, autonomous vehicles and more were anticipated and summarised before they’d really started happening. The impression is that Lanier had seen all this coming decades ago – and he probably did.

As such, lots here to spark thought, lots to be impressed with, and it’s hard to disagree with the central thesis that the informational economics of the internet age are fundamentally broken. But at the same time, the only alternative to the current way of doing things – micropayments for data exhange/generation – still seems insanely impractical, even employing a technology like blockchain (something similar to which Lanier kinda proposes here).

So while Lanier ends on an optimistic note, the book left me more pessimistic than ever about our tech-driven future.

Review: Saving the Media: Capitalism, Crowdfunding, and Democracy, by Julia Cagé

5/5 stars

A short, readable book, well worth a read for anyone interested in the media – specifically how to tackle the ongoing challenge of funding news, and the role of journalism in democracy.

The solution proposed for the ongoing challenges of monetisation and the maintenance of independence from vested interests is an interesting one. Plausible too – if governments can be persuaded that news is a public good, that is.

And even if you don’t buy in to the news as public good argument that underpins the entire thesis, along the way come a number of interesting – often surprising – nuggets about the media industry across various countries that make this worth a look by themselves. I was particularly intrigued by the finding that an increase in the number of newspapers leads to a decrease in democratic engagement – initially counterintuitive, but makes perfect sense once explained.

The “Netflix of News” and the death of the publishing brand

I loved the concept when I first heard about it, and love that it seems to be working. Proof of concept done – now it’s time to take that concept and expand. Preferably globally.

In short, it’s a cunning system that allows you to pay for individual articles from publications, thus avoiding the constant fustration of not being able to read that great piece from the likes of the FT, Times or Economist because it’s hiding behind a paywall.

If this sort of thing takes off, it could be a whole new business-model – making paywalls more viable, while allowing monetisable ways around them.

But there’s also an interesting quote from Blendle’s founder:

“People want to read articles or want to follow specific journalists but aren’t particularly interested in the newspaper that it comes from anymore.”

This is especially true in the age of social, where URL-shorteners are so endemic that half the time you have no idea which site you’ll end up on.

I’ve got used to reading content that’s been de-branded via a hefty RSS addiction. That’s been replaced in recent years with an addiction to aggregation apps like Zite, Flipboard and Feedly, where what matters is the content itself, not the packaging, or where it’s from.

If the content is good enough, it will stand on its own – it won’t need to hide behind the brand. In fact, the brand can sometimes be a disadvantage, because it leads to preconceptions that can skew the reader’s opinion before they’ve even started to read a piece. There are some publications I avoid simply because I assume that they have nothing to offer me, for reasons of politics, prejudices, or whatever – and I know I’m far from being alone in this.

Remove the publication’s branding and present me with their content as is, would my preconceptions be different? Of course. And if I like the content, this could win them a new long-term reader.

Numbers are our friends

Useful look at how detailed, adaptable, *tailored* performance data (and people who know how to analyse and explain it) is essential if you want to be successful in modern media. As so often, Buzzfeed seems to be ahead of the curve.

It never ceases to amaze how often online publishers get het up about the wrong metrics. Tools like Omniture are obscenely powerful, yet all we tend to use them for is to find PVs, UUs, occasionally time spent, and sometimes how particular headlines are performing. Used properly, web analytics can help us keep our sites in a state of constant evolution, adapting to the tiniest shifts in user behaviour through minor design/code tweaks.

This isn’t about becoming Keanu Reeves and learning how to read the Matrix – it’s just knowing how to use the tools that are available to us.

Journalistic quality vs the money men

Useful study on metrics vs journalistic pride, but leaves out a key aspect: the sales guys – because this is how the money is made and the metrics are ultimately determined.

Time was, quality audiences would be worth more to advertisers than quantity. Why hasn’t online ad selling (and buying) caught up yet? Only when it does will there be incentive to move beyond page views and unique users as the key metric. Programmatic ad sales could be the answer, or could worsen the situation further – too early to tell.

Anyway, worth a read:

“Online media is made of clicks.

Readers click from one article to the next. Advertising revenue is based on the number of unique visitors for each site. Editors always keep in mind their traffic targets to secure the survival of their publications. Writers and bloggers interpret clicks as a signal of popularity.

The economic realities underpinning the click-based web are well documented. Yet much work remains to be done on the cultural consequences of the growing importance of Internet metrics.

I conducted two years of ethnographic research (observing newsrooms and interviewing journalists, editors, and bloggers) exploring whether web analytics are changing newsroom cultures. The answer is a qualified yes, but in ways that differ from the ones we might expect.”

A golden age for journalism

Lots to agree with here:

“by some measures, journalism has never been healthier. And there’s every reason to believe that it is actually getting stronger because of the web, not weaker — regardless of what’s happening to print”

Are jobs being lost? Yep. Are publications shutting down? Yep. But are readers getting more of what they want? Yep.

My only worry with this optimistic take on the current situation is that, despite years of worrying about it, and over a decade of confident assertions that hyperlocal “citizen journalism” will fill the void left as uneconomic newspapers shut down, there is still a major risk that many communities will be left without a reliable source of local news coverage.

I’m based in London, so there are any number of hyperlocal Twitter accounts and small blogs covering the area, but none of these are comprehensive, even combined, and few have the skills or abilities to dig deeper into what’s going on in the local council. Local newspapers were never especially economically worthwhile, but they did (well, sometimes) provide a valuable public service in holding local government to account – something they were only really able to do because of the level of access they were afforded by their permanent, professional position.

On a local level, as local papers shut, the most common publication to fill the void isn’t a blogger, it’s an official local government publication – we’re replacing public service for propaganda.

Web writing, hate reading, and the decline of quality

Nothing new, but this is worth a read on web writing and hate-reading – that old trick of being as controversial as possible in order to get an extreme response, purely because extremes get more attention, and in a pageview-driven business model, controversy is seen as good purely because, based on the metrics, it’s the controversial stuff that’s driving engagement.

This infantile attitude of provocation to get attention is increasingly being combined with ream upon ream of cheap content, because the more content you’ve got, the more potential PVs you can attract. We end up with the most depressing (and false) equation of online publishing:

Cheap content + Controversy = Clicks = Cash

It’s an attitude that’s lazy *and* massively short-termist in thinking – over the long term, quality can and should trump quantity. But even if it doesn’t, cheap, crappy content is a turn-off for audiences. The more sites that start to rely on hastily-produced, poorly-checked copy, or lazy semi-plagiarisms of things that desperate teams of poorly-paid hacks with deadlines and quotas to hit have found elsewhere, the less distinctive sites get, and the fewer returning visitors you’ll get. As that linked article puts it:

“With a business model based on a ton of cheap content, Web publishers can rely too heavily on acid-reflux-style aggregation, in which young writers destroy the savor of interesting stories and an interesting world by constantly regurgitating the news with added bile.”

There’s also an interesting point made from John Waters in the Irish Times (now behind a paywall), on the impact of comment sections under online articles:

“Because everything written specifically for online consumption is written in the expectation of addressing a hostile community, the writing process demands, as a prerequisite, either a defensive or antagonistic demeanor.”

Having learned my online publishing trade in the realm of message boards, chatrooms and blogs, I’m incredibly aware of the vast levels of bile that exist in comment sections. But it doesn’t have to be this way. With careful community management, it’s perfectly possible to build online communities that are supportive, friendly, and constructive, rather than the supposed default of objectionable and offensive. Check out the likes of b3ta, imgur and Metafilter for some prime examples of sites with vast *positive* communities of commenters. And then contrast those with the comments sections of pretty much any national newspaper site – packed with trolls and maniacs.

It doesn’t have to be this way.

The failure of the supermarket model of publishing

Fascinating, thought-provoking piece – another of those ones you come away from thinking “damn, that’s so obvious – why didn’t I make the connection before?” A few highlights:

Quality doesn’t mean popularity:

every single newspaper that I talk with. They are saying the same thing, which is that their journalistic work is top of the line and amazing. The problem is ‘only’ with the secondary thing of how it is presented to the reader.

And we have been hearing this for the past five to ten years, and yet the problem still remains. There is a complete and total blind spot in the newspaper industry that, just maybe, part of the problem is also the journalism itself.

Instead, they move the problem out of the editorial room, and into separate and isolated ‘innovation teams’… who are then charged with coming up with ideas for how to reformat their existing journalistic product in a digital way.

But let me ask you this. If The NYT is ‘winning at journalism‘, why is its readership falling significantly? If their daily report is smart and engaging, why are they failing to get its journalism to its readers?

If its product is ‘the world’s best journalism‘, why does it have a problem growing its audience?

Newspapers (and all-in-one-place sites) are an outdated concept:

No matter how hard they try, supermarkets with a mass-market/low-relevancy appeal will never appear on a list of the most ‘engaging brands’, or on list of brands that people love.

And this is the essence of the trouble newspapers are facing today. It’s not that we now live in a digital world, and that we are behaving in a different way. It’s that your editorial focus is to be the supermarket of news.

The New York Times is publishing 300 new articles every single day, and in their Innovation Report they discuss how to surface even more from their archives. This is the Walmart business model.

The problem with this model is that supermarkets only work when visiting the individual brands is too hard to do. That’s why we go to supermarkets. In the physical world, visiting 40 different stores just to get your groceries would take forever, so we prefer to only go to one place, the supermarket, where we can get everything… even if most of the other products there aren’t what we need.

It’s the same with how print newspapers used to work. We needed this one place to go because it was too hard to get news from multiple sources.

But on the internet, we have solved this problem. You can follow as many sources as you want, and it’s as easy to visit 1000 different sites as it is to just visit one. Everything is just one click away. In fact, that’s how people use social media. It’s all about the links.

One of clearest examples of this is how Washington Post is absolutely failing to engage people on YouTube. Every single day, they are posting a bunch of news videos about random things. Each video is well made (great production quality), but there is no editorial focus.

The result is this:

quality3

Here we have a large US newspaper that is barely reaching any people when it uploads a video to YouTube. And it’s not that the videos are uninteresting. There is one about iPhone cases that you can buy at the 9/11 museum (and the controversy of that), with only 687 views. There is a motivational speech (usually a popular thing to post on YouTube), with only 819 views. We have social tactics, like “5 awkward political fundraising moments”, with only 101 views.

Then we have a video by the super-popular George Takei that we all know from Star Trek. This is a person with millions of fans, but his video on Washington Post only attracted 844 views… in two weeks! If this had been posted by any Star Trek focused channel, this very same video would have reached 50,000 views, easy!

What the Washington Post is doing can only be described as a complete and total failure. It cannot get any worse than this.

Does internet advertising work?

This obsession with measurable outcomes from online advertising – something it’s impossible to do with TV or print or billboards – is idiotic. Advertising is about brand/product recognition and building familiarity/trust as much if not more than direct sales, and always has been.

This is a solid overview of the issues. Maybe, one day, the industry will wake up to its idiocy. Yes, detailed data is useful – but just because some things are measurable doesn’t mean everything is. A sale may be a long time in coming.

The power of Google

Another Google algo update and, as ever, original, interesting, useful content is key to SEO success.

The hit eBay’s taken is interesting, though… An 80% drop in Google traffic coukd be a business-killer for anyone less big. And their content surely *is* original and relevant, what with the products changing all the time?

Possibly another impact from the authorship/Google+ changes the Google guys have introduced? After all, eBay product page writers are hardly likely to be verified Google+ authors. Is this why eBay are starting to invest in creating narrative content around their auctions?

Update: See also the ever-excellent Matthew Ingram on this, who points out the extremely worrying hit the long-running and much-loved Metafilter has taken:

“Reliant on Google not only for the bulk of its traffic but also the bulk of its advertising revenue, Metafilter has had to lay off almost half of its staff.”

The lesson?

Google can kill a site on a whim, and even the experts can’t tell us how or why, because Google’s algorithms are even more secret than the Colonel’s delicious blend of herbs and spices. Any site dependent on search for the bulk of its traffic is playing a very, very dangerous game.

Update 2: More detail on the Metafilter revenue/traffic decline, complete with stats.

The related power of Facebook to stifle updates from sources it has deemed to be suspect for whatever reason simply – and even the New York Times’ recently-leaked innovation report’s charts In the decline of its homepage – makes an obvious cliché all the more true even it comes to Web traffic: don’t put all your eggs in one basket.

If more than 25% of your traffic/revenue comes from one source, you’re in danger. More than 50%, you have a potential death sentence. All it takes is one thing to change, and you’re screwed.

Is collaborative newsgathering the way forward?

Being a foreign affairs geek, the decline of overseas bureaus has long been a concern.

Yes, the web could mean that information from overseas is easier to access and verify remotely than ever before (see the success of the Dublin-based Storyful in rapidly verifying UK from all over the world), but having your own trusted people on the ground? Surely that’s an advantage?

Well, yes and no. A correspondent can’t be everywhere at once. In a fast-moving situation like the one ongoing in Ukraine, with so many unverified stories and deliberate falsehoods and fabrications being set up, this becomes even more of a problem.

And so the just-announced Ukraine Desk collaboration between Vice, Quartz, Mashable, Digg, Mother Jones, and BreakingNews.com – pooling their on the ground resources to improve the reliability of their information – is a fascinating one, which I’ll be following with interest (both in the subject and the process).

Could collaborative newsgathering and media coalitions be a way to break down the economic challenges of having reporters on the ground?