This. My biggest data lessons from 25 years in digital publishing / marketing to add to the efficiency/effectiveness debate:
1) There’s an important distinction between being data-driven and data-informed; more organisations need to lean towards the latter, because…
2) No numbers mean anything without context – almost everything measurable needs multiple other datapoints, timescales, and points of comparison to have any meaning
3) Most data tracked by marketing departments are vanity metrics with almost zero long-term value for the business as a whole
4) Pick the wrong KPIs (pageviews being the most obvious, revenue growth perhaps the least) you’re more likely to harm the business than help it by focusing on improving the *indicator* rather than the business-wide performance, because…
5) Almost every metric can be gamed or significantly impacted by outliers or picking the wrong points of comparison, but…
6) Not enough people check to see if this is what’s happening, especially if the results are looking good
7) Equally, just because you *think* you can measure something doesn’t mean this is what you’re actually measuring, or that it’s helpful to do so, but…
8) Tables of numbers and nice pretty charts (especially with trend lines) are addictive, while cross-referencing multiple metrics and trying to make sense of it all is difficult – not helped by most of the tools available being deeply unintuitive, so…
9) Most laypeople don’t bother asking about the methodology for fear of looking stupid, and just nod along, so…
10) Keep on questioning the data – who compiled it, how, when, where, why, and what could we be missing? Data interpretation is as much art as science – the more we question what we’re seeing, the more likely it is someone will have one of those sparks of inspiration that help you find something genuinely meaningful
There’s some fascinating stuff in this SEO long read, based on impressive research and analysis. Just bear in mind that, as leaked Google documents put it, “If you think you understand how [search algorithms] work, trust us: you don’t. We’re not sure that we do either.”
To save you time, the main lesson is that “achieving a high ranking isn’t solely about having a great document or implementing the right SEO measures with high-quality content”. Search results shift in near realtime based on thousands of utterly opaque, interconnected assessments of obscure demand and user intent signals, so there’s only so much website managers can do.
For me, this all confirms a few core content principles:
Context is king, not content. You can have an amazing page full of astounding insight, but if it doesn’t clearly meet the needs of the user at that moment in time, it will go unviewed.
Page structure is at least as important as substance – if (human and bot) audiences can’t quickly tell that your page is interesting and relevant, they’ll bounce.
But don’t worry – the key to success is rarely going to be a single webpage. More important is the authority of the domain and brand.
This means the impact of content is at least as much about cumulative brand building as it is immediate engagement. Think of the long tail, not just the short spike – and focus your content strategy on building this long-term growth over the short-term quick hit.
Given so much about how this works is unknown, and so many factors are outside your control, it’s best not to over-think it. Follow all the advice SEO experts offer, and you’ll end up with something so over-engineered it’ll lose its coherence and flow. This will increase bounce rates.
So how to succeed?
Go back to basics: Focus on ensuring your content fulfills a clear audience need (ideally currently unmet by other sources), using language audiences are looking for, presented in ways audiences are likely to engage with, and with clear links to and from other relevant content to help both humans and bots understand its relevance within the broader context.
In other words, SEO may be complex when you dig into the details – but it’s really just a combination of common sense, long-term authority building, and a good bit of luck.
As the FT points out, big tech has so much data on us, surely ad targeting should be good by now?
The real solution to increasing your chances of reaching the right people isn’t marketing automation, it’s user experience. One’s a tactic, the other’s a strategy.
After all, if even Facebook struggles to identify audience interests with any degree of accuracy, what hope do more limited platforms have?
The risk isn’t just that you’re wasting your paid media spend on micro-targeting, it’s that you’re wasting your production budget producing multiple variants of marketing content for audiences that may never see your material. It’s lose-lose.
The magic bullet isn’t audience segmentation in promotion plans – it’s focusing on your audiences’ interests in the content and messaging development phase. This helps ensure what you’re saying (and how you’re saying it) can appeal to multiple target groups at the same time – from niche to broad. Then you can let your audiences self-select the next step on their customer journey via clear signposting of where to go to find what they want.
One size may not fit all perfectly, but with a skilled tailor one size can be given the *illusion* of fitting all. People will pay attention to the things they’re interested in, not the things they aren’t. Which makes people far more capable of deciding what’s relevant to them than any algorithm.
Seeing this graphic doing the rounds. Pretty. Still, call me a cynic, but:
1) [citation needed] – the full graphic lists multiple top-level sources, but without details – what were the exact sources? What was the methodology for identifying this data used by each of those sources? How credible is this information?
2) So what? What useful insight do these lump sums tell us without context? Most of the numbers are random, unrelated big figures, so how does this help us understand the world? What are the trends? What’s the insight?
This is superficially a great bit of marketing, as it’s getting shared a lot and is designed to promote a company flogging a data analytics platform. But there’s no further detail on their site, which is a masterclass in promising a lot (e.g. “Solve back-end integration of any data, at cloud scale, without moving data”) without actually saying or revealing anything about how their tools actually work. To find out more, you need to give them your contact details.
For true data geeks, as for ex-journalists like me, alarm bells start going off at this point:
– Data without context is meaningless
– Single data points don’t equal insight
– Data needs to be well sourced to warrant trust
– Don’t give away your data if you don’t know what you’re getting
This long piece neatly sums up the paradox of the age of algorithmic analytics:
“Algorithms that tell us which topics are trending don’t merely reflect trends; they can also help create them…
“The internet has shown us that the oddest of subcultures and smallest of niches can develop followings… I don’t think readers weren’t interested. It’s that they were told not to be interested. The algorithms had already decided my subjects were not breaking news. Those algorithms then ensured that they would never be.”
This approach of following your analytics is a *terrible* content strategy. By pursuing a mass audience and popularity above all, same as everyone else, you’re doomed to lose your distinctiveness – and relevance to your true target audiences. Even though the algorithms supposedly love relevance above all, they’re still (usually) not sophisticated enough to identify your priority audiences among all those visits.
This is why we’re seeing so many traditional publications fail, and ad revenues collapse: They’ve all become alike, because the algorithms have told them all the same things. That’s made them less valuable, in terms of both price and utility.
Don’t get me wrong: audience analytics are essential. But you need to know how to read them – and their limitations.
Everyone’s going to be sharing this NYT piece on location data – and rightly so. Scary stuff, with some superb journalism backed up with excellent presentation that should make the telecoms, tech and advertising industries (as well as regulators) all take a good hard look at themselves.
But the real challenge (and huge opportunity) is finding ways to enable safe sharing of this kind of data without impeaching on privacy or personal security. Because – even anonymised – this kind of data can lead to insanely useful insight that goes far, far beyond serving up targeted advertising:
“Researchers can use the raw data to provide key insights for transportation studies and government planners. The City Council of Portland, Ore., unanimously approved a deal to study traffic and transit by monitoring millions of cellphones. Unicef announced a plan to use aggregated mobile location data to study epidemics, natural disasters and demographics.”
This isn’t a problem with the concept of location tracking. It’s a problem with the execution.
The New Statesman has a long piece on the ongoing slow death of the advertising industry, with some fun distinctions between the ad industry (creative, visionary) and the ad business (dull, obsessed with data).
Can you guess which part the person who wrote it comes from?
Of course, the simple response to the majority of the article’s debate about whether high-impact artistic visions or hyper-efficient attempts to ensure relevancy are the best way forwards is:
But while there’s much to disagree (and agree) with throughout, it was this particular passage that sparked a realisation about the real challenge for the marketing industry:
“Now that people carry media around with them everywhere, advertisers have less incentive to create memorable brands. Instead, they concentrate on forcing our attention towards the message or offer of the moment. The ad business doesn’t care about the future of its audience, only its present.”
This, within the context of modern ad microtargeting and algorithms (as well as the general proliferation of TV channels, streaming video, and the decline in newspaper readership), is kinda true – with no clear way to ensure a follow-up interaction, the classic old ad model of trying to get a message in front of someone eight times (or whatever) and it’ll stick is no longer as straightforward as it once was. Even if you succeed, it’ll be by using cookies to track someone across multiple sites, firing the same advert at them so relentlessly that it seems desperate – and obvious.
But the obsession with the fast-paced present also shows how many marketing campaigns continue to utterly miss the point of social media.
The clue’s in the name
Social – done properly – *isn’t* simply of the moment, as much as it’s often dismissed as ephemeral.
To think of social posts as throw-away one-offs, as much marketing does, is like viewing a single frame of a film that’s designed to be watched at 24 frames per second. It’s like the blind men and the elephant – you may *think* you know what’s going on, and how your audience is responding, but you’re not seeing the whole (motion) picture.
Yes, a single tweet or Facebook post *can* work in isolation. It can have impact. A person with a couple of hundred followers can see something they post go viral and reach hundreds of thousands of likes. An influencer can amplify it to the point the original poster can monetise that single moment, or use it as the starting point to become an influencer in their own right.
But the clue’s in the name – social is *social*. It’s about relationships, not one-off interactions. And the internet is the same – again, the clue’s in the name. It’s a network. It’s interconnected. Nothing online operates in isolation.
This is why an approach to online advertising that thinks only about the advert – in isolation – is always going to be doomed to fail. (And yes, if your social media post or article or video or whatever is put out on a schedule to broadcast to your followers – whether you put paid behind it or not – if you have no plan or resources to follow up and respond to the replies, then all it is is an advert.)
Even if you aggregate all your social data to see trends over time, you may *think* you’re seeing the big picture – but you’re not seeing it from the perspective of your audience. You’re lumping them together as stats, when in reality they’re all individuals – each having a distinct interaction with your brand. The long-term trends hide the fact that your audience is not always the same audience – different people will see different posts at different times, and many won’t see some of what you’re putting out at all. This means they’ll all be getting different impressions of what it is you’re about.
I remember when all this were fields…
When I started playing about in IRC and messageboards in the 90s, it took months to be recognised as a regular. When I started blogging in the early 2000s, it again took months to build a following and reputation.
And that’s months of multiple posts a day. Multiple replies to comments. Discussions. Following commenters back to their own blogs and reading *their* stuff. Getting a sense of how they thought.
This was all pre-Twitter, pre-Facebook – but post-IRC, and after messageboards, MSN Messenger and the like had become passé. We’d encounter each other on other people’s blogs, in their comment sections, and notice we were talking about the same things through trackbacks, RSS aggregators (after 2004 or so), checking now-defunct sites like Technorati, IceRocket and the like to find other people talking about the same thing (because Google was still rubbish for realtime search back then), and occasionally directly emailing.
Looking beneath the surface
The public face of blogging was our individual blogs. The individual posts. But those were just the tip of the proverbial iceberg – the starting points for interactions between blogger and reader that in some cases have lasted years. Some of the people I met virtually through my various blogs have become real-life friends. Some discussions inspired people to take up blogging for themselves, or to pursue different careers. Some of those interactions even led to real-world, paid work (as they did for me – which, in turn, led to my transition from print journalism to digital, and from there to my current role developing multiplatform, multimedia digital marketing strategies).
All these deep, lasting, sometimes life-changing relationships started with a connection around shared interests – just as, today, algorithms try to match adverts to people who may be interested in them. Superficially, to anyone looking from outside, those initial interactions in the comment sections under individual posts would have looked like that was all there was. If you’d looked at the stats on our blogs, the numbers would have looked *tiny*.
But the *real* story was the ongoing conversations and subconscious assimilation of each others’ ideas. The discussions and collaborations that stretched over months, and led to the short-lived rise of group-blogs, real-world meet-ups, grand plans that (in my case at least) never quite came to fruition. It was about the relationships and trust we built up over time.
The *real* impact took *years*, and in some cases was more significant than any of us ever imagined when we first put finger to keyboard.
How humans work
We’re all humans. We latch onto stories. We need big ideas. Emotional connections. Things to inspire and entertain. Things that speak to our gut instincts as well as to our heads. We’ve all read Daniel Kahneman, and know these heuristics are classic marketing creative territory.
And yes – as we’re humans we can also be manipulated if we’re targeted with the right message at the right time. Some of us will be more susceptible to some messaging than others. We will all have slightly different interests, meaning you can’t speak to us all in the same way. So a data-driven approach makes sense to try and finally give some clarity to John Wanamaker’s classic “Half the money I spend on advertising is wasted” conundrum.
But where big idea creative can attract attention, and data-driven targeting can increase relevance, what’s still missing for many brands is the follow-up. The vital thing that comes next.
In some cases this is where CRM comes in – but I can tell you from my blogging and chatroom days, in most cases being overly keen to initiate a conversation is going to have precisely the opposite response from the one you want. No one wants a pop-up window asking if they want help the second they land on a site any more than they want cookie notifications or requests to turn off their adblocker. Overly keen CRM = instant bounce, often with feelings of mild violation and anger. Not great for the start of a relationship. There’s a reason Microsoft killed Clippy…
My point? Let your audience go at their own pace
The reason the brief Golden Age of blogging (from around 2003-2006, by my reckoning) led to so many strong, lasting relationships is that those relationships were able to be built at our own pace.
There was no realtime chat. There was no “unread” notification to put pressure on us to respond unless and until we were ready. We all gradually built up archives of work that our readers and fellow bloggers could all check out at their leisure to get a sense of who we were and what we stood for. We linked to our past work – and each other – where relevant, showing how our thinking was developing over time, and allowing others to follow our trains of thought at their own pace to catch up and join in the conversation.
So when you encountered an unfamiliar blog or blogger – which was frequently – you could dip your toe in, test the water, and go back and check the context before engaging only when you had an idea what you were going to get involved in.
It was a slower-paced, more civilised way of communicating online that the likes of Twitter seem to have permanently destroyed with the constant need for instantaneous responses to everything.
But today’s pressure to living in the moment and make instant decisions is deeply offputting. It’s not how people like to work. It’s not how any successful relationship has ever been built. It goes against all the instincts of the high-pressured world we’re now in, but today’s emphasis on the hard sell and call to action – not just the obvious “BUY NOW!” but also the more subtle “CLICK HERE TO…” and “FIND OUT HOW…” – may give a short-term nudge but not a long-term engagement.
Engagement – true, lasting engagement – comes through recognition, familiarity, and trust. This can only ever be built over time – often a long time. It will never come through a hard sell, and rarely through a single call to action.
In short:
Rather than worry about big ideas vs targeting, what the marketing industry really needs to learn how to do is revive the art of the soft sell and the long tail. That’s the more human way of building relationships that last – but to work it needs a significantly more nuanced understanding of how people will be interacting with you than I’ve seen from pretty much any modern brand marketing campaign.
So remember:
Every interaction with every part of your brand’s marketing campaign may seem like a one-off to you, but it’s part of a series to your audience. It’s all connected – but one bad experience could break the chain.
This means you need a truly integrated combination of high-impact big ideas and detailed data and longer-term storytelling and archives of the earlier bits of the story so people can catch up and targeting to the people who’ll be most interested and a true understanding of how people – and the internet – actually work.
No one said it was easy. But some things take time.
Impressively prescient, considering it was published five years ago but is about technology – something that’s been moving madly fast during that timeframe. The Facebook / Cambridge Analytica scandal effectively predicted, many of the debates still going on in business and government today about things like the gig economy, autonomous vehicles and more were anticipated and summarised before they’d really started happening. The impression is that Lanier had seen all this coming decades ago – and he probably did.
As such, lots here to spark thought, lots to be impressed with, and it’s hard to disagree with the central thesis that the informational economics of the internet age are fundamentally broken. But at the same time, the only alternative to the current way of doing things – micropayments for data exhange/generation – still seems insanely impractical, even employing a technology like blockchain (something similar to which Lanier kinda proposes here).
So while Lanier ends on an optimistic note, the book left me more pessimistic than ever about our tech-driven future.
Useful look at how detailed, adaptable, *tailored* performance data (and people who know how to analyse and explain it) is essential if you want to be successful in modern media. As so often, Buzzfeed seems to be ahead of the curve.
It never ceases to amaze how often online publishers get het up about the wrong metrics. Tools like Omniture are obscenely powerful, yet all we tend to use them for is to find PVs, UUs, occasionally time spent, and sometimes how particular headlines are performing. Used properly, web analytics can help us keep our sites in a state of constant evolution, adapting to the tiniest shifts in user behaviour through minor design/code tweaks.
This isn’t about becoming Keanu Reeves and learning how to read the Matrix – it’s just knowing how to use the tools that are available to us.
Time was, quality audiences would be worth more to advertisers than quantity. Why hasn’t online ad selling (and buying) caught up yet? Only when it does will there be incentive to move beyond page views and unique users as the key metric. Programmatic ad sales could be the answer, or could worsen the situation further – too early to tell.
Anyway, worth a read:
“Online media is made of clicks.
Readers click from one article to the next. Advertising revenue is based on the number of unique visitors for each site. Editors always keep in mind their traffic targets to secure the survival of their publications. Writers and bloggers interpret clicks as a signal of popularity.
The economic realities underpinning the click-based web are well documented. Yet much work remains to be done on the cultural consequences of the growing importance of Internet metrics.
I conducted two years of ethnographic research (observing newsrooms and interviewing journalists, editors, and bloggers) exploring whether web analytics are changing newsroom cultures. The answer is a qualified yes, but in ways that differ from the ones we might expect.”
Twitter Analytics will be fun and useful, but why no ability to sort by best/worst performers? How can we tell what does/doesn’t work if we can’t see what does/doesn’t work? Intro here. Analytics themselves here (you need to activate before you’ll start seeing stats).
“Now, we expect documentation, live-feeds, streaming video, real time Tweets… [Ferguson] unfolded in real time on my social media feed which was pretty soon taken over by the topic…
And then I switched to non net-neutral Internet to see what was up. I mostly have a similar a composition of friends on Facebook as I do on Twitter.
Nada, zip, nada.
This morning, though, my Facebook feed is also very heavily dominated by discussion of Ferguson. Many of those posts seem to have been written last night, but I didn’t see them then. Overnight, “edgerank” –or whatever Facebook’s filtering algorithm is called now?—?seems to have bubbled them up, probably as people engaged them more.
But I wonder: what if Ferguson had started to bubble, but there was no Twitter to catch on nationally? Would it ever make it through the algorithmic filtering on Facebook? Maybe, but with no transparency to the decisions, I cannot be sure.
Would Ferguson be buried in algorithmic censorship?
Would we even have a chance to see her?
This isn’t about Facebook per se—maybe it will do a good job, maybe not—but the fact that algorithmic filtering, as a layer, controls what you see on the Internet. Net neutrality (or lack thereof) will be yet another layer determining this. This will come on top of existing inequalities in attention, coverage and control.”
It’s a continual worry – how to ensure we see what’s important? Though, of course, the concept is nothing new – the algorithm is just an editor or an editorial policy in a different form. It’s something I’ve written about before when it relates to the EU, focusing on a BBC editorial policy that fails to cover EU affairs in mainstream news most of the time, and then serves up extremes.
This kind of human editorial determination of the appropriate news agenda based on perceived audience interests is arguably no massive degree different from a Facebook algorithm determining what is important based on how it interprets user interests. If anything, there’s a strong argument to be made that Facebook knows its audience better than any editor on any publication or TV show ever, due to the sheer quantities of data it possesses on its userbase.
But then what of *importance* – who determines this? Who overrides the algorithmic or standard editorial policy assumption? Is there a chance that an important story will get buried because a bit of code doesn’t see it as significant? Yes. But the same is true of any number of important news stories that human editors don’t pick up on, or choose to bury on page 23 because they don’t think their readers will be that interested.
As so often, the web may be a bit different, but there’s nothing that *new* here.
“In the age of ever-present social media, our collective attentions have never been spread thinner. According to Facebook, each user has the potential to be served 1,500 stories in their newsfeed each time. For a heavy user, that number could be as much as 15,000. In this climate, how do you get people to pay attention? And, more importantly, how do you know they’re actually engaged?
“Clicks and pageviews, long the industry standards, are drastically ill-equipped for the job. Even the share isn’t a surefire measure that the user has spent any time engaging with the content itself. It’s in everyone’s interest — from publishers to readers to advertisers — to move to a metric that more fully measures attention spent consuming content. In other words, the best way to answer the question is to measure what happens between the click and the share. Enter Attention Minutes.”
This obsession with measurable outcomes from online advertising – something it’s impossible to do with TV or print or billboards – is idiotic. Advertising is about brand/product recognition and building familiarity/trust as much if not more than direct sales, and always has been.
This is a solid overview of the issues. Maybe, one day, the industry will wake up to its idiocy. Yes, detailed data is useful – but just because some things are measurable doesn’t mean everything is. A sale may be a long time in coming.
So unsurprisingly yesterday’s launch of Quartz’s new Glass site – focused on the future of news via an experimental bite-sized format – got me rather excited.
But a day in, I can’t see the point of the atomisation format for this kind of site.
The perils of high expectations
What we get are Tweet-length (or thereabouts) snippets of media news, usually with a link – similar to the linklogs popular around the late 90s / early 2000s (think Memepool, Fark, LinkMachineGo) – or some kind of opinion, often with a little arrow indicating that you can click for more.
A linklog aggregating media news is fine – a useful addition to my Twitter list of handy sources of industry info, with some useful selections.
But why this atomised opinion approach? It’s like a choose your own adventure book, only with argument/opinion – subsequent points hidden until you click – for reasons that largely escape me.
Form vs function
Take this piece on the (excellent) Fargo TV series. That link takes you to the full post – with all the subsections expanded. It reads fine – just like a regular blog post.
But come to it from the front page? You get the first paragraph only.
Click down, you are presented with the tier two paragraphs (numbers 2, 4, 6, 8 and 10).
To get the full post, you have to click an additional four times to get paragraphs 3, 5, 7, 9 and 11. That’s five clicks to get one story.
What matters more – metrics or readers?
Now yes, this will give Quartz lots of useful data that they can analyse to check reader engagement – just as Circa does with their atomised news stories.
But where Circa’s use of “atoms” for presenting their stories makes sense and is backed up by a clear philosophy*, for the opinion piece parts of Glass I simply can’t see the rationale.
If I’m interested in your opinions about Fargo, I’m interested – so give them to me when I click. Don’t make me work harder to get your nuggets of wisdom – you risk annoying and disappointing me when the additional clicks prove pointless.
So from being excited, I’ve become annoyed – the content may be good, but the presentation is annoying. It’s bullet point lists with hidden child bullets, nothing more.
Or am I missing something?
* Short version of my understanding of Circa’s news philosophy (as an aside):
1) news is fast-paced, so keep coverage short and to the point
2) news is made up of facts, and facts change, but themes and stories persist/evolve
3) some facts can be recycled into new stories on the same theme
4) therefore breaking stories into their component (factual) parts makes sense both in the long and short term, as they a) make the news easier/quicker to understand (when properly presented), b) can be recycled into other stories on the same theme down the line, and c) can have tracking attached to each element to see how/if audiences are engaging with that content, giving far more detail about user behavior than is possible from a standard article
Notes and Essays
To help shape my thinking, I write essays and shorter notes examining the ideas and narratives that shape media, marketing, technology and culture.
A core focus: The way context and assumptions can radically change how ideas are interpreted. Much of modern business, marketing, and media thinking is built on other people's frameworks, models, theories, and received wisdom. This can help clarify complex problems – but as ideas travel between disciplines and organisations they are often simplified, misapplied or treated as universal truths. I'm digging into these, across the following categories - the first being a catch-all for shorter thoughts: