In his book Analytic Activism, Dave Karpf looks at how information from digital media feeds into decisions about tactics, strategy and power.
The book sets out how data is, and could be, used at these different levels –
Data can be particularly useful when considering tactical choices. Developing your approaches through testing is better than relying on intuition, which is often wrong.
But even at this level, it’s still easy to focus on the wrong metrics. Measures such as supporter base size, or numbers taking action, can be deceptive, and may not tell you what you actually want to know. You might be maximising support to individual asks but decreasing overall commitment for example.
Thinking about how data can feed into decisions about overall strategy, and what approaches have been or are likely to be effective, things get a bit more complicated. Data can support, but it can’t eliminate the need to take, difficult decisions. Data can inform judgments, and there are ways it can do that better, but there still needs to be an element of (human) interpretation. At the strategy level, it’s about “[balancing] editorial judgement with technology and data”.
Thinking more conceptually about how you see power, how you believe change can come about, and how you use and maximise the power you have to contribute most effectively to change, it’s difficult to see how data can meaningfully guide those kinds of considerations.
The ‘digital frontier’ is the space where organisations are exploring new ways of listening through digital channels that can provide meaningful input to decision making, particularly in these harder to measure areas.
But without giving data a credence it doesn’t warrant: “it is both easy and dangerous to put too much faith in the data”. Well thought through analytics can be a valuable part of the mix but “always be blending – don’t just rely on the data”.
It’s also important not to disregard stuff you can’t measure: “Activist organisations need numbers but they also need vision. But how do you … quantify vision? And if it can’t be directly measured, how do you make sure it does not disappear from your work?”
In my head this all looks like this:
I found this a particularly useful way of thinking about things because in my experience, in the campaign evaluation world, people tend to conflate these levels, and imagine that data can answer questions about strategic effectiveness, and about overall strategic orientation, when it’s far from doing that.
It’s good to push at the limits, and that’s a lot easier to do when you have recognised them first.
Dave talks with me about this, and other stuff in the book, which is full of relevant insights, in Ep 12 of my podcast