It’s a brief-ish edition this week, owing to a bit of birthday-induced laziness. The newsletter will be back to full strength
What I’m thinking about
If anyone out there is trying to find a way of predicting Russia’s political future, I hope to heaven they’re not working public opinion polls into their models.
The Chronicles project—a survey initiative run by the erstwhile opposition politician Alexei Miniailo—released some intriguing new data this week, which made headlines in the independent sector of the Russian media. In their September survey, they decided to take head on the proposition that the Levada Center—long Russia’s leading independent polling group—was somehow systematically producing data that overstated public support for Vladimir Putin. Asked about approval for Putin using the same formulation as Levada, 78 percent of Chronicles respondents said they support Putin. Given Chronicles’ relatively small sample size (800 respondents, versus 1,600 for Levada), the difference between their number and the 85 percent shown in Levada’s polls is negligible.
That, though, wasn’t really the interesting part of the survey. They also asked respondents what policies they’d like to see Putin and the government pursue. Of those who said they support Putin, 83 percent said they’d like the government to concentrate its efforts on domestic social and economic issues, 61 percent said they’d like to see a peace deal with Ukraine involving mutual compromises, and 43 percent said they’d like to see normalization of relations with Russia. These, Chronicles and numerous commentators were quick to point out, aren’t exactly the policies that Putin is pursuing.
Chronicles, as I’ve written before, is best known for its development of a set of survey questions that allows them—in their own estimation—to gauge what they take to be the real level of support for the Kremlin and the war in Ukraine, which they mark at considerably lower than the 70-something percent consistently showed by Levada and other pollsters. While I find these data interesting and often refer to them, I’ve been clear about my skepticism: without cross-verification, including through a variety of robustness checks in the surveys and the use of focus groups and other qualitative methods, I simply cannot be confident that Chronicles’ numbers mean what they argue they mean. The same, alas, is true of these latest numbers.
Miniailo and his colleagues are pointing to their numbers as evidence of the idea that Putin is increasingly out of step with his population, and that dissatisfaction with his rule and a readiness for change are likely to increase.
Maybe.
But there are at least four logical leaps here that I’m not ready to make just yet. First, Chronicles asked respondents what they would support. To translate this support into “demands” for policy change, as some commentators have, is problematic. There are a lot of things I’d be perfectly happy to have, but that doesn’t mean I’m actively seeking them or willing to expend any effort to get them. How those can really be called demands isn’t clear to me. And as a result, it’s problematic to argue that Putin isn’t delivering on Russians’ demands.
Second, these numbers mask a great deal of differentiation. A separate Levada survey run at around the same time—and which received comparable numbers when it comes to support for Putin, as noted above—found that while many Russians are willing to support a peace deal and even abstract compromises, when pressed on actual potential compromises a large majority refuse to approve of any of them.
Third, surveys since the early months of the war have shown that a majority of Russian respondents are willing to support a wide variety of outcomes of the war and of Russian policy more broadly—provided they think that Putin also supports these things. When confronted with the potential that this might put them at odds with Putin, their support falters. Alas, respondents in this case weren’t asked whether they would support these things even if Putin did not; the Chronicles team made that leap for them, and while it may be an accurate reflection of the real gap between Russians’ desires and the behavior of the Kremlin, it is likely not an accurate reflection of the way respondents actually see the world.
And that brings me to the fourth problem: much of the commentary on this survey seems predicated on the conventional wisdom that people don’t deal well with cognitive dissonance, and thus that Russians will eventually become irked by the mismatch between what they want (if it is genuinely what they want) and what the Kremlin is giving them. The problem is, of course, that people (and not just Russian people) are actually pretty good at living with both cognitive dissonance and political disappointment. Indeed, Levada surveys have for decades shown that ordinary Russians perfectly well understand that Putin does not govern in their interests, and yet they support him anyway.
I’m not going to go into the sources of support for Putin here. I’ve written about that before, as have many others, and I’m certain we’ll all write about again. But I do want to come back to my admonition not to use public opinion as part of a model designed to predict the future. Why not? Because public opinion isn’t about the future. It’s about the present.
This is a hard thing for many to accept—even many who run surveys for a living, and many more who make their money commenting on them—but it’s true. We often look at surveys, such as the incessant drumbeat of surveys showing the down-to-the-wire competition between Kamala Harris and Donald Trump, as indicative of what’s going to happen in the future. That’s a mistake.
It may be, of course, that a survey taken two weeks or two months or two years before election day does correlate with what actually happens on election day. If that happens, it’s because a sequence of events occurred: first, voters agreed to answer the survey question two weeks/months/years before the election; second, they answered honestly; third, their opinion didn’t change between the time of the survey and the time they cast their ballot; and fourth, they actually voted. Among the many flies in this ointment is the fact that at the moment that step one and two occurred, the respondents could not know what would happen at steps three and four.
To make matters worse, the survey-as-crystal-ball conceit fails to recognize the way that public opinion is generated. Take, again, support for Kamala Harris. Back when Joe Biden was still trying to hold onto his nomination, Harris’s poll numbers were in the doldrums. She looked like an unsuccessful vice president whose lack of public support was matched only by her own lack of charisma. The moment Biden stepped back and gave Harris the nod, however, her poll numbers shot up, and she has ridden a wave of progressive enthusiasm ever since. Why? Because Kamala Harris meant one thing on 20 June 2024, and she meant something entirely different three days later. Before Biden’s withdrawal, she was not—and could not be—the vessel into which progressive voters would pour their hopes. Three days later, that changed.
My point here—and it’s a point I have tried to make again and again, so apologies to regular readers who have heard this all before—is that opinion polls are, at best, a measurement of how respondents understand the demands and opportunities created by the social and political reality they’re experiencing. If that reality demands reticence, that’s what the polls will reflect. If that reality allows for enthusiasm, it will shine through.
Those who seek to interpret polls, then, need to remember two things. First, the reality reflected in the polls is the reality of the moment and only the moment; it is not the reality of the future. And second, it’s the reality of the respondents, not the pollsters or the analysts, generated by millions of people looking around at millions of others—whether Americans or Russians or anyone else—for social cues and strength in numbers. Tempting as it might be to impute our own realities into those volatile and vexing numbers, it is a temptation we would do well to resist.
What I’m reading
The only thing policy-world wanted to talk about this week, when it came to Russia, was the BRICS summit in Kazan (meh) and the evident movement of North Korean troops either to Kursk Oblast or Eastern Ukraine or both (double meh). Okay, yes, those are important. But they’re not terribly interesting. Much more interesting, if equally inscrutable, was the explosion of violence in the small Ural-mountains town of Korkino (population 36,000) after the murder of a female taxi driver that locals blamed on Roma. The fullest reporting I’ve found is in 7x7; Ivan Zhilin’s report in Novaya Gazeta is also worth reading.
Elsewhere:
Novaya Gazeta Europe on 25 October published a two-part set of long reads by Pavel Kuznetsov—on the resurgence and reorganization of Russia’s post-Covid nationalis movement, and on how Russia’s nationalists spend their lives—that are morbidly fascinating.
Grzegorz Ekiert and Noah Dasanaike, both of Harvard, published an excellent essay in the Journal of Democracy charting the rise of new hard-line authoritarian governments (so much for the soft, hybrid-authoritarians of recent decades!) and, unlike most work on the subject, refuse to blame the phenomenon on just one factor. Blessedly, they avoid techno-determinism almost entirely.
The Bank of Finland’s in-house think tank published a sophisticated analysis of Russia’s wartime economic data, concluding that, while there is good reason to see the data as problematic, these problems likely reflect issues of quality, data hoarding and poor incentives, rather than a systematic attempt to manipulate Russia’s economic reporting.
Last but not least, is Musk-Russia the new Trump-Russia? Decide for yourself.
What I’m listening to
This hardly needs an explanation. Has there ever been a better Halloween song? It wasn’t until a recent listen, though, that I clocked the anti-war lyrics:
Now they’re beatin’ war drums in the Congo
And the Beatniks are beatin’ the bongo
So it’s up to us, you and me
To put an end to this catastrophe.
We must appeal to their goodness of heart
To pitch in and all do their part
’Cause if this atomic war begins
They won’t even have a pot to p*** in.
And we’ll be singin’ — back to back, oh, belly to belly
Don’t give a damn, done dead already…
Thank you Sam. The Grzegorz Ekiert and Noah Dasanaike essay is the best contextual analysis to answer a question which increasingly troubles me. Namely: WTF is happening to the world l know. I can see sinister and threatening events but lack the ability to put this into some wider context of a pattern and what this means to my ‘world view’. Indeed to the very type of society l live in, my freedoms, wealth and prospects (and to my children’s). The article told me nothing l didn’t know but interpreted what l do know into an explanatory framework. We’re in trouble and in general we don’t know we are. Even in my own country (UK) there is only a vague notion that we are actually involved in an ideological struggle to protect our long established political societal settlement. Furthermore our ability to ‘protect’ is much diminished in relative (historical) power terms. We rely on our allies and they are in even worse shape, consumed by political infighting with neoconservatives forces, whose ultimate aim is to join the ‘dictatorship club’.
We’re in trouble
Thanks for your discussion of what polls actually measure: what people indicate they will do in the moment, not what those people will actually do in the future. Also important is your clarification of the difference between saying support and doing support. That distinction reminded me of surveys regarding people's recognizing the perils of climate change and saying that they support measures to confront these perils.
Here's an AP/NORC survey (US only) from 2018: https://apnorc.org/projects/is-the-public-willing-to-pay-to-help-fix-climate-change/ In this one only 57 percent of Americans surveyed SAY they would be willing to incur a $1 monthly cost to mitigate climate change. Again, if confronted with an actual $1 monthly cost that percentage might decline.
And a more recent survey from Germany of 130,000 respondents world-wide: https://fortune.com/2024/02/16/most-world-sacrifice-1-percent-paycheck-help-stop-climate-change-not-us-uk-canada/ Here a surprising 69 percent SAY they would sacrifice 1 percent of their salaries to combat climate change; the US, UK, and Canada are outliers: "In the U.S., just 48% of people would be willing to contribute. In comparison, over 90% of the people of Myanmar and Uzbekistan would support climate solutions—despite earning significantly less. Generally, the researchers discovered that the richer and colder a country is, the less willing its citizens would be to personally pay up in the fight to stop global warming." Again, this is what people SAY in the moment, not what they will DO in the future.