Note: Although this post cites specific real-life examples, the intent of the discussion is intended to be entirely at the meta level.
Scott Alexander's definition is apt to cite:
The straw man is a terrible argument nobody really holds, which was only invented so your side had something easy to defeat. The weak man is a terrible argument that only a few unrepresentative people hold, which was only brought to prominence so your side had something easy to defeat.
Also instructive is Bryan Caplan's gradation:
OK, what about “collective straw manning” – questionably accusing a group for its painfully foolish positions? Now we have:
3. Criticizing a viewpoint for a painfully foolish position no adherent holds.
4. Criticizing a viewpoint for a painfully foolish position some adherents hold.
5. Criticizing a viewpoint for a painfully foolish position many adherents hold.
6. Criticizing a viewpoint for a painfully foolish position most adherents hold.
What Caplan is describing as "collective straw manning" seems to be a good scale for weakmanning’s range. And lastly, consider also Julian Sanchez's disclaimer:
With a “weak man,” you don’t actually fabricate a position, but rather pick the weakest of the arguments actually offered up by people on the other side and treat it as the best or only one they have. As Steve notes, this is hardly illegitimate all the time, because sometimes the weaker argument is actually the prevalent one. Maybe the best arguments for Christianity are offered up by Thomas Aquinas or St. Augustine, but I doubt there are very many people who are believers because they read On Christian Doctrine. Probably this will be the case with some frequency, if only because the less complex or sophisticated an argument is, the easier it is for lots of people to be familiar with it. On any topic of interest, a three-sentence argument is unlikely to be very good, but it’s a lot more likely to spread.
At least in theory, I think weakmanning should be avoided, but I struggle with how to draw the line exactly. If your goal is to avoid weakmanning, there's at least two axes that you must consider:
All the possible arguments for position X, ranked on a spectrum from least to most defensible.
All the possible arguments for position X, ranked on a spectrum from least to most representative of believers in X.
Weakmanning is not much of an issue if you're arguing against a single individual, because they either endorse the particular arguments or not. You can't showcase the error of one's ways by refuting arguments they never held.
But generally we tend to argue over positions endorsed by many different people, where each person may differ with regard to which argument they either advance or prioritize, so what should count as "representative"?
For example, many people believe in the theory of evolution, but some believers do so under the erroneous belief that evolutionary change occurs within an individual organism's lifespan.1 If you use a crude heuristic and only poll relevant experts (e.g. biology professors) you're not likely to encounter many adherents of the "change-within-lifespan" argument, so this could be a decent filter to narrow your focus on what should count as "representative" for a given position. This is generally an effective tactic, since it helps you avoid prematurely declaring victory at Wrestlemania just because you trounced some toddlers at the playground.
But sometimes you get a crazy position believed by crazy people based on crazy arguments, with a relatively tiny minority within/adjacent to the community of believers aware of the problems and doing the Lord's work coming up with better arguments. InverseFlorida coined the term "sanewashing" to describe how the meaning of "defund the police" (DTP) shifted2 to something much more neutered and, correspondingly, much more defensible:
So, now say you're someone who exists in a left-adjacent social space, who's taken up specific positions that have arrived to you through an "SJW" space, and now has to defend them to people who don't exist in any of your usual social spaces. These are ideas that you don't understand completely, because you absorbed them through social dynamics and not by detailed convincing arguments, but they're ones you're confident are right because you were assured, in essence, that there's a mass consensus behind them. When people are correctly pointing out that the arguments behind the position people around your space are advancing fail, but you're not going to give up the position because you're certain it's right, what are you going to do? I'm arguing you're going to sanewash it. And by that I mean, what you do is go "Well, obviously the arguments that people are obviously making are insane, and not what people actually believe or mean. What you can think of it as is [more reasonable argument or position than people are actually making]".
Keep in mind that this is not an object-level discussion on the merits of DTP. Assume arguendo that the "sanewashed" arguments are much more defensible than the "crazy" ones they replaced. If someone were to take a position against DTP by arguing against the now obsolete arguments, one of the sanewashers would be technically correct accusing you of weakmanning for daring to bring up that old story again. This fits the literal definition of weakmanning after all.
As Sanchez noted above, for most people for most positions, intuition predates rationality. They stumble around in the dark looking for any sort of foothold, then work backwards to fill in any necessary arguments. Both the sanewashers and the crazies are reliant on the other. Without the sanitization from the hygiene-minded sanewashers, the position would lack the fortification required to avoid erosion; and without the crazy masses delivering the bodies and zeal, the position would fade into irrelevance. The specific ratio may vary, but this dynamic is present in some amount on any given position. You very likely have already experienced the embarrassment that comes from a compatriot, purportedly on your side, making an ass of both of youse with their nonsensical arguments.
If your ultimate goal is truth-seeking, weakmanning will distract you into hacking away at worthless twigs rather than striking at the core. But sometimes the goal isn't seeking truth on the specific position (either because it's irrelevant or otherwise already beyond reasonable dispute) and instead the relevant topic is the collective epistemological dynamics.3 InverseFlorida's insightful analysis would not have been possible without shining a spotlight on the putative crazies — the very definition of weakmanning in other words.
Here's the point, at last. Normally someone holding a belief for the wrong reasons is not enough to negate that belief. But wherever a sanewasher faction appears to be spending considerable efforts cleaning up the mess their crazy neighbors keep leaving behind, it should instigate some suspicion about the belief, at least as a heuristic. Any honest and rational believer needs to grapple for an explanation for how the crazies managed to all be accidentally right despite outfitted — by definition — with erroneous arguments. Such a scenario is so implausible that it commands a curious inquiry about its origin.
It's possible that this inquiry unearths just another fun episode in the collective epistemological dynamics saga; it's also possible the probe ends up exposing a structural flaw with the belief itself. In either circumstances, a weakmanning objection is made in bad faith and intended to obfuscate. Its only purpose is to get you to ignore the inconvenient, the annoying. You should pay no heed to this protest and continue deploying the magnifying glass; don't be afraid to focus the sun's infernal rays into a burning pyre of illumination. Can you think of any reasons not to?
I know some smartass in the comments will pipe up about some endangered tropical beetle or whatever does demonstrate "change-within-lifespan" evolutionary changes. Just remember that this is not an object-level discussion.
TracingWoodgrains described the same dynamic with the gentrification of r/antiwork. Credit also to him for most of the arborist-themed metaphor in this post.
I dare you to use this phrase at a dinner party without getting kicked out.
I mentioned this on Twitter, but I think the post in a sense, doesn't quite go far enough - obviously, I agree that accusations of weakmanning can usually be dismissed. But! I think actually talking about what is "representative of a group's beliefs" is an extremely difficult topic, for reasons I hope to write more of a post on, and not because "It's difficult to tell", but because "It's probably wrong to think of a group as actually having coherent positions and beliefs, and instead they have weird superpositions of possible beliefs and social dynamics where they hold or endorse contradictory beliefs simultaneously".
There's more - you said "Normally someone holding a belief for the wrong reasons is not enough to negate that belief. " I think I actually disagree. Obviously, someone could believe "1 + 1 = 2" for the reason "Yulia Tymoshenko told me in a dream", and that's not enough to say 1 + 1 = 2. But, I would say that people's beliefs about social things, or beliefs they hold because of a community or demographic they're part of, are *not* disconnected beliefs that only mean what they say, but weird nodes of meaning and feeling that exist to support the incoherent wrong reasons they used to believe something in the first place.
Or, perhaps this is clearer: Someone might have the belief "I believe in Free Speech", which I share, but because of the fact that all ideology is social, their belief means *other* things that just them stating it doesn't. It's as though we're speaking different dialects, and that the connotations around what "believing in free speech" mean in the outgroup dialect are actually totally different. We can sense this when we find ourselves wanting to resist someone saying something totally reasonable, hard to disagree with on the surface, but we know that accepting it is a trojan horse, not because it *leads* to agreeing with other arguments, but because it *is* multiple arguments simply self contained. It's why me and some conservative could talk about our beliefs in a way that makes them sound quiet similar - because we'd be avoiding the deeper premise iceberg underneath each of our surface statements, the place where we may be speaking an entirely different language.
And I think, for a really significant amount of things, this means it can be correct to dismiss a belief because it's held for the wrong reason, if you're sensitive to the deeper connotations underneath it.
This is the kind of thing I've been thinking a lot about - very difficult to describe, but I suppose in one sense, I'm saying "Most beliefs are not just beliefs".
Just wondering, do you often apply rationalist/SSC/LessWrong style arguments in your trials? I’d love to see a rationalist lawyer persuading a jury.