People who frequently try to impress or persuade others with misleading exaggerations and distortions are themselves more likely to be fooled by impressive-sounding misinformation, new research from the University of Waterloo shows. The researchers found that people who frequently engage in “persuas...
So "you can't bullshit a bullshitter" is bullshit. Does it mean that people who bullshit "you can't bullshit a bullshitter" are easier to bullshit? (This gets recursive.)
[from the paper] recent research has suggested that bullshitting and lying, while clearly related, are psychologically distinguishable constructs (Littrell et al., 2020). For example, liars show a stronger negative association with self-regard and a stronger positive association with lie acceptability than bullshitters (Littrell et al., 2020).
I wonder how well the distinction would hold cross-linguistically. "Strong" Sapir-Whorf might be bullshit, but the weak version is worth checking.
My hypothesis is that the sort of people who'd engage on persuasive bullshit cares less about truth value of the statements, and that's what giving them a hard time asserting the truth value of what others say. In the meantime, evasive bullshitters are already using an evasive approach because they don't want to say an untrue statement.
Some of those BS things are actually pretty difficult. I mean, the "motivational quotes" sound like nonsense, but the fake science things don't sound especially fake to me (who has almost no understanding of physics).
Even the headlines didn't sound especially outrageous, given the kinds of headlines we can easily find today. Though anything "serious", I'd probably fact check or look for a more reliable source, lol
The trick is to look at what the paradigmatic discourse within a Cartesian frame of reference that includes co-articulation can reveal about the locutionary force.[/bullshit]
...sorry, I couldn't resist. Serious now: there's no fool-proof way to detect bullshit, but often you can smell it by analysing the words being used, and see if they convey something coherent. Specially if you can look for the meaning of words that you don't know.
And, if you don't know the topic, you can get a good guess on the meaning of the words based on other things that you might know.
“Strong” Sapir-Whorf might be bullshit, but the weak version is worth checking.
Really persuasiv sounding. ;-)
My hypothesis is that the sort of people who’d engage on persuasive bullshit cares less about truth value of the statements, and that’s what giving them a hard time asserting the truth value of what others say.
Hontestly speaking. This viewpoint isn't completely false.
In some contextes, other aspects are more important than just straight up true value. For instances, some people seems to be used to judge a view not on the merit of it's reasons, but because of the socially consequences which would arise if the view would hold by a lage mayority.
Even if we agree that such points should be irrelevant for a rational discussion, we already know that not all discussions are rational.
Well sure, the BSers are insecure and deep down believe that other people are more impressive than them, so they believe others' BS. That's why they feel the need to BS in order to stack up. If they were secure with themselves then they would notice that other BSers are, well, BSing.
I really hope this impressiv and "scientific sounding" headline is more than just another example of the named effect. ;-)
In a series of studies conducted with over 800 participants from the US and Canada, the researchers examined the relations between participants’ self-reported engagement in both types of BSing and their ratings of how profound, truthful, or accurate they found pseudo-profound and pseudo-scientific statements and fake news headlines.
Selfreporting.
And this 800 participants, where are they from? Students?