To be clear, sometimes authority bias is good and proper. For instance, valuing the opinion of a climate scientist who has been studying climate chaos for thirty years more than your Aunt who saw Rush Limbaugh say climate change is a hoax in the 1990s is normal and rational.
Basically, authority bias as a reasoning flaw stems from misidentifying who is authoritative on a subject.
In a vacuum, appealing to authority is fallacious. An idea must stand up on its own merits.
IRL, things get fuzzy. No one has the expertise and time to derive everything from first principles and redo every experiment ever performed. Thus we sadly have to have some level of trust in people.
not all bias is made equal or always something negative. Sometimes it's good to be biased towards the opinion of a scientist over the opinion of your aunt.
I have to respectfully disagreed with your example. Ostensibly the researcher should be an authority. I think the example given in the chart is not quite right either. I think the confusion comes from the three definitions of "Authority".
the power or right to give orders, make decisions, and enforce obedience.
"he had absolute authority over his subordinates"
a person or organization having power or control in a particular, typically political or administrative, sphere.
"the health authorities"
the power to influence others, especially because of one's commanding manner or one's recognized knowledge about something.
In your example the "Authority" is definition 3, someone with specialized knowledge of a topic that should be listened to by those who are lay on the topic.
In the chart I think they were trying to go for 1, which is the correct source of Authority Bias, but they didn't want to step on toes or get political. The actual example is someone who has decision authority like a police officer or politician or a boss at a workplace who says things and a listener automatically believes them regardless of the speakers actual specialized knowledge of the topic they are speaking on. A better example would be "Believing a vaccine is dangerous because a politician says it is."
This all feeds into a topic I have been kicking around in my head for a while that I have been contemplating attempting to write up as a book. "The Death of Expertise". So many people have been so brainwashed that authorities in definition 3 are met with a frankly asinine amount of incredulity, but authorities in the first are trusted regardless of education or demonstrable specialized knowledge.
YSK: the Dunning-Kruger effect is controversial because it's part of psychology's repeatability problem.
Other famous psychology experiments like the 'Stanford prison experiment' or the 'Milgram experiment' fail to show what you learned in psych101. The prison experiment was so flawed as to be useless, and variations on the Milgram experiment show the opposite effect from the original.
For those familiar with the Milgram experiment: one variation of the study saw the "scientist" running the test replaced with a policeman or a military officer. In these circumstances, almost everybody refused to use high voltage.
Controversial in the sense that it can be easily applied to anyone. There is some substance to the idea that a person can trick themselves into thinking they know more based on limited info. A lot of these biases are like that, they aren't cut and dry but more of an gray area where people can be fooled in various ways. Critical thinking is hard even if it's taught, and it's not taught well enough or at all.
And all of that is my opinion and falls into various biases, but oh well. The easiest person to fool is yourself because we are hardwired in our brain to want to be right, with rewards to ourselves when we find things that help confirm it even if the evidence is not valid. I think the best way to try and avoid the pitfalls is to always back up your claim with something. I've found myself often(!) erasing a response to someone because what I was going to reply didn't have the data that I thought it did and I couldn't show I was correct after I dug a bit to find something.
I almost deleted this for the very reason, but I want to see how it hits. I feel that knowing there's a lot of biases that anyone can fall into can help form better reasoning and argument.
What bias would that fall under? One could assume the variation has to do with the average American's trust of law enforcement vs their trust of a qualified person.
(Assuming the repeat experiments were done in the US that is)
For negativity bias my wife just told me a great technique that she uses for that. Come up with a list of people whose opinions matter to you. Any time you question yourself, imagine how each person on that list would react to what you did. Since those are the only people whose opinions matter to you, if it's mostly positive, then you should feel proud of your choice.
Knowing these helps with self-talk. You trip over a curb and start scolding yourself. Then you can say to yourself "this is just spotlight bias", and move on with your day, avoiding the impact of negative emotions. Or, you might be more open to a change in restaurant plans because you know of the false consensus effect. There's subtle but real power in just naming things!
I tripped and fell spectacularly walking in a supermarket. I was annoyed that no one helped me up or checked if I was okay (I didn't need help but it made me think less of my fellow man) and that my partner was waiting in the car and didn't witness it, because it was actually really funny.
I left embarrassment in my 20s. Don't have the energy or interest in it now. And I know I'm not the main character - everyone's living their own lives, the impact you make on strangers is minimal. At worst someone said when they got home from the shops 'i saw this chick stack and it was kinda funny'.
Reminding yourself that no one really cares about people that don't know is a helpful way to shut down the negative self talk.
What’s the cognitive bias for believing that any given chart is the ULTIMATE CHART. Yes yes, YOUR chart is gospel, the exhaustive definitive final chart 🙄
False Consensus Effect and Narcissistic Personality go hand in hand. Can’t tell you the amount of times my narcissistic coworker starts trash talking people I like a hell of a lot more than them assuming I agree.
Availability Heuristic looks out of place. It's pretty much the only bias I have (beside confirmation bias, which is hard to avoid as sneaky it is), but how should one survive in this world without relying on others? Without doing a scientific bias free study on every topic in life, you're unavoidable suffering from that bias. A healthy level would be avoiding making it a rule. I regularly disagree with friends decisions, so maybe I don't have this bias.
I'd say a lot of those things are the result of cognitive shortcuts.
It kinda makes sense to make a lot if not most decisions by relying of such shortcuts (hands up anybody who whilst not having a skin problem will seek peer-reviewed studies when chosing what kind of soap to buy) because they reduce the time and energy expediture, sometimes massivelly so.
Personally I try to "balance" shortcuts vs actual research (in a day to day sense, rather than Research) by making the research effort I will put into a purchase proportional to the price of the item in question (and also taking in account the downsides of a missjudgement: a cheap bungee-jumping rope is still well worth the research) - I'll invest more or less time into evaluationg it and seeking independent evaluations on it depending on how many days of work it will take to be able to afford it - it's not really worth spending hours researching something worth what you earn in 10 minutes of your work if the only downside is that you lose that money but it's well worth investing days into researching it when you're buying a brand new car or a house.
I’d love to see a list of names for writing devices used by trolls/propagandists thar generate completely false information of varying types. Forced binary choices when a third way is valid or the choices aren’t even related. Most of them are just plain old lies, so I don’t think the list would be too long.
I was thinking about one of these earlier talking about Full Metal Alchemist vs FMA: Brotherhood. Everyone I've talked to who liked Brotherhood more, saw it first. Which makes me wonder if I would like it more had I not seen the original first.
I'd argue that's the outcome or survivorship bias, where people focus on the one winner or that time a thing worked and ignoring all the other failures.
E.g. people think investors all swim in money because they only see the warren buffets of the world, when in reality there's thousands, millions of people who tried the almost exact same thing and lost some or all of their savings in bad investments.
Absolutely! It's called the Pollyanna Principle. In fact, there's a counterpart to all of these biases that are immensely helpful in certain types of therapy.
But then I can't tell my friends all the ways Fox managed to fuck up Dragon Ball: Evolution.
I mean to be fair, I think everyone knew that was going to be shit going into it.
Although to be fair to Dragon Ball Evolution it did bring Toriyama out of retirement for Super and his Swan Song Daima. (No joke, he came out of retirement because the thought of the American movie being the "Last ever new Dragon Ball content" pissed him off that much., and he knew that after GT the studio wasn't going to do anything without him...
Course now the brand is so big that he has a successor (Toyotaro) and there's a wing of Toei that does nothing but Dragon Ball that allegedly has ideas for the next 20 years.
This is basically how I see NT myself being in the spectrum. Not to say I dont do any of those, on the contrary, Im guilty of many but I feel like they are more common on NTs (specially ones like Bandwagon Effect or Authority Bias)