People often hold a variety of beliefs that may seem odd or incorrect to others. This can involve topics like conspiracy theories or scientific denialism. Political philosophers have long believed that scientific facts can easily resolve disputes. However, when disagreements occur, they often concern not only the facts themselves but the processes of forming facts and determining who to trust for these facts.
🌍 Real-world example: The ongoing debates about climate change and vaccinations around the globe, despite substantial scientific evidence supporting their validity.
People tend to “cherry-pick” evidence, that is, they believe what they want to believe and select evidence and expertise to back their stand. This is particularly true for beliefs tied to our identities. In other words, if a fact aligns with our beliefs, we are less critical of it and remember it more than a fact challenging our beliefs. These factual beliefs act as signals of identity and solidarity.
🏔️ Real-world example: Someone believing that climate change is a myth may be signalling their allegiance to a particular group that holds the same belief, leading to political polarization.
The pessimistic view is that the appeal to expertise is a charade. Motivated thinking and confirmation bias lead people to cherry-pick authorities that support what they already believe. If the majority supports their belief, they cite the quantity of evidence. If the majority is against them, they argue for the quality of their evidence, emphasizing that truth isn't a democracy. Authorities are used not as guides towards truth but as justifications for pre-existing beliefs.
According to philosopher Quassim Cassam, beliefs are more shaped by our mental habits than facts. These habits influence how we evaluate evidence, relate to authority, and respond to the arguments and beliefs of others. Yet, this could be harmful as it pathologizes people and reduces empathy and tolerance towards them.
🔬 Real-world example: Cassam describes conspiracy theorists as gullible and careless in reasoning, indicating these aren't their reasons but habits of mind.
Dive deeper and gain exclusive access to premium files of Theory of Knowledge. Subscribe now and get closer to that 45 🌟
People often hold a variety of beliefs that may seem odd or incorrect to others. This can involve topics like conspiracy theories or scientific denialism. Political philosophers have long believed that scientific facts can easily resolve disputes. However, when disagreements occur, they often concern not only the facts themselves but the processes of forming facts and determining who to trust for these facts.
🌍 Real-world example: The ongoing debates about climate change and vaccinations around the globe, despite substantial scientific evidence supporting their validity.
People tend to “cherry-pick” evidence, that is, they believe what they want to believe and select evidence and expertise to back their stand. This is particularly true for beliefs tied to our identities. In other words, if a fact aligns with our beliefs, we are less critical of it and remember it more than a fact challenging our beliefs. These factual beliefs act as signals of identity and solidarity.
🏔️ Real-world example: Someone believing that climate change is a myth may be signalling their allegiance to a particular group that holds the same belief, leading to political polarization.
The pessimistic view is that the appeal to expertise is a charade. Motivated thinking and confirmation bias lead people to cherry-pick authorities that support what they already believe. If the majority supports their belief, they cite the quantity of evidence. If the majority is against them, they argue for the quality of their evidence, emphasizing that truth isn't a democracy. Authorities are used not as guides towards truth but as justifications for pre-existing beliefs.
According to philosopher Quassim Cassam, beliefs are more shaped by our mental habits than facts. These habits influence how we evaluate evidence, relate to authority, and respond to the arguments and beliefs of others. Yet, this could be harmful as it pathologizes people and reduces empathy and tolerance towards them.
🔬 Real-world example: Cassam describes conspiracy theorists as gullible and careless in reasoning, indicating these aren't their reasons but habits of mind.
Dive deeper and gain exclusive access to premium files of Theory of Knowledge. Subscribe now and get closer to that 45 🌟