Motivated reasoning (motivational reasoning bias) is a cognitive and social response in which individuals, consciously or unconsciously, allow emotion-loaded motivational biases to affect how new information is perceived. Individuals tend to favor evidence that coincides with their current beliefs and reject new information that contradicts them, despite contrary evidence.

Motivated reasoning overlaps with confirmation bias. Both favor evidence supporting one's beliefs, at the same time dismissing contradictory evidence. However, confirmation bias is mainly an unconscious (innate) cognitive bias. In contrast, motivated reasoning (motivational bias) is an unconscious or conscious process by which one's emotions control the evidence supported or dismissed. For confirmation bias, the evidence or arguments can be logical as well as emotional.

Motivated reasoning can be classified into two categories: 1) Accuracy-oriented (non-directional), in which the motive is to arrive at an accurate conclusion, irrespective of the individual's beliefs, and 2) Goal or directional-oriented, in which the motive is to arrive at a particular directional conclusion.

Definitions

Motivated reasoning is a cognitive and social response, in which individuals, consciously or unconsciously, allow emotion-loaded motivational biases to affect how new information is perceived. Individuals tend to favor arguments that support their current beliefs and reject new information that contradicts these beliefs.[1]

Motivated reasoning, confirmation bias and cognitive dissonance are closely related to each other.[2]

Both motivated reasoning and confirmation bias favor evidence supporting one's beliefs, at the same time dismissing contradictory evidence. Motivated reasoning (motivational bias) is an unconscious or conscious process by which personal emotions control the evidence that is supported or dismissed. However, confirmation bias is mainly an unconscious (innate, implicit) cognitive bias, and the evidence or arguments utilised can be logical as well as emotional. More broadly, it is feasible that motivated reasoning can moderate cognitive biases generally, not just confirmation bias.[2]

Individual differences such as political beliefs can moderate the emotional / motivational effect. In addition, the social context (groupthink, peer pressure) also partly controls the evidence utilised for motivated reasoning, particularly in dysfunctional societies. Social context moderates emotions, which in turn moderates beliefs.

Motivated reasoning differs from critical thinking, in which beliefs are assessed with a skeptical but open-minded attitude.

Cognitive dissonance

Individuals are compelled to initiate motivated reasoning to lessen the amount of cognitive dissonance they feel. Cognitive dissonance is the feeling of psychological and physiological stress and uneasiness between two conflicting cognitive and/or emotional elements (such as the desire to smoke, knowing it is unhealthy). According to Leon Festinger, there are two paths individuals can engage in to reduce the amount of distress. The first is altering behavior or cognitive bias. The second more common option is avoiding or discrediting information and/or situations that would create dissonance.[2]

Research studies suggest that reasoning away contradictions is psychologically easier than revising feelings. Emotions tend to color how "facts" are perceived. Feelings come first, and evidence is used in service of those feelings. Evidence that supports what is already believed is accepted. Evidence which contradicts those beliefs is not.[3]

Mechanisms: Cold and hot cognition

The notion that motives or goals affect reasoning has a long and controversial history in social psychology. This is because supportive research could be reinterpreted in entirely cognitive non-motivational terms (the hot versus cold cognition controversy). This controversy existed because of a failure to explore mechanisms underlying motivated reasoning.[1]

Early research on how humans evaluated and integrated information supported a cognitive approach consistent with Bayesian probability, in which individuals weighted new information using rational calculations (cold cognition).[4] More recent theories endorse these cognitive processes as only partial explanations of motivated reasoning but have also introduced motivational[1] or affective (emotional) processes (hot cognition).[5]

Kunda theory

Ziva Kunda reviewed research and developed a theoretical model to explain the mechanism by which motivated reasoning results in bias.[1] Motivation to arrive at a desired conclusion provides a level of arousal, which acts as an initial trigger for the operation of cognitive processes. To participate in motivated reasoning, either consciously or subconsciously, an individual first needs to be motivated. Motivation then affects reasoning by influencing the knowledge structures (beliefs, memories, information) that are accessed and the cognitive processes used.

Lodge-Taber theory

By comparison, Milton Lodge and Charles Taber introduced an empirically supported model in which affect is intricately tied to cognition, and information processing is biased toward support for positions that the individual already holds. Their model has three components.[6]

  1. On-line processing in which, when called on to make an evaluation, people instantly draw on stored information which is marked with affect;
  2. Affect is automatically activated along with the cognitive node to which it is tied;[7]
  3. An "heuristic mechanism" for evaluating new information triggers a reflection on "How do I feel?" about this topic. This process results in a bias towards maintaining existing affect, even in the face of other, disconfirming information.

This theory is fully developed and tested in their book The Rationalizing Voter (2013).[8] David Redlawsk (2002) found that the timing of when disconfirming information was introduced played a role in determining bias. When subjects encountered incongruity during an information search, the automatic assimilation and update process was interrupted. This results in one of two outcomes: subjects may enhance attitude strength in a desire to support existing affect (resulting in degradation in decision quality and potential bias) or, subjects may counter-argue existing beliefs in an attempt to integrate the new data.[6] This second outcome is consistent with the research on how processing occurs when one is tasked with accuracy goals.

To summarize, the two models differ in that Kunda identifies a primary role for cognitive strategies such as memory processes, and the use of rules in determining biased information selection. In contrast, Lodge and Taber identify a primary role for affect in guiding cognitive processes and maintaining bias.

Neuroscience evidence

A neuroimaging study by Drew Westen and colleagues does not support the use of cognitive processes in motivated reasoning, lending greater support to affective processing as a key mechanism in supporting bias. This study, designed to test the neural circuitry of individuals engaged in motivated reasoning, found that motivated reasoning "was not associated with neural activity in regions previously linked with cold reasoning tasks [Bayesian reasoning] nor conscious (explicit) emotion regulation".[9]

This neuroscience data suggests that "motivated reasoning is qualitatively distinct from reasoning when people do not have a strong emotional stake in the conclusions reached."[9] However, if there is a strong emotion attached during their previous round of motivated reasoning and that emotion is again present when the individual's conclusion is reached, a strong emotional stake is then attached to the conclusion. Any new information in regards to that conclusion will cause motivated reasoning to reoccur. This can create pathways within the neural network that further ingrains the reasoned beliefs of that individual along similar neural networks where logical reasoning occurs. This causes the strong emotion to reoccur when confronted with contradictory information, time and time again. This is referred to by Lodge and Taber as affective contagion.[8] But instead of "infecting" other individuals, the emotion "infects" the individuals reasoning pathways and conclusions.

Categories

Motivated reasoning can be classified into two categories: 1) Accuracy-oriented (non-directional), in which the motive is to arrive at an accurate conclusion, irrespective of the individual's beliefs, and 2) goal or directional-oriented, in which the motive is to arrive at a particular directional conclusion. Political motivated reasoning, in particular, is strongly directional, being directed towards supporting individual conservative or liberal (progressive) beliefs.[1][10]

Despite their differences in information processing, an accuracy-motivated and a goal-motivated individual can reach the same conclusion. Both accuracy-oriented and directional-oriented messages move in the desired direction.[10] However, the distinction lies in crafting effective communication where those who are accuracy motivated will respond better to credible evidence catered to the community while those who are goal-oriented will feel less-threatened when the issue is framed to fit their identity or values.[11]

Accuracy-oriented (non-directional) motivated reasoning

Several works on accuracy-driven reasoning suggest that when people are motivated to be accurate, they expend more cognitive effort, attend to relevant information more carefully, and process it more deeply, often using more complex rules.

Kunda asserts that accuracy goals delay the process of coming to a premature conclusion, in that accuracy goals increase both the quantity and quality of processing—particularly in leading to more complex inferential cognitive processing procedures. When researchers manipulated test subjects’ motivation to be accurate by informing them that the target task was highly important or that they would be expected to defend their judgments, it was found that subjects utilized deeper processing and that there was less biasing of information. This was true when accuracy motives were present at the initial processing and encoding of information.[12][13] In reviewing a line of research on accuracy goals and bias, Kunda concludes, "several different kinds of biases have been shown to weaken in the presence of accuracy goals".[1] However, accuracy goals do not always eliminate biases and improve reasoning. Some biases (biases resulting from using the availability heuristic) might be resistant to accuracy manipulations. For accuracy to reduce bias, the following conditions must be present:

  1. Subjects must possess appropriate reasoning strategies.
  2. They must view these as superior to other strategies.
  3. They must be capable of using these strategies at will.

However, these last two conditions introduce the construct that accuracy goals include a conscious process of utilizing cognitive strategies in motivated reasoning. This construct is called into question by neuroscience research that concludes that motivated reasoning is qualitatively distinct from reasoning (in instances when there is no strong emotional stake in the outcomes).[9] Accuracy-oriented individuals who are thought to use "objective" processing can vary in information updating depending on how much faith they place in a provided piece of evidence and inability to detect misinformation that can lead to beliefs that diverge from scientific consensus.[11]

Goal-oriented (directional) motivated reasoning

Directional goals enhance the accessibility of knowledge structures (memories, beliefs, information) that are consistent with desired conclusions. According to Kunda, such goals can lead to biased memory search and belief construction mechanism.[1] Several studies support the effect of directional goals in selection and construction of beliefs about oneself, other people and the world.

Cognitive dissonance research provides extensive evidence that people may bias their self-characterizations when motivated to do so. Other biases such as confirmation bias, prior attitude effect and disconfirmation bias could contribute to goal-oriented motivated reasoning.[11] For example, in one study, subjects altered their self-view by viewing themselves as more extroverted when induced to believe that extroversion was beneficial.

Michael Thaler, a professor at Princeton University, conducted a study and found that men are more likely than women to demonstrate performance-motivated reasoning due to a gender gap in one's beliefs about personal performance.[14] This led him to question whether males are more susceptible to motivated reasoning or if females process information through the Bayesian Method. After a second study was conducted the conclusion was drawn that both men and women are susceptible to motivated reasoning, but certain motivated beliefs can be separated into genders[14]

The motivation to achieve directional goals could also influence which rules (procedural structures, such as inferential rules) are accessed to guide the search for information. Studies also suggest that evaluation of scientific evidence may be biased by whether the conclusions are in-line with the reader's beliefs.

In spite of goal-oriented motivated reasoning, people are not at liberty to conclude whatever they want merely because of that want.[1] People tend to draw conclusions only if they can muster up supportive evidence. They search memory for those beliefs and rules that could support their desired conclusion or they could create new beliefs to logically support their desired goals.

Case studies

Smoking

When an individual is trying to quit smoking, they might engage in motivated reasoning to convince themselves to keep smoking. They might focus on information that makes smoking seem less harmful while discrediting any evidence which emphasizes any dangers associated with the behavior. Individuals in situations like this are driven to initiate motivated reasoning to lessen the amount of cognitive dissonance they feel. This can make it harder for individuals to quit and lead to continued smoking, even though they know it is not good for their health.[15]

Political bias

Peter Ditto and his students conducted a meta-analysis in 2018 of studies relating to political bias.[16] Their aim was to assess which U.S. political orientation (left/liberal or right/conservative) was more biased and initiated more motivated reasoning. They found that both political orientations are susceptible to bias to the same extent.[16] The analysis was disputed by Jonathan Baron and John Jost,[17] to whom Ditto and colleagues responded.[18] Reviewing the debate, Stuart Vyse concluded that the answer to the question of whether U.S. liberals or conservatives are more biased is: "We don't know."[19]

On April 22, 2011, The New York Times published a series of articles attempting to explain the Barack Obama citizenship conspiracy theories. One of these articles by political scientist David Redlawsk explained these "birther" conspiracies as an example of political motivated reasoning.[20] U.S. presidential candidates are required to be born in the U.S. Despite ample evidence that President Barack Obama was born in the U.S. state of Hawaii, many people continue to believe that he was not born in the U.S., and therefore that he was an illegitimate president.[20] Similarly, many people believe he is a Muslim (as was his father), despite ample lifetime evidence of his Christian beliefs and practice (as was true of his mother).[20] Subsequent research by others suggested that political partisan identity was more important for motivating "birther" beliefs than for some other conspiracy beliefs such as 9/11 conspiracy theories.[21]

Climate change

Coal-fired power station Neurath in Grevenbroich, North Rhine-Westphalia, Germany.

Despite a scientific consensus on climate change, citizens are divided on the topic, particularly along political lines.[22] A significant segment of the American public has fixed beliefs, either because they are not politically engaged, or because they hold strong beliefs that are unlikely to change. Liberals and progressives generally believe, based on extensive evidence, that human activity is the main driver of climate change. By contrast, conservatives are generally much less likely to hold this belief, and a subset believes that there is no human involvement, and that the reported evidence is faulty (or even fraudulent). A prominent explanation is political directional motivated reasoning, in that conservatives are more likely to reject new evidence that contradicts their long established beliefs. In addition, some highly directional climate deniers not only discredit scientific information on human-induced climate change but also to seek contrary evidence that leads to a posterior belief of greater denial.[23][11]

A study by Robin Bayes and colleagues of the human-induced climate change views of 1,960 Republicans found that both accuracy and directional motives move in the desired direction, but only in the presence of politically motivated messages congruent with the induced beliefs.[10]

Social media

Social media is used for many different purposes and ways of spreading opinions. It is the number one place people go to get information and most of that information is complete opinion and bias. The way this applies to motivated reasoning is the way it spreads. "However, motivated reasoning suggests that informational uses of social media are conditioned by various social and cultural ways of thinking".[24] All ideas and opinions are shared and makes it very easy for motivated reasoning and biases to come through when searching for an answer or just facts on the internet or any news source.

COVID-19

In the context of the COVID-19 pandemic, people who refuse to wear masks or get vaccinated may engage in motivated reasoning to justify their beliefs and actions. They may reject scientific evidence that supports mask-wearing and vaccination and instead seek out information that supports their pre-existing beliefs, such as conspiracy theories or misinformation. This can lead to behaviors that are harmful to both themselves and others.[25]

In a 2020 study, Van Bavel and colleagues explored the concept of motivated reasoning as a contributor to the spread of misinformation and resistance to public health measures during the COVID-19 pandemic. Their results indicated that people often engage in motivated reasoning when processing information about the pandemic, interpreting it to confirm their pre-existing beliefs and values.[26] The authors argue that addressing motivated reasoning is critical to promoting effective public health messaging and reducing the spread of misinformation. They suggested several strategies, such as reframing public health messages to align with individuals' values and beliefs. In addition, they suggested using trusted sources to convey information by creating social norms that support public health behaviors.[26]

Outcomes and tackling strategies

The outcomes of motivated reasoning derive from "a biased set of cognitive processes—that is, strategies for accessing, constructing, and evaluating beliefs. The motivation to be accurate enhances use of those beliefs and strategies that are considered most appropriate, whereas the motivation to arrive at particular conclusions enhances use of those that are considered most likely to yield the desired conclusion."[1] Careful or "reflective" reasoning has been linked to both overcoming and reinforcing motivated reasoning, suggesting that reflection is not a panacea, but a tool that can be used for rational or irrational purposes depending on other factors.[27] For example, when people are presented with and forced to think analytically about something complex that they lack adequate knowledge of (i.e. being presented with a new study on meteorology whilst having no degree in the subject), there is no directional shift in thinking, and their extant conclusions are more likely to be supported with motivated reasoning. Conversely, if they are presented with a more simplistic test of analytical thinking that confronts their beliefs (i.e. seeing implausible headlines as false), motivated reasoning is less likely to occur and a directional shift in thinking may result.[28]

Hostile media effect

Research on motivated reasoning tested accuracy goals (i.e., reaching correct conclusions) and directional goals (i.e., reaching preferred conclusions). Factors such as these affect perceptions; and results confirm that motivated reasoning affects decision-making and estimates.[29] These results have far reaching consequences because, when confronted with a small amount of information contrary to an established belief, an individual is motivated to reason away the new information, contributing to a hostile media effect.[30] If this pattern continues over an extended period of time, the individual becomes more entrenched in their beliefs.

Tipping point

However, recent studies have shown that motivated reasoning can be overcome. "When the amount of incongruency is relatively small, the heightened negative affect does not necessarily override the motivation to maintain [belief]." However, there is evidence of a theoretical "tipping point" where the amount of incongruent information that is received by the motivated reasoner can turn certainty into anxiety. This anxiety of being incorrect may lead to a change of opinion to the better.[3]

See also

References

  1. 1 2 3 4 5 6 7 8 9 Kunda, Z. (1990). "The case for motivated reasoning". Psychological Bulletin. 108 (3): 480–498. doi:10.1037/0033-2909.108.3.480. PMID 2270237. S2CID 9703661.
  2. 1 2 3 Stone, Daniel F.; Wood, Daniel H. (2018). "Cognitive dissonance, motivated reasoning, and confirmation bias: Applications in industrial organization". Handbook of Behavioral Industrial Organization. doi:10.4337/9781784718985.00011. ISBN 9781784718985.
  3. 1 2 Redlawsk, D. P.; Civettini, A. J. W.; Emmerson, K. M. (2010). "The affective tipping point: Do motivated reasoners ever "get it"?". Political Psychology. 31 (4): 563. doi:10.1111/j.1467-9221.2010.00772.x.
  4. Gerber, Alan; Green, Donald (1999). "Misperceptions about perceptual bias". Annual Review of Political Science. 18 (11): 189–2100. doi:10.1146/annurev.polisci.2.1.189.
  5. Redlawsk, D (2002). "Hot cognition or cool consideration". The Journal of Politics. 64 (4): 1021–1044. doi:10.1111/1468-2508.00161. S2CID 15895299.
  6. 1 2 Lodge, M. and Taber, C. (2000). "Three steps toward a theory of motivated political reasoning". Elements of reason: Cognition, choice and bounds of rationality. Cambridge University Press. ISBN 978-0-521-65329-9.{{cite book}}: CS1 maint: multiple names: authors list (link)
  7. Fazio, Russell (1995). Richard E. Petty; Jon A. Krosnick (eds.). Attitude strength: Antecedents and consequences. Mahwah, NJ. ISBN 978-0-8058-1086-8.
  8. 1 2 Lodge, Milton; Taber, Charles (2013). The rationalizing voter. New York: Cambridge University Press.
  9. 1 2 3 Westen, D.; Blagov, P. S.; Harenski, K.; Kilts, C.; Hamann, S. (2006). "Neural bases of motivated reasoning: An fMRI study of emotional constraints on partisan political judgment in the 2004 U.S. Presidential Election". Journal of Cognitive Neuroscience. 18 (11): 1947–1958. CiteSeerX 10.1.1.578.8097. doi:10.1162/jocn.2006.18.11.1947. PMID 17069484. S2CID 8625992.
  10. 1 2 3 Bayes, Robin; Drunckman, James N.; Goods, Avery; Molden, Daniel C. (2020). "When and how different motives can drive motivated political reasoning". Political Psychology. 41 (5): 1031–1052. doi:10.1111/pops.12663. S2CID 218501034.
  11. 1 2 3 4 Druckman, James N.; McGrath, Mary C. (February 2019). "The evidence for motivated reasoning in climate change preference formation". Nature Climate Change. 9 (2): 111–119. Bibcode:2019NatCC...9..111D. doi:10.1038/s41558-018-0360-1. ISSN 1758-678X. S2CID 85511178.
  12. Tetlock, P. (1983). "Accountability and the perseverance of first impressions". Social Psychology Quarterly. 46 (4): 285–292. doi:10.2307/3033716. JSTOR 3033716.
  13. Tetlock, P. (1985). "Accountability: A social check on the fundamental attribution error". Social Psychology Quarterly. 48 (3): 227–236. doi:10.2307/3033683. JSTOR 3033683.
  14. 1 2 Thaler, Michael (November 2021). "Gender differences in motivated reasoning". Journal of Economic Behavior & Organization. 191: 501–518. arXiv:2012.01538. doi:10.1016/j.jebo.2021.09.016. ISSN 0167-2681. S2CID 227254126.
  15. Fotuhi, Omid; Fong, Geoffrey T.; Zanna, Mark P.; Borland, Ron; Yong, Hua-Hie; Cummings, K Michael (2013). "Patterns of cognitive dissonance-reducing beliefs among smokers: A longitudinal analysis from the International Tobacco Control (ITC) Four Country Survey". Tobacco Control. 22 (1): 52–58. doi:10.1136/tobaccocontrol-2011-050139. PMC 4009366. PMID 22218426.
  16. 1 2 Ditto, Peter H.; Liu, Brittany S.; Clark, Cory J.; Wojcik, Sean P.; Chen, Eric E.; Grady, Rebecca H.; Celniker, Jared B.; Zinger, Joanne F. (March 2019). "At least bias is bipartisan: a meta-analytic comparison of partisan bias in liberals and conservatives" (PDF). Perspectives on Psychological Science. 14 (2): 273–291. doi:10.1177/1745691617746796. PMID 29851554. S2CID 46921775.
  17. Baron, Jonathan; Jost, John T. (March 2019). "False equivalence: are liberals and conservatives in the United States equally biased?" (PDF). Perspectives on Psychological Science. 14 (2): 292–303. doi:10.1177/1745691618788876. PMID 30836901. S2CID 73513928.
  18. Ditto, Peter H.; Clark, Cory J.; Liu, Brittany S.; Wojcik, Sean P.; Chen, Eric E.; Grady, Rebecca H.; Celniker, Jared B.; Zinger, Joanne F. (March 2019). "Partisan bias and its discontents" (PDF). Perspectives on Psychological Science. 14 (2): 304–316. doi:10.1177/1745691618817753. PMID 30836902. S2CID 73506194.
  19. Vyse, Stuart (19 March 2019). "Who are more biased: Liberals or conservatives?". Skeptical Inquirer. Retrieved 2023-04-15.
  20. 1 2 3 Redlawsk, David P. (April 22, 2011). "A matter of motivated reasoning". The New York Times. See also: Pasek, Josh; Stark, Tobias H.; Krosnick, Jon A.; Tompson, Trevor (December 2015). "What motivates a conspiracy theory? Birther beliefs, partisanship, liberal–conservative ideology, and anti-Black attitudes". Electoral Studies. 40: 482–489. doi:10.1016/j.electstud.2014.09.009. hdl:1874/329366.
  21. Enders, Adam M.; Smallpage, Steven M.; Lupton, Robert N. (July 2020). "Are all 'birthers' conspiracy theorists? On the relationship between conspiratorial thinking and political orientations". British Journal of Political Science. 50 (3): 849–866. doi:10.1017/S0007123417000837. S2CID 149762298.
  22. Trumbo, Craig (July 1996). "Constructing climate change: claims and frames in US news coverage of an environmental issue". Public Understanding of Science. 5 (3): 269–283. doi:10.1088/0963-6625/5/3/006. ISSN 0963-6625. S2CID 147326292.
  23. Brulle, Robert J.; Carmichael, Jason; Jenkins, J. Craig (September 2012). "Shifting public opinion on climate change: An empirical assessment of factors influencing concern over climate change in the U.S., 2002–2010". Climatic Change. 114 (2): 169–188. Bibcode:2012ClCh..114..169B. doi:10.1007/s10584-012-0403-y. hdl:10.1007/s10584-012-0403-y. S2CID 8220644.
  24. Diehl, Trevor; Huber, Brigitte; Gil de Zúñiga, Homero; Liu, James (17 August 2021). "Social media and beliefs about climate change: A cross-national analysis of news use, political ideology, and trust in science". International Journal of Public Opinion Research. 33 (2): 197–213. doi:10.1093/ijpor/edz040.
  25. Sylvester, Steven M. (2021). "COVID‐19 and Motivated Reasoning: The Influence of Knowledge on COVID‐Related Policy and Health Behavior". Social Science Quarterly. 102 (5): 2341–2359. doi:10.1111/ssqu.12989. PMC 8242725. PMID 34226771.
  26. 1 2 Bavel, Jay J. Van; Baicker, Katherine; Boggio, Paulo S.; Capraro, Valerio; Cichocka, Aleksandra; Cikara, Mina; Crockett, Molly J.; Crum, Alia J.; Douglas, Karen M.; Druckman, James N.; Drury, John; Dube, Oeindrila; Ellemers, Naomi; Finkel, Eli J.; Fowler, James H. (May 2020). "Using social and behavioural science to support COVID-19 pandemic response". Nature Human Behaviour. 4 (5): 460–471. doi:10.1038/s41562-020-0884-z. hdl:1721.1/125045. ISSN 2397-3374. PMID 32355299. S2CID 217166892.
  27. Byrd, Nick (2022). "Bounded reflectivism and epistemic identity". Metaphilosophy. 53: 53–69. doi:10.1111/meta.12534. S2CID 245477351.
  28. Pennycook, Gordon; Rand, David G. (July 2019). "Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning". Cognition. 188: 39–50. doi:10.1016/j.cognition.2018.06.011. PMID 29935897. S2CID 49418191.
  29. Nir, Lilach (2011). "Motivated reasoning and public opinion perception". Public Opinion Quarterly. 75 (3): 504–532. doi:10.1093/poq/nfq076.
  30. Tsang, Stephanie Jean (1 August 2018). "Empathy and the hostile media phenomenon". Journal of Communication. 68 (4): 809–829. doi:10.1093/joc/jqy031.

Further reading

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.