Wrapped in Lew Papers: The psychology of climate psychologization – Part1

Type three concerns the Continued Influence Effect (CIE). The paper Explicit warnings reduce but do not eliminate the continued influence of misinformation by Ecker et al (E2010, one of the other authors is Lewandowsky), neatly explains the CIE in this paragraph:
∙∙∙∙∙∙For example, H. M. Johnson and Seifert (1994) presented participants with a story about a fictitious warehouse fire, allegedly caused by volatile materials stored carelessly in a closet. Participants were later told that the closet had actually been empty. Although participants later remembered this retraction, they still used the outdated misinformation to make inferences; for example, people might argue that the fire was particularly intense because of the volatile materials or that an insurance claim may be refused due to negligence. H. M. Johnson and Seifert (1994) termed this reliance on misinformation the continued influence effect (CIE). The CIE is robust and occurs in a variety of contexts, regardless of the particular story being presented and regardless of the test applied (Ecker et al., in press; H. M. Johnson & Seifert 1994, 1998; Wilkes & Reynolds, 1999).
The paper is available here:- http://rd.springer.com/content/pdf/10.3758%2FMC.38.8.1087.pdf (You may need to cut and paste this link into your browser, for some reason it doesn’t work direct for me). Ecker works at the ‘cogsci’ cognitive science lab of the University of Western Australia, where Lewandowsky also worked before moving to Bristol University in the UK.

E2010 demonstrates the robustness and persistence of the CIE via reference to various real-world examples, such as reports relating to Weapons of Mass Destruction in the Iraq conflict, reports relating to the alleged link between autism and vaccines, a New York Times article suggesting that China had directly benefitted from a crisis in the US economy, and the pseudo real-world example of laboratory analogues of court proceedings. E.g.
∙∙∙∙∙∙The continued influence of misinformation is also detectable in real-world settings. For example, during the 2003 invasion of Iraq, the public was exposed to countless hints that weapons of mass destruction (WMDs) had been discovered in Iraq. Even though no such report was ever confirmed, these constant hints were powerful enough to engender, in a substantial proportion of the U.S. public, a longstanding belief in the presence of WMDs that has persisted, even after the nonexistence of WMDs became fully evident (Kull, Ramsay, & Lewis, 2003; Lewandowsky, Stritzke, Oberauer, & Morales, 2005). Unconfirmed hints can thus engender false memories in the public (analogous to the “sleep” example presented at the outset) that resist subsequent correction (analogous to the warehouse fire example above).

I don’t intend to pursue here the exploration of potential mechanisms for the CIE within E2010 or other papers, except to include this summarizing paragraph:
∙∙∙∙∙∙The CIE typically has been explained by reference to a mental-event model that people build when trying to understand an unfolding event (H. M. Johnson & Seifert,1994; van Oostendorp, 1996; Wilkes &Leatherbarrow,1988). On this view, a retraction of central information creates a gap in the model, and—because people are apparently more willing to accept inconsistencies than they are voids in their event model—they continue to rely on misinformation. That is, people prefer to retain some information in crucial model positions (e.g., what caused something to happen or who was involved), even if that information is known to be discredited (H. M. Johnson &Seifert, 1994; van Oostendorp & Bonebakker, 1999).
It is notable that the above quote is directly followed by this:
∙∙∙∙∙∙Previous efforts [i.e. before the experiments described in this paper] to reduce the CIE have been pursued along various lines, most of which have remained unsuccessful.

The CIE would appear to be extremely tenacious, and remains influential even when considerable efforts to negate it are undertaken, for instance via repeated high-profile retractions or corrections of information that was later found to be wrong. E2010 further states [my underline]:
∙∙∙∙∙∙‘Contrary to the ease with which false memories can be created and true memories altered, the elimination of memories for information that is later revealed to be false—we refer to this as misinformation—has proven to be considerably more difficult. Misinformation continues to affect behavior, even if people explicitly acknowledge that this information has been retracted, invalidated, or corrected (Ecker, Lewandowsky, & Apai, in press; Ecker, Lewandowsky, Swire, & Chang, 2010; Gilbert, Krull, & Malone, 1990; Gilbert, Tafarodi, & Malone, 1993; H. M. Johnson & Seifert, 1994, 1998; Seifert, 2002; van Oostendorp, 1996; van Oostendorp & Bonebakker, 1999; Wilkes & Leatherbarrow, 1988; Wilkes & Reynolds, 1999).’

E2010 describes two modest experiments aimed at combating the CIE, run on 125 and 92 test subjects respectively. The following quote from the abstract for the paper summarizes the results of those tests:
∙∙∙∙∙∙The present study investigated whether the continued influence of misinformation can be reduced by explicitly warning people at the outset that they may be misled. A specific warning—giving detailed information about the continued influence effect (CIE)—succeeded in reducing the continued reliance on outdated information but did not eliminate it. A more general warning—reminding people that facts are not always properly checked before information is disseminated—was even less effective. In an additional experiment, a specific warning was combined with the provision of a plausible alternative explanation for the retracted information. This combined manipulation further reduced the CIE but still failed to eliminate it altogether.

So, even when subjects are explicitly warned beforehand that such a thing as the CIE exists, and then are also told afterwards that certain information given to them in the experiment was false, along with a clear and plausible explanation as to why the original information was false, the CIE is still not eliminated. I.e. subjects still displayed some level of belief in the false information they’d received. The mention about the level of fact checking in the above quote is also very important; more generically this emphasizes the fact that uncertainties within information, whatever their source, should be clearly communicated, even though this has to happen in conjunction with other efforts in order to be truly effective against the CIE, should information later prove to be partially or wholly in error. This concept is crucial within the CAGW debate, and so we’ll return to it later.

Next page for more…

Advertisements
This entry was posted in Climate and tagged , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s