Communicating risk information

What negative consequences may occur when communicating risk facts in the "real-world" and how can insights from cognitive psychology help?

What is risk? Ahmed, Naik, Willoughby and Edwards (2012) supplied a definition, simply by stating that risk can be a probability that something unsafe will provoke harm down the road. Then, what is risk communication? Because the literature about this topic is indeed vast, a good starting point will be by citing Benjamin Franklin (1789). He wrote in a letter: ”Our Constitution is in actual procedure; everything seems to promise that it will last; however in this environment there is nothing particular but loss of life and taxes”. Gerd Gigerenzer (2002) just highlighted the last component (”Nothing is certain but loss of life and taxes”) and discussed the uncertainty that your public must live with. Franklin acknowledged the chance but he offered no choice on how to manage it. This paper will concentrate on how persons communicate risk facts in the ”real-globe” and what negative implications it could have on a person. Likewise, could cognitive psychology improve this exchange of information?

Ahmed et al. (2012) continued by explaining that risk conversation is an instrument which aims to create people understand a specific type of information (for instance, a diagnosis of disease) in order to help the patient to create a much better decision about his health. But in many cases, risk details is usually transmitted wrongly and the information will either be correct but too ambiguous explained or the individual will prefer never to ask further dilemma. The negative consequences of these actions could cause anxiety and depressive disorder when receiving such a delicate data. The interaction of risk are available anywhere, from a reports reportage (”the terrorism danger level is usually orange”) to medication (”risk of a heart attack is usually 15%”). But a query emerges. How could a person understand the term ”orange” or the percentage of 15% without a background information. The chance might be intercepted as low for the reason that orange colour is not red or that 15% is too far away from 100%. Therefore, the patient will have the basic perception. If it just happened to other people, it generally does not mean that it will happen to me. He will decide to not ask further issues, however the anxiety could stay in one’s mind. Imagine if could it happen to me? That’s the primary purpose of this paper, to analyse how the risk facts is transmitted around different areas of interest and how it is perceived among individuals. The general public interpret the risk according to their own understanding and background. The most crucial thing should be the knowing of how risk is normally welcomed in people’s minds.

Moreover, uncertainty is an awful sense that one will own several times in a lifetime. Brashers (2001) approached this subject matter and highlighted the feelings one would feel whenever a decision must be made when there isn’t enough information. In the previous example, when the risk information was communicated through percentages, the average person who received the news that ”the threat of having a coronary attack is 15%” may be struggling with the analysis. The uncertainty was present. What 15% means? Is there a chance of another heart attack? Or there is not, mainly for the reason that percentage is merely 15%? Both explications could seem to be equally correct or similarly ambiguous, since the individual had not received enough facts beforehand. He’ll convince himself of a certain result, which generally may not be correct. Nonetheless, there is research which ultimately shows that there is no proof how uncertainty affects the populace. (Johnson & Slovic, 1994)

In relation to the reality mentioned above, one concerning issue in how risk can be communicated may be the probabilities of percentages. Gigerenzer and Hoffrage (1994) started out this debate, by suggesting to replace this technique of reporting delicate data with normal frequencies. An explication would be to imagine that individuals possessed a cognitive algorithm that could carry out statistical inferences. And imagine if these algorithms would not be designed to figure out probabilities of percentages? After the humanity progressed, perhaps a person can comprehend a natural presentation of specific information, based on previous activities. According to Gallistel (1990), even animals have a higher tendency of recognizing certain changes in rate of recurrence distributions within their environment.

Consequently, this fact could also be applied to humans. A proper example because of this argument was offered by Gigerenzer (2000) in his publication. A story called ”Prozac’s side Effects” began with a psychiatrist good friend of the author. Whenever a person received a prescription to take Prozac, the psychiatrist informed him that he had ”a 30 to 50 percent potential for creating a sexual problem”, for example impotence or insufficient sexual interest. The curious thing was that no patient asked for an explication of the side effects, even when the psychiatrist noticed their increased level of anxiety. The issue to become resolved was the way in which the interaction of risk facts was perceived. He realised this result after he browse Gigerenzer’s book (2000) and modified his approach to approaching the sufferers. But what was the trouble? When he told individuals that that they had ”a 30 to 50 percent potential for developing a sexual problem”, most of them believed that in 30 out of 50 percent of their sexual encounters something significant might happen. Also if the amounts are correct, from a psychological stage were seen in a different light. Once the psychiatrist changed the probability of percentages by expressing the medial side effects through normal frequencies (”3 to 5 patients will encounter a sexual issue, out of 10”), the amount of anxiety of people decreased substantially and the patients began to ask questions about their condition, such as what could happen if they are among the 3 to 5 people? The moral of the story is that even though an experienced doctor provides correct details to his patient, the reference category was left out. The negative consequences of communicating risk in the real world were that the expression of percentages left the perception of individuals to think about and invent a reference category. When someone uses natural frequencies, a person would not need to think if that included his sexual encounters.

When taking about uncertainty, another major flag could be the illusion of certainty. Gigerenzer (2000) concentrated around this tendency of coping with an answer that might be wrong in reality. Our subconscious brain, when encountering some concern, switches the uncertainty with certainty. This could be included in the communication of risk and its own negative consequences, when one have a problem with a diagnosis which is usually perceived at a cognitive level as true. In real life, when an individual is obligated to deal with a hypersensitive and ambiguous term, the mind guidelines in. The perception system will try to understand the info and send it to your subconscious, which it will acknowledge it as a truthful fact.

Another circumstance presented by Gigerenzer (2000) is ”Susan’s nightmare”. The girl received a confident HIV diagnostic, after she applied drugs. The condition made her eliminate her job, her home and she actually distanced from her child to protect him. Several months passed and she visited the doctor with bronchitis. Afterward, she was asked to retest her blood vessels for HIV. No one can imagine her surprise when the test returned detrimental. But how could this come to be possible? Apparently, Susan’s blood sample was switched with another patient’s sample. This manufactured an illusion of certainty. The certainty that Susan was contaminated and the certainty that the various other patient could live. That is called a false-great, and the challenge was that no doctors informed her that there have been two testing (the Elisa and Western blot) or that errors might happen. The outcome was considered as 100% appropriate and that once a test came positive, whether or not the other one provided a false-great, it meant that she had the disease. The consequences of the doctors’ action changed Susan’s lifestyle for nine weeks and practically destroyed her. Of course, many factors could be considered, such as for example both tests showing a poor result when in fact she was infected. But whatever the chance might be, it had been the doctors’ duty to handle these issues and describe the whole situation to Susan.

To avoid the consequences of risk communicant, some cognitive psychologists attempted to cope with people’s uncertainty by inventing ‘the cognitive theory of decision’. (Noll & Krier, 1990). It was made up of the solely reason for making persons understand the risks that they have to confront each day. In this theory, it had been presented that persons choose shortcuts each day that causes almost all of the time their inability to understand probability when assessing the risk that they need to face. This theory tried out to analyse how a person might manage risk regulation, despite the fact that the source of mistakes cannot be found. Every person is different plus they have several perception on the environment. Still, this theory might help to improve how the public understand risk communication and the way the risks ought to be minimised.

To conclude, risk facts is presented in lots of ways. From an illusion of certainty to the difficult task of making persons understand percentages, the connection of risk details poses as a significant challenge for organizations. Harman (2009) raised the reality where people live in, by stating that to be able for people to comprehend the risks that surround them, it will be better to embrace the incorrect perception of the people and check out it as an opportunity for improvement. Therefore, the ‘cognitive theory of choice’ (Noll & Krier, 1990) actions in, with the expectation of reducing the uncertainty which an individual must live with in the ‘real-world’. However, these methods of dealing with risk will end up being hard to erase from everyone’s minds.


Ahmed, H., Naik, G., Willoughby, H., & Edwards, A good. G. (2012). Interacting risk. Bmj, 344, e3996.

Brashers, D. E. (2001). Communication and uncertainty control. Journal of communication, 51(3), 477-497.

Gigerenzer, G. (2002). Reckoning with risk. London: Penguin.

Gigerenzer, G., & Hoffrage, U. (1995). How exactly to boost Bayesian reasoning without instruction: frequency formats. Psychological Review, 102, 684-704.

Hardman, D. (2009). Judgment and decision making. Malden, MA: Blackwell.

Johnson, B. B., & Slovic, P. (1994). "Improving" Risk Communication and Risk Supervision: Legislated Alternatives or Legislated Disasters?. Risk Analysis, 14(6), 905-906.

Johnson, B. B., & Slovic, P. (1995). Presenting uncertainty in health risk assessment: initial research of its effects on risk perception and trust. Risk analysis, 15(4), 485-494.

Noll, R. G., & Krier, J. E. (1990). Some implications of cognitive psychology for risk regulation. The Journal of Legal Research, 19(S2), 747-779.