Abstracts Int. Workshop Rad. Sciences-Appl. Vienna 2005.doc Piero Danesi 12 January 2005
International Workshop on Radiological Sciences and Applications
March 16-18, Vienna, Austria
Consultant, Arsenal, Object 3/30, A-1300, Vienna, Austria
former Director (1986-2002) of the IAEAís Laboratories at Seiberdorf and Vienna
We live in a complex, technological world, and everything we do or is done to us, carries some risk to our health and welfare. Therefore there is no such thing as zero risk or absolute safety. This holds for all technologies including the power and non-power applications of nuclear and radiation technology.
In every society there are stimuli that arouse public fears of hypothetical dangers, vague and uncertain, as they may be (e.g. collision with asteroids or invasion by extra-terrestrials). Fear arises when a hazard is imaginatively conceived, but its reality is not detectable or difficult to measure. Hypothetical fears may arise form idle rumors, or scaremonger hyperboles intended to create public support for a cause. The corresponding risk then becomes a political reality, and may dominate public policies and world issues. We can quote as examples ionising radiation, nuclear power, food irradiation, depleted uranium or genetically modified organisms.
Moreover, often the fears and the associated risks perceived by society have been the result of illegitimate extrapolations either of the frequency of recorded rare events or of known exposure to high levels of a given external agent (or stressor) projected to low levels, in absence of any convincing experimental or theoretical evidence. Frequently the policy choice for non-detectable risks has gone so far as to outlaw the source, a step sanctified as the ěprecautionary principleî, forgetting that this principle is not necessarily correct as it is well demonstrated by the many external agents that at low concentrations or doses can have a beneficial effects (e.g. chemical elements, UV and visible radiation, vitamins). We know that in these cases it is actually the dose which determines if an agent is either a ěremedyî or a ěpoisonî. Last but not least the precautionary principle is a retreat and not an answer and when carried to the extreme, it maintains the status quo by stopping economic and public health progress.
Radiation fear at low doses delivered at low rates is an example of a minor public health hazard being raised to a major issue by its proponents. It also illustrates that the moral high ground assumed by well-meaning activists may well be socially immoral, when evaluated by the welfare of the total population. This can be appreciated when one considers that presently there are countries that consider acceptable spending about US $ 180 million to save one human life by implementing the present radiation protection regulations, but are reluctant in saving lives by improving highway safety, providing school lunches or immunising people living in developing countries against common diseases such as measles, diphtheria and pertussis, costing the latter measures about 50 to 100 US$ per life saved.
In order to make sensible decisions about risks, we need information and, to the extent that available data permit, have to put risks in perspective, assigning numbers that allow as far as possible their rating. Quantification also helps to manage and control risk. Scientific principles can be used in association with full appreciation of human values, to fulfil the aim of all of us, namely to gain the greatest possible benefits to mankind from any technology or activity we perform at the lowest possible cost. This means minimising the risk of unnecessary early death or illness, while at the same time maximising the happiness of life.
Although full agreement exists on the mathematical definition of risk (specialists define risk as the product of the probability that an unpleasant event will occur and the consequences that this event will produce), the issue of comparative risk assessment (i.e. a quantitatively meaningful comparison among different risks) has been rather controversial. The reason is that risk has many attributes and to rigorously compare risks, all factors, circumstances and assumptions that are not explicitly presented in the risk characterisation and quantification, should be mutually equivalent. Comparative risk analysis, in spite of its limitations, is one of the best tools to reveal the probable consequences of societyís choices among alternatives as can help us to allocate resources to reduce those risks which in a given period of time can cause the major detriment to human society.
Objective of this presentation is to discuss the issue of comparative risk assessment by providing factual information not only about radiation risk but also other technological and common risks we experience in our everyday life. Some aspects of the public perception of nuclear and radiation risks are first described and analysed taking into account sociological and psychological factors. Elements for a more quantitative, rational and ethical approach are then presented. These are based on probabilities derived form historical mortality rates, epidemiological studies, calculated fault and events trees, and extrapolation of animal experimental data. Utilizing a risk quantification based on probabilities, comparison among some typical risks is presented.
Finally the role that assuming a linear relationship between detrimental health effects and radiation dose also at low doses (the so called ělinear-no-threshold hypothesis) has played in generating radiophobia (the irrational fear that any level of ionising radiation is dangerous) is discussed, some evidence against its validity is examined and some of the associated socio-economic costs are reviewed.