Regulatory Limits on Radiation Dose

Safety Limits: What are they? How are they determined? Much of the discussion concerning radiation levels and radioactive material releases has been presented in the context of safety limits set by a regulator. Examples of such limits include the I-131 limit for drinking water (210 Bq/L) or an annual occupational radiation dose limit (0.05 Sv).…

Safety Limits: What are they? How are they determined?

Much of the discussion concerning radiation levels and radioactive material releases has been presented in the context of safety limits set by a regulator. Examples of such limits include the I-131 limit for drinking water (210 Bq/L) or an annual occupational radiation dose limit (0.05 Sv). What is often left out of these discussions is how these limits were determined and what exceeding a limit implies. This post is intended to provide a general description of the implications of safety limits.

What is a Safety Limit and how are Safety Limits determined?

Safety limits are designed to protect the public from a potential harm and are often set well below the point of potential danger to prevent that point of danger from being accidentally reached. Safety Limits are determined in two steps. First, by identifying the amount of exposure to any given agent, above which causes a health effect to be observed. This amount is determined for the most vulnerable members of the population, and considers the effects of both short and long-term exposure. That resulting number is then divided by a safety factor to ensure that the public is never exposed to dangerous levels. The reason for the safety factor is so the regulator will have time to fix the problem before the levels reach a point that can cause harm to the public, if for whatever reason, the safety limit is exceeded. The more uncertain the dividing line between safety and harm is, the larger the safety factor used to protect the public.

Key Principles of Radiation Protection at Low Radiation Exposure

The probabilistic nature of low-dose radiation health effects makes it impossible to derive a clear distinction between ‘safe’ and ‘dangerous’ level of radiation. This also creates difficulties in explaining the control of radiation risks. The major policy implication is that some finite risk, however small, must be assumed and a level of protection established based on what is deemed acceptable. This leads to a system of protection based on three key principles recognized by the International Commission of Radiation Protection (ICRP) and endorsed by the US National Council on Radiation Protection and Measurement (NCRP) and all other national agencies:

 

–          Principle of Justification, based on the analysis of benefit versus risk of exposure;

–          Principle of Optimization of Exposure, based on the ALARA (As Low As Reasonably Achievable) principle;

–          Principle of limitation of exposure to any person;

 

The ICRP, in its latest Recommendations on Radiological Protection, stated that for radiation doses below around 100 mSv in a year, the increase in the incidence of stochastic effects is assumed to occur with a small probability and in proportion to the increase in radiation dose over the background dose. The use of this so-called linear-non-threshold (LNT) model is considered by the ICRP and by NCRP the best practical approach to managing risk from radiation exposure and commensurate with precautionary principle, being a prudent basis for radiological protection at low doses and low dose rates. However, uncertainties on the over-conservatism on this judgment are recognized by the ICRP and the NCRP, which have stated the need for further evaluation based on new research results.

Despite the fact that the actual onset of latent cancer and other long term effects in relationship to radioactivity exposure is unknown, we do know that those effects are not statistically significant at very low doses. In simpler terms, the number of cancers caused by exposure to low doses of radiation is so small that we can’t sort it out from the noise – the natural rate of cancer incidence.

In 1980, the US National Council on Radiation Protection and Measurement (NCRP) published a report examining and quantifying the dose rate effect.  In examining all laboratory data regarding tumor induction published at that time, they found that lowering the dose rate from acute (eg 180 mSv/hr) to about 4.8 mSv/hr reduced the rate of tumor generation by an average factor of 4. They called this the ‘dose rate effectiveness factor’, DREF.  When the irradiations were much longer term irradiations, comprising “a significant or sizeable fraction of the life span” an even larger reduction in effect was observed, an average of a factor of 10; this was called the ‘protraction factor’ (PF). With few exceptions, the dose rates used in all of the laboratory studies cited in NCRP 64 used ‘low dose rates’ at least a factor of 4000 times higher than normal background dose rates. It is the results of these experiments and others like them, plus corresponding safety factors, which are used to establish regulatory limits on dose and dose rate to the general public.

However, what is of interest today in Japan are dose-rates more like 10, 30, or 100 times background.  What about these dose rates?  The problem noted by the NCRP was that deleterious effects of these very low dose rates could not be observed. In fact, low doses and low dose rates led to increased longevity rather than the decreased lifespan seen at higher doses and dose rates.  In addressing the apparent life lengthening at low dose rates, the NCRP interpreted this effect as reflecting “a favorable response to low grade injury leading to some degree of systemic stimulation.”  They go on to state that “…there appears to be little doubt that mean life span in some animal populations exposed to low level radiation throughout their lifetimes is longer than that of the un-irradiated control population.” In the future, the accurate examination of residents of high background radiation areas around the world might generate the needed information on this phenomenon, which is termed “radiation hormesis”. Based on the presently available data, residents of high background radiation areas (sizeable population is exposed up to 20 mSv per year from natural background) do not appear to suffer adverse effects from these doses.

Areas characterized with background radiation significantly higher than average can be found in Iran, Brazil, India, Australia and China. In the U.S., the population of Denver receives more than 10 mSv per year from natural background.

 

Like this:

Like Loading…

Leave a Reply

Your email address will not be published. Required fields are marked *