A large part of what makes the DGR risky is that this has never been done before. We don’t know what we’re up against. But we do know that there are some particular areas of concern – such as public health – where it is imperative to explore the risks in detail and consider whether suitable preventative measures and backup plans can be implemented to make the risk worth taking. Unfortunately, much of the research that would be needed to better evaluate these risks has not been done.
One of the major flaws in the way that the DGR is being proposed is that there is no unified assessment of its impact on human health. This is rather strange, since its impact on human health is the primary concern of most interested parties, and is the reason for situating it so deep underground. The lack of such a report is especially troubling because the project is complex and drawn out and involves many different populations. The scattered and fragmentary nature of what health discussions there are imposes a logistical and cognitive burden on those seeking to evaluate health effects, and ensures that those discussions are going to be less well understood than if they were presented in a coherent manner.
“There is no stand‐alone document on human health” reported The International Institute of Concern for Public Health (IICPH) when it was considering the previous DGR proposal close to the Bruce reactor itseelf:
This means that intervenors and the public have to search through numerous lengthy documents to find mention or discussion of the effects of the Project on human health, rather than having a consolidated “Technical Support Document” (TSD) or an equivalent document focused on human health.
Given the significant concern about the impact of the proposed DGR on human health, both in the short term and in the very long term, we are requesting that OPG be required to produce a document specifically on the potential impact on human health, including worst‐case scenarios.
There are many aspect to human health effects – workers, the public, local communities, native communities, sensitive (vulnerable) populations, generational and long term effects, radiological hazards, hazards from non‐radiological substances, routes of exposure (ingestion, absorption, inhalation), accidents and malfunctions, cumulative effects from multiple exposures, including the various phases of the project and concurrent projects, etc.
A comprehensive report on health effects is needed to address all these concerns.
– IICPH submission 89441E (PDF), p. 17
The main health concern, of course, is cancer caused by radiation from the various forms of nuclear waste that this project may be designed to store. We do not yet know whether the site will store thousands of tonnes of mid-level and low-level nuclear waste but we do know it will store all Ontario's high-level nuclear waste deep underground in caskets. The CNSC did in fact complete a report on the incidence of cancer near its power plants, but unfortunately this report was a whitewash that used inappropriate assumptions and methodology that could do nothing else than produce its intended result.
As a retired engineer from the Bruce reactor Frank Greening’s paper points out, the report does not use actual doses to the public, which would have had to be measured; instead it relies on estimates:
The authors of the RADICON report state that the study was based on “all available annual total dose data”. This statement implies that doses to members of the public were measured, when in reality average doses to hypothetical individuals were estimated using models
– RADICON Critique (PDF), p.1
We have already seen that when OPG and the CNSC use models and estimates of exposure, these can be up to 100 times lower than the actual measured value for these situations.
[…]the most serious problem with the RADICON study is its use of averaged meteorological and annual emissions data to estimate doses to members of the public living near Canadian NPPs. Averaging data in this way is commonly employed in the calculation of derived release limits (DRLs) and is arguably a valid approach to dose estimation for relatively constant (continuous) emissions. However, CANDU NPP’s emissions are far from constant, being dominated by short-term spike releases. Such releases are subject to far less dispersion than long-term “routine” emissions. In addition, doses resulting from the wet deposition of radionuclides – especially doses from spike releases that coincide with periods of heavy precipitation – are inevitably underestimated by long-term averaging. A detailed analysis of CANDU emissions over extended periods of time (up to 10 years) shows that the data invariably exhibit power law, rather than Gaussian distributions. Gaussian distributions drop off quickly, because under this statistic large release events are extremely rare; by comparison, power law distributions drop off more slowly. Thus large release events – the events in the tail of the distribution – are more likely to happen in a power law distribution than in a Gaussian.
– RADICON Critique, p.19
Moreover, the RADICON estimates and modeling included some patently false data: for instance, it claims that the amount of radioactive iodine released by the Ontario plants was too low to measure, when in fact the CNSC itself had reported values (RADICON Critique, p.4) for these emissions for all the years covered by the report. Iodine is an essential nutrient for proper functioning of the thyroid, so radioactive iodine is the main cause of radiation-induced thyroid cancer. Claiming that there are no radioactive iodine releases when there actually are significantly undermines the integrity and the explanatory power of the RADICON report.
Likewise, the problems with the CNSC’s model of the dispersal of radioactive releases uses the concept of a plume of radiation traveling in the direction of the most common wind direction at the nuclear plant in question. However, as Greening’s critique shows, the directions that the RADICON report gives as most common for each plant do not match the directions that the CNSC itself gives for those plants in its annual reporting (RADICON Critique, p. 5-6). Furthermore, the data from this annual reporting show that the direction of wind at these facilities is fairly evenly distributed, with none of the sixteen cardinal directions being prevalent more than 11% of the time. Indeed, for each nuclear plant, three to five wind directions show prevalence within a percentage point and a half of the most common wind direction; so if the radioactive contamination is indeed spread by the wind, using a single “most common” wind direction will not provide a good separation between contaminated and uncontaminated areas.
And as Greening notes,
Finally it should be noted that atmospheric dispersion models cannot deal with periods when winds are calm, which usually means wind velocities between zero and 1 m/s. In real situations these “calm” hours may account for up to 10 % of the data but are usually excluded from the calculations altogether because the wind velocity appears in the denominator of the IMPACT software’s plume dispersion equation and therefore cannot be assigned a value of zero, a fact that is not mentioned in the RADICON Report.
– RADICON Critique, p. 6
In these sorts of calm conditions we can appreciate that a radioactive release would settle in the immediate vicinity of the facility, so the people closest to the reactor in all directions would get the heaviest dose. This is also the case with winds that blow in inconsistent directions – it’s just that the closest people get the heaviest dose in on direction at a time, rather than all at once. If such is the case, you would expect to see the people closest to the facility showing the most health effects, and this does turn out to be the case on a widespread basis.
The 2008 KiKK study is the most statistically powerful research ever done on health risks and proximity to nuclear power plants, with 1592 cases and 4735 controls covering a 23-year period. The KiKK study found increased cancer (especially leukemia) near German nuclear installations. In fact, it found that
[…] a logistic regression analysis of the ratio of KiKK cases to controls as a function of proximity (= 1/r with r the residential distance in meters, chosen as the independent variable) showed a strongly increasing risk for all cancers, and especially for leukemia, the closer the children had lived to nuclear plants at the time of diagnosis, with the sharpest rise within 5 km. During the study period 1980–2003, children < 5 years living within 5 km of a nuclear power plant were more than twice as likely to develop leukemia compared to children living > 5 km distant (OR, 2.19; lower limit 95%; CI, 1.51). The increase in leukemia remained significant in children < 5 years living in the < 10 km zone compared to the > 10 km zone (RR, 1.33; lower 95%; CI,1.06)
– Rudi Nussbaum, “Childhood Leukemia and Cancers Near German Nuclear Reactors: Significance, Context, and Ramifications of Recent Studies”, International Journal of Occupational and Environmental Health. 2009; 15:318–323. p. 319 (full article here (PDF))
These findings are backed up by “a sophisticated meta-analysis of incidence and mortality rates of childhood leukemia near 136 nuclear facilities in the UK, Canada, US, Germany, Japan, and Spain which shows statistically significant increases between 14% and 21% of leukemia incidence in children < 9 years near many of these sites…” (Nussbaum study, p. 320) and another very recent, very large French study showed a similar relationship.
The relevance of this to the RADICON study is that if the increased leukemia and other cancers are distributed evenly around the nuclear facility, with distance being the only relevant variable, then a model like the one the CNSC uses will be misleading and yield an incorrect result, because the health problems in the most common wind direction will be the same as in any other direction. There will be more cases closer to the nuclear facility in both the control sample and the people supposedly living in the radiation plume; but since that’s not what the model is looking for, it won’t find them. Using an incorrect model just randomizes your sampling and ensures that no effect will be found.
One possible conclusion from all this is that the nuclear industry just doesn’t know how to model or estimate environmental dosage very well. This has been borne out in industry and regulatory exercises where proponents were invited to model and estimate radiation exposure in a specific real-life situation, and compare their estimates against the actual amount of radiation:
The IAEA […] established an Environmental Modeling and Radiation Safety (EMRAS) group in 2002. […] Pickering NPP [Nuclear Power Plant] was the focus of one of the IAEA’s tritium release scenarios which was largely based on measurements made in the vicinity of the Pickering NPP in July and September of 2002. HTO [Water where one of the hydrogen atoms has been replaced by tritium] concentrations were measured in air, precipitation, soil, drinking water, plants (including the crops that make up the diet of the local farm animals) and products derived from the animals themselves. Given information on food intakes by the farm animals and the measured HTO concentrations in air, precipitation and drinking water, participants in each scenario studied were asked to calculate:
(1) HTO and non-exchangeable organically bound tritium (OBT) concentrations in sampled plants and animal products for each site and sampling period.
(2) HTO concentrations in the top 5 cm soil layer for each site and sampling period.
The modeling approaches used by the participants in the Pickering scenario varied widely. […]
Predictions for the rainwater concentrations ranged over a factor of 3, from 28 Bq/litre to 87 Bq/litre. Significantly, all the predictions were substantially lower than the observed value of 218 Bq/litre. This result is not that surprising because it is known that all of the models used assume that material in the plume has a Gaussian distribution in the vertical, which is not necessarily the case, especially for a source near ground level. In addition, the models probably do not work well for heavy rains or for a plume that meanders.
The prediction of average values of HTO concentrations in soil moisture was also made in these IAEA studies using a variety of approaches. The diversity of approaches led to results that were spread over a factor of nearly seven. This raises questions of how to explain the reasons for such a spread, and how to determine which approach is the best. Significantly, the models evaluated by these IAEA studies have all previously claimed some degree of validity based on the results of comparisons with measured data. However, a model that gives excellent predictions from within a limited range of input parameters may perform poorly when used beyond that range of inputs.
[…] This may be something of a moot point when it comes to the RADICON Report because its authors present no information on how the calculated doses were verified and also fail to discuss the question of data uncertainty. Nevertheless, as the above discussion shows, the RADICON doses are certainly uncertain by at least a factor of ten.
– RADICON Critique, p. 8-9
The French study is interesting with respect to the modeling question, not only because of its size (2,753 cases, 30,000 controls) but because it performed both sorts of evaluations: it tried to evaluate health effects in relation to simple geographic proximity, and (separately) a sophisticated model of estimated dosage, which it calls DBGZ. As with OPG’s RADICON report, the modeled dosage failed to show any effect, but (as in the German study) simple geographic proximity discovered elevated risk for people living close to the reactors:
The […] study evidenced an association between childhood […] [leukemia] and living less than 5 km from a NPP for the 2002–2007 period. The association was also observed in the contemporaneous incidence study, but not for the previous period, 1990–2001. The use of DBGZ yielded very different results, with … [results showing no effect] for all the DBGZ categories.
– Claire Sermage-Faure et al., “Childhood leukemia around French nuclear power plants—The Geocap study, 2002–2007”, International Journal of Cancer, 131:1, Jan 2012, pp. 769–780. p. 777 (full article here)
This shows that the difference between the lack of results obtained with models and the clear results obtained with geographic proximity are not necessarily a result of differences in what is going on at different locations: here you have the exact same data, the exact same researchers, and they obtain different results from modeling than they do from geographic proximity.
These studies are motivated by the question of whether nuclear facilities pose a danger to the public. In this case, we are exploring the first underground nuclear waste burial site in the world so we do not have a lot of studies to refer to. The larger and more statistically robust studies have found a clear link between living close to the nuclear plants. We don't have any studies on clear links between living close to the nuclear waste sites You might expect the authors of these studies to emphasize these linsk in their findings. But it’s not that simple.
On paper, these studies are not about finding danger to the public, but finding whether there is a link between exposure to radiation in the vicinity of these nuclear plants and danger to health. And as far as exposure is concerned, these authors have taken the position that it was impossible for the public to be exposed to radiation enough to cause the cancers that they’ve found, so the cause of these cancers is actually unknown. The French investigators quite rightly take this as a challenge to be solved, and they call for further investigation to find out what that cause is:
Overall, the results suggest a possible excess risk of […] [leukemia] in the close vicinity of French NPPs in 2002–2007. The increased incidence observed at less than 5 km from the NPPs in the Geocap study only partially supports the recent German [KiKK] findings as the increase was limited to recent years and was not specific to the youngest children. The absence of any association with DBGZ, which is assumed to reflect the distribution of gaseous radiation discharged from NPPs, may indicate that the association observed with distance.
Overall, the results suggest a potential excess risk over 2002–2007 that may be due to unknown factors related to the proximity of NPPs. Among the potential factors are population mixing and exposures to physical agents, including natural or man-made exposures to radiation not modeled by the DBGZ. Overall, the findings call for investigation for potential risk factors related to the vicinity of NPP and for collaborative analysis of all the evidence available from multisite studies conducted in various countries.
– Sermage-Faure study, p. 779
The French researchers’ thoughts are well-taken: whatever is causing the nearby leukemia, it could be caused by a “physical agent” – that is, a non-radioactive carcinogen. And the exposure could be happening at work instead of outside the plant. Since people tend to try to live near their jobs, workplace exposure could produce the effect that these researchers discovered. An effect, whatever its nature, that happens at the workplace would be magnified in the surrounding population since – as the French study explicitly notes – these facilities tend to be located in remote rural areas, so the people who work at the nuclear facilities are a large part of the surrounding population. Whatever the cause of these leukemias turns out to be, the French researchers take seriously their charge of finding out what that cause is.
The German researchers, on the other hand, seem to have interpreted their charge very narrowly – to investigate only whether the elevated levels of cancer near the nuclear power plants were caused by radiation. Since they also decided a priori that the levels of radiation emitted by these plants were too low to cause the levels of cancer that they found, their findings were pre-determined.
The authors of the KiKK study stated that “no effect would be expected on the basis of the usual models for the effects of low levels of radiation. […] In view of the fact that this result was not expected under current radiation-epidemiological knowledge, and considering that there is no evidence of relevant accidents, and that possible confounders could not be identified, the observed […] [occurrences of cancer] remains unexplained.”
The GCCR scientists’ surprising interpretation of their 2008 data negates the basic design of the KiKK study: it was to test whether or not there exists an association between incidence rates of malignancies and proximity to the suspected source of radioactive emission, the exhaust stack (surrogate for individual levels of exposures). The categorical dismissal of radiation as a possible cause of the observed health effects voids the design of the study as a test of this possibility. Based on unquestioned acceptance of official assumptions about radioactive emissions and related radiation risks, referred to by the GCCR scientists as “the current state of radiobiological knowledge,” they claimed that radioactive emission from these nuclear plants would have to be several orders of magnitude higher to explain the observed health effects.
– Nussbaum study, p. 319-20
Given that we’ve already seen that the field has no good way of estimating what the public’s exposure to this sort of radiation actually is (even when they aren’t misreporting the data), it seems a bit complacent to claim that the public’s exposure level can’t possibly be high enough to cause the existing health effects.
Whatever the cause – whether it is nuclear or not; whether it works through environmental exposure or exposure on the job – there is clearly some sort of danger to a child living near a nuclear power plant. The French investigators took their charge to protect the public broadly – since they weren’t sure what was causing the effect, they called for further investigation to find out what was. The German investigators, on the other hand, decided that the effect wasn’t caused by radiation (or rather, they pre-determined that before the study even began) and issued a position paper saying “that the KiKK study results were ‘compatible’ with previous findings” without saying whether those “previous findings” were the troubling ones that prompted their research, or their 1998 paper that finished with “We conclude that at present in Germany no further investigations of this kind are necessary.” (Nussbaum study, p. 320)
Into which of these camps does the CNSC fall?
There is ample evidence that living near Ontario’s nuclear power plants increases ones risk of leukemia and other genetic conditions that are sometimes caused by radiation.
[…] leukemia incidence was observed to be significantly elevated in Clarington from 1993 to 2004 – the period after the Darlington NGS began operating. Similarly, thyroid cancer incidence in Ajax-Pickering males was significantly elevated from 1993 to 2004, and thyroid cancer incidence in males and females from 1981 to 1992 was also elevated.
In addition, a 1996 report on the impact of radiation on health in Durham Region observed that the occurrence of the congenital anomaly Down’s syndrome was elevated in Ajax-Pickering during the 1980s and was found to be significantly higher than the rate in Ontario for the time period 1978-1991.It is also noteworthy that these results were consistent with a study done by Health and Welfare Canada for the Atomic Energy Control Board which found significantly high rates of Down’s syndrome in Pickering in the 1973-1988 time period.
Nevertheless, and regardless of these findings, studies of the epidemiology of cancer in populations located in the vicinity of CANDU NPPs have consistently concluded that radiation is not a plausible explanation for any excess cancers observed within 25 km of Pickering, Darlington or Bruce NPPs; the stated reason for rejecting radiation as a causative factor being that the magnitude of the predicted doses is insufficient to induce cancers in exposed individuals.
– RADICON Critique, p.7
In fact the Atomic Energy Control Board (AECB), the CNSC’s predecessor, conducted a study of “Childhood Leukemia Around Canadian Nuclear Facilities”, finding elevated cancer rates within 25 km of the two nuclear generating stations that were studied. However, the AECB concluded that these increases were not significant, using statistical methods tailored to reach that conclusion (Hoel affidavit (PDF)).
The question at hand is whether the CNSC is going to take its charge seriously to protect the people of Ontario – and indeed the whole Great Lakes region – or whether it is just here to provide cover for the industry and wash its hands of these pesky bureaucratic shams of investigations as quickly as possibly.
Here are all the studies cited in this article as PDF files:
1. IICPH submission 89441E
2. RADICON Critique
3. Rudi Nussbaum, “Childhood Leukemia and Cancers Near German Nuclear Reactors: Significance, Context, and Ramifications of Recent Studies”, International Journal of Occupational and Environmental Health. 2009; 15:318–323.
4. Claire Sermage-Faure et al., “Childhood leukemia around French nuclear power plants—The Geocap study, 2002–2007”, International Journal of Cancer, 131:1, Jan 2012, pp. 769–780.
5. Affidavit of Dr. David Hoel
Nuclear energy is a new technology; as such it is subject to unforeseen problems. One of the major problems leading to the Windscale fire, for instance, is that the heat sensors in the core were positioned in places that turned out to be inappropriate – thus, the reactor’s instruments displayed a safe temperature long after other places in the core had started to melt and catch fire.
Likewise, it is imperative to operate nuclear facilities only in accordance with their designed and approved functionality. Both the Windscale fire and the Chernobyl disaster occurred when the staff at the facility decided to conduct a test of alternate ways of operating the equipment.
Concomitant with new and unknown technologies are unknown costs. These tend to come to light mostly when decommissioning and/or cleaning up a failed facility, as the process is more public and the operators have less of a vested interest in secrecy.
For instance, the team tasked with cleaning up the failed DGR in Asse, Germany, admits that it will have to invent new technology to extract the leaking waste and indeed that it can’t even find most of the waste do to unforeseen shifting of the rock in which the storage facility is housed.
Likewise, the company cleaning up the exploded nuclear waste storage shaft in Dounreay, Scotland, admits that they have no idea what’s at the bottom of the shaft, and will have to invent new engineering solutions.
The largest nuclear accidents in the world have happened in the US, Japan, Russia/Ukraine and Germany. These are not countries lacking in technical expertise – rather, they are world leaders in high-criticality technology like space flight. So the problems causing these disasters do not stem from lack of expertise – rather, it is because this technology is inherently difficult and the consequences can be more severe than any other technology currently available on earth.
Given the potentially catastrophic consequences of nuclear accidents and the unforeseeable problems with the technology, pre-emptive safety measures have been invaluable, where they have been in place. For example, at Windscale, Sir John Crockcroft, the head scientist, insisted on installing extremely expensive, high-performance filters in its exhaust chimneys. This was widely derided at the time, since in the planned operation of the facility, there was no gaseous exhaust – outside air simply flowed across metal fins to cool the reactor without mingling with any chemical product of the reactor. Thus, during normal operation the filters would have absolutely nothing to do. However, when the reactor caught fire and burned uncontrollably for three days, spewing radioactive Uranium smoke, the filters prevented the Category 5 event from becoming an experience like Chernobyl.
There are only a few working underground repositories for nuclear waste, though several, like Dounreay and Asse, have been closed due to explosions and containment problems. One of these working repositories, the Forsmark DGR in Sweden, has already reported a raft of unforeseen technology malfunctions, even in very simple areas like keeping the access passageway from freezing up, the sealage and integrity of the waste canisters before they even reach the facility, contamination of groundwater, and humidity in the cavern leading to corrosion of metal parts.
Ironically, engineers from the Swedish site now note that to solve the problem of the access passageway freezing up in winter, “for heating of the roadway we nowadays use waterborne heating using the heat from the ventilation of the rock.” Since they nowhere mention that this heating comes from the ventilation system itself, we have to wonder whether this heat comes from the stored nuclear waste.
At this point in its operations, Forsmark, like the proposed Bruce DGR, is supposed to be a low to intermediate level waste site. One of the defining distinctions between these wastes and high level wastes is that only high level wastes are supposed to emit noticeable heat. So one has to wonder whether Forsmark, like other nuclear waste repositories, has been allowed to reclassify high level waste as intermediate in order to pretend to be in compliance with its governing regulations. And although it is certainly ingenious to use this heat to keep the access way clear, one has to wonder how much this heat will build up as Forsmark continues to receive a thousand cubic meters of waste per year, how this heat will behave when the facility is backfilled as is planned when it is full, and how the heat will interact with other technical problems like the unforeseen corrosion.
One also has to wonder – given that these fairly basic problems were unforeseen, and not simply solved (it took four tries to resolve all the listed problems with the access way) how well they’ll be able to deal with more complex unforeseen problems as they arise.
Bottom line: nuclear energy technology is new and unexplored enough that unforeseen problems will be arising for some time yet. Given the cataclysmic nature of nuclear failures, multiple layers of precaution are indicated. Such measures in the past have prevented bad accidents from turning into catastrophes. Yet, just like the Japanese industrial/regulatory complex, neither the Canadian nuclear industry nor the Canadian regulators have shown any interest in planning for the unforeseen problems that inevitably arise.