Strontium-90 levels are rapidly increasing in groundwater near the reactor 2 turbine building at Fukushima Daiichi. Enenews has the story:
Xinhua: Very high radioactivity levels were detected in groundwater from an observation well at the crippled Fukushima Daiichi nuclear plant, said the plant operator Tokyo Electric Power Co. (TEPCO) Wednesday. […] The observation well was set up on the Pacific side of the plant’s No. 2 reactor turbine building last December to find out the reasons why radioactivity levels in seawater near the plant remained high. The company said the sampled water could be from the contaminated water that seeped into the ground.
Reuters: Testing of groundwater showed the reading for strontium-90 increased from 8.6 becquerels to 1,000 becquerels per litre between Dec. 8, 2012 and May 24
BBC News: High levels of a toxic radioactive isotope have been found in groundwater at Japan’s Fukushima nuclear plant, its operator says. […] Strontium-90 is formed as a by-product of nuclear fission. Tests showed that levels of strontium in groundwater at the Fukushima plant had increased 100-fold since the end of last year, Toshihiko Fukuda, a Tepco official, told media.
And Fukushima Diary has a story about huge amounts of tritium and strontium-90:
On 6/19/2013, Tepco announced they measured 500,000,000 Bq/m3 of Tritium and 1,000,000 Bq/m3 of Sr-90 from the groundwater taken from the east side of reactor2 turbine building. The sample was taken on 5/24/2013.
The radiation level in seawater doesn’t decrease. In order to investigate it, Tepco drilled 3 observation holes on the east side of the reactor1~4 turbine buildings.
These radionuclides are going to end up in the seawater, no matter what Tepco says. It is also likely that they have been released into the atmosphere in significant amounts. This is due to re-criticalities that have and are continuing to occur this year. They are occurring in the underground coriums and in at least one aboveground site.
Tepco has minimized the strontium deposition until recently. I know strontium-90 was released into the Pacific in early 2011, because the Chinese government announced that it had been found in Pacific squid. Not only that, I had just eaten some of it. The worst thing I have ever eaten.
Strontium-90 is a particularly dangerous radioactive toxin. It causes bone cancer, leukemia, and immunological damage. It collects in bone and is more dangerous the younger you are.
I received an email today from Richard Bramhall at the Low-Level Radiation Campaign. Chris Busby has written a review of evidence that ingesting and inhaling radionuclides is causing DNA damage to people, and how the nuclear industry’s ICRP model grossly underestimates the dangers to human health.
The evidence shows that ICRP’s use of “absorbed dose” is invalid for many radionuclides when they are internal… The review defines and discusses situations where genetic damage is massively more likely than from external radiation at the same “dose”; 1) biochemical affinity for DNA, 2) transmutation, 3) hot particles, 4) sequential emitters (“Second Event Theory”), 5) low energy beta emitters, and 6) the “Secondary Photoelectron Effect”:
1. Some substances (for example Strontium-90 and Uranium) have high biochemical affinity for DNA so a large proportion of what is inside the body will be chemically bound to DNA. For this reason the radiation events associated with them are massively more likely to damage DNA structures than the same dose delivered externally.
Sr-90 binds to DNA, and is far more dangerous for this reason that nuclides that are simply in cytoplasm. Uranium is also very dangerous for this reason.
2. Transmutation, where the radioactive decay of a radio-element changes it into a different element (e.g. Carbon-14 changing to Nitrogen), has mutagenic effects far greater than would be expected on the basis of “absorbed dose”. This has been known since the 1960s but it has been ignored by risk agencies such as ICRP, UNSCEAR and BEIR.
DNA is made from Carbon, Oxygen, Hydrogen and Nitrogen. Carbon-14 and tritium (radioactive hydrogen) become incorporated into DNA through eating and drinking contaminated products. When the carbon-14 atom in the DNA decays into nitrogen, the chemical bond is broken, and DNA damage results. Similarly, tritium decays into helium. This goes for plants, as well as animals and humans.
3. Hot particles, especially those which emit very short-range alpha radiation, have obvious implications for high local doses to tissue where they are embedded.
It should be added that due to the bystander effect, dozens of times more DNA molecules are affected than those which are directly affected by this alpha radiation. This is due to the release of inflammatory cytokines such as TNF-alpha. These cytokines are also themselves associated with cancer, heart disease, strokes, and autoimmune diseases.
4. The “Second Event Theory” concerns the decay sequences of some radionuclides which decay to a short-lived daughter. Strontium 90 decaying to Yttrium 90 is an example; the Yttrium 90 has a half-life of 2½ days so the theory is that the first event (decay of Strontium 90) may damage a cell’s DNA which then sets about repairing itself. The repair process is known to be very radiosensitive and there is a finite probability that the second event (the subsequent Yttrium decay) inflicts further damage which cannot be repaired.
Sr-90 decays into the vastly more energetic Y-90, which last around 3 weeks. As the DNA repair process in underway due to the Sr-90, an atom turns into Y-90 which screws up the repair process, which itself is a radiosensitive function. It ends up with misrepaired DNA strands that go into random configurations.
5. A good example of a low energy beta emitter is Tritium. (Tritium is projected to account for 99.8% of the radioactivity in discharges from the “generic” design of reactor planned for the UK). The review compares Tritium with Caesium-137. The very low decay energy of Tritium means that delivering the same absorbed dose as the Caesium requires 90 times as many radiation tracks from Tritium. This density of events occurring at low doses suggests a mechanism to explain experimental results that show Tritium is a greater mutagenic hazard than ICRP would expect.
The is an immense amount of tritium going into the Pacific, which gets evaporated and becomes rainfall over North America. Again here, the ICRP grossly underestimates the health impacts of this dangerous radionuclide, which affects all life.
6. Elements with large numbers of protons (e.g. Uranium, Plutonium) absorb external gamma radiation efficiently, re-emitting it in the form of very short-range photoelectrons indistinguishable from beta radiation. This is known as the Secondary Photoelectron Effect (SPE). The review criticizes papers which used Monte Carlo methodology in attempts to minimise the importance of SPE after New Scientist  published a report on it in 2008…
It is bad enough when uranium and plutonium bind to DNA and cause damage via alpha radiation. When there is gamma radiation in the environment, say, during a global nuclear catastrophe, like right now, the uranium and plutonium absorb it, and emit photons which act just like a beta particle.
The review shows that enhancement factors arising from the mechanisms above can theoretically be as high as 10,000-fold. It lists epidemiological evidence where such enhancements are required to explain clear effects which are denied by the industry, regulators and government on the basis of low average doses. One of these is the recent KiKK study which, if the doubled risk of childhood leukaemia near NPPs in Germany is caused by radioactive discharges, implies a 10,000-fold error in ICRP risk estimates. KiKK is at one extreme of such evidence; at the other, the Seascale cluster implies an error of 200.
All the crap about “safe radiation levels” and “no immediate effect” is just that, crap. Reality is at least 200 times worse and as much as 10,000 times worse than the crap models that the radioactive polluters use.