homeo fail

When this appeared in my inbox yesterday, courtesy of Richard Saunders (who was kind enough to ask me if I was sitting down first) I initially *facepalmed*, then sought about getting the original paper where this apparent break through research had appeared (thanks to @xtaldave for the full text).

The paper that had apparently found homeopathy to be as effective as chemo for breast cancer (according to Homeopathy Plus!, yes those guys), was published in the International Journal of Oncology* and entitled “Cytotoxic Effects of Ultra Diluted Remedies on Breast Cancer Cells.” (Click the link for the full pdf of the study).

The paper examines the effects of ultra-dilute remedies (read:homeopathy) on the induction of cell death in two cancer cell lines (commercially available MCF-7 and MDA-MB-231) and one immortalised control cell line, (HMLE).

The authors use several remedies already in use for the treatment of human breast cancer developed at the P. Banerji Homeopathic Research Foundation in India, Kolkata;

Carcinosin, 30C; Conium maculatum, 3C; Phytolacca decandra, 200C and Thuja occidentalis, 30C (for an explanation of how dilute these remedies are see here).

All remedies were diluted in 87% “extra neutral alcohol” and succussed, including the alcohol used as the control solvent.

The authors analysed each remedy with high performance liquid chromatography (HPLC) to look for differences, then measured cell death in cell culture in response to increasing doses and increasing time of incubation with each remedy. These measures include the MTT assay for cell viability, Annexin V and PI for apoptosis, FISH for DNA breakage and Western blots to measure activation of cell cycle proteins.

Technically the paper is quite well written. The problems lie with the data. And these problems are so massive, I wonder how they got by the reviewers. I don’t know whether they were dozing when they reviewed this paper, but I could immediately see some big fat gaping holes in their results.

First up a few pointers;

“The experiments were conducted in triplicate and repeated at least twice in each case of remedy”

This would not get past me. It is accepted scientific convention that experiments are done at least three times (not two) and also in triplicate, giving you a final “n” number of 9. These studies were done in cell culture, meaning there is plenty of material for experiments to be repeated as many times as you wish. So why were they only done twice? Three is convention because it gives the study more statistical power.

Ah statistics, huh?

There is a distinct lack of statistics in this paper, by which I mean there are none at all. As my friend Jo said; “Nary a p-value nor a confidence interval to be seen”. Which begs the question, how can you get a paper accepted in a peer reviewed journal without doing an statistical analysis?

Really? No, I mean REALLY? This is why I suspect the reviewers were dozing or drunk.

And by not doing any statistical analysis, you can not make any statements about whether the treatments are different to each other. Statistics uses algorithms to calculate mathematical differences with a degree of confidence (usually 95%) so that we don’t rely on visual interpretation, which is notoriously unreliable. But this doesn’t seem to have bothered these authors, or the reviewers.

So let’s look more closely at the results.

Firstly the HPLC.

Oh wait a sec, there are no results shown for HPLC. And neither do the authors say “no results shown”. They just make some rather confused statements about what they think they saw and move on. What?! I need to see the chromatograms. What possible reason could they have for not including this data, especially when they go on to describe it so badly in the text.

“All four remedies had very similar HPLC chromatograms to each other, with only trace amounts of limited number of peaks. They were not significantly distinct from the solvent and they lacked the distinct peak seen in the solvent.

So, this means that all the remedies were the same, ie. no different to the solvent and no other peaks indicating any ingredients. But then they contradict themselves by saying that the remedies did not have the solvent peak? Fail.

And then;

“The chromatogram of the untreated and treated solvents appeared identical, indicating that succussion did not cause chemical changes in the solvent.”

Okay, but don’t some homeopaths claim that succussion does have an effect on the chemical structure of the water/solvent? Isn’t this how they explain that homeopathy works? I can only guess SINCE WE CANNOT SEE THE CHROMATOGRAMS, but what you are saying is the remedies and the solvent were exactly the same, meaning they are solvent.

No surprises there.

What about the cell death studies?

So let’s look at the cell death studies since these constitute the crux of the study’s aims. That is, to determine if these ultra dilute remedies can induce cell death in cancer cell lines.

So here we have results for all three cell lines, two cancerous and one control, and they are all treated with a control (the 87% alcohol solvent) or the remedies and death measured by MTT assay. Here’s how they describe it in the text;

“Interestingly, the inhibitory effects on cell viability of the remedies in both the MCF-7 and MDA-MB-231 cells were distinctly greater for each of the doses tested than those seen in cells treated only with solvent.”

Which translates as the treatments killed the cells better than the solvent alone. Okay, so it looks like it did when you eye ball the histograms, but you have no evidence for this – you didn’t do stats, therefore you cannot say this! Sheesh, where did you learn to write science?

But why not keep the fail going;

“MCF-7 cells were found to be more sensitive to all four remedies than the MDA-MB-231 cells”.

Again no statistics, so this statement cannot be confirmed. When you do science properly and you run statistical analysis, you are entitled to say, “MCF-7 cells were found to be significantly more sensitive to all four remedies than the MDA-MB-231 cells”.

Unless you’re these authors, then you just get a great big FAIL stamp on your work.

Also note that they state that the control treatment (that is the solvent) also induced cell death in all cell types;

“As shown in Fig. 1A, the solvent reduced the viability of all three cell types; the overall reduction in cells at different doses of solvent was about 30% for MCF-7, 20-30% for MDA-MB-231 and 20% for HMLE cells.”

Ummm, hold on a sec.

This is your control treatment, which means it should not be causing cell death. It is designed to be inert, functioning as a carrier of your treatment, in order that you can measure the impact of the treatment alone. If your solvent or vehicle is killing your cells you have a fundamental problem. You need to go back to the drawing board and find a different solvent to deliver your treatment.

This is a very big problem right here.

If the cell death induced by the solvent is significant, then the rest of the paper is worthless. But because there are no stats here, there is no way to tell if death by the solvent is significant. According to the above statement, the alcohol killed ~30% of the cancer cells compared to no treatment at all. Although this effect was increased when the treatment was present, there remains a large problem with your model if your solvent is killing the cells.

Perhaps this explains why there are no stats in this paper? Because they may in fact show that the “inert” solvent also significantly kills the cancer cells? Once again, there is no way for me to know this without access to the raw data, or the statistical analysis.

Man, how the hell did this embarrassment get accepted?

Well now that I have revealed a fundamental flaw in this tripe I have lost the will to continue. There is much more fail herein however, I mean we are only at Figure 1 remember.

So I will cover just a few more things that are also glaringly obviously wrong with this paper, then I will send a large bottle of 87% alcohol to the editorial board and encourage them to keep up the good work of smiting the peer review process and science in general.

General lack of quantitation of results in this paper.

figure 3

Figure 3, excerpt from Frenkel et al., showing damage to DNA.

Figure 3 (left) shows fluorescent microscopy data for DNA breakage as measured by FISH assay. But where is the quantitation of this data?

The authors show a representative image for each treatment, and this is usually acceptable if you then measure large numbers of cells and report on such changes with numbers (see below).

cytochrome c

Dunlop et al., in press. Panels are representative DAPI/FITC overlay images of at least 10 images taken from triplicate wells. Histograms are mean + SD of 3 independent experients, incubations in triplicate, n = 9, p < 0.001 1-way ANOVA. Tukeys post-hoc analysis.

Further, even in the fluorescent images the authors only show a maximum of 14 cells. What the hell can you glean from 14 cells? They even say;

“At least 200 cells from treated and untreated samples were analyzed for mitotic index and telomeric DNA signals with a Nikon Eclipse 80i microscope equipped with fluorescence attachment and a Photometrics CoolSNAP HQ2 monochrome digital camera.”

So where is this data? Not in this paper, I can tell you that.

Next up Western blots.

Then they move onto Western blots. Actually this looks like the most resolved part of the paper. They have normalised everything to beta-actin as convention goes, and they have indicated the time of exposure to treatments. But as I mentioned earlier, if their control treatment 87% alcohol, is killing cells, then what can we glean from this data? Well not much except the effect could be an additive effect of the alcohol and treatment. There is no way to differentiate the impact of the treatment versus the control.

Flow cytometry, not quantitated either.

Sigh. I spend most of my days doing flow cytometry, so I am pretty familiar with how it works and what are the accepted ways to present the data. This is not one of them.

Figure 5, excerpt from et al.,

Figure 5, excerpt from Frenkel et al.,

The assay they use (Annexin V and PI) is a common one and I use it often. Standard procedure is to count ~10,000 cells for each condition, then plot your results on a graph, like this (see left below).

AO flow

Dunlop et al., in press. Flow cytometry analysis of lysosomal destabilisation in THP1 human monocytes with acridine orange as a probe. Mean & SD of three independent experiments, incubations conducted in triplicate (n = 9), ** p < 0.01, *** p < 0.001, 1-way ANOVA, Tukeys post-hoc analysis.

You might also show your scatter plots as they have done above, as a nice visual demonstration of how the cells respond to the treatment, but this is not quantitation.

I’m going to stop there. I won’t even bother dealing with the discussion and conclusions, because by my analysis, they are based on flawed data.

One thing I will say about Homeopathy Plus! yelling “Homeopathy as good as chemotherapy for breast cancer” is not a conclusion you can draw from this study.

For all the reasons I have addressed above as well as the really obvious point that these studies were conducted in cell culture. This is a very different situation to a whole animal.

Cells bathing in a bath of homeopathy is very different to the processes which occur in vivo, for example the treatment must survive the low pH of the stomach, cross the gut, escape metabolism in the liver and get to the site of the cancer then do it’s job. This is a very complex process and very difficult to control. Studies in cell culture can provide data about the mechanism of action of a compound, but rarely do they relate to the processes in a human.

Never extrapolate results from a culture dish to a whole animal. You will undoubtedly be wrong and look like a fool.

Ooh, did someone say Homeopathy Plus!?

Listen to Fran Sheffield from Homeopathy Plus! talk about how homeopathy works here (mp3, 3:19).

*The International Journal of Oncology, impact factor 2.234, fail factor 10^23.


Subscribe to comments Comment | Trackback |
Post Tags: , , , , , , , ,

Browse Timeline