
Were they the "right rats?" The controversial study by Gilles-Eric Séralini of the University of Caen, on the toxicity of genetically modified NK603 corn and the weed killer Roundup, has raised a host of questions concerning the type of rodent used in the experiment.
Is the Sprague-Dawley rat appropriate for two-year feeding experiments, when this kind of rat already tends to develop breast cancers past a certain age? The GMO (genetically-modified organism) companies used the same rats for their own tests, but only fed them for 90 days.
But the Seralini controversy has also highlighted what many see as the limits of current methods of using lab animals to evaluate toxic risks. Indeed, a growing number of researchers are advocating toxicogenomics instead. This new science has emerged from recent progress in genetics and biotechnology, and allows scientists to evaluate the effects caused by a particular substance on human cells in vitro.
One major advantage is that lab animals are not needed, but even more importantly, it ensures that results are valid when dealing with humans. "No animal model is valid for another species," says biochemist Claude Reiss, president of Antidote-Europe, an organization that promotes "efficient and safe biomedical research" and former research director at the French National Center for Scientific Research (CNRS). "We are not 70-kilo (150-pound) rats!"
Toxicologist Thomas Hartung is a professor at Johns Hopkins University in Baltimore, where he leads the Center for Alternatives to Animal Testing (CAAT). As an illustration of the problem, he cites the example of aspirin: "With the current protocols, which were developed between the 1920s and the 1960s and have scarcely changed since, aspirin would never have been approved for use," he explains. "The molecule produces malformations in the embryos of rats, mice, rabbits, hamsters and guinea pigs... and if you expose a rat to the doses of aspirin used in human patients, you have a 50% chance of killing it."
On the other hand, thalidomide, which was prescribed to pregnant women with morning sickness during the 1950s, was tested on rats and no teratogenicity was found (those which cause malformations of the embryo or fetus). Thalidomide use in humans ended after a widespread scandal: about 15,000 babies were born with severe malformations.
"There are well-known techniques to influence the currently required tests," says Reiss, one of the first Europeans to advocate toxicogenomics. "For example, to artificially minimize the risk of cancer, you can test a product using C57BL mice, which are known to be up to 100 times less sensitive to carcinogens than the C3H mice."
In theory, the experts from the health authorities, who evaluate industrial studies, are on the lookout for this kind of thing. But in real life, they do not always notice. In 2005, in a study published in the journal Environmental Health Perspectives, University of Missouri biologist Frederick vom Saal showed that companies had evaluated the toxicity of bisphenol A (BPA) using Sprague-Dawley rats, which are known to be 25,000 to 100,000 times less sensitive to hormonal disturbance due to BPA than CF-1 mice, which are often used in university labs. As a consequence, some health authorities are still convinced that BPA is harmless.
In vivo vs in vitro
Would toxicogenomics do better? "You take human cells-- neurons, liver cells, or cells from other tissues-- and put them into contact in vitro with different concentrations of the substance you are studying," Reiss explains. "The cells will react against any damage by activating certain genes." Each type of aggression has its own identifiable genetic signature.
However, the science of toxicogenomics is not yet a stable one, and can produce results that have diverse interpretations. Some even doubt that it can ever completely replace in vivo (live or animal) testing. Moreover, signatures of the various kinds of attacks that cells can endure are not at all well documented yet.
In a 2007 report, the American Academy of Sciences called for research to be done rapidly to map the entire human "toxome" -- that is, all the possible biochemical ways by which substances can be toxic to humans. Several projects are working toward this ambitious goal. Tox21, developed by the American EPA (Environmental Protection Agency), is already partly operational. In particular, it has been used since 2010 for rapid evaluation of the toxicity of dispersants used after the Gulf of Mexico oil spill.
At Johns Hopkins, Hartung has also begun a Human Toxome Project at the CAAT. It obtained funding in 2011 and now aims to coordinate research on the subject on the international level. "The project is as ambitious as the Human Genome Project, and will not be finished for 10 to 15 years," says Hartung. "But we must not wait till it is finished to begin to take advantage of toxicogenomics."
To explain the gap between scientific promises and regulatory practice, François Busquet, European coordinator for the CAAT, suggests that regulatory agencies are naturally conservative. "Those who evaluate risks are used to working with the classic tests, and are often uneasy faced with this new science," he says.
But the value of toxicogenomics is being noticed. The Human Toxicology Project Consortium has just been created. It brings together giant industrial companies like Dupont, Dow, L'Oréal, ExxonMobil, and Johnson & Johnson. The reason is that the cost of toxicology testing, in both time and money, is high. "Testing costs about $3 billion a year worldwide," says Hartung. Toxicogenomics would allow better results, “100 times faster and 100 times less expensive," says Reiss.