Rethinking death to better understand the effects of chemicals

Here is a thought experiment: suppose a drum of toxic chemicals spills in water and kills 50 percent of the fish. TreeHugger reports on it, lots of volunteers help with the cleanup, and the remaining fish in the lake recover fully. Then disaster strikes a second time -- the same amount of chemical spills again into that same lake.

Will the spill kill 50 percent of the fish, again? Or will all of the fish survive, seeing as how they have already proven their ability to recover from that dose of toxins?

This question frames a recent paper on the Death Dilemma and Organism Recovery in Toxicology. The authors examine how the answer to this question influences the outcomes in computer models predicting chemical toxicity. The first case, in which 50 percent of the fish die again, implies a model of random death. The second case depends on a model of individual tolerance -- if an individual has survived once, and is given the chance to recover completely, they will survive again.

Many people are familiar with the use of models in the assessment of the impacts of climate change. But fewer understand that if we are going to keep up in the race between the invention of new chemicals and new uses for the over 100 million documented chemicals already known, modeling will be essential.

So understanding whether a chemical is working in a random or a threshold-dependent manner can be critical to getting a good prediction from the model. In some cases, such as endocrine disruptors, rethinking the old methods of chemical testing may be essential to getting at the truth about whether there are any levels that can be deemed "safe."

Tags: Chemicals | Ecology | Toxins

WHAT'S HOT ON FACEBOOK

treehugger slideshows