By Raymond Richard Neutra, Chief Emeritus Division of Environmental and Occupational Disease Control California Department of Public Health
The presentations of Niels Kuster, Frank Barnes and Dariusz Leszczynski at the last session of the San Diego BEMS meeting reminded me again of the fact that those who work in this field operate within different scientific cultures including but not limited to: physics, biophysics, electrical engineering, physical chemistry, molecular biology, cellular biology, physiology, toxicology and epidemiology. These different scientific cultures have different ideas of what evidence and what inferential rules go into making a convincing claim of causality. This leads to different scientific conclusions and to a lack of agreement on research priorities.
Over the years I have noticed that each discipline uses their own inferential rules of thumb to guide them as to what scientific observations to “enter into evidence”. However though these rules are inculcated during training in the particular discipline and by now taken for granted, they are rarely explicitly stated. For example regulatory toxicologists want to exclude any evidence not generated by “Best Laboratory Practices” and their extensive audit trails, while most academic researchers cannot afford to adhere to these and instead use usual scientific quality control. Each discipline uses other unstated rules of thumb to move from the evidence to claims about their certainty that the EMF mixture, or one of its ingredients is capable of causing an effect observable within their particular domain. Physicists used to dealing with relatively simple situations demand a kind of replication or repeatability that is not always achievable in complex biological systems driven by multiple variables and feed back loops. Each discipline uses still other unstated inferential rules to make claims about the relevance of the effects they have observed to pathological or therapeutic effects in humans. As a result there are endless arguments as to what would constitute a convincing argument for causality and relevance. Also there are unresolved arguments as to what series of experiments or observations are most likely to move the field further. This goes beyond the usual lobbying for one’s own research unit.
When there is miscommunication within a team comprised of different cultures, one needs a culture consultant to figure out what is going wrong and help the team figure out how to overcome their communication problems.
If I were a contract officer again for a research program, I would set aside some money for the following activities to overcome this problem:
- I would issue a request for proposal for an interdisciplinary team which, with the promise of reimbursement for their time, would agree to meet face to face and through internet meeting modalities to explore the differences in their inferential assumptions about (a) what should be entered into evidence, (b) how one should move from evidence to causal claims and (c) how one should move from claims about bioeffects to claims about pathology or cure. The team effort would be coached by experts in argument theory and philosophy of science. These facilitators would write the final report with input and comment from team members. I would have at least two such parallel teams, one of self declared doubting Thomas’s, and one of self declared “high index of suspicion” scientists. At the end of the process the two teams would meet, in a series of facilitated meetings to discuss the inevitable difference in approaches. And the argument theorists and philosophers of science would be charged with writing a summary report on lessons learned.
- The second task of the teams would be to recommend priorities for future research on the basis of the above understandings.
- Since EMF effects may be sensitive to experimental conditions (for example strain of animal used, etc) I would ask the teams to work with investigators to discuss ahead of time the possible results of proposed experiments or observations. Next steps and valid inferences should be laid out ahead of time not made up after the results are seen (as was the case with the hen house study). If replications fail, there should be a protocol and a budget to find out why they failed. In the budget should be funds for skilled arbitrators and argument theorists who have the ability to help participants move through emotionally charged scientific disagreements.