Climate Change Project

Table of Contents



Articles on Law, Science, and Engineering


Charles Walter, Ph.D., J.D. & Edward Richards, III, J.D., M.P.H.


When I (CFW) took physical chemistry lab at Georgia Tech, the professor would collect the lab notebooks, spread them out across his desk, and summarily assign an "F" to anyone who he thought had cheated. We all marveled at how he could do this. And we all feared that his methods might unjustly convict one of us. Because he wasn't always right.

But he was right much of the time, and we think we know how he did it. Perhaps the Internal Revenue Service uses the same technique. The Office of Research Integrity (ORI) at the U.S. Public Health Service (PHS) certainly does. And ORI uses it as evidence that research data have been falsified or fabricated.

In this article we discuss four such cases of research misconduct reported by ORI in the Federal Register, focusing on how ORI uses digits in data other than the leading digit as evidence that data is fabricated or falsified. In the next article of this series we examine more closely the probative value of using significant digits in ORI's analysis.

Research Misconduct

ORI's investigations arise from its legal duty to investigate research misconduct as defined in regulations promulgated by PHS, a subdivision of the Department of Health and Human Services. 42 CFR 50A.

One form of research misconduct is to report data from an experiment that was not done as described, or not done at all. In considering allegations of research misconduct it is usually necessary to examine "questioned data." If questioned data was not fabricated or falsified, the experiment was done; if the data is fabricated or falsified, the experiment was either not done or reported incorrectly.

Reported data may be numeric or non-numeric information. If numeric, it may be raw numbers written down by hand from direct observations, numbers in a log or graph from electronic sensors, hand or machine calculations from raw data, data interpolations or extrapolations, differentials or integrals estimated from slopes or the area of raw data, or the like.

Regardless of the nature of the information observed, the scientific method requires that observations be entered into laboratory notebooks. The importance of retaining proper records of experimental data cannot be overemphasized. For example, an Assistant Professor at Thomas Jefferson University who reported data that was inconsistent with earlier data. When he could not verify the earlier data because it was not retained, he was accused of research misconduct, and eventually agreed to sanctions by ORI. 65 FR 39149-50. Thus, even though the failure to keep proper records of data may not be considered "research misconduct" per se by a number of scientists, failure to produce a proper record of experimental data can lead to sanctions for research misconduct. It is this written record of research data that provides the basis for investigating the data's internal logical consistency or statistical structure.

The scientific method also requires that scientists record only what they actually observe in the laboratory. The concept of "significant figures" imposes a limit on the number of digits recorded. Additional digits are not recorded because they are random noise, meaningless to the experiment. Nevertheless, there are numerous publications containing data expressed with excess digits to the right of the least significant one. Often this is merely the result of copying a digital readout from a calculator or experimental equipment that is too dumb to know when it has run out of significant figures. These insignificant digits may be useful as evidence of intentional misrepresentations in significant digits. For example, two chemical engineering students taking the same examination in my (CFW) process dynamics and control course at the University of Houston reported a calculation to eight figures with all digits the same. Curious, I did the same calculation by hand and found that their last three digits were incorrect. When confronted with this fact, one of the students produced a hand calculator that gave the same result he had reported on the exam, including the three incorrect digits. The other student was able to tell the world quite truthfully that he flunked the course because he didn't have a hand calculator.

ORI is using both numeric and non-numeric information as evidence of data falsification. In this article we focus on examples wherein ORI used both insignificant digits and significant digits to the right of the leading digit as evidence of data fabrication. ORI calls them "inconsequential" digits. The theory is, if the information is falsified, the miscreant typically devotes attention to details that are most important to establish the desired scientific result (the leading digit in ORI's view) and pays less attention to details that are not directly related to the scientific conclusions of the purported experiment. ORI uses anomalous distributions of digits to the right of the leading digit together with testimony and other physical evidence to resolve accusations that data is falsified and/or fabricated.

Four Examples of Research Misconduct Uncovered by ORI

As described above, ORI is using characteristics of inconsequential digits as evidence that data is falsified. We discuss four examples in this paper. The smoking gun in the first two examples was anomalous terminal digits. In the last two examples, the smoking gun was the lack of uniformity in the distribution of inconsequential digits ("Chi-Square Test"). ORI also used its Chi-Square Test to corroborate the evidence of research misconduct in Example 2.

Example 1. An Extra Terminal Digit.

Example 1 illustrates how the presence of an extra terminal digit can lead to an inference that data is falsified or fabricated. In this example, a Research Assistant Professor in the Department of Veterinary Biomedical Sciences at the University of Missouri-Columbia whose research was supported by a grant from the National Heart, Lung, and Blood Institutes, and the National Institutes of Health, was found to have fabricated muscle weights for two of the six rats he claimed to have used in an experiment. 62 FR 7787.

When allegedly raw data is falsified from a calculation, the calculated numbers may have more terminal digits than the original numbers from which the calculations are made. This can be true even when the calculation is done using other real data. An extra terminal digit "5" was the evidence against this assistant professor who was conducting studies on the effect of rhythmic contractions of skeletal muscle on blood flow using the hind legs of rats. The reported data comprised the weights of 28 muscles and other body parts of six rats. However, the data for two of the rats contained a number of entries with one more digit than the data for the other four rats. Further, the extra digit was always a "5." This suggests that the number may have resulted from a calculation involving division by "2." The calculated mean of two numbers, one ending in an odd digit and the other ending in an even digit, would always end in the extra digit "5." ORI asserted that, lacking muscle-weight data for two of the rats, the investigator twice generated weights by calculating the means of the weights of two other rats for which data were available. When the six rat carcasses were checked, four had the hind legs dissected, but two were still intact. When the investigator was shown that the two rats' weights were clearly calculated as means of two other rats, not measures, he accepted the finding of scientific misconduct and entered into a Voluntary Exclusion Agreement in which he agreed for three years to exclude himself from serving in any advisory capacity to PHS and to submit to supervision designed to ensure the scientific integrity of his research contributions to any research supported by PHS.

Example 2. ANSI/IEEE Standard 854-1987.

Example 2 illustrates how the existence of terminal odd digits can be used as evidence that the data was falsified or fabricated.

In many time-course experiments, a variable is observed during a time interval. In such experiments, the time reported may be the computed average of two successive points during which the variable was observed. When ANSI/IEEE standard 854-1987 is used for rounding terminal 5's, the last digit of the calculated value is always even. This standard, which is based on a long-standing convention, is that the rounding of numbers ending in 5 be to the nearest even digit. Thus, the last digit of times calculated in a computer by averaging two successive integers will be even because one of the points will be odd and the next even and their sum divided by two must end in "5" which is always rounded to an even number.

Odd terminal digits where there should be none was part of the evidence against an investigator who had reported the time-course of electro-physiological measurements of spontaneous action potential spikes of muscle fiber from different genetic crosses. The time during which a spike occurred was computed from two successive times surrounding its peak. Since the computation was carried out according to the ANSI/IEEE standard, all 1026 time values of the "unquestioned" data ended in even digits. In contrast, over a fourth of the time values in the questioned data end in an odd digit despite the fact that they should have been computed in an identical manner.

Example 3. Distribution of Digits-Part 1

Non-uniform distributions of significant and insignificant digits was the evidence in Example 3, which involved a Fogarty Visiting Scientist in the Laboratory of Infectious Disease at the National Institute of Allergy and Infectious Diseases (NIAID). 58 FR 47143.

The data were reaction produced counts and residual substrate counts measured in a scintillation counter for different clones. The number of digits in the data range from 4 to 6. In what follows, the rightmost digit is designated "R1," the digit to its left is designated "R2," the digit to the left of R2 is "R3," and the digit to its left is "R4," except that the leading digit is never included in the distribution analysis.

ORI compared Chi-Square Tests for digit distributions from 222 unquestioned and 252 questioned data. The total number of digits for the analysis of the unquestioned data is 857, with 222 as R1, 222 as R2, 218 as R3, and 195 as R4. The total number of digits for the analysis of the questioned data is 939, with 252 as R1, 252 as R2, 250 as R3, and 185 as R4. ORI assumes each has 9 degrees of freedom. Chi-Square for the unquestioned and questioned data were 14.3 & 34.8 (R4), 9.89 & 29.3 (R3), 8.72 & 13.2 (R2), 11.33 & 27.1 (R1), and 11.09 & 30.94 (total). Thus, all of the Chi-Square values for the unquestioned data correspond to probabilities in the range .11 to .46 suggesting that distributions of the digits in the unquestioned data are not significantly different from uniform. On the other hand, three of the four Chi-Square values for the questioned data have probabilities greater than .05. ORI concluded that the distributions of these digits are not uniform.

Faced with the evidence, the investigator admitted that he had constructed the data from rough estimates based on autoradiogram, rather than from an actual scintillation counter experiment. He claimed that he was under pressure from his professor in Japan to publish scientific papers, so he fabricated data when he ran out of supplies because he did not want to delay his project for two to three weeks while awaiting the supplies. He agreed to exclude himself for two years from any federal grants or contracts, from serving on any PHS advisory committees for three years, and to certify as accurate and reliable any future applications for support from PHS.

Example 4. Distribution of Digits-Part 2

Unlikely digit distributions was also the evidence in Example 4, which involved a scientist then at the University of Utah and the University of California, San Diego. 59 FR 63811. The experiments comprised adding lipopolysaccharide extracts (LPS) obtained from endotoxin from various bacteria to cell cultures. In one series of experiments, the investigator claimed to have added 5000, 500, 50, 5, and .5 mg/liter of LPS to cultures to which endotoxin and stimulator cells were added simultaneously. In another series of experiments, the investigator claimed to have added these same concentrations of LPS to cultures to which endotoxin was added 24 hours prior to the addition of stimulator cells.

The data were reported as averages containing 5 digits and standard deviations containing 3-4 digits. ORI's treatment of the digits was the same as in Example 3, that is, the digits are analyzed in up to four places but no leading digit is included. Chi-Square for the various concentrations of LPS are 8.57 (5000 mg/l), 5.93 (500mg/l), 8.54 (50mg/l), 9.14 (5gm/l), and 26.22 (.5 mg/l). Again, assuming 9 degrees of freedom, the probabilities that these digits are distributed randomly are between .424 and .747 for the four higher concentrations of LPS, but the probability that the inconsequential digits in the data for .5 mg/l are uniformly distributed is less than 1 in 500.

The investigator in this case did not admit guilt to ORI's allegations, which included claims that he falsified and misrepresented scientific experiments in grant applications and publications in the 1970's and 1980's. However, when confronted with the evidence, he agreed in 1994 to be excluded from eligibility for all federal grants, contracts and cooperative agreements, and from serving on any PHS advisory committee, boards or peer review committees for three years. Since published data was at issue, the investigator also agreed to submit letters of retraction for two articles in The Journal of Clinical Immunology, three articles in The Journal of Trauma, and letters of correction for an article in Immunology Letters and three additional articles in The Journal of Trauma. The articles in question had been published during a ten year period between 1977 and 1988. There is no statute of limitations on scientific misconduct, nor should there be.

Example 2 Revisited. Distribution of Digits-Part 3.

ORI also applied the uniform distribution test to the data in Example 2. It found that the distribution of the 1026 penultimate digits of the times for the unquestioned data is uniform (p = .1). In addition, Chi-Square for the 172 penultimate digits for the times ending in an even digit is 12.3, thereby showing that their distribution is uniform (p = 2). However, Chi-Square for the penultimate digits from the 58 times ending in an odd digit is 33, thereby showing that the probability that their distribution is uniform is only .00013.

Now that they know what ORI is up to, can data falsifiers foil ORI's uniform distribution test by focusing on the digits ORI examines? ORI believes not. It cites studies showing that most people are unable to chose digits randomly, even when trying to do so. J.E. Mosimann, C.V. Wiseman and R.E. Edelman, Accountability in Research, 4, 31-55 (1995).


It is fascinating (at least to us) that individuals who have spent years obtaining the education and training necessary to become research scientists would risk everything by falsifying or fabricating data. Who would do such a foolish thing? What would motivate a person to be so dishonest? Why would a person supposedly dedicated to contributing to science add garbage to its data base? What can be done to protect science from these miscreants?

Who Is Sanctioned for Research Misconduct?

During the period since 1992, there are at least 90 findings of research misconduct reported by ORI in the Federal Register. Nearly twice as many males (59) as females (31) were sanctioned. The number of Ph.D.'s sanctioned for research misconduct outnumbered M.D.'s 41 to 10 (3 had both). The reported sanctions included 26 faculty, 8 post doctoral students, 12 graduate students, and 11 technicians.

At least 71 of the 90 reported sanctions for research misconduct occurred in a medical school or other clinical environment. There were at least 140 federal grants involved, all from PHS. The only National Science Foundation grant involved had been plagiarized while under review by an Associate Professor at the University of Cincinnati who copied the material to his own PHS grant application. 61 FR 16803-4. At least 14 of the sanctions were for fabrication and/or falsification of clinical trial data. Except for Harvard University where two faculty members were sanctioned, the parties sanctioned for research misconduct in clinical trials were all nurses, coordinators, or the like.

About two thirds of the individuals sanctioned by ORI admitted the research misconduct charges against them, although half of the faculty who refused to admit the charges against them were senior faculty. As described above, the professor in Example 4 did not admit the charges against him. Likewise, a University of Pennsylvania Professor agreed to sanctions but denied all charges against him. 63 FR 35933. And a Baylor University Medical School Professor appealed ORI's sanctions against him and lost. 64 FR 12341. However a professor at the Medical College of Georgia cooperated with the college's and ORI's investigation and admitted the charges against her. 60 FR 32555-6.

What Research Misconduct is Sanctioned?

80 of the 90 research misconduct incidents involved falsified and/or fabricated data. 42 of these cases involved falsified and/or fabricated data that had been presented orally or submitted for publication in a journal, and 32 involved falsified and/or fabricated data that had been included in a federal grant application. Nine of the 90 misconduct incidents included untrue statements about educational background, publications, or other qualifications, and seven of these were incorporated in a federal grant application. Seven incidents involved plagiarism, one in a published article and six in federal grant applications.


The only evidence addressing "why" that we found is the statement in Example 3 indicating that, at least for some individuals, it can be more important to please the boss than to be honest in the conduct of research. This suggests a degree of culpability on the part of the boss as well as the individual who feels driven to fabricate data. If the research leader cannot document that he or she has made a serious attempt to establish an atmosphere that clearly places integrity above everything else, he or she should share any sanctions imposed on individuals who might be misguided by this lack of leadership. Whether the desire to please derives from job insecurity, degree progress, the need for future professional references, pressure to publish, an unprofessional relationship, or any other matter that should be secondary to integrity, the research leader must likewise share the blame for research misconduct if he or she has not emphasized integrity as the number one priority in the laboratory.

Research leaders should not be permitted to reverse the Nurenburg defense by claiming they cannot control everything that everyone in their laboratory does. If they cannot, they have taken on more than they should have taken on, and that should be a form of misconduct itself. The basis upon which science is built requires that someone be responsible for the integrity of research published from a laboratory and the federal granting process that supports it, and that integrity must be practiced by every single person in the laboratory. The only conceivable individual ultimately responsible for ensuring that integrity is the person in charge. Harry Truman said it well when he said, "The buck stops here." 

A Solution Still on Hold.

A research scientist who falsifies or fabricates data has missed at least two major points during his or her education. First, there is a formal methodology called "the scientific method" which, if not used, excludes what one is doing from the domain we call "science." Second, there is a purpose to scientific research embedded in the very name we use to identify it: Whether we look at the Latin, French or English etymology, the word "science" is derived from "having knowledge," and "research" is derived from its French meaning "to investigate thoroughly." To do this we must stand on the shoulders of those who came before us, and we must provide a sound basis for those who follow us. There is simply no place in the phrase "scientific research" for incompetence or dishonesty.

How can a person with the educations of those 90 people sanctioned for their "research misconduct" and named in the Federal Register have missed these points? The obvious place to look for the answer is to examine the educational process that creates research scientists and the continuing process that maintains their professional skills and commitment.

Nearly all of the individuals sanctioned for scientific misconduct were highly educated professionals. Most had been educated or were being educated in advanced professional degree programs. At least 71 had participated in post graduate education programs leading to nursing degrees or doctoral degrees in medicine or philosophy. Was there something missing in their education? Did they receive adequate training in the scientific method and research ethics? How about their continuing education? Every other profession, from real estate sales people to attorneys to physicians have continuing education covering technical and ethical aspects of their professions. How about research scientists?

This problem has been recognized by scientists, clinicians, educators and others for many years. The following mark a few of the milestones during the past twelve years.

--In 1989, the Institute of Medicine (IOM) stated,

"...instruction in the standards and ethics of research is essential to the proper education of scientists,"

and recommended that universities provide formal instruction in good research practices. The Responsible Conduct of Research in the Health Sciences, p. 30 (IOM 1989).

--In the same year, the PHS published misconduct regulations that state that research

"...[i]nstitutions shall foster a research environment that discourages misconduct in all research and that deals forthrightly with possible misconduct associated with research for which PHS funds have been provided or requested." 42 CFR 50.105.

--In 1990, the National Institutes of Health required applications for National Research Service Award Institutional Training Grants to include a description of a program to provide instruction in the responsible conduct of research.

--In 1992, three years after IOM's recommendation, the National Academy of Sciences stated,

"...[s]cientists and research institutions should integrate into their curricula educational programs that foster faculty and student awareness of concerns related to the integrity of the research process." Responsible Science: Ensuring the Integrity of the Research Process, p. 13 (NAS 1992).

--In 1995 the Commission on Research Integrity recommended that the Department of Health and Human Services add a new assurance

"...that the institution has an educational program on the responsible conduct of research." Integrity and Misconduct in Research, p.18 (HHS 1995).

--In December, 2000, after receiving extensive public input, ORI issued PHS's Final Policy on Instruction in the Responsible Conduct of Research. 65 FR 76647. That policy states,

"...that research staff ... at extramural institutions shall complete a basic program of instruction in the responsible conduct of research, as set forth in this document. Research staff who are working on the PHS‑supported project at entities other than the institution that received the PHS research grant, cooperative agreement, or contract, are also covered by the policy.... The policy pertains to all research ... or research training, conducted with ... support from ... PHS."

--In February, 2001 ORI suspended implementation of the PHS Policy on Instruction in the Responsible Conduct of Research. 66 FR 11032-3.

According to PHS, this latest action was required by the new President's January 20, 2001, Regulatory Review Plan, which calls for administrative review of all agency rules promulgated in the waning days of the Clinton administration. According to Representative Billy Touzin (R-La), Chair of the House Committee on Energy and Commerce, which has oversight authority over PHS, it is because the PHS policy to improve the ethics of scientists and others "may have been issued by a government agency in apparent disregard of federal law." Letter dated February 5, 2001 from Rep. Touzin to ORI Director Chris Pascal. Republican Touzin labels PHS's policy to educate scientists about how to avoid scientific misconduct a "substantive rule," while PHS and ORI continue to insist that such education is a longstanding interest of the federal government that has been reaffirmed time and again by the research community. The bottom line is there is to be further delay in the adoption of a policy that would help educate research scientists and prevent incidents of scientific misconduct. As a result of this action, institutions receiving PHS grants for research are no longer required to implement the responsible conduct of research policy developed during the last twelve years. Moreover, resources that otherwise would have been available to assist institutions implement the PHS educational program are also on hold.

Surprisingly, powerful scientific societies such as the American Society for Biochemistry and Molecular Biology (ASBMB) commented negatively on ORI's proposed implementation of PHS's educational policy on research misconduct. This opposition is based in part on ASBMB's fear that ORI will do what scientists have failed to do for themselves to enhance the integrity of scientific research. IX ASBMB News Nov/Dec, 2000, p.4. For example, ASBMB cites issues such as keeping data books as being "within the purview of academic institutions and the research community." But, despite formal procedures clearly set forth in the scientific method, scientists and academic institutions have failed to implement a concrete, uniform policy on keeping data books, nor have they implemented any other such policies to define professionalism in using the scientific method. Absent clear definitions of what the scientific method requires in modern research, it is not surprising that scientists and academic institutions have been unable to develop a workable policy to control misconduct and protect the integrity of scientific research. Nor is it surprising that the current issue of ASBMB News includes a picture of Representative Billy Touzan (R-La) sporting a big smile. X ASMBM News Mar/Apr, 2001, p.7.

In view of this opposition to reform the conduct of federally funded research through federal intervention together with the lack of any progress by research scientists to educate themselves on professional ethics in research, there is not much hope that institutions will suddenly start implementing educational plans to prevent scientific misconduct on their own. Indeed, the status of research professionalism and ethics today is not dissimilar to the status of race relations in the United States prior to federal intervention with civil rights laws. Until then, race relations were "within the purview" of local control. When traditional efforts to deal with problems fail, when those who should lead the effort and take responsibility don't act, new solutions are called for.

Nevertheless, an institution would be foolish to rely on this technicality and discontinue educational programs because it is still legally responsible for the misconduct of its employees. The scope of an institution's liability could be affected, for example, if the falsified data became the basis of criminal false claim action. Under the Federal Sentencing Guidelines, effective education programs are mitigating factors to be considered by the judge and prosecutor. The converse is also true: failure to have programs in place to prevent illegal activity strengthens the presumption that the behavior was endorsed by the institution.

Meanwhile, the integrity of science suffers because scientists refuse to educate themselves about how to avoid research misconduct, and government is not filling the void, even for research sponsored by government. And government is not filling the void because of politics, bungling, and/or lobbying by some individuals in powerful scientific societies who disingenuously argue valid objections to an expended role of government in scientific research to protect not science, but their own interests and government funding. Honest research scientists everywhere should be outraged at this interference with our attempts to improve our profession and safeguard the integrity of science. And to counter this temporary setback, we should fight for adoption of professional rules of conduct and demand that the educational programs continue because of their own merits, irrespective of whether they are mandated by federal regulation.

In the next two articles of this series we discuss legal defenses to ORI's statistical evidence based on "inconsequential digits" and educational efforts to prevent research misconduct and improve professionalism in science.

End of document


The Law, Science & Public Health Law Site
The Best on the WWW Since 1995!
Copyright as to non-public domain materials
See DR-KATE.COM for home hurricane and disaster preparation

See WWW.EPR-ART.COM for photography of southern Louisiana and Hurricane Katrina
Professor Edward P. Richards, III, JD, MPH - Webmaster