ORI’s   “The Lab” – Avoiding Research Misconduct

Each university that uses funds from the Public Health Services,  must provide training in the Responsible Conduct of Research (RCR).   The Lab is a new interactive movie developed by the U.S. Office of Research Integrity (ORI), that allows viewers to experience an ethical challenge involving research as one of four characters.  Set in an active lab with a tier of graduate students,  post docs, PIs and a research integrity officer (RIO), the story line follows each character as a case of data-image falsification is discovered and followed through by steering the chosen character through a series of choices and outcomes.

The case involves manipulation of Western Blots, which are among the data-images most frequently falsified.  Each stakeholder’s motives, choices and outcomes are well developed  and realistic with a sophisticated understanding of the dynamics of the academic lab environment.  For example, the graduate student’s conversations with her peers show how much courage it takes to point out a flaw in the system.

The various pressures on each character create a sense of what it is like to make ethical choices in a competitive environment,  and pulls no punches about the ramifications of speaking up, or keeping silent.   Relationships of PIs, RIOs, post-docs and graduate students are presented believably, as are the pressures that can come with different roles in a lab.  The production values have been kept very high, and this interactive movie should be an important part of research ethics resources for academic labs.

This movie is available as a download in the public domain, at no cost to the user,  from the Office of Research Integrity at http://ori.hhs.gov/thelab.


What does a retraction tell us?

Retractionwatch.com has been a powerful partner in the examination of falsification in the published record, doing the hard work of gathering retraction notices and categorizing them, generating data from these retractions, highlighting research about retractions, and collecting wide-ranging comments, all in one readily available blog.

Data-images and retractions.  At Science Image Integrity our focus is on data-images rather than retractions—on what retractions can tell us about the extent and types of problems with data-images.  Earlier this year we looked at a small sample of 2010 and 2011 retractions in PubMed and found that the language in these retractions was inconsistent and often vague, even obscure; some gave no information other than the statement of retraction.  (In September Ferric Fang PhD and colleagues at University of Washington  examined this problem in detail in their 2012 article “Misconduct accounts for the majority of retracted scientific publications“, confirming our impressions.)  Because of a lack of information in many retractions, and lack of clarity in many others, it is not possible to know what proportion of retractions involve problems with data-images.

Consistent terminology about data-images.  In addition, retractions involving data-images need a consistent language for describing different kinds of falsification.  Earlier this year we published a preliminary schema for kinds of falsification of data-images, and we’ve since refined it HERE .   It is inevitable that different institutions—the Office of Research Integrity, universities, research institutes—will use different terminology, but we hope that the community will reach a consensus about how to describe types of falsification.
The comments of Ferric Fang, MD, Professor of Laboratory Medicine and Microbiology at University of Washington, apply to both the shortcomings of retractions and the need for common terminology:

Specifically, the classification of data falsification or fabrication, plagiarism and intentional duplicate publication as forms of “author error” is confusing, as most studies have characterized these practices as misconduct, as opposed to error.  Similarly, lumping together methodological or analytical errors with non-reproducible data  results in a category error because, as you know, we found a number of cases in which non-reproducible  data irreproducibility was cited for what turned out to be suspected or documented fraud.

    A Chinese proverb is said to state that “the beginning of wisdom is to call things by their right names.”  Thus, to understand retractions and to address their underlying causes, it is important not to limit our understanding to the incomplete and sometimes misleading information provided in retraction notices.”

Link to Retractionwatch article: http://retractionwatch.wordpress.com/2012/10/30/most-retraction-notices-dont-involve-misconduct-or-flawed-data-new-study/

Link to Fang’s article:  http://www.pnas.org/content/109/42/17028   Subscription required

Students’ reliance on PI advisers in research ethics 

This summer, the Council of Graduate Schools  published its report on research and scholarly integrity in graduate education.  According to an August 14 article by Beth Mole in the Chronicle of Higher Education, based on a report by The Project for Scholarly Integrity, graduate students felt that they understood research ethics but the report revealed that students nonetheless needed help in dealing with issues of research misconduct.

One finding was that students relied heavily on their PI advisers for guidance on research ethics rather than on university resources.    This is certainly not surprising but it limits the perspective available to the student, especially on issues that remain unsettled within the research community.  A prime example is digital-data images, where researcher-advisers even within the same department may have widely different perspectives and experience.  These differences are reflected in the increasing number of retractions based on questions of data image manipulation and management.


LINK to article :  http://chronicle.com/article/How-to-Train-Graduate-Students/133623/  **  Access requires a subscription

LINK to study:   http://www.cgsnet.org/benchmarking/best-practices-data/PSI-dashboard   Report by Project for Scholarly Integrity


Data image misconduct in the lab:

Once data-images were produced on film and processed in labs with protocol and certified staff.   Now lab directors each have their own non-standardized practices and policies, with only the potential for institutional oversight and editorial and peer review to help them identify opportunities for error and research misconduct.   Very few societies have adopted policies or practices, and those differ in content and detail.

Without national or international data-image management standards,  it is up to each individual lab to decide upon protocol and best practices,  with very little guidance if questions arise.

The case of Cardiff University’s investigations of publications produced by Dean Morgan’s lab was brought to our attention recently by a pseudonymous whistle-blower, Clare Francis,  and we wonder whether such cases of mega-misconduct would be so frequently seen, if labs could turn to a set of standards which PIs, researchers and students could turn to.

Link to article:  http://www.timeshighereducation.co.uk/story.asp?sectioncode=26&storycode=420811&c=1

Identifying self-plagiarism; clarification is needed for data-images too:

According to an article in The Chronicle of Higher Education (link at bottom)The University of Alabama at Birmingham conducted a needs assessment of graduate students on research ethics and found that the students did not feel confident in recognizing self plagiarism.  The UAB students are not alone—senior researchers can have the same problem, as we described last Fall when writing about Professor Wang at the Montreal Heart Institute.
UAB revised its plagiarism workshop, according to the article,  to include interactive exercises to help students identify plagiarism. Their online plagiarism policies have not been similarly updated  and we do not know, whether this workshop covered images as well as text and tables.

For our schema on types of image falsification, including self plagiarism, see  Falsification Tables.

Standards in Data Image Management: are we seeing growing consensus?

Best practices in acquisition, post processing, and creation of metadata for data images in biomed are being developed by a variety of sources, including professional societies, research institutions, and individual labs.  As yet, however, there is no one recognized source for widely accepted practices.  Meaningful standards arise in the field when various guidelines, developed independently, begin to move toward a consensus.  The standards are a formal acceptance and recognition of this consensus.

Training based on evolving guidelines can be found at national conferences such as Microscopy and Micro-analysis and the Council of Science Editors.  Also some commercial ventures offer training, such as Jerry Sedgewick’s annual Image and Analysis Workshops in Minnesota.

As yet, many institutions, societies, and publishers do not explicitly address data-images within their guidelines but instead provide generic data-management guidelines and assume that any special needs that data-images have will be covered in the generic language. The increased level of training, discussion at national conferences and existence and popularity of OMERO – itself developed from a consensus that data-images require special management suggests that there is a place for standards for data-image management.


Stowers Institute: first research organization to give open access to data-images

The Stowers Institute, a biomedical research institute in Kansas City, Kansas, now requires members to deposit all original data-images in the Stowers Original Data Repository (ODR) or repositories maintained by third parties. The ODR was built on the OMERO platform by a team at the institute.

The huge data sets generated by imaging technology, among others, may require acquisition and post-processing steps to extract information and prepare images for publication. To allow members of the scientific community to make “independent, fully informed assessments of published results,” the institute now provides open access to the original data. An “Original Data” link is placed by publications listed on the Stowers website, and available data will also be accessible from the Stowers Institute homepage.
Link: http://stowers.org/media/news/jul-13-2012


OMERO comes through

The new release of OMERO 4.4 includes just the kind of improvements to keep it current with the rising focus on metadata creation for data-images. Changes include greater functionality for users and  enhances the ability to capture,  annotate, and view metadata and provides readily found and clear instructions for doing so.  The OME consortium recognizes the importance of metadata in data-image management.

Link : https://www.openmicroscopy.org/site/products/feature-list


Keeping Guidelines Current

In 2003 the Microscopy Society of America (MSA) was almost alone among professional societies in developing policies about digital data-images.  The MSA recognized the need to provide basic ethical guidelines for digital data-image:

The MSA position on digital image processing has been approved as follows:
“Ethical digital imaging requires that the original uncompressed image file be stored on archival media (e.g., CD-R) without any image manipulation or processing operation. All parameters of the production and acquisition of this file, as well as any subsequent processing steps, must be documented and reported to ensure reproducible results.
Generally, acceptable (non-reportable) imaging operations include gamma correction, histogram stretching, and brightness and contrast adjustments. All other operations (such as Unsharp-Masking, Gaussian Blur, etc.) must be directly identified by the author as part of the experimental methodology. However, for diffraction data or any other image data that is used for subsequent quantification, all imaging operations must be reported.”

Now, almost 10 years later, retractions and questions about data-images in submitted and accepted publications are frequent enough to ask whether the MSA should take the lead again by re-examining these policies and expanding them to include more specific guidelines.
We hope they will and that they would emphasis the need to retain metadata with the unmanipulated original.  It would also be a service to the community if they would clarify issues of falsification and plagiarism of data-images.

Link:  http://www.microscopy.org/resources/digital_imaging.cfm


How do we get there from here?    A look at author guidelines in data-image integrity.

Learning to do the right thing – The CSE/COPE Joint Session:
How does a journal or society create an ethics policy?  Christine Bennett, PhD, Publications Ethics Manager  of The American Physiological Society. (APS) shared the process the publications committee and editors in chief at APS used to develop their publications ethics guidelines,  and provided templates for communicating concerns with authors.   Digital data-images and figures topped the list of ethics concerns in this session for editors, authors, and reviewers.   A significant service is provided when seminars like this recognize the need for providing a reference point for the publishing community on this important topic, and we look forward to seeing more societies following their lead.  We note and appreciate that image-data are featured in their Ethics poster, and we are developing a similar resource for those who use data-images in their manuscripts.

Demystifying Scientific Misconduct Issues through the Instructions to Authors – Mary Scheetz, PhD, Research Integrity, LLC,  Patty Baskin, MS,  Executive Director, Neurology Journals, Ken Kornfield, Senior Managing Editor,  American Society of  Clinical Oncology.

What should good instructions to authors (IAs) include?   Five journal’s IAs were examined for completeness and effectiveness,  with discussion.  We are impressed that the journal stance on data-image manipulation was a line item for each IA.  Not only do we encourage all journals to update their IAs to include specific instructions on handling data-images,  but we are equally concerned that authors seek out the best guidelines even if their journal provides no guidance (an occurrence we hope swiftly disappears).

We are grateful to Mary Scheetz and her colleagues for providing invaluable advice in these very clear slides.  We will follow up on these excellent resources in the next weeks.

CSE/COPE Joint Session: www.resourcenter.net/images/CSE/Files/2012/AnnMtg/Handouts/01_Bennett.pdf

Demystifying IAs:   http://www.resourcenter.net/images/CSE/Files/2012/AnnMtg/Handouts/16_Scheetz_Baskin_Kornfield.pdf

Omero: a solution and model for managing data-images
An innovative approach to creating an architecture for storing data-images has been developed by the  Open Microscopy Environment (OME), a collaborative  project between research laboratories at the Swedloe Lab at the University of Dundee (Scotland) and in the United States the National Institute on Aging, and the Laboratory for Optical and Computational Instrumentation (LOCI).  The OME produces tools to support data management especially for biological light microscopy; all OME formats and software are free, and it is designed to interact with commercial software. One project is the Java-based OMERO software, which includes tools for storing, visualizing, managing, and annotating microscopic images and metadata.

OMERO is used to power the JCB Dataviewer, developed by Journal of Cell Biology in 2008 as a voluntary repository of images in its published papers.  The American Society of Cell Biology (ASCB) uses OMERO for its Cell Image Library.  Both are browser-based applications for viewing original image data multidimensionally.  OMERO supports numerous file types, including the proprietary file types of many popular image-analysis software such as Adobe, ImageJ, and ImagePro.

The Cell Library and the DataViewer both have a detailed page for data-image viewing and simple analysis.  The power of OMERO combined with these public repositories gives users unprecedented access to original data-images.
The Cell Library and the JCB Dataviewer are a preview of data management for data-images in the future.  What’s missing so far are best practices guidelines for archiving the original images and standards for metadata that can be used to replicate post processing.  With such guidelines in place, OMERO will be a powerful tool for institutions in promoting ethical post processing of digital data-images for publication.

Link to Dataviewer:  http://jcb-dataviewer.rupress.org/jcb/page/imageformats
Link to Cell Library: http://www.ascb.org/ivl/design/microscopy.html


Retractions caused by inaccurate data-images:  more details needed
Data-images are increasingly a cause of retraction and correction of articles published in biomedical journals.  Although journals publish retraction and correction statements about problems with data-images, these statements often raise more questions than they answer.  Some retraction statements merely say that the article is retracted because of questions about figures; others are ambiguous.
A striking example of an unenlightened statement is the retractions by De Domenico et al., in Cell Metabolism, as reported by Retraction Watch (link below).  Even less informative are the 2009 retraction statements by four of the same authors in Cell Biology.
This lack of transparency is not useful to readers or anyone who studies the integrity of data-images in science.  Readers need to know more because they relied on the published images when assessing the value of the published article; when an article is retracted, they need to know what was wrong with that data.  People who track problems with image integrity cannot understand what is happening if they do not know the type and extent of the problem that led to the retraction.
The Council of Science Editors and leading professional societies need to promote more-informative retraction notices.
RetractionWatch, May23, 2012;  Authors retract two Cell Metabolism papers after “data were inappropriately removed from the laboratory”;   http://retractionwatch.wordpress.com/2012/05/23/authors-retract-two-cell-metabolism-papers-after-data-were-inappropriately-removed-from-the-laboratory


Where to store the data?

Best practices for handling of data-images include retaining the original captured image, along with capture and post-processing steps in metadata.  Creating that storage space for a myriad of data-types is challenging for research institutions which can only archive some of their data.  At a recent conference on data management at the University of Virginia, librarians discussed the cost of data storage and the lack of consistent formats for data and metadata.
In the past, storage costs have led some stakeholders to think that since science is supposed to be replicable,  storing data is redundant, and only the research paper with its methods section should be necessary to replicate the data.   During the conference James Hilton, CIO for the university,  noted that while some data is easily replicable and does not need archival space,  there are also bodies of data for which replication is too expensive or not possible such as observational data,  and therefore  archival storage space should be provided.

Data types and metadata standards are not always uniform even within a field, creating additional barriers in developing effective institutional repositories.   Consistent retention of image-capture and post-processing steps as metadata are not yet built into the research process.   As research institutions grapple with storage issues,  standardization of metadata must become part of the best practices for managing data-image management.

What Is a Falsified Image?

We have noticed a few author’s responses to retractions of papers in which data-images have been categorized as falsified,  as assertions that they have done nothing wrong   This table presents a schema for understanding the different forms of data-image falsification.  The basic distinctions in the schema have to do with whose data set is being used, whether the data set is the subject of the present report, and whether the image has been manipulated inappropriately.



Falsification and Plagiarism

Images from dataset being reported

Images from other datasets


  • Image from researcher’s own dataset
  • Deliberately mislabeled
  • No inappropriate manipulation of image


Falsification = mislabeling


  • Image from another researcher’s dataset (published or unpublished)
  • Manipulation of image:  none or inappropriate


Falsification through plagiarism = claiming another’s work as one’s own

Falsification through manipulation:  only if image was also inappropriately manipulated



  • Image from researcher’s own dataset
  • Accurately labeled
  • Inappropriate manipulation of image


Falsification = inappropriate manipulation


  • Image from own dataset previously published
  • Deliberately mislabeled
  • Manipulation of image:  none or inappropriate


Falsification through self-plagiarism:  claiming that the image represents the dataset being reported

Falsification through manipulation:  only if image was also inappropriately manipulated


  • In A, the researcher uses an image from the dataset being reported, makes no inappropriate manipulations, but deliberately mislabels it.  The falsification here is the mislabeling.
  • In B, the researcher again uses his own image from the dataset being reported in the paper but manipulates it inappropriately.  The falsification is the inappropriate manipulation.

By contrast, situations C and D involved both falsification and plagiarism.

  • In C, the researcher uses another researcher’s dataset (whether it has been published or not), mislabels it as her own, and may or may not have manipulated the image inappropriately.  The falsification is claiming the image as her own; she may also have falsified the image itself.
  • In D, the researcher uses an image from his own previous research that has been published and deliberately mislabels it as belonging to the new research.  The falsification is claiming that the image is from the current research.  (This is sometimes called “self plagiarism.”)

A better grasp of data-image handling involves knowing what is acceptable, and what is not acceptable.  We hope this table can provide some clarification in the conversation on data-image integrity in the sciences.

The 1%:  journal manuscripts with serious image falsification How common a problem is data-image falsification in journal articles?  Rockefeller University Press and the Office of Research Integrity have separately collected data showing that falsification of data-images occurs in 1% of their cases. Journal of Cell Biology.  Liz Williams, PhD, executive editor of JCB, reported in March that, after screening of images in accepted articles, 1% of the articles are found to have falsified images and acceptance was revoked.  As an example, JCBhad the following data for 2002-2011:

3341 accepted papers; all images were screened for inappropriate manipulation

 531 papers:  the journal requested the original images for examination

  35 papers:  acceptance was withdrawn because of falsification of images

This means, she concluded, that without screening the JCB would have published four papers each year with falsified images. Office of Research Integrity.  Likewise, John Krueger,PhD, Scientist-Investigator at the federal Office of Research Integrity, reported that data from American Journal of Respiratory and Critical Care Medicine, Blood, and Journal of Biological Chemistry also showed that 1% of accepted manuscripts were found to have serious image falsification.  He went on to note that this 1% represents $300M in research funding. Liz Williams and John Krueger spoke at the March 20, 2012, meeting of Science Online NYC (SoNYC), sponsored by Nature.com.


University research librarians and data management—including images On April 10, 2012, the University of Virginia Libraries held its first conference on data management, successfully facilitating a powerful opportunity for local universities to collaborate.  In terms of data-images, what was most interesting was not what was included on the agenda but what wasn’t. Hosted by the University’s Scientific Data Consulting group (SciDaC) from the Charles Brown Science and Engineering Library, this conference for librarians at major Virginia universities explored the roles and resources of libraries in curating and managing research data.  The meeting grew from researchers’ need for data management plans and librarians’ increasing role in assisting researchers and their home institutions with data management. Although management of data-images was not on the agenda, the library participants were especially interested in information about emerging guidelines for data-images and about tools for creating metadata for data-images. We will cover ideas and issues from the conference in coming weeks.


Readers are becoming science gatekeepers In 2011, the Nature journals joined with Rockefeller University to offer a monthly online panel discussion on how the internet is changing science as we know it.   On March 20 the topic was, “Setting the Research Record Straight”  discussing research ethics and retractions, with a strong focus on data-image integrity.  The main presenters were Liz Williams, PhD, executive editor of JCB,  John Krueger, PhD, scientific investigator with the O.R.I, and Ivan Oransky MD, a founder of Retractionwatch.com. Liz Williams noted that digital publication means that editorial review now continues into perpetuity and includes non-experts.  John Krueger observed that 20% of data-images presented to journals require corrections, and 1% fall into questions relating to fraud.  He pointed out that this 1% is not represented in retractions.   Ivan Oransky emphasized that readers now extend the review process, and, with broad, continuous electronic access,  they can view data-images in unprecedented ways. This extended review period is the new reality in science publishing, yet many of our procedures and standards have not caught up, and there is pent-up demand for guidance on data-image integrity. The link for the live-stream video of “Setting the Research Record Straight” can be found here. The link for to Science Online NYC can be found here.


How many of ORI’s misconduct decisions involve data-images?Each year the Office of Research Integrity posts summaries of closed cases where it found misconduct.  In 2011, the ORI reported on 13 closed cases: 38% involved digital data-images. All but one of them involved falsifying images after capture, and the other was a case of altering the gain settings on an instrument in order to obtain false data. One year’s data may not reflect the patterns for longer periods, so we may also look at cases closed in 2009 and 2010. (Because ORI removes from the website all cases for which the penalty has been completed, it will not be possible to look at earlier cases on this site.) To see the cases: http://ori.hhs.gov/case_summary

Instructions to Authors—a Front-line Defense for image integrity

Journals’ Instructions to Authors give vital rules and guidelines, but unfortunately they seldom communicate expectations for ethical handling of data-images.  At the 2011 Council of Science Editors (CSE) annual conference, we presented a poster on identifying how many journals (in a sample of 446) gave specific instructions for handling data-images and how many treated images as illustrations.   CSE Poster link  Only 50% of journals had specific instructions on data-images. At the upcoming 2012 CSE conference (May 18-21), two sessions will cover data-images as part of research ethics.  Mary Sheetz, Research Consultant for Research Integrity Consulting (and former officer at the Office of Research Integrity), and Patty Baskin, Executive Editor of Neurology, will include data-image handling as one of several topics in the Sunday panel discussion “Demystifying Scientific Misconduct Issues through Instructions to Authors”, and Christina Bennett, PhD  ,Publications Ethics Manager, American Physiological Society, will include a section on the integrity of data-images in Saturday’s full-day course on Publication Ethics. It is encouraging to see these issues covered at a national meeting of editors and publishers.

Reaching consensus about best practices

In any field, it is difficult to reach consensus about best practices.  For issues involving digital data-images, consensus is even harder to reach because of the varied disciplines that use data-images and the many ways the images may be used.  Even within a single discipline, agreeing on best practices can be frustrating.  One of the few societies in biomedical sciences to tackle these difficulties is the Microscopy Society of America (MSA), which has had an official policy on digital-image processing since 2003 .

Developing consensus on best practices for appropriate capture and post-processing of data-images faces familiar obstacles:  limited time in busy professional lives; inertia; and general resistance to change.  In the search for guiding principles—much less practices—for the capture, processing, and archiving of data-images, it is important not to underestimate the effort involved. John Mackenzie of North Carolina State University makes a case for “standardization in scientific digital imaging in order to ensure proper ethical manipulation” in the description of the short course—and, we hope, in the course itself—that  he will teach later this year.  (MSA’s Microscopy  and Microanalysis Conference, July 20-August 2, 2012 )

Pitfalls of Western Blots

Western blots are one of a category of images that involve densitometry, or measurement of brightness and darkness ratios, a category that dominates the field in image-related article retractions and corrections.   In Scientific Imaging with Photoshop, author Jerry Sedgewick states that, “In general, images destined for OD/I measurements should not be altered in any way.  Exceptions to that rule are acceptable as long as the procedures are documented and described in the publication.”   Images captured for the purpose of densitometry permit the least manipulation, as Jerry observes,  “The only permissible change to electrophoretic specimens, aside from those taken to conform image to outputs, is to eliminate dust and scratches, but only by a global application of a filter.”

Why then,  are Western blots the focus of so many retractions?  In a 2009  article, “Quantifying Western blots: Pitfalls of densitometry”,  Gassman and colleagues observe, “Although Western blots are frequently quantified, densitometry is not documented.”   In their study of 100 randomly selected papers, “none provided sufficient information on how Western blot results were translated into statistical values”.  The study concludes that “the necessity of clear definition and documentation of densitometry is evident from our analysis”  and adds that, “the same data can be used to make – statistically proven – completely contradictory statements.”

Image fraud in papers by Sylvia Bulfone-Paus In 2009, immunologist Sylvia Bulfone-Paus faced allegations of fraud involving both image and data manipulation. She was director of the immunology laboratory at Research Center Brostel in Germany and also a professor of immunology at the University of Manchester in the UK. The investigations came after a persistent (and sometimes unsavory) campaign by anonymous bloggers who reported fraud in her lab. Papers in 2001-2009. Investigations by the center found fraudulent image manipulation (images of protein blots from unrelated experiments, for example), and six papers were retracted. Then the DFG, the German research funding agency, investigated separately and found more fraud, and another six papers were retracted. She resigned as director. She and the investigators blamed two former post-docs who were co-authors on the papers, and she claimed to have no knowledge of the fraudulent activities. Papers before 2001. However, examination of Bulfone-Paus articles published before 2001 (before the post-docs joined lab) revealed further problems with data. In 2011 Transplantation retracted a 2000 paper because of a fraudulent image—the image for a control (untreated) was the same image used in her 1997 paper in Nature Medicine. Further, Blood is investigating duplicate image-data in a figure in a 1999 paper, and BioEssays has issued an Expression of Concern about a 2006 article coauthored with the two post-docs and her husband. Apparently no decisions have yet been made by these two journals.


Retraction Watch:  25% of its retractions are for image manipulation Retraction Watch allows readers to search its collection of retractions by the reason for the retraction.  As of February14, 2012, retractions for image manipulation account for a quarter of all retractions in the database, tied with those retracted because they were not reproducible. The top five reasons given—accounting for 94%–for the 230 retractions were

Not reproducible              24.7%

Image manipulation        24.3%

Plagiarism                            18.3%

Duplication                          15.2%

Faked data                            11.3%

Retraction Watch (http://retractionwatch.wordpress.com) is a website devoted to “tracking retractions as a window into the scientific process.”  


Experts Agree:  Preserve the original There is no universal standard across scientific disciplines for manipulation of  digital-data images.  Experienced and respected colleagues, however, all voice two fundamental practices:   Save your original; and document all post-processing steps in a replicable fashion. As John Russ summarized about ethics in digital-data image handling: “The heart of the scientific method is replicability. If adequate information is provided on the processing steps applied, and the original image data are preserved, then the validity of the results can be independently verified.”  Dr. Russ on Ethics Jerry Sedgewick wrote about the crucial role of the original image: – “The sole means for determining the extent of the existence of alteration or additions lies in looking at the original.”  Scientific Imaging with Photoshop, Methods, Measurements, and Output(2008) Doug Cromey reminds researchers that: “Scientific digital images are data that can be compromised by inappropriate manipulations. …  Maintaining a copy of the unaltered original image is the user’s only protection against accusations of misconduct. This is also the only way that users can recover from a mistake in image processing.  Honesty is the best policy. If portions of an image for publication were selectively enhanced, the author should state it clearly in the figure legend.   Southwest Environmental Health Sciences Center The Microscopy Society of America’s resolution on ethical digital image processing sets the standard for its members:  :   “Ethical digital imaging requires that the original uncompressed image file be stored on archival media (e.g., CD-R) without any image manipulation or processing operation. All parameters of the production and acquisition of this file, as well as any subsequent processing steps, must be documented and reported to ensure reproducibility.  (Microscopy Today Nov/Dec 2003, p61): These two essential practices are the foundation for proper archiving of all digital-data images.


Dr. Wang doesn’t think he did anything wrong Dr. Zhigou Wang of the Montreal Heart Institute (MHI) still does not feel he has done anything wrong.   In six now-retracted articles published in 2007 and 2Experts Agree: Preserve The Original008 in leading peer reviewed journals, Wang used western blot bands from other, previous studies as evidence in the articles about subsequent studies.  This is a clear violation of  research ethics, not to mention copyright violations if the previous articles were published.    MHI funded the research and set up an expert committee that concluded Wang had violated the ethical polices of their institution and the journals that published the articles.  Dr. Wang was forced to resign in September 2011. See www.retractionwatch.com for details of the case. In an interview with Retractionwatch,com staff, Wang commented, “We noticed some mistakes in the Western blot band images shown in these papers. These mistakes do not invalidate our results and conclusions, and we and others have been able to reproduce the data reported in these papers.”    This comment reveals two misunderstandings.  First, one ethical researcher claims that images from one study are in fact from a different, later study.  But second and more important, his reaction reflects a widespread and outdated understanding of images in reporting research results:  the idea that a image is a concept, merely an illustration that is transferable, rather than understanding that the image is the data. No researcher would make this mistake with numerical or statistical data—claiming that numerical data from Study A is actually data from Study B (and C and D) is false and therefore unethical. It is vital that data-images be recognized as data,  not illustrations.   In a recent presentation at the 2011 annual Council of Science Editors conference,  we reported a study of over 400 biomedical journals that revealed approximately 50% of the journals’ author instructions gave only instructions for handling illustrations, with no special instructions or standards for data-images.  We believe it is time for data-images to be recognized as a special form of data,  and for journal author instructions to reflect the needs of digital data-images in science.

“Yes, I cut and pasted the images but didn’t do anything wrong.”
In a 2011 case submitted to the UK’s Committee on Publication Ethics (COPE), the editors of a journal were convinced that false bands had been included in an assay figure.   http://www.publicationethics.org/case/duplicate-publication-and-alleged-image-manipulation The author had not described any image manipulations when submitting the article.  When the editors contacted the author, he admitted that “some of the figures had been made by copy/paste but he maintained that the conclusions of the article are correct.”  The editors then investigated other papers they had published by the same author and found what appeared to be inappropriate manipulation.  When contacted again, the author had the same response, as did his co-author. The editors found themselves dealing with authors who believed that cutting and pasting assay figures—gel blots—was not wrong if the authors considered that the results accurate. If nothing else, this COPE case points to the need for  better and more thorough education about producing and manipulating data-images.  Senior researchers may need this education as much as graduate students do.  All parts of the research community need to be involved—research institutions, universities, academic and professional societies, granting agencies, and national organizations devoted to responsible conduct of research.
Image-data need special handling in DMPs Requiring data-management plans (DMPs) in grant proposals is a new attempt to protect the integrity of the scientific record.   The National Science Foundation (NSF), for example, now requires a data plan in every grant application, and various organizations are working to design materials and tools to help researchers create and document their DMPs.   Organizations such as DataOne, a service of the University of California Curation Center of the California Digital Library, are developing strategies to bring greater awareness of data organization during project planning.  For example, a consortium of 6 universities—led by the University of California  and including the University of Virginia, is developing DMPTOOL, a service to help researchers meet requirements for data management plans.    Like other leading research universities, the University of Virginia, has created the Science Data Consulting group (SCiDaC)  to formulate institutional templates for researchers who seek NSF funding. But the NSF requirements and new tools such a DMPTOOL do not take into account the special nature of data-images, which require their own dedicated plan  to address entrenched routines.    DataOne recognizes that data-images are a special category of data because of the challenges they present to indexing for later retrieval:  “Multimedia metadata is particularly important for retrieval,  as the objects do not contain text that can be indexed.”  Because all data management relies on indexing systems, this is a substantial obstacle to managing data-images well, and there are no clear guidelines for addressing data-images. We believe that any DMP that covers data-images should contain guidelines and requirements that address the unique properties of digital data images.
Getting it Right:   Digital Data-Images and Journal Author instructions Thoughts on Journal of Molecular Medicine’s 2008 statement on scientific misconduct and data-images
We recently re-read editor Detlev Ganten and Gregg L. Semenza’s  thoughtful statement on research integrity and their journal’s new policies, and it made us think more about improving author instructions. The JMol editors  state clearly that their primary concern is that pressure to publish is leading to more research misconduct, and they seek to promote integrity in research and limit research misconduct in the articles they publish.  It is heartening that they specifically write about manipulation of digital data-images, but their focus seems to be on deliberately inappropriate image manipulation, it is not clear from their statement that they recognize the extent to which digital-data images are specially liable to unintentionally inappropriate digital image manipulations. The editors seek to “further increase the readers’ confidence in the results”, and have hit the mark in requiring “authors to provide raw data of blots or gels as electronic supplementary material.”  This is an excellent beginning and far more than most journals ask, but we still think more needs to be done. We think journals should make at least three specific changes to their instructions to authors, in order to fully address the complexities of dealing with data-images. 1)  Insist that authors retain the original unmanipulated data-image for all data-images and make them available upon request. 2)  Require that authors retain the audit trail of image manipulations for images being submitted. 3)   Require that any data-image submitted for publication include appropriate description of any manipulations. Whereas screening of images can be an important tool for journals, the cost is usually more than journal budgets can bear.  Our suggestions require minimal oversight during routine operations. Further these three requirements will help protect the journal and the scientific record by applying to digital data-images the same kinds of requirements long common for other types of data. D. Ganten,  G. L. Semenza,  C. Nolte, Fostering trust, J Mol Med’s scientific integrity policy, Journal of Molecular Medicine, Volume 87, Number 1, 1-2, (DOI: 10.1007/s00109-0 08-0428-x)  Fostering Trust:   J Mol Med’s scientific integrity policy

Tips for working correctly with digital images: Doug Cromey of the University of Arizona recently published an excellent one-page overview of the importance of teaching researchers appropriate manipulation of digital data-images.

“… It is imperative that researchers be taught how to correctly work with digital images”

He gives concise, practical guidelines on correct acquisitions, archiving, data management plans, and journal instructions.  He also cites general image-manipulation guidelines that are available for discussion and training.  The article is  “Scientific Digital Image Data:  Handle With Care”  published in the Office of Research Integrity Newsletter (Vol 19, no 3, June 2011, page 2); http://ori.dhhs.gov/documents/newsletters/vol19_no3.pdf.

Research and Ethics in Publishing has an active new watchdog in Retractionwatch.wordpress.com,  an organization dedicated to encouraging a public discussion on ethics in science journal publications.     Say the authors, “We’re interested in whether journals are consistent. How long do they wait before printing a retraction? What requires one? How much of a public announcement, if any, do they make? Does a journal with a low rate of retractions have a better peer review and editing process, or is it just sweeping more mistakes under the rug?”

Readers identify problems with published images. Nature recently added a “Comments” feature to its online articles, to promote post-publication review.  Soon after online publication of an article in April (Vol. 472 [April 21]:356-36), readers began to post comments about possible problems with the figures.   Within 8 hours, a senior editor posted a notice saying that the issues were being investigated.  To see the article and the comments (at foot of article):  more… “Twisted Pixels.. Douglas W. Cromey of University of Arizona College of Medicine published an article in the Science and Engineering Ethics special issue on responsible data management (SSE, 2010; 14[Dec 2010]:639-667).  “….  The problem is twofold: (1) the very small, yet troubling, number of intentional falsifications that have been identified, and (2) the more common unintentional, inappropriate manipulation of images for publication. “  He proposes 12 guidelines for scientific digital image manipulation and discusses the technical reasons behind them. Link to article… Download article Changing standards for manipulation of images. “Standards for appropriate manipulation of digital data have developed more slowly than has the software for manipulating the images.  Half of all cases now investigated by the federal Office of Research Integrity involve questions about digital images.  Editors of biomedical journals have begun setting standards for published images, but many researchers are not aware of these changing standards.”  (CSE Poster-Changing Standards by Addeane Caelleigh and Kirsten Miles, Council of Science Editors 2011 Annual Meeting)

Nature Group adopts policies for manipulation of data-images. Veronique Kiermer, executive editor of Nature and the Nature journals, described their policies on manipulation of digital data-images and the enforcement practices.  She emphasized the importance of education about image manipulation and journals’ responsibilities.  (panel presentation at the 2011 Annual Meeting of the Council of Science Editorsmore….

ORI’s view of image integrity in scientific publishing. John W. Krueger, scientist-investigator for the U.S. Office of Research Integrity, discussed the factors that drive the incidence of inappropriate manipulation, the ORI’s tools for examining images, gave suggestions for helping reviewers and authors, and raised questions of journals’ obligations.  (panel presentation at the 2011 Annual Meeting of the Council of Science Editors)   more…
How journals can best respond to the issues of manipulation of digital data images. Kirsten Miles of P.I. Outcomes (Charlottesville, Virginia) presented questions and suggestions for journals as they deal with digital data images.  She emphasized the extended life-cycle of data, showed examples of image screening, and suggested four “best practices” for journals in managing images.  (panel presentation at the 2011 Annual Meeting of the Council of Science Editorsmore…
CSE Short Course for Editors now covers image fraud.  The well-known course, designed for newly appointed editors of science journals, now covers image fraud as part of its two-day curriculum.  The course description for the 2011 course, offered at the CSE Annual Meeting in Baltimore, included image fraud in the general description of the course.  Watch for this course in future conferences.