Quality Assurance in Research

In research contexts, quality assurance (QA) refers to strategies and policies for ensuring that data integrity, quality, and reliability are maintained at every stage of the project. This includes strategies for preventing errors from entering the datasets, taking precautions before data is collected, and establishing procedures while data is used in a study. 

Quality assurance is important for many reasons. The most obvious is that the whole point of research projects is to produce reliable data that yield rigorous and reproducible research results. There are other important factors as well. Internal Review Boards (IRBs), funding agencies, and other organizations that oversee research activity often require quality assurance procedures be implemented into project workflows to ensure that all policies are followed and that disbursed funds are going to well-organized and executed projects. There are also compliance issues in which research projects must be able to establish that data collection and analysis followed all protocols for human and animal subjects, privacy rules and regulations such as HIPAA and FERPA, and other safeguards that guarantee research projects are conducted in a responsible manner. In some instances, administrative audits are conducted to evaluate your project’s quality assurance and policy compliance. 

Having quality assurance practices in place helps your project compliance and also helps evaluating your own research and data management practices to produce the best results possible. 

Here are some steps you can take to promote quality assurance in your research:

Establishing clear data normalization protocols: Normalizing the data you record can have substantial impacts in all aspects of your research project. Normalizing means standardizing all the features and categories of data so that everyone working on the project has a clear sense for how to record it as it’s collected. Planning ahead and having clearly defined protocols for data collection before beginning the collection process means that all data that is part of the project adheres to the same standards. 

Using consistent data formats and measurement standards: Using consistent format and measurement standards is part of the normalization process, and often you can find controlled vocabularies or ontologies that will provide established structural and definitional guidelines for your data based on your discipline. This will result in consistency in your data, not only within your own project, but also for others who may want to use it later on for further analysis or evaluation. 

Rigorous data handling and analysis procedures: This is one of the most crucial components of quality assurance because data collection introduces significant opportunities for human error to undermine the integrity of data. At every stage of data collection in which a researcher records, transforms, or analyzes data, there is the potential for simple mistakes. Identifying those stages in data collection where errors are more likely to occur, and putting preventative measures in place can minimize those errors. Simple things such as establishing data and measurement formats can help, but also the tools you select for data collection can have significant impacts. 

Select data collection and storage tools that promote data consistency: Spreadsheets for instance, are notorious for making it easy for errors to occur in data collection because they offer few controls on how it’s entered. Other tools such as databases or fillable forms provide features that allow you to control how data is entered. If you have a large team of researchers collecting data from the field or in varying contexts it’s easy for inconsistencies to arise. If the tools the researchers are using require consistency, you can be more successful at maintaining data integrity at every stage of handling data.  

Metadata documenting how data was collected, recorded, and processed: Documenting how your data was handled throughout your project is good practice for a host of reasons, and it’s particularly helpful for maintaining data integrity. Making your data handling procedures explicit and formalized in the way metadata demands requires, first, that you consider these issues carefully. It also clarifies any ambiguities in your workflow so that a researcher during the project or making use of your research outputs at a later date could identify when the data is correct and where errors may have occurred.

Research staff training: Perhaps the most important thing you can do to produce consistent and reliable data is to make sure everyone working on the research project, from seasoned researchers to graduate and undergraduate project team members, have proper training in all the data collection and analysis procedures. Having everyone on the same page means that you can be confident that each person working on the project knows how their data handling tasks contribute to the overall project’s data quality goals.

Want to learn more about this topic? Check out the following resources: 

The UK Data Service provides detailed information on establishing quality assurance strategies in your research. 

DataOne provides guidance on how to craft a quality assurance plan that will allow you to “think systematically” as you put these protocols in place.