Working with NIBRS
Understanding NIBRS Data Quality
by Janet Whitaker, NY State Division of Criminal Justice Services
The integrity and reliability of data-based analysis and reporting depend, in large part, on the quality of the underlying data. The quality control of crime reporting data should begin at the local points of data collection, data entry, and data processing. This section gives an overview of the FBI-required edits. Related links provide examples, with code, for verifying the accuracy of local data sets.
For NIBRS, the FBI offers multiple levels of data quality controls and tests. Prior to participating in NIBRS, agencies are required to submit data on magnetic media for testing, and agency participation is dependent upon the submission of accurate data. The FBI provides detailed documentation of the coding and submission requirements: data element definitions; specification of the valid data values for each element; listings of the mandatory, conditional, and optional data elements; specification of which data segments are required under what circumstances; required sequencing of data segments; data submission schedules; and resubmission guidelines and controls.
When the FBI receives data submissions, an extensive series of data quality checks are run and an incident report is rejected if errors are found. The data quality checks include: value type and position within field; presence of values for mandatory and conditional data elements; use of valid values; duplication if multiple valid values are permitted for a data element; use of logical values and cross-checks between related data elements (e.g., age of child is less than age of parent); presence of required data segments; duplication of segments and/or reports; number and type of segments; segment sequencing; and appropriate links between segments (e.g., an offense segment requires a victim segment).
NIBRS data that have been edited to satisfy all of the FBI requirements may still be incomplete or misleading. For example, if "incident location" is consistently recorded as "unknown", which is a valid code, it is impossible to determine how many incidents occurred at particular location types. A comprehensive, department-level review may need to be performed to identify software problems, data collection problems, and training needs. After the problems are corrected, these additional "logic checks" should be repeated to measure the improvement in data accuracy and completeness.
The required NIBRS edits are detailed in the FBI manuals. These edits range from field-by-field valid code checking to completing detailed logical checks, such as whether property crimes have the required substantive property information, or whether person crimes have the required substantive information about victims. As mentioned above, data meeting all of the required edits may appear completely and totally correct, but may still be incomplete or misleading. Techniques to evaluate the quality of the edited NIBRS data to verify that crime reporting is as complete and accurate as possible have been developed.
As a result of NIBRS data quality testing and evaluation, five main areas that need improvement and understanding have been identified:
- When offenses or arrest charges are not assigned the correct NIBRS code, data quality testing can help find inconsistencies and errors.
- When data are missing due to software limitations or errors, data quality testing can help identify where corrections are needed.
- Excessive use of "unknown" or "not reported" codes can signal problems with data collection, data entry, or software that require correction.
- When data are missing, data quality testing can highlight the need for training or policy attention at the local department level.
- When local department crime category definitions differ from NIBRS definitions, differences between current NIBRS and historic UCR reporting patterns can be identified. Understanding these differences will improve record keeping.
Simple ad-hoc queries and attention to response frequencies are primary tools for evaluating NIBRS data quality. State programs are discovering the kinds of things to look for: accuracy of coding tables, excessive use of "default" rather than substantive codes, uncollected information, and unusual changes from historical reporting patterns. Documenting the findings and explaining any differences from historical patterns is essential for relating NIBRS data to historical UCR data and for understanding the nature and changing characteristics of crime.
Thanks to the required edits and data quality testing techniques, local departments are reporting increased data accuracy. Evaluating NIBRS data quality and making improvements or changes where necessary generates increased confidence in the quality of crime reporting for local departments, state programs, and policymakers.
Examples of how faulty reporting or editing can affect record keeping suggest a range of potential problems:
- If the tests of "use or threatened use of a weapon" and/or "serious physical injury to the victim" are not applied consistently in determining the appropriate NIBRS code, assault counts may not be accurate.
- If the youth bureau does not trust or use the software and does not enter any information about juveniles into the agency records management system, it will appear that no juveniles were arrested or referred.
- If few or no incidents show multiple offenses, victims, offenders, or arrestees, it may be that the local department is not collecting NIBRS data or is entering only "top charge" information into the system.
- If few or no unknown offenders are reported, it may be that local software lacks the capability to correctly report unknown offenders.
- If IBR crime category counts for most serious vs. multiple charges are substantially different from historical UCR reporting, it may be that the agencys crime category definition is not consistent with the FBIs UCR/IBR definition.
Identifying questions about NIBRS data quality is just one step in the quality control process. In many cases, data quality testing can pinpoint necessary training and/or policy changes or software changes. In other cases, differences between historical UCR reporting and IBR reporting can be identified, allowing local departments to understand and document the changes in their reporting.
NIBRS data quality testing gives local law enforcement greater crime reporting accuracy and confidence, and increases our understanding of crime for local law enforcement, state, and federal UCR/IBR programs.