Yang Liu 2020
Yang Liu 2020
Liu Yang for the degree of Doctor of Philosophy in Public Health presented on June 3, 2020.
Title: Improving Occupational Safety and Health (OSH) Surveillance: Focusing on Enhancing
Surveillance Methods and Evaluation
Work-related hazards and exposures affect human health and well-being. Occupational
Safety and Health (OSH) surveillance is fundamental to the public health approach. OSH
surveillance provides important data on the occurrence and magnitude of the problem and
risk/protective factors. A wide variety of OSH surveillance programs and systems have been
established and strengthened to address multiple areas of workplace safety and health over the
past few decades. This includes national systems for work-related injuries, illnesses, and
fatalities, as well as state-based OSH surveillance programs which contribute to local prevention
efforts as well as to the national data profile. However, challenges exist in current OSH
surveillance that call for more accurate, timely and comprehensive reporting of events and
challenges and gaps, including utilization of new and existing data sources for more complete
surveillance, OSH surveillance terminologies and conceptual model, and the lack of OSH
surveillance evaluation guiding frameworks. Two related studies were conducted using both
qualitative and quantitative research methods along with a translational study to accomplish three
to develop a comprehensive list of elements relevant to OSH surveillance and its evaluation,
including surveillance components, attributes and example measures. A modified Delphi study
with three survey rounds was conducted with ten experts. Based on the experts’ ratings and
comments, a total of 211 elements were selected from a draft list compiled from literature, with
134 (62.3%) reaching high level of consensus. Definition/description for each element were
refined based on experts’ suggestions to reflect best practice. Experts further reviewed their
respective OSH surveillance systems, including six state-level and one national OSH
surveillance systems/programs. Experts’ review highlighted that evaluations were not a common
practice in these OSH surveillance systems/programs. Major themes from experts’ comments
and verbal answers revealed current status and gaps of OSH surveillance in the US. Resources
constraints including financial support, technical platforms and expertise have been limitations
for more advanced OSH surveillance activities. As such, feasibility should be an important
Following the first study, the second translational study aimed to develop a practical
identified surveillance elements as well as the development of the framework were sought from
Delphi panel experts and potential users of the framework through the Council of State and
that it is meaningful to develop a tailored framework for OSH surveillance evaluation and that
the framework must be flexible, simple, and straightforward. The developed framework
incorporates the identified elements, field experts’ suggestions, and experience from the research
team.
The third study analyzed surveillance data, specifically Oregon’s workers’ compensation
(WC) claims data to apply a Prevention Index (PI) to prioritize industries of high-risk for a
vulnerable population, young workers aged 24 years and under. All accepted WC disabling
claims data (N=13,360) and a significant portion of accepted non-disabling claims data
(N=24,460) were obtained from 2013 to 2018. Non-disabling injury rate was estimated
approximately 4 times the rate of disabling injuries among Oregon young workers (401.3 vs.
111.6 per 10,000 young workers). The most common injury types among young workers for both
disabling and non-disabling claims combined was “struck by/against” (35.6%), followed by
“work-related musculoskeletal disorders” (19.5%), “fall on same level” (9.1%), and “violence”
(6.6%). Differences in the distributions of injury types were observed across industry sectors.
Based on the PI, high ranked industry sectors included agriculture, construction and
manufacturing for both genders combined. For female young workers, the high ranked sector
was healthcare. The study demonstrated that the inclusion of non-disabling WC claims data was
The overall research project is applied in nature. The developed guiding framework is
expected to address the lack of a practical framework for evaluating OSH surveillance systems
and helps to promote more evaluations and in turn to improve OSH surveillance systems. The
exploration of non-disabling WC claims data and the use of Prevention Index contributes to
targeted prevention efforts and enhances OSH surveillance of young workers that can be
by
Liu Yang
A DISSERTATION
submitted to
in partial fulfillment of
the requirements for the
degree of
Doctor of Philosophy
APPROVED:
I understand that my dissertation will become part of the permanent collection of Oregon State
University libraries. My signature below authorizes release of my dissertation to any reader
upon request.
friends have made it possible to successfully navigate the doctoral program and complete my
dissertation research. My parents, Shengli Yang and Fuyi Shen have inspired me through my
academic pursuits since I was young. Being amazing dad and mom, they have been providing me
encouragement and selfless support in this endeavor. My husband, Xun Mo have been
Mo and Jessica Aiyang Mo, have been a big part of our lives and a constant source of job. I
couldn’t have done this without the love and support of my family.
A PhD student could not have asked for a better committee than mine. As my major
advisor and mentor, Dr. Laurel Kincl has been providing great support and help throughout my
graduate program. Through the past five years, she has not only been diligently mentoring me on
conducting academic research from designing and implementing studies to writing manuscripts
including leading and communicating with the research team, developing partnerships and
seeking ways to disseminate the research, which made this research possible. She also cares
about my personal and professional growth and has provided invaluable guide in my future
career development. I would also wish to thank all other committee members, Dr. Viktor
Bovbjerg, Dr. Jeong Ho “Jay” Kim, Dr. Adam Branscum, and Dr. Bryan Tilt. My minor advisor,
Dr. Viktor Bovbjerg, has supported me in the application of the minor PhD degree. Being a
Teaching Assistant with his courses, I have learned valuable teaching experience and skills from
him. Dr. Adam Branscum has taught multiple statistics courses in which I enjoyed his clear and
coherent instructions on various aspect of statistical modeling and interpretation. He has also
been working with me through multiple studies and provided tremendous help and guide in
conducting and summarizing research findings and drafting manuscripts. All my PhD committee
members have been providing me great support and encouragement navigating me throughout
I would wish to thank Curtis Cude and Crystal Weston in the Oregon Health Authority
for their great support in my internship with the Oregon Occupational Public Health Program and
their collaboration in my studies. I would also wish to thank all participants in the Delphi study
as well as in the follow-up opinion seeking for their time, effort and enthusiasm contributing to
this research. I would also like to thank the Council of State and Territorial Epidemiologists
(CSTE) for facilitating the study enrollment and dissemination of our study results.
At Oregon State University, I have had the great opportunity to learn from outstanding
faculty members and fellow students. I enjoyed great courses from many professors including
Dr. Perry Hystad, Dr. Ellen Smit, Dr. Molly Kile and Dr. Anthony Veltri. My friends in the
program and college, Laura N Syron, Amber Lyon-Colbert, Barbara Hudson-Hanley, Barrett
Welch, Lan N. Doan, Deeksha Vasanth Rao, Meghan Fitzgerald, Jafra D. Thomas, Linh Bui, and
many others, have been motivating and encouraging me through all of the ups and downs of a
graduate study, and sharing with each other valuable experiences and resources. I am proud of
being together with them achieving our dreams and career goals.
Liu Yang
CONTRIBUTION OF AUTHORS
Liu Yang, MS, MPH, conceptualized and designed the research, collected and analyzed the data,
Dr. Laurel D. Kincl supervised the research, assisted with the study design, assisted with the
Dr. Adam Branscum and Dr. Viktor Bovbjerg assisted with the interpretation of findings and
Curtis Cude and Crystal Weston assisted with data collection, interpretation of findings and
Page
CHAPTER 1 Introduction............................................................................................................... 1
Occupational safety and health (OSH) surveillance in the United States ................................... 1
References ................................................................................................................................. 19
Introduction ............................................................................................................................... 29
Methods..................................................................................................................................... 30
Results ....................................................................................................................................... 34
Discussion ................................................................................................................................. 38
Conclusion ................................................................................................................................ 42
Reference .................................................................................................................................. 48
CHAPTER 3 Guiding Framework for Evaluating Occupational Safety and Health Surveillance
Systems ......................................................................................................................................... 51
Abstract ..................................................................................................................................... 52
Introduction ............................................................................................................................... 53
TABLE OF CONTENTS (Continued)
Page
Methods..................................................................................................................................... 54
Results ....................................................................................................................................... 55
References ................................................................................................................................. 60
CHAPTER 4 Assessing disabling and non-disabling injuries and illnesses using accepted
workers compensation claims data to prioritize industries of high risk for Oregon young workers
....................................................................................................................................................... 92
Abstract ..................................................................................................................................... 93
Introduction ............................................................................................................................... 95
Methods..................................................................................................................................... 97
Figure Page
Figure 1-1 Frequency counts of identified guidelines and frameworks by time period and type of
surveillance ................................................................................................................................... 18
Figure 2-1 Selected components and attributes of OSH surveillance in logic model (Elements
marked yellow were selected with low consensus) ...................................................................... 47
Figure 3-1 Stages of OSH surveillance activities and types of evaluation corresponding to each
stage .............................................................................................................................................. 65
Figure 3-4
Common OSH surveillance components and attributes in Logic Model .................................... 71
Figure 3-5 Initial/need evaluation: Minimum and optional components and attributes .............. 82
Figure 3-6 Routine/rapid evaluation: Minimum and optional components and attributes .......... 84
Figure 3-7 Comprehensive evaluation: Minimum and optional components and attributes ....... 86
Figure 3-8 Data quality evaluation: Minimum and optional components and attributes ............ 88
Figure 4-1 Disabling injury rate by age group (including adult worker group) and gender, Oregon
accepted disabling claims (2013-2018) ...................................................................................... 120
Figure 4-2 Distribution of claims by injury type for young workers, Oregon accepted disabling
and non-disabling claims, 2013-2018 ......................................................................................... 121
LIST OF TABLES
Table Page
Table 1-1 Major sources of data for occupational safety and health surveillance ........................ 13
Table 1-3 Attributes abstracted from existing guidelines and frameworks .................................. 15
Table 2-1 Preset Criteria for Experts in the Delphi Study ............................................................ 44
Table 2-2 Summary statistics for framework elements (average, range) ..................................... 45
Table 3-1 Questions in the Delphi study and follow-up opinion seeking..................................... 55
Table 3-2 Major comments and suggestions for an evaluation framework from experts in the
Delphi study and follow up review (n=19) ................................................................................... 55
Table 3-3 Objectives and suitable situations by type of OSH surveillance evaluation. ............... 66
Table 4-1 Oregon workers' compensation disabling and non-disabling claims frequency, rate,
medical cost by year and demographics, 2013-2018 .................................................................. 117
Table 4-2 Claim rate for young workers, by major injury type and NORA industry sector,
Oregon workers' compensation accepted disabling and non-disabling claims, 2013-2018 ....... 118
Table 4-3 NORA industry sectors ranked by Prevention Index (PI) for young workers, Oregon
workers' compensation accepted disabling and non-disabling claims, 2013-2018 .................... 119
LIST OF APPENDICES
Appendix Page
Appendix 1-1 Historical events in occupational safety and health (OSH) and the development of
OSH surveillance in the US ........................................................................................................ 141
Appendix 1-2 Identified guidelines and frameworks for evaluating public health surveillance
systems in a thorough literature review ...................................................................................... 146
Appendix 1-3 Summary of six existing evaluations of OSH surveillance systems .................... 159
Appendix 3-5 Template working tables for developing evaluation methods on selected each
attribute and measures................................................................................................................. 185
Appendix 3-6 Template working table for data collection plan ................................................. 189
Appendix 4-1 NORA industry sectors ranked by Prevention Index (PI) for male and female
young workers............................................................................................................................. 195
Appendix 4-2 Industry groups (NAICS 4-digit) ranked by Prevention Index (PI) .................... 197
1
CHAPTER 1 Introduction
Work-related hazards and exposures affect human health and well-being. Globally, it is
estimated that some 5-7% mortalities in industrialized countries are attributable to work-related
illnesses and injuries 1. In the United States of America (US), there were more than 5,000 work-
related fatal injuries and 2.8 million non-fatal injuries and illnesses reported by private
employers nationally in 2018, corresponding to a fatal injury rate of 3.5 per 100,000 Full-Time
Equivalent workers (FTE) and a non-fatal injury rate 2.8 per 100 FTEs, respectively 2. It was
estimated that the total cost of occupational injuries in the US in 2018 was $170.8 billion, equal
Work-related injuries and illnesses are preventable. Promoting safe and secure working
environments for all workers is one of the United Nations’ Sustainable Development Goals for
2015-2030 4. In public health practice, occupational health issues are addressed using the public
health approach, which includes four steps from identifying the health program to the
health (OSH) surveillance is central to every step in this approach by providing data on the
Public health surveillance is defined as the ongoing, systematic collection, analysis, and
interpretation of data regarding health related events, that are essential to the planning,
2
implementation, and evaluation of public health practices, and closely integrated with the timely
dissemination of these data linked to prevention and control 8. OSH surveillance is a subset of
public health surveillance specifically targeting issues of OSH importance 9. In the US, OSH
surveillance has largely been undertaken by government agencies at national and state level
which have the legal authority to require disease and injury reporting and access to various data
sources containing OSH information 10,11. Surveillance activities for OSH events had not been
formalized until the Occupational Safety and Health Act (OSH Act) 12. The OSH Act of 1970
designated the Bureau of Labor Statistics (BLS) to be responsible for collecting statistics on
occupational injuries and illnesses and created the National Institute for Occupational Safety and
Health (NIOSH) to conduct research to inform standards and regulations in OSH 9,12,13.
Over the past few decades, OSH surveillance has undergone tremendous improvement,
with a wide variety of OSH surveillance programs and systems being established and
strengthened to address multiple areas of workplace safety and health. National systems for
work-related injuries, illnesses, and death have been implemented and maintained including the
BLS national Survey of Occupational Injuries and Illnesses (SOII) and the Census of Fatal
collaborating with other national and State partners, such as the National Traumatic
Occupational Fatalities surveillance system (NTOF), the National Electronic Injury Surveillance
System - Occupational Supplement (NEISS-Work), to name but a few 14. States have been
playing a pivotal role in OSH surveillance by supporting the BLS SOII and CFOI programs.
Recognizing the importance of states in a nationwide OSH surveillance, NIOSH has been
collaborating with states since 1980s, resulting a set of state-based programs such as the Fatality
Assessment and Control Evaluation (FACE) programs and the Adult Blood Lead Epidemiology
3
and Surveillance System (ABLES) 7. Since early 2000, NIOSH has funded states to establish
state-based OSH surveillance systems for occupational health indicators (OHIs) and other
expanded surveillance activities, which contribute to local prevention efforts as well as to the
national data profile 15. Other agencies such as the Occupational Safety and Health
Administration (OSHA) have also been building OSH databases such as the Integrated
Management Information System (IMIS) which contains data available for tracking workplace
exposures and hazards 16. Appendix 1-1 summarizes the historical events from the 1900’s to the
present time in occupational safety and health including the development of OSH surveillance in
the US.
Challenges and gaps exist in current OSH surveillance in response to calls for more
accurate, timely and comprehensive reporting of events and conditions of occupational health
importance. The recent report, A smarter national surveillance for occupational safety and
health in the 21st century, by the National Academies of Sciences, Engineering, and Medicine as
requested by NIOSH, BLS and OSHA, reviewed current OSH surveillance in the US and
summarized key barriers 9. These included: concerns regarding confidentiality and privacy in
data reporting and use, cost of maintaining and expanding surveillance, collection of exposure
data and multi-source surveillance, OSH surveillance structure and expertise including adequate
epidemiological and informatics capacity, as well as the culture, mission, and methodology in
OSH surveillance has historically being a resource-limited field in public health, with
limited funding and staff capacity 9,17,18. However, increasing interest and opportunities in public
health surveillance including emerging data sources, new technologies and exploration of
4
surveillance methods are promising to address gaps in OSH surveillance. The following sections
Complete and accurate data are essential to diseases prevention and policy making.
significant concern regarding OSH surveillance data reliability 19–21. Studies have shown an
illnesses 22–24. Certain working populations such as self-employed, or household workers are
likely to be excluded from some OSH data systems. For example, SOII has been estimated as
Failures in caupturing work-related injuries and illnesses can happen anywhere from the
worker and physician reporting to the documentation of the injury or illness in different data
systems 25. There has been a growing recognition and practice in OSH surveillance to leverage
various data sources to detect cases that are possibly missed in one data system to improve data
completeness. Studies showed that the use of more than one data source doubled estimates of
work-related injuries compared to the BLS statistics 26–28. A wide variety of data sources have
now been explored for their use in surveillance of occupational injuries, illnesses, and hazards 26–
37
. Table 1-1 summarizes the potential data sources for OSH surveillance in the US.
Among the ample data sources, workers’ compensation (WC) claims is a commonly used
data source for OSH surveillance 38–47. Coded WC claims database provides case-level data on
the injury and illness, which are sometimes difficult to obtain in other data sources 9,48. Further,
WC data may link to data on injury severity (e.g., days of lost work time), medical and
5
compensation cost, health care records, as well as workplace safety and health and prevention
measures to allow for in-depth research. It is recommended to explore and promote the expanded
agencies, which are generally the only source available for research and OSH surveillance,
usually include only lost work time claims 9,48. Access to medical service-only claims containing
information on less severe yet more frequent work-related injuries and illnesses help with a more
complete report.
The Centers for Disease Control and Prevention (CDC) identified issues with lexicon,
definitions, and conceptual framework as a major concern of public health surveillance in the
21st century 49,50. Agreed terms and frameworks set a foundation toward increased understanding
within and beyond the public health field. Efforts have been made to standardize concepts,
terminologies, structural components, and methods in public health surveillance since decades
ago 49,51,52. Various models have been proposed regarding surveillance processes and activities.
The CDC, following suggestions by Alexander Langmuir, who is regarded as the founder of
modern public health surveillance, regards surveillance as limited to data collection, analysis and
dissemination, with no direct responsibility for disease control activities 8,51–54. While some
researchers included disease control and action as a key public health surveillance component.
For example, McNabb et al. 55 suggested eight core functions for public health surveillance,
namely: detection, registration, reporting, confirmation, analysis, feedback, acute response, and
supervision, and resource-provision. This model was further adapted in the World Health
surveillance and response systems 56,57. Choi summarized a set of common concepts and
epidemiologic data, and health data 51. Frameworks and key components were also proposed for
specific types of surveillance to reflect their own features and requirements, including infectious
diseases surveillance 58, animal health surveillance 59,60 and injury surveillance 61.
components that can be used to guide OSH surveillance, there has been no effort to
systematically assemble components and elements that best describe the overall process and
features of current OSH surveillance systems, especially systems in the US to date. The recent
report by the National Academies of Sciences, Engineering, and Medicine summarized key OSH
surveillance elements into three major groups including process, organizational and
technological components, and methods 9. However, definitions and difference between these
Surveillance evaluation
The value of public health surveillance lies in the effective and efficient delivery of
useful data regarding health determinants and outcomes. The need for both quality data and a
quality surveillance system stimulates rigorous evaluation of the current surveillance. Evaluation
is defined as systematic investigation of the merit, worth, or significance of an object 62,63. The
evaluation of a health program can be defined as the systematic collection of information about
the activities, characteristics, and outcomes of the program to make judgments about the
program, improve program effectiveness and/or inform decisions about future program
development, where the term “program” refers to “any set of organized activities supported by a
7
set of resources to achieve a specific and intended result” 62. Public health surveillance system fit
into the broad definition of “program” and thus the principles and methodologies for program
evaluation applies to surveillance evaluation. However, the data stream centered organization
and structure of public health surveillance makes the objectives and focuses in surveillance
evaluation differ from the general program evaluation. Specifically, the evaluation of public
health surveillance systems is to ensure that the surveillance system is monitoring problems of
public health importance efficiently and effectively and the utility of surveillance resources and
Guidelines and frameworks have been developed to guide the evaluation of public health
surveillance systems in the past three decades. However, existing guidelines and frameworks
provide only generic guidance which are insufficient in guiding a practical evaluation 64. To date,
no guideline or framework has been developed for evaluating OSH surveillance systems. In
practice, public health surveillance systems were evaluated largely inconsistently in methods and
criteria, with evaluation quality varying significantly 65,66. This creates challenges in results
comparison and thus reduces the usability of evaluations in guiding the improvement of
surveillance systems. Further, publications on OSH surveillance evaluation are very limited to
date. Detailed review of current evaluation guidelines and practices are presented in below
sections.
recommendations and tools for surveillance evaluation. A thorough literature review has been
conducted to comprehensively identify existing guiding approaches and examine whether there
has been published guiding approaches for OSH surveillance evaluation. A total of 34 guidelines
8
and frameworks were identified and reviewed (Appendix 1-2). Most of them (N=30, 88%) were
published since 2000 (Figure 1-1). Nearly one third of guidelines and frameworks (N=11) were
developed for all types of public health surveillance, followed by animal health surveillance
(N=8, 24%) and communicable diseases and animal health surveillance (N=4, 12%) (Error!
Reference source not found. and Table 1-2). Only one framework targets occupational diseases,
which was developed as an audit tool for national registries in EU countries 67.
The three most frequently cited guidelines are Injury surveillance guidelines by WHO
and CDC (N=601) 68, Updated guidelines for evaluating public health surveillance systems by
CDC (N=574) 8 and Guidelines for evaluating surveillance systems by CDC (N=503) 53. The two
CDC guidelines, especially the updated guidelines, are by far the most well-known and the de
facto authoritative guidelines in public health surveillance evaluation, which have guided many
existing surveillance evaluations 69–78 as well as the development of other surveillance evaluation
The CDC updated guidelines recommend ten attributes, i.e. measurable features and
stability and usefulness. Attributes have been adopted in existing guidelines and frameworks as
well as evaluation practice as a method to assess the performance and quality of a surveillance
system 56,57,68,74,84–88. A total of 52 distinct attributes were abstracted from the 34 guidelines and
frameworks (Table 1-3), among which the most frequent attributes are those recommended in
CDC updated guidelines. Components and activities in a surveillance system discussed in these
Researchers and evaluators have noted that the CDC guidelines tend to be general and
surveillance process and components 90,91. We have reported elsewhere challenges in the
evaluation of Oregon OSH surveillance system based on the CDC updated guidance (manuscript
in press). In fact, similar weaknesses were also found in other existing surveillance evaluation
guidelines and frameworks. Calba et al. 64 pointed out multiple weaknesses and limitations in
their systematic review of 15 identified guidelines and frameworks and concluded that existing
thorough review demonstrated that most existing guidelines and framework provide only generic
and detailed level of guidance and tools on implementing the evaluation are usually not included.
There is lack of a comprehensive list of attributes and guidance on how to select these attributes
and match them to components to a specific type of surveillance. Further, methods for assessing
the attributes were usually discussed with general examples. On the other hand, some guidelines
and frameworks are too specific to a certain type of surveillance and lack generalizability to
In practice, public health surveillance systems have been evaluated inconsistently, with
the evaluation quality varying significantly 65,66. Existing evaluations vary in methods and
criteria. Given limited resources, evaluations were mostly conducted in a convenient way, with
system attributes being selected subjectively and described largely qualitatively 66. A systematic
review conducted by Drewe et al. 94 of 99 existing evaluations of animal and public health
surveillance systems found that nearly half of the evaluations assessed only one or two attributes.
10
They found that comprehensive evaluation was uncommon and there was almost no evaluations
infectious diseases surveillance evaluation and other types of public health surveillance,
for publications on evaluating OSH surveillance systems identified only six publications on
comprehensive evaluation of OSH surveillance systems 87,88,95–98. Five evaluations were based on
existing guidelines, among which four referenced the two CDC guidelines on surveillance
evaluation 8,53. Three evaluations assessed attributes recommended in the CDC guidelines8,53.
Most of these attributes were evaluated qualitatively, with inconsistent measures used in these
evaluations. None of the six evaluations assessed the data quality related attributes quantitatively,
such as sensitivity and predictive value positive (PVP). Rather, they were evaluated with
qualitative evidence or even left without assessment due to lack of data. Two of the six
evaluations did not clearly describe the evaluation methods used. Among those that introduced
common methods to collect evaluation evidence. A summary of the six evaluations is included in
Appendix 1-3.
This research aims to address the identified gaps and opportunities in current OSH
surveillance in the US, including the lack of consistent terminology in OSH surveillance and the
lack of guidelines and frameworks for OSH surveillance evaluation, as well as the necessity to
improve OSH surveillance by utilizing new and existing data sources and methods. Three related
studies have been conducted using both qualitative and quantitative research methods and
11
translational study design to accomplish the three specific research aims. Components and
attributes abstracted from existing guidelines and evaluations introduced in above sections were
Aim 1: To understand the current status of OSH surveillance in the US and to develop a
the US based on the comprehensive list of elements identified in Aim 1 and follow-up
suggestions from the Delphi panel experts and potential users of the framework.
Aim 3: To explore the use of non-disabling WC claims data obtained from WC insurance
company and the Prevention Index (PI) method in OSH surveillance by examining injury and
illness burden and prioritizing industries of high-risk for a vulnerable population in Oregon,
surveillance evaluation and surveillance methods. Moving research to practice has been a focus
in this research. Being applied in nature, this research incorporated a translational design of
multidirectional process, knowledge dissemination and integration 95,96. In Aim 1 and Aim 2,
current OSH surveillance practice and need for a practical guiding framework were identified
from field experts. The Delphi study results were then disseminated back to potential users of the
framework, i.e., field professionals and practitioners to collect further suggestions on the
12
development of a guiding framework. In Aim 3, study results were presented back to the data
provider and field practitioners for young workers safety and health to seek opportunities in
This research will produce three manuscripts, which are presented in chapter 2 to chapter
Table 1-1 Major sources of data for occupational safety and health surveillance
Type Example data sources
Employer-based · BLS SOII;
· BLS CFOI;
· Data reporting to regulatory agencies by employers;
Hospital/billing based · Birth and death certificates;
· Data from healthcare facilities, such as hospitals and emergency
departments, including inpatient/outpatient discharge data;
· Data from healthcare providers, such as physicians;
· Data from clinical laboratories;
· Poison control centers data;
· Syndromic surveillance system;
· All payer claims data;
· Electronic health records;
Most frequently cited Injury surveillance guidelines (WHO & CDC, 2004), N=601;
Updated guidelines for evaluating public health surveillance systems (CDC,
2001), N=574;
Guidelines for evaluating surveillance systems (CDC, 1988), N=503
Figure 1-1 Frequency counts of identified guidelines and frameworks by time period and type of surveillance
19
References
1. Takala J, Hämäläinen P, Saarela KL, et al. Global Estimates of the Burden of Injury and
Illness at Work in 2012. J Occup Environ Hyg. 2014;11(5):326-337.
doi:10.1080/15459624.2013.863131
2. Bureau of Labor Statistics. Injuries, Illnesses, and Fatalities (IIF). Published 2020.
Accessed March 19, 2020. https://www.bls.gov/iif/home.htm
3. National Safety Council. Work Injury Costs. Injury Facts. Accessed May 14, 2020.
https://injuryfacts.nsc.org/work/costs/work-injury-costs/
5. Centers for Disease Control and Prevention. Injury Prevention and Control. the Centers for
Disease Control and Prevention. Published 2014. Accessed April 1, 2018.
https://www.cdc.gov/injury/about/approach.html
6. Centers for Disease Control and Prevention. How We Prevent Chronic Diseases and
Promote Health. Published July 31, 2019. Accessed March 31, 2020.
https://www.cdc.gov/chronicdisease/center/nccdphp/how.htm
7. National Institute for Occupational Safety and Health. Tracking Occupational Injuries,
Illnesses, and Hazards: The NIOSH Surveillance Strategic Plan. U.S. Department of
Health and Human Services, Public Health Service, Centers for Disease Control and
Prevention, National Institute for Occupational Safety and Health; 2001.
8. Centers for Disease Control and Prevention. Updated guidelines for evaluating public
health surveillance systems: recommendations from the guidelines working group. Morb
Mortal Wkly Rep. 2001;50(No. 13):1-36.
12. Council NR. Counting Injuries and Illnesses in the Workplace: Proposals for a Better
System. The National Academies Press; 1987. doi:10.17226/18911
20
13. Shire JD, Marsh GM, Talbott EO, Sharma RK. Advances and Current Themes in
Occupational Health and Environmental Public Health Surveillance. Annu Rev Public
Health. 2011;32(1):109-132. doi:10.1146/annurev-publhealth-082310-152811
14. National Institute for Occupational Safety and Health. Worker Health Surveillance - Our
Current Surveillance Initiatives. Published 2019. Accessed May 15, 2020.
https://www.cdc.gov/niosh/topics/surveillance/data.html
15. National Institute for Occupational Safety and Health. State Surveillance Program.
Published February 6, 2020. Accessed March 18, 2020.
https://www.cdc.gov/niosh/oep/statesurv.html
16. Occupational Safety and Health Administration (OSHA). OSHA - Information System
(OIS). Accessed April 26, 2020. https://www.dol.gov/agencies/oasam/centers-
offices/ocio/privacy/osha/ois
19. Stout N, Bell C. Effectiveness of source documents for identifying fatal occupational
injuries: a synthesis of studies. Am J Public Health. 1991;81(6):725-728.
doi:10.2105/AJPH.81.6.725
20. Leigh JP, Du J, McCurdy SA. An estimate of the U.S. government’s undercount of
nonfatal occupational injuries and illnesses in agriculture. Ann Epidemiol. 2014;24(4):254-
259. doi:10.1016/j.annepidem.2014.01.006
22. Leigh JP, Marcin JP, Miller TR. An Estimate of the U.S. Government’s Undercount of
Nonfatal Occupational Injuries. J Occup Environ Med. 2004;46(1):10–18.
doi:10.1097/01.jom.0000105909.66435.53
23. Rosenman KD, Kalush A, Reilly MJ, Gardiner JC, Reeves M, Luo Z. How Much Work-
Related Injury and Illness is Missed By the Current National Surveillance System? J
Occup Environ Med. 2006;48(4):357–365. doi:10.1097/01.jom.0000205864.81970.63
24. Ruser JW. Allegations of Undercounting in the BLS Survey of Occupational Injuries and
Illnesses. In: JSM Proceedings, Statistical Computing Section. American Statistical
Association; 2010:15.
25. Azaroff LS, Levenstein C, Wegman DH. Occupational Injury and Illness Surveillance:
Conceptual Filters Explain Underreporting. Am J Public Health. 2002;92(9):1421-1429.
21
26. Kica J, Rosenman KD. Multi-source surveillance for work-related crushing injuries. Am J
Ind Med. 2018;61(2):148-156. doi:10.1002/ajim.22800
27. Largo TW, Rosenman KD. Surveillance of work-related amputations in Michigan using
multiple data sources: results for 2006-2012. Occup Environ Med. 2015;72(3):171-176.
doi:10.1136/oemed-2014-102335
28. Davis LK, Grattan KM, Tak S, Bullock LF, Ozonoff A, Boden LI. Use of multiple data
sources for surveillance of work-related amputations in Massachusetts, comparison with
official estimates and implications for national surveillance. Am J Ind Med.
2014;57(10):1120–1132.
29. Borjan M, Lumia M. Evaluation of a state based syndromic surveillance system for the
classification and capture of non-fatal occupational injuries and illnesses in New Jersey.
Am J Ind Med. 2017;60(7):621-626. doi:10.1002/ajim.22734
30. Sears JM, Bowman SM, Hogg-Johnson S, Shorter ZA. Occupational Injury Trends
Derived From Trauma Registry and Hospital Discharge Records Lessons for Surveillance
and Research. J Occup Environ Med. 2014;56(10):1067-1073.
doi:10.1097/JOM.0000000000000225
31. Pefoyo AJK, Genesove L, Moore K, Del Bianco A, Kramer D. Exploring the Usefulness
of Occupational Exposure Registries for Surveillance The Case of the Ontario Asbestos
Workers Registry (1986-2012). J Occup Environ Med. 2014;56(10):1100-1110.
doi:10.1097/JOM.0000000000000235
32. Lofgren DJ, Reeb-Whitaker CK, Adams D. Surveillance of Washington OSHA Exposure
Data to Identify Uncharacterized or Emerging Occupational Health Hazards. J Occup
Environ Hyg. 2010;7(7):375-388. doi:10.1080/15459621003781207
34. Sarazin P, Burstyn I, Kincl L, Lavoué J. Trends in OSHA Compliance Monitoring Data
1979–2011: Statistical Modeling of Ancillary Information across 77 Chemicals. Ann
Occup Hyg. 2016;60(4):432-452. doi:10.1093/annhyg/mev092
35. Graber JM, Smith AE. Results from a state-based surveillance system for carbon
monoxide poisoning. PUBLIC Health Rep. 2007;122(2):145-154.
doi:10.1177/003335490712200203
38. Breslin C, Koehoorn M, Smith P, Manno M. Age related differences in work injuries and
permanent impairment: a comparison of workers’ compensation claims among
adolescents, young adults, and adults. Occup Environ Med. 2003;60(9):e10–e10.
39. Radi S, Benke G, Schaafsma F, Sim M. Compensation claims for occupational noise
induced hearing loss between 1998 and 2008: yearly incidence rates and trends in older
workers. Aust N Z J Public Health. 2016;40(2):181–185. doi:10.1111/1753-6405.12460
40. Morassaei S, Breslin FC, Shen M, Smith PM. Examining job tenure and lost-time claim
rates in Ontario, Canada, over a 10-year period, 1999–2008. Occup Env Med.
2014;70(3):171-178. doi:10.1136/oemed-2012-100743
44. Mallon M, Cherry E. Investigating the Relationship Between Worker Demographics and
Nature of Injury on Federal Department of Defense Workersʼ Compensation Injury Rates
and Costs From 2000 to 2008. J Occup Environ Med. 2015;57 Suppl 3S:S27–S30.
doi:10.1097/JOM.0000000000000416
45. Ramaswamy SK, Mosher GA. Using workers’ compensation claims data to characterize
occupational injuries in the biofuels industry. Saf Sci. 2018;103:352-360.
doi:10.1016/j.ssci.2017.12.014
46. Horwitz IB, McCall BP. Occupational injury among Rhode Island adolescents: an analysis
of workers’ compensation claims, 1998 to 2002. J Occup Environ Med Am Coll Occup
Environ Med. 2005;47(5):473-481.
47. Walters JK, Christensen KA, Green MK, Karam LE, Kincl LD. Occupational injuries to
Oregon workers 24 years and younger: An analysis of workers’ compensation claims,
2000-2007. Am J Ind Med. 2010;53(10):984-994. doi:10.1002/ajim.20819
23
48. Utterback DF, Meyers AR, Wurzelbacher SJ. Workers’ Compensation Insurance: A
Primer for Public Health. Nitional Institute for Occupational Safety and Health; 2014.
Accessed February 19, 2019. http://elcosh.org/document/3752/d001287/workers%3F-
compensation-insurance%3A-a-primer-for-public-health.html
49. Thacker SB, Qualters JR, Lee LM. Public Health Surveillance in the United States:
Evolution and Challenges. Morb Mortal Wkly Rep. 2012;61(3):3-9.
50. Hall HI, Correa A, Yoon PW, Braden CR. Lexicon, Definitions, and Conceptual
Framework for Public Health Surveillance. 2012;61:5.
51. Choi BCK. The Past, Present, and Future of Public Health Surveillance. Scientifica.
2012;2012:1-26. doi:http://dx.doi.org/10.6064/2012/875253
52. Allaki FE, Bigras-Poulin M, Michel P, Ravel A. A Population Health Surveillance Theory.
Epidemiol Health. 2012;34:e2012007. doi:10.4178/epih/e2012007
53. Centers for Disease Control and Prevention. Guidelines for evaluating surveillance
systems. Morb Mortal Wkly Rep MMWR. 1988;37(No. S-5).
54. Langmuir AD. The surveillance of communicable diseases of national importance. N Engl
J Med. 1963;268(4):182-192.
55. McNabb SJ, Chungong S, Ryan M, et al. Conceptual framework of public health
surveillance and action and its application in health sector reform. BMC Public Health.
2002;2:2. doi:10.1186/1471-2458-2-2
56. World Health Organization. Communicable Disease Surveillance and Response Systems:
Guide to Monitoring and Evaluating. World Health Organization; 2006.
http://apps.who.int/iris/bitstream/10665/69331/1/WHO_CDS_EPR_LYO_2006_2_eng.pd
f
59. Muellner P, Watts J, Bingham P, et al. SurF: an innovative framework in biosecurity and
animal health surveillance evaluation. Transbound Emerg Dis. 2018;0(0):1-8.
doi:10.1111/tbed.12898
24
60. Peyre M, Hoinville L, Njoroge J, et al. The RISKSUR EVA tool (Survtool): A tool for the
integrated evaluation of animal health surveillance systems. Prev Vet Med.
2019;173:104777. doi:10.1016/j.prevetmed.2019.104777
61. Auer AM, Dobmeier TM, Haglund BJ, Tillgren P. The relevance of WHO injury
surveillance guidelines for evaluation: learning from the aboriginal community-centered
injury surveillance system (ACCISS) and two institution-based systems. BMC Public
Health. 2011;11(744):1-15. doi:10.1186/1471-2458-11-744
62. Centers for Disease Control and Prevention (CDC), Office of Strategy and Innovation.
Introduction to Program Evaluation for Public Health Programs: A Self-Study Guide.
Atlanta, GA: Centers for Disease Control and Prevention; 2011.
http://www.cdc.gov/eval/guide/cdcevalmanual.pdf
63. Centers for Disease Control and Prevention, Program Performance and Evaluation Office.
Framework for program evaluation in public health. Morb Mortal Wkly Rep.
1999;48(No.RR-11):1-40.
64. Calba C, Goutard FL, Hoinville L, et al. Surveillance systems evaluation: a systematic
review of the existing approaches. BMC Public Health. 2015;15:448. doi:10.1186/s12889-
015-1791-5
65. Drewe JA, Hoinville LJ, Cook AJC, Floyd T, Stärk KDC. Evaluation of animal and public
health surveillance systems: a systematic review. Epidemiol Infect. 2012;140(4):575-590.
doi:10.1017/S0950268811002160
67. Spreeuwers D, de Boer AG, Verbeek JH, van Dijk FJ. Characteristics of national registries
for occupational diseases: international development and validation of an audit tool
(ODIT). BMC Health Serv Res. 2009;9:194. doi:10.1186/1472-6963-9-194
69. Adamson PC, Tafuma TA, Davis SM, Xaba S, Herman-Roloff A. A systems-based
assessment of the PrePex device adverse events active surveillance system in Zimbabwe.
PLOS ONE. 2017;12(12):e0190055. doi:10.1371/journal.pone.0190055
70. Jefferson H, Dupuy B, Chaudet H, et al. Evaluation of a syndromic surveillance for the
early detection of outbreaks among military personnel in a tropical country. J Public
Health Oxf Engl. 2008;30(4):375-383. doi:10.1093/pubmed/fdn026
71. Jhung MA, Budnitz DS, Mendelsohn AB, Weidenbach KN, Nelson TD, Pollock DA.
Evaluation and overview of the National Electronic Injury Surveillance System-
Cooperative Adverse Drug Event Surveillance Project (NEISS-CADES). Med Care.
2007;45(10 Supl 2):S96-102. doi:10.1097/MLR.0b013e318041f737
25
73. Kaburi BB, Kubio C, Kenu E, et al. Evaluation of the enhanced meningitis surveillance
system, Yendi municipality, northern Ghana, 2010–2015. BMC Infect Dis. 2017;17:306.
doi:10.1186/s12879-017-2410-0
76. Pinell-McNamara VA, Acosta AM, Pedreira MC, et al. Expanding Pertussis Epidemiology
in 6 Latin America Countries through the Latin American Pertussis Project. Emerg Infect
Dis. 2017;23(Suppl 1):S94-S100. doi:10.3201/eid2313.170457
77. Thomas MJ, Yoon PW, Collins JM, Davidson AJ, Mac Kenzie WR. Evaluation of
Syndromic Surveillance Systems in 6 US State and Local Health Departments: J Public
Health Manag Pract. Published online September 2017:1.
doi:10.1097/PHH.0000000000000679
79. Hoffman S, Dekraai M. Brief Evaluation Protocol for Public Health Surveillance Systems.
The Public Policy Center, University of Nebraska; 2016.
80. Centers for Disease Control and Prevention. Evaluating an NCD-Related Surveillance
System. Centers for Disease Control and Prevention; 2013.
81. Centers for Disease Control and Prevention. Framework for evaluating public health
surveillance systems for early detection of outbreaks: recommendations from the CDC
Working Group. Morb Mortal Wkly Rep. 2004;53(No. RR-5):1-13.
82. Salman null, Stärk KDC, Zepeda C. Quality assurance applied to animal disease
surveillance systems. Rev Sci Tech Int Off Epizoot. 2003;22(2):689-696.
83. Meynard J-B, Chaudet H, Green AD, et al. Proposal of a framework for evaluating
military surveillance systems for early detection of outbreaks on duty areas. BMC Public
Health. 2008;8(1):146.
26
85. Health Canada. Framework and Tools for Evaluating Health Surveillance Systems.
Population and Public Health Branch, Health Canada; 2004. Accessed October 18, 2017.
http://publications.gc.ca/site/eng/260337/publication.html
86. Hagemeyer A, Azofeifa A, Stroup DF, Tomedi LE. Evaluating Surveillance for Excessive
Alcohol Use in New Mexico. Prev Chronic Dis. 2018;15. doi:10.5888/pcd15.180358
87. Austin C. An evaluation of the census of fatal occupational injuries as a system for
surveillance. Compens Work Cond. 1995;1:51–54.
89. Garland R. Evaluation of the Oregon Asthma Surveillance System. Oregon Department of
Human Services, Public Health Division, Office of Disease Prevention and Epidemiology,
Health Promotion and Chronic Disease Prevention, Oregon Asthma Program; 2009.
90. Samoff E, MacDonald PDM, Fangman MT, Waller AE. Local Surveillance Practice
Evaluation in North Carolina and Value of New National Accreditation Measures. J
Public Health Manag Pract. 2013;19(2):146–152. doi:10.1097/PHH.0b013e318252ee21
92. Dufour B. Technical and economic evaluation method for use in improving infectious
animal disease surveillance networks. Vet Res. 1999;30(1):27–37.
93. World Health Organization. Assessing the National Health Information System: An
Assessment Tool (v4.0). World Health Organization; 2008.
http://apps.who.int/iris/bitstream/10665/43932/1/9789241547512_eng.pdf?ua=1
94. Drewe JA, Hoinville LJ, Cook AJC, Floyd T, Staerk KDC. Evaluation of animal and
public health surveillance systems: a systematic review. Epidemiol Infect.
2012;140(4):575-590. doi:10.1017/S0950268811002160
95. Zoellner JM, Porter KJ. Translational Research: Concepts and Methods in Dissemination
and Implementation Research. In: Coulston AM, Boushey CJ, Ferruzzi MG, Delahanty
LM, eds. Nutrition in the Prevention and Treatment of Disease (Fourth Edition).
Academic Press; 2017:125-143. doi:10.1016/B978-0-12-802928-2.00006-0
96. National Institute for Occupational Safety and Health. Research to Practice (r2p).
Published October 19, 2018. Accessed May 15, 2020.
https://www.cdc.gov/niosh/r2p/default.html
27
Evaluation Framework
a
College of Public Health and Human Sciences, Oregon State University, 123 Women’s
Building, Corvallis, Oregon, 97331, USA
28
Abstract
health (OSH) surveillance system. No tailored guidance exists for evaluating OSH surveillance
systems to date. The study aimed to understand current OSH surveillance in the US and propose
Methods: A modified Delphi study with three survey rounds invited experts to rate and
comment on OSH surveillance evaluation framework elements. An initial list of elements was
compiled from literature. The expert panel rated and commented on the elements resulting in a
final list through the panel’s consensus. Additionally, experts completed a brief review of OSH
surveillance systems they had worked with and answered questions regarding the development of
an evaluation framework. A mixed methods analysis compiled descriptive statistics of the ratings
Results: Fifty-four people across the US were contacted to participate. Ten experts began the
first survey round with eight then seven experts continuing in the subsequent rounds,
were selected into the final list through panel consensus, with 134 (62.3%) reaching high
consensus. Major themes regarding current OSH surveillance focused on resources and
Conclusions: A Delphi process successfully identified tailored OSH surveillance elements and
major themes regarding OSH surveillance in the US. A framework developed from these results
can serve as a standard yet flexible guide for evaluating OSH surveillance systems.
29
Introduction
Despite the declining trends, millions of workers in the United States (US) suffer injuries
and illnesses on the job each year. In 2018, there were 2.8 million nonfatal occupational injuries
and illnesses and more than 5000 fatal occupational injuries according to the US Bureau of
Labor Statistics 1. Important data on work-related fatalities, injuries and illnesses as well as the
safety and health (OSH) surveillance systems to inform prevention efforts. There has been a
growing interest to improve OSH surveillance capacity and quality for more timely, accurate and
Surveillance evaluation can be challenging, beginning with the selection of attributes (i.e.
characteristics of the surveillance system which needs to be evaluated) and measures (i.e.
evaluation questions or indicators). It was reported that only one or two attributes were measured
Evaluations can be open to interpretation and not comparable due partly to the lack of consistent
operational definitions and detailed guidance corresponding to the different types of surveillance.
terminologies and guide surveillance evaluations, most were geared towards communicable
diseases surveillance 5,7,8, and were critiqued as providing only generic guidance 9,10. A published
systematic literature review on 15 guiding approaches for public health and animal health
guidelines have been developed specific for these systems. The development of surveillance
systems for occupational safety and health conditions in the US has historically lagged behind
other health conditions and is still facing ongoing challenges including organizational capacity,
data sources and resources 2,3,12–14. A tailored framework that takes into consideration
characteristics of OSH surveillance systems can guide comparable and replicable evaluations and
thus contribute to the continuous development and improvement of OSH surveillance. In our
model, the framework should incorporate structural components (i.e., main functions and
measures adds to the practicality of the framework. To this end, a Delphi survey study was
designed and conducted to solicit opinions and suggestions from a panel of experts to: 1) develop
example measures; and 2) understand the current status of OSH surveillance in the US. It is
expected that findings of this study can improve understanding of OSH surveillance and inform
Methods
research fields including guidelines development in health sciences 15–17. It particularly suits
research questions where value judgements are required on a collective basis 18–20. Compared to
other face-to-face interactions, advantages of the Delphi technique include anonymity, the
avoidance of dominated views, structured feedback for valid consensus and quantitative group
responses 17,20–22. Electronic communication methods allows for more efficient organization and
31
timely responses in the Delphi process to reduce time, geographical and economic restrictions 17.
A modified Delphi study with three internet-based survey rounds and a background
survey was designed following the recommendations from common Delphi theoretical
frameworks and practices in healthcare and related fields 16,17,19–24,24–27. Experts across the US
were invited to rate and comment on framework elements as well as to review their respective
The study obtained ethics review approval by the Institutional Review Board at Oregon
Framework Elements
Common surveillance components, attributes, and example measures were compiled with
a thorough review of existing guidelines and frameworks of public health surveillance evaluation
as well as published evaluations for OSH surveillance systems by January 2019 (see Appendix 1-
2 and Appendix 1-3 for a brief introduction of guidelines and frameworks as well as evaluations
(including 47 sub-components), 32 attributes and 133 example measures were described and
organized into a logic model that was provided to the expert panel (See Figure 2-1 for the
The panel included experts with experience in at least one of the following two main
specialty areas: 1) the operation of OSH surveillance systems in the US and, 2) evaluation of
OSH surveillance systems. Potential participants were identified through multiple channels: 1)
32
websites on national and state level OSH surveillance systems in the US; 2) authors of peer-
in OSH surveillance; and 4) participants in the Council of State and Territorial Epidemiologists
Recruitment emails were sent to 54 identified persons from 27 national and state level
OSH surveillance systems in the US, and authors and researchers in OSH surveillance seeking
expressed interest and were invited into the first Delphi round. Six criteria (Table 2-1) were
evaluated for each potential panelist. A panelist had to satisfy at least three criteria to qualify as
an “expert”.
Survey Rounds
Three rounds of internet-based surveys were administered from March 2019 to August
2019. In each round, an invitation email was sent to panelists who had completed the previous
round, followed by two reminder emails. The first two rounds were to review 1) surveillance
components and attributes, and 2) example measures, respectively. Panelists rated these
framework elements on a 5-point Likert scale based on their relevance to OSH surveillance in
the US and its evaluation, with 1 indicating the least relevant and 5 the most relevant. They also
provided justifications, suggested revisions, additional elements, and other relevant thoughts.
Statistics of the ratings and qualitative summaries for each element along with the panelist’s own
rating in the first review were sent back to each panelist for a second review in the next round,
where the panelist had the opportunity to reconsider his/her ratings for elements that did not have
a high panel consensus. In the third round, the panelists gave a brief review of the OSH
surveillance systems they work with. Using selected attributes from the previous two rounds, the
33
panelists scored their system’s performance on a scale from 0 (worst performance) to 10 (best
performance) (based on their best understanding without collecting actual evaluation evidence).
Questions regarding the development of the framework were included in each round. A
background survey to collect panelists’ educational and professional experience was completed
For each element, quantitative statistics reflecting panel opinions were calculated,
including mean rating, difference between the highest and lowest ratings, mode (the most
frequent rating), SD (standard deviation of ratings), and the percent of experts who rated 3 or
higher. The panelist’s comments were taken into consideration to edit the element. The following
1) High consensus: mean rating and mode ³ 4, 80% or more panelists rated 3 or higher,
difference of panel ratings £ 2 (e.g. range from 3 to 5), and no significant edits based on
panelists’ comments;
2) Low consensus/Selected for further confirmation: mean rating and mode ³ 3, 60% or
3) Dropped: mean rating < 3, or mode < 3, or less than 60% panelists rated the element 3
or higher.
All other questions were analyzed using descriptive statistics including mean and
percentage. Content analysis was performed for all comments and verbal answers to identify
major themes regarding the current status of OSH surveillance systems and their evaluation. Data
34
were initially processed using Microsoft Excel. Quantitative analyses were conducted using R
Results
Of the 14 people who expressed interest, 10 (71.4%) completed the first round survey and
were included in the Delphi panel. Eight out of the ten panelists (80%) completed the second
round and seven (70%) completed the third round. The initial 10 panelists were geographically
distributed across the US, representing 12 different states covering all five regions in the US. Of
the 10 participants, seven (70%) worked for state health departments and three (30%) were from
an academic setting. The panelists’ experience working with OSH surveillance systems in the US
ranged from 2 to 5 years, with a mean of 4.3 years. Eight of the 10 panelists had experience with
OSH surveillance evaluation. All panelists held a master’s degree or higher and had peer-
reviewed publications and/or conference presentations relevant to OSH surveillance. All panel
members were qualified to be experts based on the preset criteria (Table 2-1).
The number of elements selected and consensus statistics from the Delphi rounds are
31 attributes, and 116 example measures in a final list, with 134 (62.3%) reaching high
consensus. In general, components and attributes received higher mean rating and mode as well
as smaller difference and SD than example measures. Compared to elements with low consensus,
elements that received high consensus tended to have much higher mean and mode, and much
smaller difference and SD, with more than 98% of panel experts rating them as ≥ 3.
35
convergence and the panel agreement. The SD and difference of elements rated in the second
review reduced (Wilcoxon signed rank test P-values < 0.05), possibly indicating increased
opinion convergence. More elements had small difference in the final list compared to the initial
list (data not shown). For example, the percentage of measures with a difference of 2 or below
An OSH surveillance logic model was finalized to include components and attributes that
were selected after two panel reviews (Figure 2-1). A complete list of all selected elements as
well as their finalized description can be found in Appendix 3-2 to Appendix 3-4.
Six state-level and one national OSH surveillance systems/programs in the US were
evaluated by panelists in the last Delphi round. The majority of the systems were supported by
national funds (86%). Nearly half of the systems (43%) conducted only secondary data analysis.
More than 57% of the systems were never or rarely evaluated and no system was evaluated
frequently. For those ever being evaluated, common purposes were to assess implementation
(80%) and performance (60%). Need assessment, cost-effectiveness and data quality assessment
were rarely or never conducted. Seven attributes were scored high (>8.0) in these systems,
among which, compliance and confidentiality received the highest score (9.0), followed by
significance, relevance, adherence, accuracy, and clarity. Three attributes received the lowest
score: legislative support (5.9), timeliness (5.7), and integration (3.9). Common perceived
weaknesses included high quality data sources, being able to conduct various surveillance
activities, dissemination, and creating outcomes and impacts. Major constraints included lack of
36
resources, legislative support, and recognition and integration into the general public health
domain.
Content analysis on the ample commentary from panel experts over the three Delphi
rounds revealed the themes described below. Table 2-3 includes the theme categories and most
Status quo and issues: Resource constraints and feasibility were common issues noted by
the panel. One expert commented that “the reality is that resources for this type of work [OSH
surveillance] are historically VERY limited.” Funding was reported as not stable. A panelist
wrote that it would be “best if … built into the regular funding cycle.” Technological resources
and legislative supports were reported as basic or even lacking. The lack of resources included
commented that “without them [legislation and regulations] one may end up using existing data
not designed for OSH” and another wrote that “most of the data systems in use currently for
OSH surveillance were designed for an entirely separate purpose (i.e. financial tracking and
reimbursement)”. Reported issues with various existing data sources included the lack of
and the data timeliness. A panelist commented, “Often there is a substantial lag time in the
availability of surveillance data for public health actions”. A recurring message from the panel
Ideal OSH surveillance: Some system requirements may not seem readily feasible given
limited resources and thus belong to “an ideal” OSH surveillance. Experts’ comments helped to
37
outline an ideal OSH surveillance, in which all necessary infrastructures and resources including
were well-integrated to support key surveillance activities from data collection to dissemination
as well as interoperability among data systems. Strategic planning was emphasized in the ideal
OSH surveillance. “Having a well thought out plan…is critical to developing a streamlined
system”. Plans and protocols should cover both existing and potential surveillance activities.
Implications for evaluation and a framework: Nearly all experts held positive
expectations about the development of a tailored framework for OSH surveillance evaluation.
One expert commented that “It would be helpful to have an evaluation framework specific to
OSH surveillance ...” and another noted that “we've been struggling with developing a
evaluation”. Experts called for flexibility in OSH surveillance evaluation, because “a ‘one-size
fits all’ mentality of program evaluation is often unnecessary”, and an evaluation framework
should “allow for ‘picking and choosing’ of those evaluation components”. Factors such as
differences between systems, changing surveillance objectives and OSH conditions over time,
and non-routine activities affect the evaluation choices. One expert commented that “…
evaluating it [the infrastructure] seems difficult as it constantly changes and differs from state to
state”. Another expert stated, “Case definitions are often lacking for OSH across jurisdictions,
states, areas.” As to assessing the significance of surveillance, one expert wrote that “Priorities
change and the most useful surveillance systems are those that have a historical trove of
information about priority conditions that emerge as ‘significant concerns’ at some point.”
standards and guidelines, and strategies were commonly stated as impacting the system’s
performance and stability. For example, experts commented that “Infrastructure is necessary for
an effective and efficient public health surveillance system” and the surveillance strategy
“impacts standardization and consistency over time and among data sources”. Some elements
were regarded as more central in relation to other elements. For example, experts commented
that stakeholders’ acceptance and participation were impacted by the system’s compliance,
transparency, simplicity and usefulness, while they in turn impacted various aspects of the
system’s performance. They stated that “without the willingness, the data is not as good”,
“whether a system will function smoothly, meets the intended purpose and can be sustained also
depends on its acceptability by stakeholders.” Attributes such as data quality, data timeliness and
Discussion
The lack of comparability and replicability in evaluations limits their usefulness 28. We
have reported elsewhere the results of an OSH surveillance evaluation in Oregon and challenges
facilitate the understanding of important elements in OSH surveillance and thus contribute to the
improvement of surveillance and its evaluation. This three-round survey study based on the
the current status of OSH surveillance systems in the US further informed contextual information
Delphi Consensus
The resulting list (Appendix 3-2 to Appendix 3-4) contains elements receiving high and
low level of consensus. Means and modes are common statistics used in the Delphi process to
39
indicate central tendency of panelists’ ratings, or “average group estimate”, while difference, SD,
and the percent of experts giving a certain rating or selecting a certain category can be used to
measure the divergence of group opinions, or the amount of panel disagreement 21,22,29. This
study leveraged a combination of central tendency and divergence measures to differentiate the
framework elements.
potentially change their opinions after referring to the views of their peers 17,21,30. The second
review helped to reach panel opinion convergence for elements that the panel rated low and/or
held more dispersed views for in the first review. After the second review, both the difference
and SD reduced significantly. The change of ratings in this study suggested that the “group
opinions” had guided the panel experts’ reconsideration of their stance. A strength in this Delphi
study is that experts were provided with feedback in both quantitative and qualitative summaries,
as literature have suggested that qualitative feedback is an effective way to mitigate experts’
Surveillance Elements
Many elements in the “Inputs” in the logic model were considered fundamental and
critical to the OSH surveillance. The need for more relevant OSH surveillance data sources and
stakeholders was also identified by the panel. Regarding surveillance activities, not surprisingly,
data processing, ongoing monitoring and data dissemination and their sub-components (except
case investigation) were considered to be relevant to OSH surveillance. These activities are also
The study revealed less relevant components and attributes to current OSH surveillance
due partly to limited resources, for example, the component, “early detection” and its three sub-
components, “provide timely data”, “detect clusters and unusual events” and “guide immediate
actions”. Despite that guiding immediate action was proposed for an ideal OSH surveillance 3,
the panel experts thought that these activities were less relevant to current OSH surveillance
because they were “resources intensive”. Among low consensus attributes, “simplicity” is worth
mentioning. In general, a surveillance system should be effective and as simple as possible 4,5,8,34.
Interestingly, the panelists had conflicting perspectives towards simplicity: OSH surveillance
was fairly simple, but, on the other hand, that surveillance systems can be intricate.
This study confirmed that current OSH surveillance was largely focused on secondary
data analyses with multiple external data sources. As the recent report by the National
comprehensive OSH surveillance system in the US, but rather an evolving set of systems using a
variety of data sources that meet different objectives, each with strengths and weaknesses” 3.
Both the experts’ comments and their reviews of respective systems highlighted limited
resources for OSH surveillance, which is also related to the lack of or less frequent evaluation for
the systems. Experts reported barriers to evaluation including staff time and budget, as well as
and feasibility in deciding evaluation objectives and criteria. Although feasibility is always a
consideration, it seemed to be more remarkable with OSH surveillance, as the panel experts
pointed out that many criteria were ideal but not readily feasible due to limited resources.
41
The identified network relationships among elements are similar to another existing study
understand interactions among OSH surveillance components and attributes and to investigate
This study highlighted the need for tailored guidelines for OSH surveillance evaluation.
According to the panel experts, they struggled with developing a systematic evaluation approach
and current guidelines were not designed to accommodate systems for non-communicable
diseases and systems that rely on multiple data sources such as OSH. Relevant components,
attributes and measures identified in this study can guide more relevant evaluation of OSH
surveillance systems. The study team is further working on a more detailed evaluation
Limitations
A few limitations should be recognized in this study. Similar to any other consensus
methods, the Delphi technique is subject to cognitive bias in information exchange and decision-
making processes related to the participants’ backgrounds and personal traits 36. Potential
participants were identified using multiple information sources, however, despite the effort to
recruit a panel representing both the national and state level OSH surveillance systems, most
Participation rate in the second and the third round were 80% and 70% respectively in
this study. The attrition may impact the quality of information being gathered in this study.
Participant fatigue and the associated attrition has been recognized as a particular challenge in
Delphi studies given its iterative feature 18,26. However, as compared to a 16% participation rate
42
in the third round for large Delphi studies reported in literature 16, the participation rates in this
study were relatively high. This study adopted multiple recommendations in literature to
minimize attrition including the use of endorsed individuals, a modified Delphi process using
structured (closed-ended) items in the first round, frequent reminders and active contacts even
Researchers have warned that even the most well-planned Delphi may not yield an
exhaustive nor all-inclusive set of ideas 25. The finalized list in this study may not cover every
aspect of OSH surveillance given limited existing reference for this field. With continued efforts
promoting OSH surveillance and its evaluation, more elements that are relevant and practical to
Conclusion
and example measures relevant to OSH surveillance systems in the US by conducting a three-
round Delphi survey study with experts across the US who had experience in OSH surveillance
operation and/or with surveillance evaluation. The study results suggested that a Delphi process
delivered valuable findings through panel consensus based on both quantitative and qualitative
analyses. Findings from the experts’ reviews of their respective OSH surveillance systems and
their comments provided contextual information to understand the current status of OSH
A framework developed from these results can serve as a flexible, yet standard guide for
evaluating OSH surveillance systems. The evaluators could choose components and attributes
that are appropriate to their systems and using example measures to develop their own evaluation
43
metrics. Future research and practice are needed to further refine and enrich the list and explore
its applicability.
44
Delphi %Panelists
Element Category #Element Mean rating Mode Difference SD
process rated 3 or above
Component After 1st All 59 4.3 (3.4, 4.9) 4.6 (3, 5) 2.0 (1, 3) 0.7 (0.3, 1.2) 95 (80, 100)
review High consensus 34 4.6 (4.0, 4.9) 4.9 (4, 5) 1.6 (1, 2) 0.6 (0.3, 0.9) 99 (90, 100)
(n=10)
For confirmation 22 3.9 (3.4, 4.5) 4.1 (3, 5) 2.6 (1, 3) 0.9 (0.5, 1.2) 90 (80, 100)
Dropped 3 4.1 (3.7, 4.6) 4.3 (4, 5) 2.0 (2, 2) 0.7 (0.7, 0.7) 97 (90, 100)
After 2nd High consensus 45 4.5 (4.0, 4.9) 4.9 (4, 5) 1.6 (1, 2) 0.6 (0.3, 0.9) 98 (80, 100)
review (n=8) Low consensus 19 3.8 (3.5, 4.4) 4.1 (3, 5) 2.7 (2, 3) 0.9 (0.6, 1.2) 88 (60, 100)
Dropped 1 4 5 3 1.3 50
Final list All 64 4.3 (3.5, 4.9) 4.6 (3, 5) 2.0 (1, 3) 0.7 (0.3, 1.2) 95 (60, 90)
Attribute After 1st All 32 4.2 (3.6, 4.8) 4.5 (3, 5) 2.0 (1, 3) 0.8 (0.4, 1.2) 96 (80, 100)
review (n=8) High consensus 24 4.4 (4.0, 4.8) 4.6 (4, 5) 1.7 (1, 2) 0.7 (0.4, 0.9) 99 (90, 100)
For confirmation 8 3.9 (3.6, 4.3) 4.1 (3, 5) 2.8 (2, 3) 1.0 (0.7, 1.2) 89 (80, 100)
Dropped 0 / / / / /
After 2nd High consensus 26 4.4 (4.0, 4.8) 4.6 (4, 5) 1.7 (1, 2) 0.7 (0.4, 0.9) 99 (90, 100)
review (n=7) Low consensus 5 3.7 (3.3, 4.0) 3.6 (3, 5) 2.8 (2, 3) 0.9 (0.7, 1.1) 90 (80, 100)
Dropped 1 4.1 5 2 0.9 90
Final list All 31 4.3 (3.3, 4.8) 4.5 (3, 5) 1.9 (1, 3) 0.7 (0.4, 1.1) 98 (80, 100)
Measure After 1st All 133 3.8 (2.5, 4.8) 4.0 (2, 5) 2.3 (1, 4) 0.9 (0.4, 1.4) 92 (50, 100)
review (n=8) High consensus 55 4.2 (4.0, 4.8) 4.6 (4, 5) 1.9 (1, 2) 0.7 (0.5, 1.0) 99.8 (88, 100)
For confirmation 71 3.6 (3.0, 4.3) 3.6 (3, 5) 2.5 (1, 4) 0.9 (0.4, 1.4) 89 (63, 100)
Dropped 7 3.0 (2.5, 3.8) 2.9 (2, 4) 2.9 (2, 3) 1.1 (0.9, 1.3) 66 (50, 88)
After 2nd High consensus 63 4.2 (4.0, 4.8) 4.6 (4, 5) 1.9 (1, 2) 0.7 (0.5, 1.0) 99.8 (88, 100)
review (n=7) Low consensus 53 3.5 (3.0, 4.1) 3.5 (3, 5) 1.8 (0, 4) 0.7 (0.0, 1.2) 94 (75, 100)
Dropped 10 2.8 (2.6, 3.1) 2.9 (2, 3) 2.5 (2, 3) 0.8 (0.7, 1.1) 69 (63, 88)
Final list All 116 3.9 (3.0, 4.8) 4.1 (3, 5) 1.9 (0, 4) 0.7 (0.0, 1.2) 97 (75, 100)
46
Figure 2-1 Selected components and attributes of OSH surveillance in logic model (Elements marked yellow were selected with low consensus)
48
Reference
1. Bureau of Labor Statistics. Injuries, Illnesses, and Fatalities (IIF). Published 2020.
Accessed March 19, 2020. https://www.bls.gov/iif/home.htm
5. Centers for Disease Control and Prevention. Updated guidelines for evaluating public
health surveillance systems: recommendations from the guidelines working group. Morb
Mortal Wkly Rep. 2001;50(No. 13):1-36.
6. Drewe JA, Hoinville LJ, Cook AJC, Floyd T, Staerk KDC. Evaluation of animal and
public health surveillance systems: a systematic review. Epidemiol Infect.
2012;140(4):575-590. doi:10.1017/S0950268811002160
7. Hoinville LJ, Alban L, Drewe JA, et al. Proposed terms and concepts for describing and
evaluating animal-health surveillance systems. Prev Vet Med. 2013;112(1-2):1-12.
doi:10.1016/j.prevetmed.2013.06.006
9. Auer AM, Dobmeier TM, Haglund BJ, Tillgren P. The relevance of WHO injury
surveillance guidelines for evaluation: learning from the aboriginal community-centered
injury surveillance system (ACCISS) and two institution-based systems. BMC Public
Health. 2011;11:744. doi:10.1186/1471-2458-11-744
11. Calba C, Goutard FL, Hoinville L, et al. Surveillance systems evaluation: a systematic
review of the existing approaches. BMC Public Health. 2015;15:448. doi:10.1186/s12889-
015-1791-5
49
12. Shire JD, Marsh GM, Talbott EO, Sharma RK. Advances and Current Themes in
Occupational Health and Environmental Public Health Surveillance. Annu Rev Public
Health. 2011;32(1):109-132. doi:10.1146/annurev-publhealth-082310-152811
16. Foth T, Efstathiou N, Vanderspank-Wright B, et al. The use of Delphi and Nominal Group
Technique in nursing education: A review. Int J Nurs Stud. 2016;60:112-120.
doi:10.1016/j.ijnurstu.2016.04.015
17. Murphy MK, Black NA, Lamping DL, et al. consensus development methods, and their
use in clinical guideline development. Health Technol Assess. 1998;2(3):1-88.
18. Adler M, Ziglio E. Gazing Into the Oracle: The Delphi Method and Its Application to
Social Policy and Public Health. Jessica Kingsley Publishers; 1996.
19. Linstone HA, Turoff M. The Delphi Method: Techniques and Applications. First Edition.
Addison-Wesley Educational Publishers Inc; 1975.
20. Powell C. The Delphi technique: myths and realities. J Adv Nurs. 2003;41(4):376-382.
doi:10.1046/j.1365-2648.2003.02537.x
21. Hsu C-C, Sandford BA. The Delphi technique: making sense of consensus. Pract Assess
Res Eval. 2007;12(10):1–8.
22. Rowe G, Wright G. The Delphi technique as a forecasting tool: issues and analysis. Int J
Forecast. 1999;15(4):353-375. doi:10.1016/S0169-2070(99)00018-7
23. Boulkedid R, Abdoul H, Loustau M, Sibony O, Alberti C. Using and Reporting the Delphi
Method for Selecting Healthcare Quality Indicators: A Systematic Review. Wright JM, ed.
PLoS ONE. 2011;6(6):e20476. doi:10.1371/journal.pone.0020476
25. Hasson F, Keeney S. Enhancing rigour in the Delphi technique research. Technol Forecast
Soc Change. 2011;78(9):1695-1704. doi:10.1016/j.techfore.2011.04.005
26. Hsu C-C, Sandford BA. Minimizing Non-Response in The Delphi Process: How to
Respond to Non-Response. Pract Assess Res Eval. 2007;12(17):1-6.
50
27. Sinha IP, Smyth RL, Williamson PR. Using the Delphi Technique to Determine Which
Outcomes to Measure in Clinical Trials: Recommendations for the Future Based on a
Systematic Review of Existing Studies. PLoS Med. 2011;8(1):e1000393.
doi:10.1371/journal.pmed.1000393
29. Greatorex J, Dexter T. An accessible analytical approach for investigating what happens
between the rounds of a Delphi study. J Adv Nurs. 2000;32(4):1016-1024.
30. Makkonen M, Hujala T, Uusivuori J. Policy experts’ propensity to change their opinion
along Delphi rounds. Technol Forecast Soc Change. 2016;109:61-68.
doi:10.1016/j.techfore.2016.05.020
31. Bolger F, Wright G. Improving the Delphi process: Lessons from social psychological
research. Technol Forecast Soc Change. 2011;78(9):1500-1513.
doi:10.1016/j.techfore.2011.07.007
32. Meijering JV, Tobi H. The effect of controlled opinion feedback on Delphi features:
Mixed messages from a real-world Delphi experiment. Technol Forecast Soc Change.
2016;103:166-173. doi:10.1016/j.techfore.2015.11.008
33. Choi BCK. The Past, Present, and Future of Public Health Surveillance. Scientifica.
2012;2012:1-26. doi:http://dx.doi.org/10.6064/2012/875253
34. Health Canada. Framework and Tools for Evaluating Health Surveillance Systems.
Population and Public Health Branch, Health Canada; 2004. Accessed October 18, 2017.
http://publications.gc.ca/site/eng/260337/publication.html
35. Calba C, RVC JD, Goutard F, et al. The Evaluation Attributes Used for Evaluating Animal
Health Surveillance Systems. RISKSUR Project; 2013.
Liu Yang a
Laurel Kincl a
a
College of Public Health and Human Sciences, Oregon State University, 123 Women’s
Building, Corvallis, Oregon, 97331, USA
52
Abstract
Background: Despite increased interest in public health surveillance and its evaluation, no
tailored framework has been developed for evaluating Occupational Safety and Health (OSH)
Methods: A Delphi study was conducted with an expert panel to select framework elements
(surveillance components, attributes and measures) through consensus. The findings were shared
with experts and field practitioners in OSH surveillance in the US through the Council of State
feedback was obtained on the applicability of identified elements and suggested content for a
tailored framework. This feedback guided the development of a tailored evaluation framework
Results: A total of ten experts from the Delphi panel and nine field professionals provided
feedback on proposed elements and guidance on their use. Suggestions covered multiple aspects
for a framework, including offering flexibility, providing recommended and optional elements,
considering stages of surveillance, and being simple and straightforward. The guiding framework
has been developed to incorporate the final elements selected in the Delphi study as well as
experts’ suggestions. The framework includes practical guidance on evaluation steps and four
different evaluation types, as well as practical tools such as template working tables.
Conclusion: The developed framework is expected to address the lack of a practical guiding tool
in OSH surveillance evaluation. It is expected with use, the framework will provide a
Introduction
Despite increased interest in public health surveillance and its evaluation, no tailored
framework has been developed for evaluating Occupational Safety and Health (OSH)
surveillance in the United States (US). Evaluators refer to commonly used guidelines,
represented the Updated guidelines for evaluating public health surveillance systems by CDC, to
evaluate OSH surveillance systems 1. However, the CDC guidelines provide limited guidance for
evaluating OSH surveillance systems, which tend to have limited resources and lack components
Public health surveillance systems are usually evaluated by attributes, which refer to a set
attributes, and measures for each attribute, a Delphi study was conducted. A panel of experts
across the US reviewed and rated elements (surveillance components, attributes and measures)
compiled from the literature. The panel identified relevant elements that could be used in a
Using a translational study design, findings from the Delphi study were disseminated to
the potential users of the framework, i.e., health professionals and practitioners in national and
state-level OSH surveillance systems. Structured feedback was collected on the applicability of
identified elements and suggested contents in a tailored framework. This paper presents the
Methods
A Delphi study was conducted from March 2019 to August 2019 to propose elements
(surveillance components, attributes and example measures) that are relevant to OSH
surveillance and its evaluation for a tailored evaluation framework. An expert panel of ten
members with experience in the operation and evaluation of OSH surveillance systems reviewed
and rated a draft list of elements, resulting in consensus on a total of 64 OSH surveillance
components, 31 attributes and 116 example measures, with 134 (62.3%) reaching a high level of
In the Delphi study, experts answered open and closed-end questions regarding the
development of a tailored guiding evaluation framework. After the study, all selected elements
and a proposed framework sketch were disseminated back to Delphi experts as well as field
professionals in national and state-level OSH surveillance systems through the Council of State
and Territorial Epidemiologists (CSTE) Occupational Health (OH) Subcommittee. The follow-
up sought opinions regarding selected elements and framework content from experts and
professionals via 1) a presentation and in-person communication with participants in the CSTE
Occupational Health (OH) Subcommittee annual winter meeting (December 2020, Miami, FL)
and 2) written feedback from interested experts and professionals via emails. Table 3-1 includes
questions in the Delphi study and the follow-up opinion seeking. All feedback were compiled
with themes and descriptive statistics to guide the development of this OSH evaluation
framework.
55
Table 3-1 Questions in the Delphi study and follow-up opinion seeking
Study Questions
Delphi Study What other comments do you have for developing the tailored evaluation
framework, in addition to those you have provided with specific elements?
What other contents besides the surveillance components and attributes do you
think need to be included in the tailored evaluation framework?
Is it good to group framework elements into "recommended" and "optional"
based on the panel's mean ratings?
What other suggestions do you have for the to-be-developed OSH surveillance
evaluation framework?
Follow-up opinion seeking Is there any selected element for OSH surveillance you want to comment on?
Should elements selected with low consensus be excluded from the framework
or included as optional items?
Can the mean rating be used as weight for a given element?
Any specific content should be removed/added from the framework sketch?
Results
A total of ten Delphi panel experts answered open and closed-end questions regarding the
development of a guiding evaluation framework in the Delphi study. Nine field professionals
(including three experts in the Delphi panel) provided feedback regarding the selected elements
and the development of the guiding framework through written and in-person communications in
the follow-up review. Table 3-2 summarized major comments and suggestions from experts for
Table 3-2 Major comments and suggestions for an evaluation framework from experts in the Delphi study (n=10)
and the follow up review (n=9)
Aspect Major comments and suggestions
Low consensus elements Six experts in the follow-up (67%) suggested to include elements with low
consensus, given:
1) They may be relevant to advanced level OSH surveillance systems;
2) It allows for customization in the evaluation.
Be simple and Ten experts (53%) emphasized that the framework needs to be clear and simple to be
straightforward a practical and useful tool for surveillance staff.
Ways to make the framework practical include:
56
Weights and quantitative Experts agreed that the framework could suggest elements that are more important to
scoring grid OSH surveillance, but they were unable to comment more on weights due to lack of
expertise.
Experts raised concern that numerical scales (the quantitative scoring grid) may not
be as interpretable as qualitative summary.
Practicality Six experts (32%) confirmed that the list of elements was thorough and
comprehensive.
Four experts (21%) commented that some of the elements are not feasible for basic
OSH surveillance systems or are unrealistic to measure them.
Three experts (16%) identified elements that reflect features more of a complete
system, describing current OSH surveillance as a "loosely defined" system relying on
data systems that are not owned/managed by organizations conducting the
surveillance.
All experts believed that it is meaningful to develop a tailored framework for OSH
surveillance evaluation. Most experts in the follow-up (67%) explicitly suggested to include low
panel consensus elements, thinking they could be optional choices in a flexible framework. A
customization. Most panel experts (80%) agreed that the framework should provide guidance for
different types of evaluations based on system’s maturation stage. Many emphasized the
57
importance of keeping the framework simple and straightforward. One expert commented, the
framework “needs to be very clear and straightforward. Otherwise it would not be useful.”
Another expert noted that OSH surveillance systems “may not have resources” and surveillance
staff are “not trained evaluator”. While we had hoped to have the experts consider how to
differentiate elements based on their level of relevance to OSH surveillance, they were unable to
The OSH surveillance evaluation framework (see Supplement) has been developed to
incorporate the selected elements and experts’ suggestions, with an aim to assist with evaluation
of OSH surveillance systems being carried out by public health departments and institutions at
various jurisdictions in the US. The framework offers practical guidance in two main parts: 1)
steps on formulating and implementing an evaluation and 2) guides for four different types of
evaluation. Template working tables and other tools are included to aid the implementation of an
evaluation.
The OSH surveillance evaluation framework has been developed following a principle of
being “simple and straightforward” suggested by experts. We adapted the six-step tasks
recommended in the CDC updated guidelines and other participatory based evaluation models
into a more intuitive and practical 4-step procedure, in which engaging stakeholders is not
recommended as a main task in OSH surveillance evaluation 1,5. For resource limited
surveillance systems and systems largely relying on secondary data sources such as OSH
surveillance systems, stakeholders are usually limited to data providers and some data user with
58
which contacts can possibly be established. The situation makes it neither meaningful nor
Four types of evaluation were proposed in this framework based on surveillance stages. A
prevailing message from experts was to offer flexibility of customization. We grouped elements
into recommended and optional for each type of evaluation, with the objective to simplify each
type of evaluation while provide flexibility to tailor by the evaluators. Further, unlike the
common surveillance evaluation guidelines stood by the CDC updated guidelines, we do not
sensitivity, specificity and predictive value positive (PVP). Our experience has demonstrated that
data and resources available for a normal OSH surveillance evaluation usually would not allow
an in-depth analysis on data quality attributes (manuscript in press). Rather, we proposed a type
A logic model is included in the framework as a useful way to visualize elements on the
OSH surveillance organization flow. Logic models have been suggested in many existing guides
for program and surveillance evaluation 1,5,6,15. By incorporating all recommended components
and attributes into one comprehensive graphic presentation, the logic model provides an easy and
straightforward way for evaluators to check and select elements for their tailored evaluations.
It is expected the framework addresses the lack of a practical guiding tool in OSH
surveillance evaluation and helps to promote more evaluations and in turn to improve OSH
surveillance systems. Despite intending to be comprehensive, it is likely that not all components
and attributes relevant to OSH surveillance have been included in this framework given the wide
variety of OSH surveillance activities. Further, the recommended components and attributes for
59
each type of evaluation may need further refinement with more practice. Application of this
framework can help to continuously improve the framework, as one Delphi panel expert
commented, “it will be helpful in the development of a draft [the framework], which can be
amended as more people attempt to use the tool.” Although the framework is intended for OSH
surveillance systems in the US, it is promising to provide a reference for OSH surveillance in
other contexts.
60
References
1. Centers for Disease Control and Prevention. Updated guidelines for evaluating public
health surveillance systems: recommendations from the guidelines working group. Morb
Mortal Wkly Rep. 2001;50(No. 13):1-36.
5. Harris MJ. Evaluating Public and Community Health Programs. John Wiley & Sons,
Incorporated; 2016. Accessed May 1, 2020.
http://ebookcentral.proquest.com/lib/osu/detail.action?docID=4722974
6. Centers for Disease Control and Prevention (CDC), Office of Strategy and Innovation.
Introduction to Program Evaluation for Public Health Programs: A Self-Study Guide.
Atlanta, GA: Centers for Disease Control and Prevention; 2011.
http://www.cdc.gov/eval/guide/cdcevalmanual.pdf
7. Choi BCK. The Past, Present, and Future of Public Health Surveillance. Scientifica.
2012;2012:1-26. doi:http://dx.doi.org/10.6064/2012/875253
11. Nobles RE. Process evaluation of the Texas occupational safety & health surveillance
system. Published online 2009. Accessed November 10, 2017.
https://search.proquest.com/openview/bbf7eddd1106df9c4ea7c4ea18613bfb/1?pq-
origsite=gscholar&cbl=18750&diss=y
12. Largo TW, Rosenman KD. Surveillance of work-related amputations in Michigan using
multiple data sources: results for 2006-2012. Occup Environ Med. 2015;72(3):171-176.
doi:10.1136/oemed-2014-102335
61
13. Rosenman KD, Kalush A, Reilly MJ, Gardiner JC, Reeves M, Luo Z. How Much Work-
Related Injury and Illness is Missed By the Current National Surveillance System? J
Occup Environ Med. 2006;48(4):357–365. doi:10.1097/01.jom.0000205864.81970.63
14. Davis LK, Hunt PR, Hackman HH, McKeown LN, Ozonoff VV. Use of statewide
electronic emergency department data for occupational injury surveillance: a feasibility
study in Massachusetts. Am J Ind Med. 2012;55(4):344-352. doi:10.1002/ajim.21035
15. McNabb SJN, Surdo AM, Redmond A, et al. Applying a new conceptual framework to
evaluate tuberculosis surveillance and action performance and measure the costs,
Hillsborough County, Florida, 2002. Ann Epidemiol. 2004;14(9):640-645.
doi:10.1016/j.annepidem.2003.09.021
62
Surveillance Systems
Table of Contents
1 Introduction ……………………………………………………………………………62
10 Appendices …………………………………………………………………………..164
63
1 Introduction
Occupational Safety and Health (OSH) surveillance is the ongoing, systematic collection,
analysis, interpretation and dissemination of data on work-related health outcomes and on the
presence of health and safety hazards that are essential to OSH practices and research. OSH
importance are being monitored effectively and to promote the best use of public health
resources.
To aid the evaluation of OSH surveillance systems, the guiding framework has been
measures) identified from experience and research informed by field experts in OSH surveillance
systems in the US (manuscripts in press/under review). The framework is intended for use of
OSH surveillance systems carried out by public health departments and relevant institutions at
various jurisdictions in the US. However, it may provide a reference for evaluating OSH
A logic model is provided to organize all evaluation elements included in this framework
(see Figure 3-4 and section 4 through 7). This framework aims to assist evaluators to plan and
• Initial/need evaluation,
• Routine/rapid evaluation,
These types of OSH surveillance evaluation and steps in conducting an evaluation are
described in Section 3. The following Sections 4 to section 7 contain specific guidance for each
type of evaluation. For each type, minimum components and attributes to be considered are
recommended. The framework concludes with suggestions to weight attributes for an evaluation
(section 8). Appendices include templated working tables and other tools to aid the
implementation of an evaluation.
of systematic collection and analysis of data regarding the occurrence and trend of OSH events.
A new surveillance initiative begins with identification of event(s) of OSH importance and
surveillance of the event(s). After implementation, the ongoing surveillance activities produce
data and statistics which are disseminated for wide use in OSH prevention action.
65
Figure 3-1 Stages of OSH surveillance activities and types of evaluation corresponding to each
stage
For a new initiative, initial/need evaluation is needed to establish a baseline for ongoing
evaluation. A data quality evaluation may also be needed to provide in-depth information on data
sources and analysis methodology. When the initiative is in ongoing operation stage,
recommended for every three years. Data quality evaluation may be needed from time to time.
Table 3-3 introduces objectives and suitable situations for each of the four types of evaluation.
66
Table 3-3 Objectives and suitable situations by type of OSH surveillance evaluation.
1) Initial/Need Evaluation
2) Routine/Rapid Evaluation
3) Comprehensive Evaluation
The framework proposes a four-step practical procedure for evaluating OSH surveillance
evaluation, which is centered with events under surveillance and cycled by ensuring the use of
evaluation findings. Figure 3-2 shows the diagram of the four evaluation steps that apply to all
four types of evaluation. Please keep in mind that an evaluation is often not a linear process,
This is a key step in an evaluation as it sets the foundation for the next steps. It contains a
team with one or two lead evaluators and a few key members including system staff, possible
external evaluators and stakeholders. In a routine/rapid evaluation, the team may have only one
or two members, usually the system staff. The lead evaluator has primary responsibility in
designing and implementing the evaluation, as well as suggesting team members. Requirements
1
Adapted from:
[1] Centers for Disease Control and Prevention (CDC). Office of Strategy and Innovation. Introduction to Program Evaluation for
Public Health Programs: A Self-Study Guide. Atlanta, GA: Centers for Disease Control and Prevention; 2011.
[2] Harris MJ. Evaluating Public and Community Health Programs. John Wiley & Sons, Incorporated; 2016.
69
• Understands both the potential benefits and risks of evaluation and conform ethics and
needed
• Able to report full findings (i.e., will not gloss over or fail to report certain findings)
Stakeholders identified in next tasks may be invited into the evaluation team to help to
establish priorities for the evaluation and to facilitate evaluation resources. For example, in a
comprehensive evaluation or data quality evaluation, participation of main data providers and
data users help to determine appropriate scope of work and solicit in-depth information on how
the data are collected, analyzed and used. Stakeholders deemed appropriate to be in the
evaluation team should have fairly good understanding of the surveillance system’s objectives
and activities so that they are able to provide concrete suggestions for the evaluation.
Collect information to describe the system: Framing the evaluation design begins with
learning some details about the system under evaluation. Two key questions are 1) what event(s)
are under surveillance and 2) what stage(s) they are in. Typically, an OSH surveillance system
70
may include multiple surveillance initiatives/activities covering many events using data from a
collected on the public health importance of the event(s) (e.g., incidence and prevalence,
population affected, costs associated with the event), purposes of the surveillance, surveillance
strategies (e.g., case definition, data sources, data analysis methods, technical platforms),
expected outputs and outcomes, stakeholders and the infrastructure and supporting functions in
the system to support the surveillance initiative/activity. Drawing a logic model of your system is
suggested to visualize the input and the organization flow of the system. Figure 3-4 is an
example of a comprehensive logic model that organizes common components for OSH
surveillance systems as well as attributes. Appendix 3-2 and Appendix 3-3 describe in details
Figure 3-4 Common OSH surveillance components and attributes in Logic Model
72
Main sources for collecting documentation to describe the system are listed below:
interviewing system staff and other stakeholders or by onsite observation. A complete list of
common questions/aspects to consider in describing the system are listed in Appendix 3-1.
Thorough data collection work usually happens once during an initial evaluation or the
first comprehensive evaluation. Once this information is collected, it may be used in future
evaluations with necessary updates. Information collected in this task may be used as evaluation
evidence later so making logs and documentation is needed. Understanding the system also helps
to define the scope of evaluation work and to select components and attributes to be evaluated.
73
Identify stakeholders: More often than not, the major stakeholders for an OSH
surveillance system are staff members responsible for the surveillance activities, data providers
and data users collaborating within federal and state health agencies. The developers and
maintainers of technical platforms such as Electronic Health Records (EHR) may also be
stakeholders. It is important to note that while they may be stakeholders, medical care providers,
employers and employees, and community members may be too up- or down-stream to consider.
Identified stakeholders can be invited into the evaluation team (if necessary) and/or participants
from which evaluation evidence are collected from. Stakeholder involvement in an evaluation
may be limited by the nature of evaluation. For example, in a routine/rapid evaluation, the
Determine the scope of evaluation: The evaluation purpose and objectives may be pre-
determined at the initiation of the evaluation project however the purpose and scope of
evaluation may still need to be tuned with information collected in describing the system. In
some situations, the evaluation team may be assigned a broad mission and needs to determine a
clear scope of work. An important consideration is to determine the type of evaluation that is
based on the main evaluation purpose and stages of selected surveillance activities.
• Identify gaps needing improvement and areas needing more surveillance activities;
• Determine if data collected and generated in the system are in good quality;
74
With the determined type of evaluation, you must decide the components and attributes to
be included. Sections 4 through Section 7 introduce minimum and optional components and
attributes recommended for each type of evaluation that you can tailor for use of your specific
systems under evaluation. Deciding on components and attributes under evaluation is not as
in the hosting agency (i.e., public health department or institution) may be shared by various
programs and functions in that agency or the OSH surveillance activities may be collaborative
efforts across different agencies. Further, an OSH surveillance system may include multiple
surveillance initiatives/activities, with each focusing different OSH events using data from a
wide variety of sources and being in different stage. The evaluation team needs to draw a clear
line on which to be included in the surveillance “system” under evaluation based on the purpose
of the evaluation. Feasibility on evidences and methods to assess these attributes is also an
important consideration. There is often an iterative process between “describing the system”,
“identifying stakeholders”, “determining the scope of evaluation”, and the later task “develop
evaluated via attributes, which are measurable features of the corresponding surveillance
components. For each attribute selected for an evaluation, appropriate measures with possible
corresponding to each attribute listed in Figure 3-4. Multiple measures may be needed for one
attribute with evaluation evidences being collected from multiple information sources. For
meetings (checking meeting minutes and/or interviewing staff), responsiveness working with the
surveillance staff (checking meeting minutes and/or interviewing staff), and/or the stakeholders’
subset of effective measures can be used for routine and rapid evaluation. Availability of
information sources must be considered in selecting measures (e.g., stakeholders, data for
quantitative analysis), along with possible resources (e.g., time frame required for the evaluation)
and feasibility. Evaluation questions and data collection methods that take a great amount of
time, complex design and data analysis, and involvement of a large number of stakeholders may
not be appropriate for an evaluation with limited staff time and budget. If given a limited time
frame and/or resources, prioritize evaluation measures so that the most important and relevant
For each attribute and each corresponding evaluation measure, methods on collecting and
analyzing evaluation evidence are usually specified with one or more operational evaluation
with evidence collected form the surveillance staff. Two operational questions may be used:
“Could you rate the ease of case ascertainment/definition for each event/case definition using 5-
point scale?” and “Are there some case definitions difficult or tricky to ascertain?”. Common
data collection methods include system document review, literature review, onsite observation.
Methods to collect evidence from stakeholders (including surveillance staff) include interviews
Appendix 3-5 includes demonstration working tables that help to develop evaluation
measures and questions and sort them by data collection methods and participants, which will
Make evaluation plans: At the conclusion of the evaluation design step, the evaluation
team should make a detailed plan to specify responsible persons and timelines for following
evaluation evidences, reporting the findings, and proposing recommendations for system
improvement. Responsible person(s) for each activity should also be specified. Appendix 3-6
provide a template working table for the evaluation data collection plan. For a routine/rapid
evaluation or evaluation with a small team, evaluation plans and protocols may be documented
simple.
may also be part of the evaluation to facilitate data collection and evaluation findings
dissemination. Ways to engage stakeholders may include enrolling them into the evaluation
team, workshops and presentations on the evaluation project and findings, individual feedback
regarding evaluation results and recommendations. Often there is no budget for an evaluation,
but if there is, the team may need to make a budget plan.
Following evaluation methodology developed in step 1, this step includes two specific
tasks, 1) preparing and 2) implementing data collection. In the preparation stage, detailed
protocols for conducting the data collection activities should be developed, such as onsite visit
plans or interview guides. Measures and evaluation questions developed in step 1 for a specific
information source (e.g., the surveillance staff) need to be sorted together and further refined for
operational guides of interviews and/or focus groups, and survey questionnaires, if applicable.
Appendix 3-7 provides sample interview guides adapted from an evaluation of a state-based
77
OSH surveillance system (manuscript in press). The validity of developed guides needs to be
pre-tested within the team or by relevant people (e.g., staff in other departments, or a small
Evaluations often do not require ethics review, but consult with your relevant
Institutional Review Board to confirm before implementing the data collection work.
convenience onsite visits and meeting with surveillance staff, or as complex as multiple
surveys/interviews/focus group discussions as well as formal onsite visits with more than one
phone/virtue meeting may be time- and cost-efficient ways for interviews/focus groups. Survey
could be developed and delivered via common online survey tools such as Qualtrics, REDCap,
or SurveyMonkey. Each data collection activity should be documented in a log and saved for
evidences/findings in a timely fashion. This helps to reflect possible lapses to adjust for future
data collections.
• Quantitative data refer to those in numerical format and analyzed using statistical
or from stakeholders (usually with closed-ended questions). Quantitative data describes the
78
system and its performance, such as the number of data sources for a surveillance event, time
interval between obtaining surveillance data, annual funding and staff time (e.g., FTE), or
usually analyzed for descriptive statistics, such as mean and range (for continuous data) and
frequency/percentage (for categorical data). A clear cut-off may be determined with quantitative
Quantitative data and in-depth statistical analysis are particularly essential in data quality
evaluation for attributes such as sensitivity and Predictive Value Positive (PVP).
• Qualitative data refer to those that do not involve numerical data or lend themselves to
statistical analysis. Qualitative data in an evaluation are usually collected with open-ended
useful supplement to quantitative data to describe reasons, perceptions, experiences and attitudes.
For example, following a Yes/No closed-ended question asking participants about the usefulness
of the system, an open-ended question asking why provides deeper information for
interpretation. Expertise with content analysis is needed to analyze qualitative data, which aims
to abstract major themes and patterns in qualitative data. Appendix 3-9 provides further
Interpreting the evaluation evidence and findings is the important step to determine the
system’s performance regarding each attribute, which leads to evaluation conclusions and
recommendations. You may consider two ways to present the system’s performance. Most often
a surveillance system’s performance is presented in a descriptive way with clear statement of the
strengths and weaknesses of the system regarding each attribute. Another way is to convert
findings into quantitative scoring grid, in which performance of each attribute as well as the
79
overall system performance can be presented with numerical numbers that are easily compared.
However, a significant challenge in this method is to define clear cut-off of “good” and “bad”
performance. The scoring grid method is only recommended when meaningful and interpretable
Appendix 3-2 and Appendix 3-3 contain descriptions of each component and attribute
reflecting general performance criteria for OSH surveillance. However, the system’s
performance should be determined within the context of the evaluation purpose, the system’s
infrastructure and other resources, as well as other factors possibly affecting the system’s
performance. A discussion with the surveillance system leadership and other key members may
The evaluation project largely ends with findings being summarized into an evaluation
report, which can be kept in the system, submitted to relevant parties (e.g., funding agencies),
and disseminated to stakeholders and other interested people and entities. The evaluation report
generally includes introduction of the evaluation purpose and scope of work, description of the
surveillance system and activities, evaluation methods, evaluation results and findings,
Appendix 3-8.
evaluation evidence, findings and recommendations. It may be difficult to ensure that the
recommendations is beyond the resources available for the evaluation team. It’s optimal if the
team can push the first step in translating evaluation findings into actions by discussing with the
The evaluation reports and materials collected should be referenced in future evaluations.
Gaps identified and recommendations proposed in a previous evaluation can inform the purpose
To design and implement a quality OSH surveillance evaluation, the evaluation team
should consider the following four standards, namely: utility, feasibility, propriety, and accuracy.
For each quality standard, requirements and possible aspects to consider are listed as below.
Feasibility Requires that the evaluation process be practical · Can the evaluation methods be implemented
and nondisruptive, that it acknowledge the within the time and budget available?
political nature of the evaluation, and that · Does the evaluation team have the expertise to
resources that are available for conducting the implement the chosen methods?
evaluation are used carefully. It specifies that · Are logistics and protocols realistic given the time
evaluations should be efficient and produce and resources available?
information that justify the expenses.
Propriety Requires that the evaluation be ethical and be · Are issues of confidentiality of respondents
conducted with regard for the rights of those addressed?
involved and affected. It requires that the
evaluation be complete and fair in its process and
reporting. It also requires that the evaluation
demonstrate respect for human dignity and avoid
conflicts of interest.
81
Accuracy Requires that the evaluation uses accurate and · Is appropriate quality control in place?
systematic processes to collect evaluation · Are enough data being collected?
evidence and produces technically accurate · Are methods and sources consistent with the
conclusions that are defensible and justifiable. nature of the evaluation?
Adapted from:
[1] Harris MJ. Evaluating Public and Community Health Programs. John Wiley & Sons, Incorporated; 2016.
[2] Hoffman S, Dekraai M. Brief Evaluation Protocol for Public Health Surveillance Systems. The Public Policy Center,
University of Nebraska; 2016.
4 Initial/need evaluation
that may impact the surveillance activities, such as major program integration or changes in data
For a new surveillance initiative, it is necessary to thoroughly document the public health
importance of the event(s) under surveillance, the purpose and activities of the surveillance (e.g.,
monitoring population trends and/or tracking individual cases), and surveillance strategies and
methodologies including data sources and technical platform. Further, the infrastructure and
As a new initiative has not usually been in operation long enough to establish credible
components and attributes to consider in the initial/need evaluation are presented in Figure 3-5.
82
Figure 3-5 Initial/need evaluation: Minimum and optional components and attributes *
* Boxes and text in black are recommended for this type of evaluation. Boxes and text in shade are not
recommended, but evaluators may choose them to tailor their evaluation project.
Detailed description on each component and attribute can be found in Appendix 3-2 and Appendix 3-3. Example
measures for each attribute are included in Appendix 3-4.
5 Routine/rapid evaluation
Routine and/or rapid evaluations are performed when an OSH surveillance system has
been routinely operated without major changes, and/or when there is limited time and resources
every year. The major purpose of routine/rapid evaluation is to assess and ensure that the
surveillance activities are being implementation as planned. As such, system’s outcomes are not
Routine evaluation by its very nature is conducted rapidly and frequently. It is not
encouraged to intensively engage and involve stakeholders unless necessary and readily
accessible. Further, as no major changes have occurred to the system, describing the
Figure 3-6 includes minimum and optional components and attributes to consider in a
routine/rapid evaluation.
84
Figure 3-6 Routine/rapid evaluation: Minimum and optional components and attributes *
* Boxes and text in black are recommended for this type of evaluation. Boxes and text in shade are not
recommended, but evaluators may choose them to tailor their evaluation project.
Detailed description on each component and attribute can be found in Appendix 3-2 and Appendix 3-3. Example
measures for each attribute are included in Appendix 3-4.
6 Comprehensive evaluation
Comprehensive evaluations are recommended every three years, or when major changes
significant system timelines, for example, reviewing for funding application. A comprehensive
evaluation aims to collect thorough information regarding every aspect of an OSH surveillance
system to comprehensively understand the input and resources in the system, the surveillance
activities being performed, and the outputs and outcomes being produced by the system. A
comprehensive evaluation makes conclusions about the performance and usefulness of the
system, identifies gaps to be addressed and proposes recommendations to improve the system.
evaluation, including sensitivity, specificity and predictive value positive (PVP) as these are
more appropriate for a data quality evaluation. Further, assessing the usefulness of an OSH
surveillance system should be limited to immediate and visible outputs because attributing
downstream changes in workplace and worker safety and health to OSH surveillance may be
difficult.
Figure 3-7 includes the minimum and optional components and attribute to consider in a
consideration to guide the selection of elements and evaluation design. For example, data sources
and events under surveillance can be limited to a few important ones depending on available
Figure 3-7 Comprehensive evaluation: Minimum and optional components and attributes *
* Boxes and text in black are recommended for this type of evaluation. Boxes and text in shade are not
recommended, but evaluators may choose them to tailor their evaluation project.
Detailed description on each component and attribute can be found in Appendix 3-2 and Appendix 3-3. Example
measures for each attribute are included in Appendix 3-4.
A specific data quality evaluation may be needed for an in-depth understanding of the
data produced in an OSH surveillance system. It is sometimes necessary to conduct a data quality
87
evaluation, for example, when there is a new surveillance initiative, or when major changes
Data quality evaluations can be a separate project with specifically allocated resources,
including time and expertise to ensure the depth and width of the assessment. This evaluation
needs to describe the surveillance strategies and the data sources but not necessarily the
surveillance infrastructure. An understanding on the core data managing and processing activities
is needed. The study team should have access to all necessary sources datasets as well as data
Figure 3-8 includes the minimum and optional components and attribute to consider in a
data quality evaluation. This evaluation should focus on quality related attributes that cannot be
easily approached in a normal evaluation, including sensitivity, specificity and predictive value
positive (PVP), as well as others. Data analysis methods may include capture-recapture
techniques, data linking, use of “gold standard” data, medical charts review, case follow-up
Figure 3-8 Data quality evaluation: Minimum and optional components and attributes *
* Boxes and text in black are recommended for this type of evaluation. Boxes and text in shade are not
recommended as minimum requirements, but evaluators may choose them to tailor their evaluation project.
Detailed description on each component and attribute can be found in Appendix 3-2 and Appendix 3-3. Example
measures for each attribute are included in Appendix 3-4.
8 Weighting attributes
reflect varied aspects of the system. Among the selected attributes to get evaluated, the
evaluation team may want to emphasize certain attributes in the evaluation design, data
89
collection and analysis to better reflect the important aspects of the system. If wished,
quantitative weights can even be assigned to the attributes being measured to acquire a sum of
“overall performance” when the evaluation team decides to quantitatively score evaluation
results.
To this end, the average rating for each attribute (on a scale of 1 to 5) from expert panel
in the Delphi study (manuscript under review) may serve as a reference for choosing important
attributes. However, it must be considered that these ratings reflect the attributes’ relevance to a
“general” OSH surveillance system. Attributes with very high meaning rating (≥ 4.5) are
• Sustainability
• Confidentiality
• Feasibility
• Acceptability
• Accessibility
• Usability
• Usefulness
• Stability
• Effectiveness
• Representativeness
• Consistency
• Accuracy.
90
activities. A system may include one or more surveillance programs, which focus on similar or
Initiative: Usually refers to the plan and/or the initial implementation of new surveillance
activities or of major change(s) in existing surveillance activities. It can also refer to generally
surveillance activities.
Normal evaluation: Refers to any type of evaluation normally allocated with limited to
moderate resources including time, budget and expertise. It does not include evaluations with
special objectives (e.g., data quality evaluation) or with extensive supporting resources.
usually assesses whether the initiative is likely to have the intended effect and provides early
feedback and documentation on the initiative. Process evaluation documents whether the
program is being implemented as intended and provides explanations on why or why not by
2
Adapted from:
[1] Centers for Disease Control and Prevention (CDC), Office of Strategy and Innovation. Introduction to Program Evaluation for
Public Health Programs: A Self-Study Guide. Atlanta, GA: Centers for Disease Control and Prevention; 2011.
[2] Hoffman S, Dekraai M. Brief Evaluation Protocol for Public Health Surveillance Systems. the Public Policy Center,
University of Nebraska; 2016
91
The two terms refer to the different stages of a program. In this framework, both
an attribute. Example measures for different attribute can be found in Appendix 3-4.
92
Liu Yang a
Adam Branscum a
Viktor Bovbjerg a
Curtis Cude b
Crystal Weston b
Laurel Kincl a
a
College of Public Health and Human Sciences, Oregon State University, 123 Women’s
Building, Corvallis, Oregon, 97331, USA
b
Public Health Division, Oregon Health Authority, 800 NE Oregon Street, Portland, Oregon,
97232, USA
93
Abstract
Background: Young workers 24 years and under are especially vulnerable to occupational
injuries and illnesses. There is a continuing need to investigate injury burden among young
workers across demographics and industry to identify vulnerable subgroups and inform targeted
interventions. Workers compensation (WC) claims data are important for quantifying work-
related injuries and illnesses, however published studies have mostly focused on disabling WC
claims. This study extended previous research on young workers’ injuries in Oregon by
including both disabling and non-disabling claims data to identify patterns of injury and high risk
industries.
Methods: We obtained all accepted disabling claims data (N=13,360) and a significant portion
of non-disabling claims data (N=24,460) from Oregon’s largest WC provider on workers aged 24
years and under from 2013 to 2018. Injury count, rate, and claim cost were calculated by year,
age, gender, industry, and injury type. A Prevention Index (PI) method was used to rank
Results: Average annual disabling and non-disabling claim rates were 111.6 and 401.3 per
10,000 young workers. Workers aged 19-21 (disabling:119.0 per 10,000 and non-disabling:
429.3) and 22-24 years (115.7 and 396.4) and male workers (145.3 and 509.0) had higher claim
rates than workers aged 14-18 (80.6 and 297.0) and female workers (79.8 and 282.9). The most
musculoskeletal disorders (WMSDs)” (19.5%). Distribution of major injury types differed across
industries. Based on the PI, high risk industries included agriculture, construction and
manufacturing for both genders combined. For female young workers, the highest risk was in the
healthcare industry.
94
Conclusions: This study identified major injury types (“struck by/against” and WMSDs) and
research and prevention efforts among Oregon young workers. Further research can examine the
representativeness of WC claims data and other state-based data on young worker safety and
health.
95
Introduction
Adolescents and young adults make up a significant portion of the workforce in the US.
In 2018, there were about 19.4 million young workers aged 24 and under nationally and more
than 200,000 young workers in the State of Oregon, representing 12% of the total workforce in
the country and the state 1. In general, US state and federal laws dictate child labor laws, and
youth aged 14 years and above are allowed to work. Studies reported that about 35% and 50% of
17-year-olds work during the school year and summer 2. While working is a key developmental
milestone for many adolescents and young adults, risk of occupational exposures and injuries can
be a serious threat to not only their immediate health but also long-term development and ability
to work throughout adulthood 3,4. Ensuring a safe and healthy working environment is essential
to protect young workers and enable them to be productive through their working life.
Youth are vulnerable to occupational hazards and injuries 2,5,6. Young workers, especially
males, are at up to two times higher risk of injuries at work as compared to their older
counterparts 7–11. Occupational injury risk factors often differ between young and adult workers.
Young workers often hold temporary jobs and work in irregular hours 5,12,13. They tend to work
in certain industry sectors such as wholesale and retail trades, services, and agriculture, and their
jobs often require higher levels of physical exertion 5,8,10,12,14,15. Moreover, young workers often
subgroups defined by age, gender and other potential factors such as training and job skills.
Workplace safety and health experiences may vary across subgroups 10,19. Studies have shown
that male and older young workers had higher injury rates as compared to female and younger
workers 3,8,14,15,20,21. Male and female workers tend to have different job tasks and experience
96
different types of injuries 22,23. For example, one study on adolescents reported that few females
did farm work while few males worked in health care facilities 23. Compared with males, females
are more likely to have jobs that require very fast repetitive motions and static standing for
extended time periods, which are associated with musculoskeletal injuries 16. There is a
continuing need to investigate injury burden among young workers across age, gender and
A number of data sources have been examined to characterize the injury burden and
nature among young workers, including administrative and hospital data such as the workers
compensation (WC) claims data, emergency department (ED) visits data, trauma registry data,
and survey-based data 3,14,20,21,24–27. Each of these data sources are found to subject to potential
coverage and completeness issues and no single data source captures all injuries to young
workers 2. WC claims data have long been used as a population-based data source for research
and surveillance of work-related injuries and illnesses 3,28–31. The state of Oregon uses WC
claims data to document the incidence and characteristics of injuries and illnesses among young
and adult workers 32–34. However, earlier efforts focused on disabling claims, i.e. claims
involving compensation for lost work time or permanent disability or death, which generally
represent more severe injuries. Non-disabling claims involving injuries and illnesses without lost
work time were generally not included in existing studies due to difficulty in accessing such data
35
.
Literature suggests that minor injuries and illnesses occur frequently among young
workers 15,16,23,36,37. One study suggested that young workers had been experiencing minor
injuries so frequently that they saw them as a normal part of their everyday work 16. Another
study on both disabling and non-disabling WC data in the state of Washington showed that only
97
15% of claims from adolescents involved compensation for lost work time 15. A previous study
conducted by Walters et al. 33 on Oregon young workers found that injury rates almost doubled
when adding data from a portion of non-disabling claims from a commercial insurer in Oregon.
This study proposed that Oregon could improve the completeness of occupational surveillance
The present study includes non-disabling WC claims data from the State Accident
Insurance Fund Corporation (SAIF), the largest WC insurance provider in Oregon 38. By
analyzing both disabling and non-disabling WC claims data, the present study reported work-
related injuries and illnesses from 2013-2018 among Oregon young workers aged < 25 years,
with two specific objectives: 1) quantify and compare disabling and non-disabling injuries and
illnesses by demographics, industry and injury characteristics; 2) identify industry sectors of high
disabling and non-disabling injury risk for young workers using a Prevention Index (PI). The PI
takes into account multiple metrics such as injury count, rate and associated cost to
systematically compare injury risk across industry sectors to inform research and prevention
priorities 39,40
Methods
Data sources
Oregon law requires most employers to provide workers’ compensation coverage to their
employees, including hourly and part-time employees 41. Oregon employers can choose to have
company, or insurance through SAIF. In 2018, SAIF’s market share in Oregon was 54%,
98
whereas the private insurers and the self-insured held 34% and 12% market share, respectively
43
.
Disabling WC claims in Oregon are defined as involving missing three or more days of
regularly scheduled work, overnight hospitalization, likely permanent disability, or death 41.
Non-disabling claims refer to those in which the injured workers do not receive any
compensation for lost work days (i.e., time lost from work is generally less than three days) or
experience permanent disability (Michael Vogt & Chuck Easterly, SAIF corporation, March,
2019, written communication). All Oregon insurance companies are required to report their
accepted disabling claims to the Workers’ Compensation Division of the Oregon Department of
Consumer and Business Services (DCBS), while accepted non-disabling claims are not required
to report.
The DCBS manages the Oregon disabling WC claims data system and provides the data
for research purposes on request. The present study obtained 2013-2018 de-identified accepted
disabling WC claims data from the DCBS for workers of all ages. Through a research project
agreement, SAIF provided 2013-2018 de-identified accepted non-disabling claims data for
Employment data
To calculate claim rates, the Quarterly Workforce Indicators (QWI) data were used as
employment data. The QWI covers 95% of US private sector jobs and includes a wide variety of
record-level data sources such as the Unemployment Insurance data and Quarterly Census of
Employment and Wages data 44. The QWI data have been used in previous studies on Oregon’s
young workers as well as for other published studies 33,45. The publicly available QWI data
99
provide estimates for the number of workers aged 14 years old and above, stratified by industry
and demographics. For this study, data for Oregon workers from 2013 to 2018 were used.
Both the accepted disabling and non-disabling claims datasets contain the following
variables used in this study: injury year, worker’s demographics (age and gender), employer’s
industry, injury characteristics (nature, body part, and event/cause), and compensated medical
cost.
Age group
The worker’s age was recorded in years in both WC claims datasets. To match age
groups in the QWI employment data, workers were grouped into: < 10, 10-13, 14-18, 19-21, 22-
24, and > 24 years (in the disabling claims dataset only).
Employer’s industry in the WC claims raw data was coded using the North American
Industry Classification System (NAICS) 46. We further coded nine industry sectors based on
National Occupational Research Agenda (NORA, the second-decade) definitions using NAICS
codes up to 4 digits 47. We combined a NORA sector, “Oil and gas extraction” with another one,
“Mining (except oil and gas extraction)” into one sector, “Mining” as either sector had small
number of claims.
Injury characteristics
DCBS coded injury characteristics (i.e., nature, event and body part affected) in the
disabling claims data with the Occupational Injury and Illness Classification System (OIICS)
version 2.01 48. However, SAIF coded the non-disabling data using the Workers Compensation
Insurance Organizations (WCIO) coding system 49. The study team drafted a WCIO-OIICS
100
crosswalk independently and then compared with two other crosswalks proposed for research
purposes to amend the crosswalk (Laura Syron, NIOSH, January 2020, written communication;
Steve Wurzelbacher, NIOSH, January 2020, written communication). Conflicting results were
resolved by consensus among the study team. We used the final WCIO-OIICS crosswalk to code
Injury types
Based on previous studies 39,50, we further coded claims into 17 types based on injury
and illness characteristics (i.e., nature and event). OIICS codes in version 1.01 used in previous
studies were reviewed and converted into codes in version 2.01 by consensus among the study
team. We used the Bureau of Labor Statistics definition for work-related musculoskeletal
Medical cost
Medical costs were standardized to the 2018 US dollar based on inflation information
Analysis
counted claims by year, age group, gender, industry, injury nature and injury type for disabling
claims and non-disabling claims separately. We also calculated median medical costs associated
Claim rates were estimated by year, age group, gender and industry for disabling and
non-disabling claims separately, expressed as the number of claims per 10,000 workers. We
chose the unit of workers to facilitate the comparison with previous studies on Oregon young
101
workers. Disabling claim counts and rates for adult workers (25 years and above) were also
QWI employment data at the 2-digit level NAICS industry using the percentage of SAIF-covered
disabling claims over the total number of disabling claims in each industry in a similar study
period from 2014 to 2018 (proportion data provided by DCBS). The proportions ranged from
13.5% (NAICS code 22, “Utilities”) to 90% (NAICS code 11, “Agriculture, forestry, fishing and
hunting”), with an average of 55.6%. An implicit assumption is that similar percentages hold for
non-disabling claims. Specifically, we assumed that these proportions reflected SAIF’s market
share, that is, the percentage of workers covered under its premium in these industry sectors.
To rank industries of high risk, we adapted the Prevention Index (PI) methodology
developed by researchers in the state of Washington 39,40. This method has been used in a number
of studies as an effective way to systematically rank industry sectors to inform research and
prevention priorities 39,40,50,53,54. Each industry is ranked by injury count, injury rate and
associated medical cost, and arithmetic means of these ranks are calculated to obtain PI ranks.
The basic PI was calculated using the following formula. In case of a tie in basic PI, rate
To reflect injury severity, the expanded PI further adds medical cost rate rank, in which
the medical cost rate was calculated as the total medical cost incurred in an industry sector over
the total employment in this sector (per 10,000 workers). The expanded PI was calculated using
the following formula. In case of a tie in expanded PI, cost rank was used as the tiebreaker.
01234 5637 8 5649 5637 801D4 5637
;<=>?@A@ BC = E
102
We calculated basic and expanded PIs for each NORA sector for disabling claims and
non-disabling claims separately. In calculating PI ranks, we excluded categories with cases less
than 10 claims over the study period from the analysis due to small sample size. PI ranks were
also calculated for strata by selected factors including age group, gender and major injury types.
Due to the very small number of claimants aged below 14 years, separate statistics are not
reported in this paper. We excluded claims with age below 10 years in age related calculation
given age were likely coded wrong in these claims. For analyses involving claim rates, claims
with age below 14 years were excluded as no employment data (denominator) were available.
All statistical analyses were performed using R software (version 3.5.1). A two-sided
The study obtained ethics review approval by the Institutional Review Board at Oregon
Results
There were 13,360 accepted disabling claims and 24,460 accepted non-disabling claims
among Oregon young workers over the six-year study period from 2013 to 2018 (Table 4-1).
Nine claims involved fatal injuries. There were no important differences between disabling
claims and non-disabling in terms of claimants’ gender and age group distribution (Table 4-1).
The overall disabling claim rate from 2013-2018 was 111.6 per 10,000 workers (95%
Confidence Interval (CI): 109.7-113.5) (Table 4-1). The overall non-disabling claim rate was
401.3 per 10,000 workers (95% CI: 393.6-406.4), which was 3.60 times higher than the disabling
claim rate (95% CI: 3.52-3.67). The median medical cost for disabling claims was $1,758
103
(hereinafter, in 2018 US dollars), while the median medical cost for non-disabling claims was
$499.
An increasing trend of both the claim count and rate during the study period was
observed for non-disabling claims, with an estimated 1.03 times increase in injury rate each year
(95% CI: 1.02-1.04). The trend in disabling claim rate remained fairly constant from 2013 to
2017 but droped substantially in 2018. This drop in 2018 is likely due to potential data
incompleteness in this year due to less maturation time, which is a common phenomenon in
Workers aged 19-21 (disabling:119.0 per 10,000 workers and non-disabling: 429.3) and
22-24 years (115.7 and 396.4) had higher claim rates than the youngest workers aged 14-18 (80.6
and 297.0) (Table 4-1). Males had higher claim rates (disabling:145.3 per 10,000 workers and
non-disabling: 509.0) than females (79.8 and 282.9). For both disabling and non-disabling
claims, the injury rate ratio comparing male and female workers decreased with increasing age
(Rate Ratio (RR): >1.91 in age group 14-18 vs. <1.78 in age group 22-24, data not shown).
Figure 4-1 further displays disabling claim rates by gender for all age groups, including
adult workers aged above 24 years. The age group 19-21 years had the highest injury rates in
both genders. Compared to female and male adult workers, injury rates in females and males in
this age group were 1.05 (95% CI: 1.00-1.10) and 1.16 times (95% CI: 1.12-1.20) higher. The
youngest age group (14-18) had the lowest injury rates, which were 0.70 (95%CI: 0.65-0.76) and
0.81 (95%CI: 0.76-0.87) times those of female and male adult workers.
Most claims involved the injury nature, “traumatic injuries and disorders” (92.6% in both
disabling and non-disabling claims combined), followed by “infectious and parasitic diseases”
104
(4.6%) and “diseases and disorders of body systems” (2.0%). Other and multiple diseases,
conditions, disorders, and symptoms accounted for fewer than 1% of the claims; however, they
incurred much higher medical cost. For example, the median medical cost in these injury natures
was $10,656 as compared with $1,722 in the above three more common injury natures in
disabling claims.
Regarding injury types, “struck by/against” was the most common type accounting for
35.6% of disabling and non-disabling claims combined, followed by WMSDs (19.5%), “fall on
same level” (9.1%), and violence (6.6%) (Figure 4-2). Non-disabling claims involved a higher
proportion of “struck by/against” (41.8% vs. 24.2%) and violence (7.6% vs. 4.6%) injuries
relative to the disabling claims. Young workers had more “struck by/against” injuries (36.0% vs.
13.8%) and less WMSDs (19.5% vs. 35.7%) as compared to adult workers (in disabling claims).
Considering medical cost, WMSDs tended to have higher medical cost, which was especially
Claim rates by major injury types and the nine NORA sectors are presented in Table 4-2.
The “agriculture, forestry & fishing” (hereinafter referred as agriculture) sector had much higher
than average rates for all major injury types. Although the mining sector did not have claim
records for all these major injury types, it had high rates in most injury types that it had claims.
WMSDs had high rates in nearly all NORA sectors, with “healthcare & social assistance”
(hereinafter referred as healthcare), manufacturing, and mining sectors having the highest rates
for both disabling and non-disabling claims. Disabling WMSDs rate was exceptionally high in
the “transportation, warehousing & utilities” (hereinafter referred as transportation) sector (RR:
3.4; 95% CI: 3.04-3.79, compared with the average disabling WMSDs rate). “struck by/against”
manufacturing sectors. It is worth noting that violence injuries were concentrated in few sectors
including healthcare, agriculture and public safety, while the other sectors had extremely low
rates.
Nine NORA industry sectors were ranked using the basic and expanded PIs (Table 4-3).
For disabling claims, the agriculture sector ranked the highest with both the basic and expanded
PIs, followed by construction, transportation, and manufacturing. The mining sector ranked
among the lowest with basic PI, however it ranked high on expanded PI which incorporates
medical costs. For non-disabling claims, the manufacturing and construction sectors had the
highest ranks with both basic and expanded PIs. Compared to disabling claims, the agriculture
and transportation sectors ranked relatively lower. Again, mining sector ranked high with
expanded PI. In both disabling and non-disabling claims, “services (except public safety and
veterinary medicine/animal care)” (hereinafter referred as services) and “wholesale and retail
Stratified by gender, for both disabling and non-disabling claims, the healthcare sector
ranked the highest for female workers but among the lowest for male workers (Table S1-S2 in
Appendix 4-1). On the other hand, construction and agriculture were among the top ranked
sectors for male workers but among the lowest ranked for female workers. Further examination
showed that male claimants dominated the construction and agriculture sectors (>87%), while
We further ranked industry groups based on NAICS 4-digit codes. These results are
included in Appendix 4-2 for interested readers but not discussed in this paper.
106
Discussion
Injury is the leading cause of death as well as years of potential life lost for children and
young adults 56. Preventing work-related injuries and illnesses among young workers continues
to be in a critical need. The present study explored the use of both disabling and non-disabling
WC claims data to document patterns of work-related injuries and illnesses among Oregon
young workers aged 24 years and under in a 6-year period from 2013 to 2018. Using a
Prevention Index, the study systematically prioritized industry sectors with a high injury risk for
young workers.
The study identified an overall disabling injury rate of 111.6 per 10,000 workers for
young workers, which is slightly lower than the reported disabling injury rate of 122.7 per
10,000 workers for Oregon young workers from 2000-2007 in the previous study 33. Breaking
down by age and gender, we noticed that male workers and workers aged 19-21 and 22-24 had
higher rates. The findings are consistent with existing literature 3,9,113,9,11,33. The study quantified
a non-disabling injury rate approximately 4 times the rate of disabling injuries among Oregon
young workers, which has not been reported before. Findings from the study highlighted the
importance of using both disabling and non-disabling WC claims data for a more complete report
Common injury types identified for Oregon young workers were similar to those for
Oregon adult workers in this study, as well as to those reported in the state of Washington for
workers of all ages combined 39. Consistent with existing studies 3,8,22, we found that the overall
proportion of WMSDs among young workers was less than that among adult workers. Young
workers may have less time to develop impaired musculoskeletal functioning compared with
107
older workers 3, however, a progressive history of WMSD and other musculoskeletal conditions
accumulated since young age may lead to higher risk to more injuries and permanent disability
3,57
. In contrast to WMSDs, “struck by/against” injuries were more common among young
workers than adult workers. They were more prevalent in non-disabling claims, indicating this
type of injury may be less severe but happen more often. “Struck by/against” injuries refer to
those produced by forcible contact or impact between the injured person and sources. Most
injuries of this type involved open wounds or surface wounds and bruises (80.9%). Young
workers often work with machinery and equipment tools, especially in production sectors such as
concentrated. These machines may be particularly dangerous to young workers, especially the
younger age group, as they are usually not designed to fit young workers’ physical body frames,
leading to awkward postures and injuries 58. Further, young workers often hold jobs with fast-
pace and higher levels of physical exertion, which exacerbate their injury risk 5,8,10,12,14,15. On the
other hand, young people tend to lack experience and cognitive maturity to deal with difficult
tasks or work environments 5,10,11,22. They may also lack access to training and appropriate
supervision on the job 17,18. Education and other intervention programs to increase young
people’s safety skills and knowledge and awareness of workplace hazards, as well as to promote
appropriate supervision help to reduce their vulnerability to many workplace injuries including
“struck by/against”.
No previous study has systematically ranked high-risk industries for young workers. This
study reported high ranked NORA sectors in both disabling and non-disabling claims based on
PI method. The basic and expanded PI ranks can inform research and intervention targets by
108
incorporating injury count, rate, and cost burden. Previous studies commonly reported
agriculture, manufacturing, trade and service sectors as hazardous for adolescent and young adult
workers 20,32,33,59. Our study suggested that agriculture, construction and manufacturing sectors
should be priority targets as they had fairly high injury counts and rates, as well as high medical
cost associated with the injuries and illnesses. Services and wholesale and retail trade sectors
should be a focus in addressing the large number of injuries among young workers. It is of note
that the mining sector had high expanded PI ranks due to its very high claim rate and cost rate,
despite that it was ranked low with basic PI due to the fewest claim counts. Although the mining
industry has historically been seen as one of the most dangerous industries in the US for its high
death toll and high injury rate for all ages 60, few previous studies highlighted concern on the
mining sector for young workers. Given the high injury rate and associated medical cost among
Oregon young workers, more research is needed to investigate young workers’ safety and health
in these industries at the state and local level. Prevention efforts in these industries must consider
how to target young workers or create provisions specific for younger workers.
Gender difference
doubled injury rate compared with their female counterparts. Recognizing the high burden of
injury and illness among male workers, researchers also pointed out a bias in occupational health
that it tends to emphasize injuries and risk factors typically associated with male workers and
overlook those associated with women 16,22,61. Our study showed that male young workers
dominated in most high rank sectors. In fact, the high ranked industries for both genders
combined mostly represented patterns of male young workers. For example, agriculture and
109
construction, ranked high for both genders combined and for males, were ranked low for
females. On the other hand, the healthcare sector, ranked the highest for females, was listed low
in the combined list. By making lists specifically for female workers, the study revealed
industries and injury types that contributed disproportionately to occupational injury and illness
Limitations
Despite the strength of including both disabling and non-disabling claims data, injuries
and illnesses are likely to be underestimated due to under-coverage and under-reporting issues
with WC claims 36,41,62–64. Young workers are more likely to be under-counted as they often
employment data using claims proportion based on the assumption that the workforce covered by
SAIF was the same as that by other insurers, defined by factors of the study’s interest such as age
and gender. We understand that this assumption might not hold true. However, we think the
estimation is appropriate for this explorative study, given the fact that SAIF is the largest WC
provider in Oregon and covers the majority of Oregon employers. Data interoperability in
disabling and non-disabling data was another challenge in this study. Despite our best effort to
develop a crosswalk between WCIO and OIICS, misclassification was unavoidable in certain
Conclusions
The study explored the expanded use of WC data by including both disabling and non-
disabling claims data in a recent 6-year period from 2013 to 2018 in describing young workers’
110
injury burden. The findings of this study complement previous studies with a more complete
In summary, young workers experienced higher injury rates than adult workers. There
were significantly more non-disabling injuries than disabling injuries across different industry
sectors, age and gender groups. For both disabling and non-disabling claims, male young
workers and workers aged 19-21 and 22-24 years had a greater injury burden than their female
and younger counterparts. Common injury types included “struck by/against”, WMSDs, “fall on
same level” and violence, with differences observed between disabling and non-disabling claims,
and across industries and genders. The study is the first to report a systematic ranking of industry
sectors with high injury risk among young workers as a whole as well as in sub-groups.
The study has practical implications for young workers’ occupational safety and health.
A more complete picture of work-related injuries and illnesses based on multiple data sources
provides evidence to both develop and evaluate targeted prevention interventions for Oregon
young workers. High ranked industry sectors for young workers as a whole, such as agriculture,
construction and manufacturing sectors, and for female workers separately such as healthcare
should be priority targets to reduce both the number and rate of injuries and illnesses among
young workers. Interventions targeting certain industries could aim for major injury types that
are prevalent in this sector. For example, WMSDs and violence injuries should be the focuses in
healthcare sector.
Exploring non-disabling claims data from commercial WC insurers and the use of the
Prevention Index methodology enhances OSH surveillance of young workers, as well as other
worker groups. Future efforts to obtain non-disabling data from more insurers and to examine
data representativeness would increase the completeness and accuracy in documenting injuries
111
and illnesses among young workers. The study methodology could be adopted by other states to
Reference
1. National Institute for Occupational Safety and Health. NIOSH Workplace Safety and
Health Topic: Young Worker Safety and Health. Published 2019. Accessed February 9,
2020. https://www.cdc.gov/niosh/topics/youth/default.html
2. National research Council. Protecting Youth at Work: Health, Safety, and Development of
Working Children and Adolescents in the United States. The National Academies Press;
1998. doi:10.17226/6019
3. Breslin C, Koehoorn M, Smith P, Manno M. Age related differences in work injuries and
permanent impairment: a comparison of workers’ compensation claims among
adolescents, young adults, and adults. Occup Environ Med. 2003;60(9):e10–e10.
6. Windau J, Samuel M. Occupational injuries among young workers. Montyly Labor Rev.
Published online 2005:11-23.
9. Centers for Disease Control and Prevention. Nonfatal Occupational Injuries and Illnesses
Treated in Hospital Emergency Departments--United States, 2003. MMWR Morb Mortal
Wkly Rep. 2006;55(449-452):313.
10. Hanvold TN, Kines P, Nykänen M, et al. Occupational Safety and Health Among Young
Workers in the Nordic Countries: A Systematic Literature Review. Saf Health Work.
Published online December 2018. doi:10.1016/j.shaw.2018.12.003
11. Salminen S. Have young workers more injuries than older ones? An international literature
review. J Safety Res. 2004;35(5):513-521. doi:10.1016/j.jsr.2004.08.005
112
12. Hanvold TN. Young Workers and Sustainable Work Life: Special Emphasis on Nordic
Conditions. Nordic Council of Ministers; 2016.
13. Paterson JL, Clarkson L, Rainsird S, Etherton H, Blevvett V. Occupational fatigue and
other health and safety issues for young Australian workers: an exploratory mixed
methods study. Ind Health. 2015;53(3):293-299. doi:10.2486/indhealth.2014-0257
15. Miller ME, Kaufman JD. Occupational injuries among adolescents in Washington State,
1988-1991. Am J Ind Med. 1998;34(2):121-132. doi:10.1002/(SICI)1097-
0274(199808)34:2<121::AID-AJIM4>3.3.CO;2-#
17. Dragano N, Barbaranelli C, Reuter M, et al. Young Workers’ Access to and Awareness of
Occupational Safety and Health Services: Age-Differences and Possible Drivers in a
Large Survey of Employees in Italy. Int J Environ Res Public Health. 2018;15(7):1511.
doi:10.3390/ijerph15071511
18. Runyan CW, Schulman M, Dal Santo J, Bowling JM, Agans R, Ta M. Work-related
hazards and workplace safety of US adolescents employed in the retail and service sectors.
Pediatrics. 2007;119(3):526-534. doi:10.1542/peds.2006-2009
19. Nielsen ML, Dyreborg J, Kines P, Nielsen KJ, Rasmussen K. Exploring and Expanding
the Category of ‘Young Workers’ According to Situated Ways of Doing Risk and
Safety—a Case Study in the Retail Industry. Nord J Work Life Stud. 2013;3(3):219-234.
doi:10.19154/njwls.v3i3.3019
20. Horwitz IB, McCall BP. Occupational injury among Rhode Island adolescents: an analysis
of workers’ compensation claims, 1998 to 2002. J Occup Environ Med Am Coll Occup
Environ Med. 2005;47(5):473-481.
21. Mujuru P, Mutambudzi M. Injuries and Seasonal Risks among Young Workers in West
Virginia - A 10-Year Retrospective Descriptive Analysis. AAOHN J. 2007;55(9):381-387.
doi:10.1177/216507990705500906
22. Laberge M, Ledoux E. Occupational health and safety issues affecting young workers: A
literature review. Work- J Prev Assess Rehabil. 2011;39(3):215-232. doi:10.3233/WOR-
2011-1170
23. Parker DL, Carl WR, French LR, Martin FB. Nature and incidence of self-reported
adolescent work injury in Minnesota. Am J Ind Med. 1994;26(4):529-541.
doi:10.1002/ajim.4700260410
113
24. Brooks DR, Davis LK, Gallagher SS. Work-related injuries among Massachusetts
children: A study based on emergency department data. Am J Ind Med. 1993;24(3):313-
324. doi:10.1002/ajim.4700240308
25. Graves JM, Sears JM, Vavilala MS, Rivara FP. The burden of traumatic brain injury
among adolescent and young adult workers in Washington State. J Safety Res.
2013;45:133-139. doi:10.1016/j.jsr.2012.11.001
26. US Bureau of Labor Statistics. Survey of Occupational Injuries and Illnesses Data.
Published 2019. Accessed March 10, 2020. https://www.bls.gov/iif/soii-overview.htm
27. Brooks DR, Davis LK. Work-related injuries to Massachusetts teens, 1987–1990. Am J
Ind Med. 1996;29(2):153-160. doi:10.1002/(SICI)1097-0274(199602)29:2<153::AID-
AJIM5>3.0.CO;2-T
28. Radi S, Benke G, Schaafsma F, Sim M. Compensation claims for occupational noise
induced hearing loss between 1998 and 2008: yearly incidence rates and trends in older
workers. Aust N Z J Public Health. 2016;40(2):181–185. doi:10.1111/1753-6405.12460
29. Morassaei S, Breslin FC, Shen M, Smith PM. Examining job tenure and lost-time claim
rates in Ontario, Canada, over a 10-year period, 1999–2008. Occup Env Med.
2014;70(3):171-178. doi:10.1136/oemed-2012-100743
32. McCall BP, Horwitz IB, Carr BS. Adolescent Occupational Injuries and Workplace Risks:
An Analysis of Oregon Workers’ Compensation Data 1990–1997. J Adolesc Health.
2007;41(3):248-255. doi:10.1016/j.jadohealth.2007.02.004
33. Walters JK, Christensen KA, Green MK, Karam LE, Kincl LD. Occupational injuries to
Oregon workers 24 years and younger: An analysis of workers’ compensation claims,
2000-2007. Am J Ind Med. 2010;53(10):984-994. doi:10.1002/ajim.20819
35. Utterback DF, Meyers AR, Wurzelbacher SJ. Workers’ Compensation Insurance: A
Primer for Public Health. Nitional Institute for Occupational Safety and Health; 2014.
114
38. SAIF. SAIF company information. Published 2020. Accessed March 7, 2020.
https://www.saif.com/x640.xml
39. Anderson NJ, Bonauto DK, Adams D. Prioritizing Industries for Occupational Injury and
Illness Prevention and Research, Washington State Workers’ Compensation Claims Data,
2002–2010. Safety and Health Assessment and Research for Prevention (SHARP)
Program, Washington State Department of Labor & Industries; 2013:1-58. Accessed
February 22, 2017. http://www.lni.wa.gov/Safety/Research/Files/bd_3F.pdf
41. Oregon State Legislature. Workers’ Compensation (2019 Edition).; 2019. Accessed March
10, 2020. https://www.oregonlegislature.gov/bills_laws/ors/ors656.html
42. Oregon State Legislature. Workers’ Compensation (2017 Edition).; 2017. Accessed
November 20, 2017. https://www.oregonlegislature.gov/bills_laws/ors/ors656.html
43. Department of Consumer and Business Services. State of Oregon Workers’ compensation
insurance and self insurance. Accessed April 4, 2020.
https://www.oregon.gov/dcbs/reports/compensation/insurance/Pages/index.aspx
45. Syron LN, Kincl L, Yang L, Cain DT, Smit E. Analysis of workers’ compensation
disabling claims in Oregon’s seafood preparation and packaging industry, 2007-2013. Am
J Ind Med. 2017;60(5):484-493. doi:10.1002/ajim.22706
46. US Census Bureau. North American Industry Classification System. Published 2020.
Accessed March 10, 2020. https://www.census.gov/cgi-
bin/sssd/naics/naicsrch?chart=2012
115
47. Centers for Disease Control and Prevention. National Occupational Research Agenda
(NORA): Sectors. Published February 4, 2019. Accessed March 10, 2020.
https://www.cdc.gov/nora/sectorapproach.html
48. Centers for Disease Control and Prevention. Occupational Injury and Illness Classification
System (OIICS). Published 2019. Accessed March 10, 2020.
https://wwwn.cdc.gov/Wisards/oiics/default.aspx
50. Anderson NJ, Bonauto DK, Adams D. Prioritizing industries for occupational injury
prevention and research in the Services Sector in Washington State, 2002–2010. J Occup
Med Toxicol. 2014;9(37):1-15. doi:10.1186/s12995-014-0037-2
51. US Bureau of Labor Statistics. Occupational Safety and Health Definitions. Published
2016. Accessed March 10, 2020. https://www.bls.gov/iif/oshdef.htm
52. Bureau of Labor Statistics. Consumer Price Index (CPI). Published 2017. Accessed May
17, 2017. https://www.bls.gov/cpi/
53. Fouquet N, Bodin J, Chazelle E, Descatha A, Roquelaure Y. Use of Multiple Data Sources
for Surveillance of Work-Related Chronic Low-Back Pain and Disc-Related Sciatica in a
French Region. Ann Work Expo Health. 2018;62(5):530-546.
doi:10.1093/annweh/wxy023
55. Yang L, Branscum A, Smit E, Dreher D, Howard K, Kincl L. Work-related injuries and
illnesses and their association with hour of work: Analysis of the Oregon construction
industry in the US using workers’ compensation accepted disabling claims, 2007-2013. J
Occup Health. 2020;62(1):e12118. doi:10.1002/1348-9585.12118
56. Graitcer PL. The development of state and local injury surveillance systems. J Safety Res.
1987;18(4):191-198. doi:10.1016/0022-4375(87)90082-X
57. Saha A, Sadhu HG. Occupational Injury Proneness in Young Workers: A Survey in Stone
Quarries. J Occup Health. 2013;55(5):333-339. doi:10.1539/joh.12-0150-OA
58. Runyan CW, Zakocs RC. Epidemiology and Prevention of Injuries Among Adolescent
Workers in the United States. Annu Rev Public Health. 2000;21(1):247-269.
doi:10.1146/annurev.publhealth.21.1.247
59. Belville R, Pollack SH, Godbold JH, Landrigan PJ. Occupational injuries among working
adolescents in New York State. JAMA. 1993;269(21):2754-2759.
116
60. Margolis KA. Underground coal mining injury: A look at how age and experience relate to
days lost from work following an injury. Saf Sci. 2010;48(4):417-421.
doi:10.1016/j.ssci.2009.12.015
61. Taylor EJ, Neis B, Messing K, Dumais L. Invisible: issues in women’s occupational
health. Resour Fem Res Tor. 1996;25(1/2):36-37.
62. Azaroff LS, Levenstein C, Wegman DH. Occupational Injury and Illness Surveillance:
Conceptual Filters Explain Underreporting. Am J Public Health. 2002;92(9):1421-1429.
63. Biddle J, Roberts K, Rosenman KD, Welch EM. What Percentage of Workers With Work-
Related Illnesses Receive Workers’ Compensation Benefits? J Occup Environ Med.
1998;40(4):325–331.
64. Shannon HS, Lowe GS. How many injured workers do not file claims for workers’
compensation benefits? Am J Ind Med. 2002;42(6):467-473. doi:10.1002/ajim.10142
117
Table 4-1 Oregon workers' compensation disabling and non-disabling claims frequency, rate, medical cost by year and demographics, 2013-2018
Table 4-2 Claim rate for young workers, by major injury type and NORA industry sector, Oregon workers' compensation accepted disabling (n=13360) and non-
disabling claims (n=24660), 2013-2018
Table 4-3 NORA industry sectors ranked by Prevention Index (PI) for young workers, Oregon workers' compensation accepted disabling (n=13360) and non-
disabling claims (n=24660), 2013-2018
160.0 155.4
148.6
Disabling Claim Rate per 10,000 Workers
140.0 133.9
120.0 108.9
100.0
85.0 83.6 81.2
80.0
57.0
60.0
40.0
20.0
0.0
Female workers Male workers
Figure 4-1 Disabling injury rate by age group (including adult worker group) and gender, Oregon accepted
disabling claims (2013-2018) (Female workers: n=568 (14-18), 1904 (19-21), 2454 (22-24), 36714 (> 24); Male
workers: n=905 (14-18), 3263 (19-21), 4264 (22-24), 64931 (> 24))
121
Struck by/against
WMSDs
Fall on same level
Violence
Others
Caught in/under/between
Rubbed or abraded
Transportation
Fall to lower level
Extreme temperature
Toxic
Overexertion
Bodily reaction
Electrical
Fire and explosion
Stress
WMSDs: work-related musculoskeletal disorders
Noise
Figure 4-2 Distribution of claims by injury type for young workers, Oregon accepted disabling (n=13360) and non-
disabling claims (n=24660), 2013-2018
122
events has been adopted as a common practice in many countries including the US to improve
occupational health. However, challenges with data sources, surveillance methods and other
resources has been plaguing OSH surveillance. Evaluation has long been a weak point in OSH
surveillance. As the first study of this research showed, nearly half of the systems being
reviewed conducted only secondary data analysis. Approximately two thirds of the systems were
never or rarely evaluated and no system was evaluated frequently. No guideline or framework
has been developed for evaluating OSH surveillance systems to date. I have reported elsewhere
challenges in the evaluation of Oregon OSH surveillance system based on the CDC updated
guidelines (manuscript in press). This includes selecting attributes and linking them to
specifying OSH surveillance relevant components and attributes with standardized descriptions
is promising to facilitate this process. Experts and field practitioners engaged in this research
held positive expectations for a tailored OSH surveillance framework. This research contributes
to the field of OSH surveillance and work-related injury and illness prevention by addressing
gaps regarding surveillance data sources and evaluation to meet the needs of field practitioners.
Each specific aim of this research was accomplished (presented in Chapters 2, 3, and 4). The
findings from each of the three studies are summarized below, followed by general conclusions
Aim 1: To understand the current status of OSH surveillance in the US and to develop a
The Aim 1 was accomplished by conducting a Delphi study involving experts across the
US with experience on the operation and/or evaluation of OSH surveillance systems (presented
in Chapter 2). Three iterative rounds of experts’ review and rating identified a total of 211
elements (surveillance components, attributes and example measures), with 134 (62.3%)
reaching high level of panel consensus. For each component and attribute, definition/description
was finalized based on experts’ suggestions to reflect best practice in current OSH surveillance.
Experts’ review of their respective OSH surveillance systems highlighted that evaluations were
not a common practice in these OSH surveillance systems/programs. Major system constraints
and weak points included resources and feasibility, data collection and flexibility. Resources
constraints including financial support, technical platforms and expertise were highlighted as
limitations for more advanced OSH surveillance activities. As such, feasibility was regarded as
performance criteria.
the US based on the comprehensive list of elements identified in Aim 1 and follow-up
suggestions from the Delphi panel experts and potential users of the framework.
The Aim 2 was accomplished as a translational study project following the completion of
Aim 1 (presented in Chapter 3). Identified framework elements and a proposed framework
sketch were disseminated to the Delphi panel experts and potential users of the framework, i.e.,
health professionals and practitioners in national and State-level OSH surveillance systems
124
through the Council of State and Territorial Epidemiologists (CSTE) Occupational Health (OH)
Subcommittee. Feedbacks were collected on the applicability of identified elements and the
development of the framework. A prevalent suggestion is that the framework must be flexible,
simple, and straightforward. The developed guiding framework incorporates the identified
elements, field experts’ suggestions, as well as OSH surveillance evaluation experience from the
research team.
company and the Prevention Index (PI) method in OSH surveillance by examining injury and
illness burden and prioritizing industries of high-risk for a vulnerable population in Oregon,
claims data from the State Accident Insurance Fund Corporation (SAIF), the largest WC
insurance provider in Oregon were analyzed together with accepted WC disabling claims data.
Frequency counts and rates by age, gender, industry, and injury type were calculated. The study
quantified a non-disabling injury rate approximately 4 times the rate of disabling injuries among
Oregon young workers (401.3 vs. 111.6 per 10,000 young workers), highlighting the importance
of using both disabling and non-disabling WC claims data for a more complete report of injuries
and illnesses among young workers. The use of PI method identified high ranked industry
sectors for Oregon young workers. For both genders combined, high-risk industries included
agriculture, construction and manufacturing. For female young workers, the high ranked sector
was healthcare.
125
Being applied in nature, the research project aimed to bridge gap between research and
dissemination and integration, field needs in OSH surveillance have been incorporated to guide
surveillance as well as practical guidance and tools on conducting OSH surveillance evaluation
were incorporated into the developed guiding framework, which is expected to provide a
Exploring the use of new and existing surveillance data sources and methods has
claims data and the Prevention Index is promising to expand current state-based surveillance to
guide evidence-based interventions for young workers as well as other worker populations. The
including inconsistent data coding systems used and possible coding errors. In-depth evaluation
of non-disabling WC claims data quality help to document and quantify the magnitude and
In the long run, further dissemination of the developed framework to potential users help
to promote more evaluations and in turn to improve OSH surveillance systems. Field application
will help to refine and enhance the framework. Continuing efforts to obtain and evaluate non-
disabling WC claims data as well as other potential data sources would increase the
Bibliography
1. Takala J, Hämäläinen P, Saarela KL, et al. Global Estimates of the Burden of Injury and
Illness at Work in 2012. J Occup Environ Hyg. 2014;11(5):326-337.
doi:10.1080/15459624.2013.863131
2. Bureau of Labor Statistics. Injuries, Illnesses, and Fatalities (IIF). Published 2020.
Accessed March 19, 2020. https://www.bls.gov/iif/home.htm
3. National Safety Council. Work Injury Costs. Injury Facts. Accessed May 14, 2020.
https://injuryfacts.nsc.org/work/costs/work-injury-costs/
5. Centers for Disease Control and Prevention. Injury Prevention and Control. the Centers for
Disease Control and Prevention. Published 2014. Accessed April 1, 2018.
https://www.cdc.gov/injury/about/approach.html
6. Centers for Disease Control and Prevention. How We Prevent Chronic Diseases and
Promote Health. Published July 31, 2019. Accessed March 31, 2020.
https://www.cdc.gov/chronicdisease/center/nccdphp/how.htm
7. National Institute for Occupational Safety and Health. Tracking Occupational Injuries,
Illnesses, and Hazards: The NIOSH Surveillance Strategic Plan. U.S. Department of
Health and Human Services, Public Health Service, Centers for Disease Control and
Prevention, National Institute for Occupational Safety and Health; 2001.
8. Centers for Disease Control and Prevention. Updated guidelines for evaluating public
health surveillance systems: recommendations from the guidelines working group. Morb
Mortal Wkly Rep. 2001;50(No. 13):1-36.
12. Council NR. Counting Injuries and Illnesses in the Workplace: Proposals for a Better
System. The National Academies Press; 1987. doi:10.17226/18911
127
13. Shire JD, Marsh GM, Talbott EO, Sharma RK. Advances and Current Themes in
Occupational Health and Environmental Public Health Surveillance. Annu Rev Public
Health. 2011;32(1):109-132. doi:10.1146/annurev-publhealth-082310-152811
14. National Institute for Occupational Safety and Health. Worker Health Surveillance - Our
Current Surveillance Initiatives. Published 2019. Accessed May 15, 2020.
https://www.cdc.gov/niosh/topics/surveillance/data.html
15. National Institute for Occupational Safety and Health. State Surveillance Program.
Published February 6, 2020. Accessed March 18, 2020.
https://www.cdc.gov/niosh/oep/statesurv.html
16. Occupational Safety and Health Administration (OSHA). OSHA - Information System
(OIS). Accessed April 26, 2020. https://www.dol.gov/agencies/oasam/centers-
offices/ocio/privacy/osha/ois
19. Stout N, Bell C. Effectiveness of source documents for identifying fatal occupational
injuries: a synthesis of studies. Am J Public Health. 1991;81(6):725-728.
doi:10.2105/AJPH.81.6.725
20. Leigh JP, Du J, McCurdy SA. An estimate of the U.S. government’s undercount of
nonfatal occupational injuries and illnesses in agriculture. Ann Epidemiol. 2014;24(4):254-
259. doi:10.1016/j.annepidem.2014.01.006
22. Leigh JP, Marcin JP, Miller TR. An Estimate of the U.S. Government’s Undercount of
Nonfatal Occupational Injuries. J Occup Environ Med. 2004;46(1):10–18.
doi:10.1097/01.jom.0000105909.66435.53
23. Rosenman KD, Kalush A, Reilly MJ, Gardiner JC, Reeves M, Luo Z. How Much Work-
Related Injury and Illness is Missed By the Current National Surveillance System? J
Occup Environ Med. 2006;48(4):357–365. doi:10.1097/01.jom.0000205864.81970.63
24. Ruser JW. Allegations of Undercounting in the BLS Survey of Occupational Injuries and
Illnesses. In: JSM Proceedings, Statistical Computing Section. American Statistical
Association; 2010:15.
25. Azaroff LS, Levenstein C, Wegman DH. Occupational Injury and Illness Surveillance:
Conceptual Filters Explain Underreporting. Am J Public Health. 2002;92(9):1421-1429.
128
26. Kica J, Rosenman KD. Multi-source surveillance for work-related crushing injuries. Am J
Ind Med. 2018;61(2):148-156. doi:10.1002/ajim.22800
27. Largo TW, Rosenman KD. Surveillance of work-related amputations in Michigan using
multiple data sources: results for 2006-2012. Occup Environ Med. 2015;72(3):171-176.
doi:10.1136/oemed-2014-102335
28. Davis LK, Grattan KM, Tak S, Bullock LF, Ozonoff A, Boden LI. Use of multiple data
sources for surveillance of work-related amputations in Massachusetts, comparison with
official estimates and implications for national surveillance. Am J Ind Med.
2014;57(10):1120–1132.
29. Borjan M, Lumia M. Evaluation of a state based syndromic surveillance system for the
classification and capture of non-fatal occupational injuries and illnesses in New Jersey.
Am J Ind Med. 2017;60(7):621-626. doi:10.1002/ajim.22734
30. Sears JM, Bowman SM, Hogg-Johnson S, Shorter ZA. Occupational Injury Trends
Derived From Trauma Registry and Hospital Discharge Records Lessons for Surveillance
and Research. J Occup Environ Med. 2014;56(10):1067-1073.
doi:10.1097/JOM.0000000000000225
31. Pefoyo AJK, Genesove L, Moore K, Del Bianco A, Kramer D. Exploring the Usefulness
of Occupational Exposure Registries for Surveillance The Case of the Ontario Asbestos
Workers Registry (1986-2012). J Occup Environ Med. 2014;56(10):1100-1110.
doi:10.1097/JOM.0000000000000235
32. Lofgren DJ, Reeb-Whitaker CK, Adams D. Surveillance of Washington OSHA Exposure
Data to Identify Uncharacterized or Emerging Occupational Health Hazards. J Occup
Environ Hyg. 2010;7(7):375-388. doi:10.1080/15459621003781207
34. Sarazin P, Burstyn I, Kincl L, Lavoué J. Trends in OSHA Compliance Monitoring Data
1979–2011: Statistical Modeling of Ancillary Information across 77 Chemicals. Ann
Occup Hyg. 2016;60(4):432-452. doi:10.1093/annhyg/mev092
35. Graber JM, Smith AE. Results from a state-based surveillance system for carbon
monoxide poisoning. PUBLIC Health Rep. 2007;122(2):145-154.
doi:10.1177/003335490712200203
38. Breslin C, Koehoorn M, Smith P, Manno M. Age related differences in work injuries and
permanent impairment: a comparison of workers’ compensation claims among
adolescents, young adults, and adults. Occup Environ Med. 2003;60(9):e10–e10.
39. Radi S, Benke G, Schaafsma F, Sim M. Compensation claims for occupational noise
induced hearing loss between 1998 and 2008: yearly incidence rates and trends in older
workers. Aust N Z J Public Health. 2016;40(2):181–185. doi:10.1111/1753-6405.12460
40. Morassaei S, Breslin FC, Shen M, Smith PM. Examining job tenure and lost-time claim
rates in Ontario, Canada, over a 10-year period, 1999–2008. Occup Env Med.
2014;70(3):171-178. doi:10.1136/oemed-2012-100743
44. Mallon M, Cherry E. Investigating the Relationship Between Worker Demographics and
Nature of Injury on Federal Department of Defense Workersʼ Compensation Injury Rates
and Costs From 2000 to 2008. J Occup Environ Med. 2015;57 Suppl 3S:S27–S30.
doi:10.1097/JOM.0000000000000416
45. Ramaswamy SK, Mosher GA. Using workers’ compensation claims data to characterize
occupational injuries in the biofuels industry. Saf Sci. 2018;103:352-360.
doi:10.1016/j.ssci.2017.12.014
46. Horwitz IB, McCall BP. Occupational injury among Rhode Island adolescents: an analysis
of workers’ compensation claims, 1998 to 2002. J Occup Environ Med Am Coll Occup
Environ Med. 2005;47(5):473-481.
47. Walters JK, Christensen KA, Green MK, Karam LE, Kincl LD. Occupational injuries to
Oregon workers 24 years and younger: An analysis of workers’ compensation claims,
2000-2007. Am J Ind Med. 2010;53(10):984-994. doi:10.1002/ajim.20819
130
48. Utterback DF, Meyers AR, Wurzelbacher SJ. Workers’ Compensation Insurance: A
Primer for Public Health. Nitional Institute for Occupational Safety and Health; 2014.
Accessed February 19, 2019. http://elcosh.org/document/3752/d001287/workers%3F-
compensation-insurance%3A-a-primer-for-public-health.html
49. Thacker SB, Qualters JR, Lee LM. Public Health Surveillance in the United States:
Evolution and Challenges. Morb Mortal Wkly Rep. 2012;61(3):3-9.
50. Hall HI, Correa A, Yoon PW, Braden CR. Lexicon, Definitions, and Conceptual
Framework for Public Health Surveillance. 2012;61:5.
51. Choi BCK. The Past, Present, and Future of Public Health Surveillance. Scientifica.
2012;2012:1-26. doi:http://dx.doi.org/10.6064/2012/875253
52. Allaki FE, Bigras-Poulin M, Michel P, Ravel A. A Population Health Surveillance Theory.
Epidemiol Health. 2012;34:e2012007. doi:10.4178/epih/e2012007
53. Centers for Disease Control and Prevention. Guidelines for evaluating surveillance
systems. Morb Mortal Wkly Rep MMWR. 1988;37(No. S-5).
54. Langmuir AD. The surveillance of communicable diseases of national importance. N Engl
J Med. 1963;268(4):182-192.
55. McNabb SJ, Chungong S, Ryan M, et al. Conceptual framework of public health
surveillance and action and its application in health sector reform. BMC Public Health.
2002;2:2. doi:10.1186/1471-2458-2-2
56. World Health Organization. Communicable Disease Surveillance and Response Systems:
Guide to Monitoring and Evaluating. World Health Organization; 2006.
http://apps.who.int/iris/bitstream/10665/69331/1/WHO_CDS_EPR_LYO_2006_2_eng.pd
f
59. Muellner P, Watts J, Bingham P, et al. SurF: an innovative framework in biosecurity and
animal health surveillance evaluation. Transbound Emerg Dis. 2018;0(0):1-8.
doi:10.1111/tbed.12898
131
60. Peyre M, Hoinville L, Njoroge J, et al. The RISKSUR EVA tool (Survtool): A tool for the
integrated evaluation of animal health surveillance systems. Prev Vet Med.
2019;173:104777. doi:10.1016/j.prevetmed.2019.104777
61. Auer AM, Dobmeier TM, Haglund BJ, Tillgren P. The relevance of WHO injury
surveillance guidelines for evaluation: learning from the aboriginal community-centered
injury surveillance system (ACCISS) and two institution-based systems. BMC Public
Health. 2011;11(744):1-15. doi:10.1186/1471-2458-11-744
62. Centers for Disease Control and Prevention (CDC), Office of Strategy and Innovation.
Introduction to Program Evaluation for Public Health Programs: A Self-Study Guide.
Atlanta, GA: Centers for Disease Control and Prevention; 2011.
http://www.cdc.gov/eval/guide/cdcevalmanual.pdf
63. Centers for Disease Control and Prevention, Program Performance and Evaluation Office.
Framework for program evaluation in public health. Morb Mortal Wkly Rep.
1999;48(No.RR-11):1-40.
64. Calba C, Goutard FL, Hoinville L, et al. Surveillance systems evaluation: a systematic
review of the existing approaches. BMC Public Health. 2015;15:448. doi:10.1186/s12889-
015-1791-5
65. Drewe JA, Hoinville LJ, Cook AJC, Floyd T, Stärk KDC. Evaluation of animal and public
health surveillance systems: a systematic review. Epidemiol Infect. 2012;140(4):575-590.
doi:10.1017/S0950268811002160
67. Spreeuwers D, de Boer AG, Verbeek JH, van Dijk FJ. Characteristics of national registries
for occupational diseases: international development and validation of an audit tool
(ODIT). BMC Health Serv Res. 2009;9:194. doi:10.1186/1472-6963-9-194
69. Adamson PC, Tafuma TA, Davis SM, Xaba S, Herman-Roloff A. A systems-based
assessment of the PrePex device adverse events active surveillance system in Zimbabwe.
PLOS ONE. 2017;12(12):e0190055. doi:10.1371/journal.pone.0190055
70. Jefferson H, Dupuy B, Chaudet H, et al. Evaluation of a syndromic surveillance for the
early detection of outbreaks among military personnel in a tropical country. J Public
Health Oxf Engl. 2008;30(4):375-383. doi:10.1093/pubmed/fdn026
71. Jhung MA, Budnitz DS, Mendelsohn AB, Weidenbach KN, Nelson TD, Pollock DA.
Evaluation and overview of the National Electronic Injury Surveillance System-
Cooperative Adverse Drug Event Surveillance Project (NEISS-CADES). Med Care.
2007;45(10 Supl 2):S96-102. doi:10.1097/MLR.0b013e318041f737
132
73. Kaburi BB, Kubio C, Kenu E, et al. Evaluation of the enhanced meningitis surveillance
system, Yendi municipality, northern Ghana, 2010–2015. BMC Infect Dis. 2017;17:306.
doi:10.1186/s12879-017-2410-0
76. Pinell-McNamara VA, Acosta AM, Pedreira MC, et al. Expanding Pertussis Epidemiology
in 6 Latin America Countries through the Latin American Pertussis Project. Emerg Infect
Dis. 2017;23(Suppl 1):S94-S100. doi:10.3201/eid2313.170457
77. Thomas MJ, Yoon PW, Collins JM, Davidson AJ, Mac Kenzie WR. Evaluation of
Syndromic Surveillance Systems in 6 US State and Local Health Departments: J Public
Health Manag Pract. Published online September 2017:1.
doi:10.1097/PHH.0000000000000679
79. Hoffman S, Dekraai M. Brief Evaluation Protocol for Public Health Surveillance Systems.
The Public Policy Center, University of Nebraska; 2016.
80. Centers for Disease Control and Prevention. Evaluating an NCD-Related Surveillance
System. Centers for Disease Control and Prevention; 2013.
81. Centers for Disease Control and Prevention. Framework for evaluating public health
surveillance systems for early detection of outbreaks: recommendations from the CDC
Working Group. Morb Mortal Wkly Rep. 2004;53(No. RR-5):1-13.
82. Salman null, Stärk KDC, Zepeda C. Quality assurance applied to animal disease
surveillance systems. Rev Sci Tech Int Off Epizoot. 2003;22(2):689-696.
83. Meynard J-B, Chaudet H, Green AD, et al. Proposal of a framework for evaluating
military surveillance systems for early detection of outbreaks on duty areas. BMC Public
Health. 2008;8(1):146.
133
85. Health Canada. Framework and Tools for Evaluating Health Surveillance Systems.
Population and Public Health Branch, Health Canada; 2004. Accessed October 18, 2017.
http://publications.gc.ca/site/eng/260337/publication.html
86. Hagemeyer A, Azofeifa A, Stroup DF, Tomedi LE. Evaluating Surveillance for Excessive
Alcohol Use in New Mexico. Prev Chronic Dis. 2018;15. doi:10.5888/pcd15.180358
87. Austin C. An evaluation of the census of fatal occupational injuries as a system for
surveillance. Compens Work Cond. 1995;1:51–54.
89. Garland R. Evaluation of the Oregon Asthma Surveillance System. Oregon Department of
Human Services, Public Health Division, Office of Disease Prevention and Epidemiology,
Health Promotion and Chronic Disease Prevention, Oregon Asthma Program; 2009.
90. Samoff E, MacDonald PDM, Fangman MT, Waller AE. Local Surveillance Practice
Evaluation in North Carolina and Value of New National Accreditation Measures. J
Public Health Manag Pract. 2013;19(2):146–152. doi:10.1097/PHH.0b013e318252ee21
92. Dufour B. Technical and economic evaluation method for use in improving infectious
animal disease surveillance networks. Vet Res. 1999;30(1):27–37.
93. World Health Organization. Assessing the National Health Information System: An
Assessment Tool (v4.0). World Health Organization; 2008.
http://apps.who.int/iris/bitstream/10665/43932/1/9789241547512_eng.pdf?ua=1
94. Drewe JA, Hoinville LJ, Cook AJC, Floyd T, Staerk KDC. Evaluation of animal and
public health surveillance systems: a systematic review. Epidemiol Infect.
2012;140(4):575-590. doi:10.1017/S0950268811002160
95. Zoellner JM, Porter KJ. Translational Research: Concepts and Methods in Dissemination
and Implementation Research. In: Coulston AM, Boushey CJ, Ferruzzi MG, Delahanty
LM, eds. Nutrition in the Prevention and Treatment of Disease (Fourth Edition).
Academic Press; 2017:125-143. doi:10.1016/B978-0-12-802928-2.00006-0
96. National Institute for Occupational Safety and Health. Research to Practice (r2p).
Published October 19, 2018. Accessed May 15, 2020.
https://www.cdc.gov/niosh/r2p/default.html
134
97. Landrigan PJ. Improving the surveillance of occupational disease. Am J Public Health.
1989;79(12):1601–1602.
98. Centers for Disease Control and Prevention. Overview of Evaluating Surveillance
Systems.; 2013.
99. Hoinville LJ, Alban L, Drewe JA, et al. Proposed terms and concepts for describing and
evaluating animal-health surveillance systems. Prev Vet Med. 2013;112(1-2):1-12.
doi:10.1016/j.prevetmed.2013.06.006
100. Auer AM, Dobmeier TM, Haglund BJ, Tillgren P. The relevance of WHO injury
surveillance guidelines for evaluation: learning from the aboriginal community-centered
injury surveillance system (ACCISS) and two institution-based systems. BMC Public
Health. 2011;11:744. doi:10.1186/1471-2458-11-744
103. Foth T, Efstathiou N, Vanderspank-Wright B, et al. The use of Delphi and Nominal Group
Technique in nursing education: A review. Int J Nurs Stud. 2016;60:112-120.
doi:10.1016/j.ijnurstu.2016.04.015
104. Murphy MK, Black NA, Lamping DL, et al. consensus development methods, and their
use in clinical guideline development. Health Technol Assess. 1998;2(3):1-88.
105. Adler M, Ziglio E. Gazing Into the Oracle: The Delphi Method and Its Application to
Social Policy and Public Health. Jessica Kingsley Publishers; 1996.
106. Linstone HA, Turoff M. The Delphi Method: Techniques and Applications. First Edition.
Addison-Wesley Educational Publishers Inc; 1975.
107. Powell C. The Delphi technique: myths and realities. J Adv Nurs. 2003;41(4):376-382.
doi:10.1046/j.1365-2648.2003.02537.x
108. Hsu C-C, Sandford BA. The Delphi technique: making sense of consensus. Pract Assess
Res Eval. 2007;12(10):1–8.
109. Rowe G, Wright G. The Delphi technique as a forecasting tool: issues and analysis. Int J
Forecast. 1999;15(4):353-375. doi:10.1016/S0169-2070(99)00018-7
110. Boulkedid R, Abdoul H, Loustau M, Sibony O, Alberti C. Using and Reporting the Delphi
Method for Selecting Healthcare Quality Indicators: A Systematic Review. Wright JM, ed.
PLoS ONE. 2011;6(6):e20476. doi:10.1371/journal.pone.0020476
135
112. Hasson F, Keeney S. Enhancing rigour in the Delphi technique research. Technol Forecast
Soc Change. 2011;78(9):1695-1704. doi:10.1016/j.techfore.2011.04.005
113. Hsu C-C, Sandford BA. Minimizing Non-Response in The Delphi Process: How to
Respond to Non-Response. Pract Assess Res Eval. 2007;12(17):1-6.
114. Sinha IP, Smyth RL, Williamson PR. Using the Delphi Technique to Determine Which
Outcomes to Measure in Clinical Trials: Recommendations for the Future Based on a
Systematic Review of Existing Studies. PLoS Med. 2011;8(1):e1000393.
doi:10.1371/journal.pmed.1000393
116. Greatorex J, Dexter T. An accessible analytical approach for investigating what happens
between the rounds of a Delphi study. J Adv Nurs. 2000;32(4):1016-1024.
117. Makkonen M, Hujala T, Uusivuori J. Policy experts’ propensity to change their opinion
along Delphi rounds. Technol Forecast Soc Change. 2016;109:61-68.
doi:10.1016/j.techfore.2016.05.020
118. Bolger F, Wright G. Improving the Delphi process: Lessons from social psychological
research. Technol Forecast Soc Change. 2011;78(9):1500-1513.
doi:10.1016/j.techfore.2011.07.007
119. Meijering JV, Tobi H. The effect of controlled opinion feedback on Delphi features:
Mixed messages from a real-world Delphi experiment. Technol Forecast Soc Change.
2016;103:166-173. doi:10.1016/j.techfore.2015.11.008
120. Calba C, RVC JD, Goutard F, et al. The Evaluation Attributes Used for Evaluating Animal
Health Surveillance Systems. RISKSUR Project; 2013.
122. Harris MJ. Evaluating Public and Community Health Programs. John Wiley & Sons,
Incorporated; 2016. Accessed May 1, 2020.
http://ebookcentral.proquest.com/lib/osu/detail.action?docID=4722974
123. McNabb SJN, Surdo AM, Redmond A, et al. Applying a new conceptual framework to
evaluate tuberculosis surveillance and action performance and measure the costs,
Hillsborough County, Florida, 2002. Ann Epidemiol. 2004;14(9):640-645.
doi:10.1016/j.annepidem.2003.09.021
136
124. National Institute for Occupational Safety and Health. NIOSH Workplace Safety and
Health Topic: Young Worker Safety and Health. Published 2019. Accessed February 9,
2020. https://www.cdc.gov/niosh/topics/youth/default.html
125. National research Council. Protecting Youth at Work: Health, Safety, and Development of
Working Children and Adolescents in the United States. The National Academies Press;
1998. doi:10.17226/6019
127. Breslin FC, Day D, Tompa E, Bhattacharyya S, Clarke J, Wang A. Systematic Review of
Risk Factors for Work Injury among Youth. Institute for Work & Health; 2005:101.
128. Windau J, Samuel M. Occupational injuries among young workers. Montyly Labor Rev.
Published online 2005:11-23.
129. Bena A, Leombruni R, Giraudo M, Costa G. A new Italian surveillance system for
occupational injuries: characteristics and initial results. Am J Ind Med. 2012;55(7):584-
592. doi:10.1002/ajim.22025
130. Breslin FC, Smith P. Age-related differences in work injuries: a multivariate, population-
based study. Am J Ind Med. 2005;48(1):50-56. doi:10.1002/ajim.20185
131. Centers for Disease Control and Prevention. Nonfatal Occupational Injuries and Illnesses
Treated in Hospital Emergency Departments--United States, 2003. MMWR Morb Mortal
Wkly Rep. 2006;55(449-452):313.
132. Hanvold TN, Kines P, Nykänen M, et al. Occupational Safety and Health Among Young
Workers in the Nordic Countries: A Systematic Literature Review. Saf Health Work.
Published online December 2018. doi:10.1016/j.shaw.2018.12.003
133. Salminen S. Have young workers more injuries than older ones? An international literature
review. J Safety Res. 2004;35(5):513-521. doi:10.1016/j.jsr.2004.08.005
134. Hanvold TN. Young Workers and Sustainable Work Life: Special Emphasis on Nordic
Conditions. Nordic Council of Ministers; 2016.
135. Paterson JL, Clarkson L, Rainsird S, Etherton H, Blevvett V. Occupational fatigue and
other health and safety issues for young Australian workers: an exploratory mixed
methods study. Ind Health. 2015;53(3):293-299. doi:10.2486/indhealth.2014-0257
137. Miller ME, Kaufman JD. Occupational injuries among adolescents in Washington State,
1988-1991. Am J Ind Med. 1998;34(2):121-132. doi:10.1002/(SICI)1097-
0274(199808)34:2<121::AID-AJIM4>3.3.CO;2-#
139. Dragano N, Barbaranelli C, Reuter M, et al. Young Workers’ Access to and Awareness of
Occupational Safety and Health Services: Age-Differences and Possible Drivers in a
Large Survey of Employees in Italy. Int J Environ Res Public Health. 2018;15(7):1511.
doi:10.3390/ijerph15071511
140. Runyan CW, Schulman M, Dal Santo J, Bowling JM, Agans R, Ta M. Work-related
hazards and workplace safety of US adolescents employed in the retail and service sectors.
Pediatrics. 2007;119(3):526-534. doi:10.1542/peds.2006-2009
141. Nielsen ML, Dyreborg J, Kines P, Nielsen KJ, Rasmussen K. Exploring and Expanding
the Category of ‘Young Workers’ According to Situated Ways of Doing Risk and
Safety—a Case Study in the Retail Industry. Nord J Work Life Stud. 2013;3(3):219-234.
doi:10.19154/njwls.v3i3.3019
142. Mujuru P, Mutambudzi M. Injuries and Seasonal Risks among Young Workers in West
Virginia - A 10-Year Retrospective Descriptive Analysis. AAOHN J. 2007;55(9):381-387.
doi:10.1177/216507990705500906
143. Laberge M, Ledoux E. Occupational health and safety issues affecting young workers: A
literature review. Work- J Prev Assess Rehabil. 2011;39(3):215-232. doi:10.3233/WOR-
2011-1170
144. Parker DL, Carl WR, French LR, Martin FB. Nature and incidence of self-reported
adolescent work injury in Minnesota. Am J Ind Med. 1994;26(4):529-541.
doi:10.1002/ajim.4700260410
145. Brooks DR, Davis LK, Gallagher SS. Work-related injuries among Massachusetts
children: A study based on emergency department data. Am J Ind Med. 1993;24(3):313-
324. doi:10.1002/ajim.4700240308
146. Graves JM, Sears JM, Vavilala MS, Rivara FP. The burden of traumatic brain injury
among adolescent and young adult workers in Washington State. J Safety Res.
2013;45:133-139. doi:10.1016/j.jsr.2012.11.001
147. US Bureau of Labor Statistics. Survey of Occupational Injuries and Illnesses Data.
Published 2019. Accessed March 10, 2020. https://www.bls.gov/iif/soii-overview.htm
148. Brooks DR, Davis LK. Work-related injuries to Massachusetts teens, 1987–1990. Am J
Ind Med. 1996;29(2):153-160. doi:10.1002/(SICI)1097-0274(199602)29:2<153::AID-
AJIM5>3.0.CO;2-T
138
149. McCall BP, Horwitz IB, Carr BS. Adolescent Occupational Injuries and Workplace Risks:
An Analysis of Oregon Workers’ Compensation Data 1990–1997. J Adolesc Health.
2007;41(3):248-255. doi:10.1016/j.jadohealth.2007.02.004
152. SAIF. SAIF company information. Published 2020. Accessed March 7, 2020.
https://www.saif.com/x640.xml
153. Anderson NJ, Bonauto DK, Adams D. Prioritizing Industries for Occupational Injury and
Illness Prevention and Research, Washington State Workers’ Compensation Claims Data,
2002–2010. Safety and Health Assessment and Research for Prevention (SHARP)
Program, Washington State Department of Labor & Industries; 2013:1-58. Accessed
February 22, 2017. http://www.lni.wa.gov/Safety/Research/Files/bd_3F.pdf
155. Oregon State Legislature. Workers’ Compensation (2019 Edition).; 2019. Accessed March
10, 2020. https://www.oregonlegislature.gov/bills_laws/ors/ors656.html
156. Oregon State Legislature. Workers’ Compensation (2017 Edition).; 2017. Accessed
November 20, 2017. https://www.oregonlegislature.gov/bills_laws/ors/ors656.html
157. Department of Consumer and Business Services. State of Oregon Workers’ compensation
insurance and self insurance. Accessed April 4, 2020.
https://www.oregon.gov/dcbs/reports/compensation/insurance/Pages/index.aspx
159. Syron LN, Kincl L, Yang L, Cain DT, Smit E. Analysis of workers’ compensation
disabling claims in Oregon’s seafood preparation and packaging industry, 2007-2013. Am
J Ind Med. 2017;60(5):484-493. doi:10.1002/ajim.22706
160. US Census Bureau. North American Industry Classification System. Published 2020.
Accessed March 10, 2020. https://www.census.gov/cgi-
bin/sssd/naics/naicsrch?chart=2012
139
161. Centers for Disease Control and Prevention. National Occupational Research Agenda
(NORA): Sectors. Published February 4, 2019. Accessed March 10, 2020.
https://www.cdc.gov/nora/sectorapproach.html
162. Centers for Disease Control and Prevention. Occupational Injury and Illness Classification
System (OIICS). Published 2019. Accessed March 10, 2020.
https://wwwn.cdc.gov/Wisards/oiics/default.aspx
164. Anderson NJ, Bonauto DK, Adams D. Prioritizing industries for occupational injury
prevention and research in the Services Sector in Washington State, 2002–2010. J Occup
Med Toxicol. 2014;9(37):1-15. doi:10.1186/s12995-014-0037-2
165. US Bureau of Labor Statistics. Occupational Safety and Health Definitions. Published
2016. Accessed March 10, 2020. https://www.bls.gov/iif/oshdef.htm
166. Bureau of Labor Statistics. Consumer Price Index (CPI). Published 2017. Accessed May
17, 2017. https://www.bls.gov/cpi/
167. Fouquet N, Bodin J, Chazelle E, Descatha A, Roquelaure Y. Use of Multiple Data Sources
for Surveillance of Work-Related Chronic Low-Back Pain and Disc-Related Sciatica in a
French Region. Ann Work Expo Health. 2018;62(5):530-546.
doi:10.1093/annweh/wxy023
169. Yang L, Branscum A, Smit E, Dreher D, Howard K, Kincl L. Work-related injuries and
illnesses and their association with hour of work: Analysis of the Oregon construction
industry in the US using workers’ compensation accepted disabling claims, 2007-2013. J
Occup Health. 2020;62(1):e12118. doi:10.1002/1348-9585.12118
170. Graitcer PL. The development of state and local injury surveillance systems. J Safety Res.
1987;18(4):191-198. doi:10.1016/0022-4375(87)90082-X
171. Saha A, Sadhu HG. Occupational Injury Proneness in Young Workers: A Survey in Stone
Quarries. J Occup Health. 2013;55(5):333-339. doi:10.1539/joh.12-0150-OA
172. Runyan CW, Zakocs RC. Epidemiology and Prevention of Injuries Among Adolescent
Workers in the United States. Annu Rev Public Health. 2000;21(1):247-269.
doi:10.1146/annurev.publhealth.21.1.247
173. Belville R, Pollack SH, Godbold JH, Landrigan PJ. Occupational injuries among working
adolescents in New York State. JAMA. 1993;269(21):2754-2759.
140
174. Margolis KA. Underground coal mining injury: A look at how age and experience relate to
days lost from work following an injury. Saf Sci. 2010;48(4):417-421.
doi:10.1016/j.ssci.2009.12.015
175. Taylor EJ, Neis B, Messing K, Dumais L. Invisible: issues in women’s occupational
health. Resour Fem Res Tor. 1996;25(1/2):36-37.
176. Biddle J, Roberts K, Rosenman KD, Welch EM. What Percentage of Workers With Work-
Related Illnesses Receive Workers’ Compensation Benefits? J Occup Environ Med.
1998;40(4):325–331.
177. Shannon HS, Lowe GS. How many injured workers do not file claims for workers’
compensation benefits? Am J Ind Med. 2002;42(6):467-473. doi:10.1002/ajim.10142
141
Appendices
Appendix 1-1 Historical events in occupational safety and health (OSH) and the
1971 BLS implemented the national Survey of Occupational Injuries and Illnesses (SOII).
The SOII is a collaboration between BLS and States to collect nationwide annual data
on non-fatal occupational injuries and illnesses from a sample of private sector
employers based on OSHA record keeping requirements .
Later in 1973 and 1978, BLS developed other data systems to supplement the SOII,
including the Supplementary Data System (SDS), which collected characteristics of
occupational injuries and illnesses and demographic information, and Work Injury
Report Surveys on detailed causal information.
BLS has continued to improve SOII. For example, since 1992, SOII started to collect
data on the nature and circumstances of the injury or the illness. In 2006, BLS began
publishing rates stratified by occupation, age and gender in addition to rates by
industry, establishment. In 2008, BLS expanded the SOII to include state and local
government workers.
142
1977 Mine Safety and Health Act of 1977 established Mine Safety and Health
Administration (MSHA). The Act specifies that NIOSH is responsible for conducting
research related to mine safety and health and making recommendations to he MSHA.
1979 Since 1979, OSHA designed the Integrated Management Information System (IMIS) to
collect OSHA Enforcement, Consultation, and Whistleblower information. IMIS
contains publicly available information on workplace exposure measures from surveys
performed by OSHA officers. The IMIS database has been recently replace by the
Occupational Safety and Health Information System (OIS) which incorporates OSHA's
various enforcement activities and provides a tool to help identify injury, illness and
fatality trends at local and national level.
1980s OSH surveillance started to take the concept and methodology of "sentinel health
events" in the 1980s. Supported by NIOSH, the State-based Sentinel Events
Notification System for Occupational Risks (SENSOR) program was invented to build
OSH surveillance capacity within States to carry out active surveillance, investigation
and preventive intervention for specific occupational conditions, such as work-related
asthma, pesticide poisoning.
The SENSOR-Pesticides programs have been funded by the NIOSH and the
Environmental Protection Agency (EPA) to conduct surveillance on one or more
occupational illnesses or injuries among 13 States as of 2020.
1980s In the late 1980s, NIOSH established the National Surveillance System of
Pneumoconiosis Mortality (NSSPM), providing annually updated information on all
pneumoconiosis related deaths in the US.
In 1990s, NIOSH began the periodic publication of the Work-Related Lung Disease
Surveillance Report, which present data from the NSSPM and other sources.
1985 NIOSH began collecting death certificates from all US States and the District of
Columbia to establish the National Traumatic Occupational Fatalities (NTOF)
surveillance system. The NTOF system contains 30 variables to describe characteristics
of the injury and the victim.
1987 the Adult Blood Lead Epidemiology and Surveillance System (ABLES) was
established as NIOSH and States collaboration among 4 states in 1987. As of 2018, 37
States have been maintaining state ABLES programs to measure trends in work-related
adult blood lead levels to better target interventions and prevent lead exposures.
143
1984 & 1986 The US Congress held hearings in 1984 and 1986 regarding the under-reporting and
credibility issues on occupational safety and health statistics.
The 1984 Congress hearing concluded "No reliable national estimates exist today, with
the exception of a limited number of substance-specific studies (such as on asbestos),
on the level of occupational disease, cancer, disability, or deaths." and "occupational
diseases data collection was fragmented, unreliable, and seventy years behind
communicable disease surveillance."
The 1986 Congress hearing stated that statistical information on occupational illnesses
is still grossly inadequate.
1987 To address concerns on the reliability of OSH statistics, Congress appointed the funds,
the BLS requested the National Research Council to assess the validity and quality of
SOII data and other key issues related to OSH data collection. The resulted report,
"Counting injuries and illnesses in the workplace: proposals for a better system"
provided seminal guidance on how to organize and enhance the US OSH surveillance.
1989 In 1986, NIOSH initiated a project to comprehensively evaluate its surveillance effort
and part of the results were published as a monograph in the American Journal of
Public Health in 1989, covering topics from reviews of its current surveillance systems
as well as recommendations for further improvement.
1990s Since the early 1990s, NIOSH have been collaborating with the Consumer Product
Safety Commission to collect data on non-fatal occupational injuries through an
occupational supplement to the National Electronic Injury Surveillance System
(NEISS), resulting the NEISS-Work system which captures work-related injuries
seeking treatment in Emergency Departments (ED).
1992 BLS implemented the national Census of Fatal Occupational Injuries (CFOI). The
CFOI is a complete census to use multiple data sources including death certificates,
workers’ compensation reports, and federal and state agency administrative reports.
1996 OSHA Data Initiative collected data on work-related illnesses and injuries from
employers within specific industries and with specific employment size from 1996 to
2011. The data is used by OSHA to calculate establishment specific injury and illness
rates. OSHA provides the data in a searchable database online.
144
2004 The CSTE Occupational Health Surveillance Work Group developed Occupational
Health Indicators (OHIs) to help States establish fundamental surveillance capacity.
OHIs are quantitative measures of work-related illnesses, injuries, and factors
associated with occupational safety and health that can be generated using state-level
data, developed by the CSTE Occupational Health Surveillance Work Group to reflect
OSH priority. In NIOSH long-term vision of a nationwide surveillance program, all
States will have the core capacity and infrastructure to conduct surveillance of
occupational injuries and illnesses. In 2004, the initial 19 OHIs were piloted in 2004
among 13 States to generate indicators using data of 2000. Since then, NIOSH has been
funding State-based OSH surveillance programs to conduct OHI surveillance and other
surveillance activities. As of 2020, NIOSH has been funding 26 States, including eight
fundamental surveillance programs (able to establish and maintain surveillance of a
minimum of 15 OHIs), eleven fundamental plus surveillance (already established
fundamental surveillance program and demonstrated capacity in OSH surveillance) and
seven expanded surveillance programs (with longer-tenure or demonstrated experience
in conducting OSH surveillance) to conduct surveillance on a total of 24 OHIs.
2010 NIOSH places importance on leveraging existing surveys and data systems to
supplement the limited resources for OSH surveillance. For example, NIOSH has been
continuously working with NCHS to add additional questions concerning work-
relatedness of conditions and workplace exposures in the National Health Interview
Survey (NHIS) in 2010 and 2015 and the Asthma Supplement in the Behavioral Risk
Factor Surveillance System (BRFSS).
Other survey systems NIOSH has also been working with include National Birth
Defects Prevention Study (NBDPS), National Ambulatory Medical Care Survey
Asthma Supplement, National Crime Victimization Survey (NCVS), National Health
and Nutrition Examination Survey (NHANES).
2010 In 2010, OSHA made the Chemical Exposure Health Data (CEHD) publicly available,
which include industrial hygiene measurements and related information from a variety
of sources. Currently only the analyses for the Salt Lake City laboratory samples are
available.
2018 The National Academies of Sciences, Engineering and Medicine under the joint request
by NIOSH, BLS and OSHA, published a study report named "A smarter national
surveillance system for occupational safety and health in the 21st century". The report
provides comprehensive review of current issues with nationwide OSH surveillance as
well as proposes suggestions for a more coordinated, cost-effective approaches for
OSH surveillance in the US.
The authors proposed the concept of a "smarter OSH surveillance system" that needs to
demonstrates efficiencies, integrates strategies across multiple data sources, coordinates
efforts across key surveillance agencies, and applies domain knowledge effectively to
interpret data and to deliver insights to key stakeholders.
Resources:
[1] National Academies of Sciences, Engineering, and Medicine. A Smarter National Surveillance System for
Occupational Safety and Health in the 21st Century. the National Academies Press; 2018.
145
[2] Council NR. Counting Injuries and Illnesses in the Workplace: Proposals for a Better System. The National
Academies Press; 1987.
[3] National Institute for Occupational Safety and Health. Tracking Occupational Injuries, Illnesses, and Hazards:
The NIOSH Surveillance Strategic Plan. U.S. Department of Health and Human Services, Public Health Service,
Centers for Disease Control and Prevention, National Institute for Occupational Safety and Health; 2001.
[4] Shire JD, Marsh GM, Talbott EO, Sharma RK. Advances and Current Themes in Occupational Health and
Environmental Public Health Surveillance. Annu Rev Public Health. 2011;32(1):109-132.
[5] Centers for Disease Control and Prevention (CDC). Pesticide Illness & Injury Surveillance. Published 2017.
[6] Baker EL. Sentinel Event Notification System for Occupational Risks (SENSOR): the concept. Am J Public
Health. 1989;79(Suppl):18-20.
[7] Control C for D, Prevention USA. Fatal occupational injuries–United States, 1980-1994. MMWR Morb Mortal
Wkly Rep. 1998;47(15):297-302.
[8] National Institute for Occupational Safety and Health (NIOSH). Adult Blood Lead Epidemiology and
Surveillance (ABLES). Published 2018.
[9] Biddle EA, Marsh SM. Comparison of two fatal occupational injury surveillance systems in the United States. J
Safety Res. 2002;33(3):337-354.
[10] National Institute for Occupational Safety and Health (NIOSH). Fatality Assessment and Control Evaluation
(FACE) Program. Published 2019.
[11] Froines J, Wegman D, Eisen E. Hazard surveillance in occupational disease. Am J Public Health.
1989;79(Suppl):26-31.
[12] Lavoue J, Friesen MC, Burstyn I. Workplace Measurements by the US Occupational Safety and Health
Administration since 1979: Descriptive Analysis and Potential Uses for Exposure Assessment. Ann Occup Hyg.
2013;57(1):77-97.
[13] Bureau of Labor Statistics (BLS). Census of Fatal Occupational Injuries (CFOI). Published 2012.
[14] Council of State and Territorial Epidemiologists. The Role of the States in a Nationwide, Comprehensive
Surveillance System for Work-Related Diseases, Injuries and Hazards: A Report from NIOSH-States Surveillance
Planning Work Group. Council of State and Territorial Epidemiologists (CSTE); 2001.
[15] National Institute for Occupational Safety and Health (NIOSH). NHIS Occupational Health Supplement: OHS
Questionnaire Content. Published March 4, 2019.
[16] National Institute for Occupational Safety and Health (NIOSH). OHS Behavioral Risk Factor Surveillance
System (BRFSS). Published December 3, 2019.
146
Appendix 1-2 Identified guidelines and frameworks for evaluating public health surveillance systems in a thorough
literature review
A method for evaluating systems of Discusses evaluation of public health collection, analysis, dissemination, quality cost, sensitivity, usefulness, specificity,
epidemiological surveillance surveillance systems using a set of control representativeness, timeliness, simplicity,
attributes. flexibility, acceptability
Thacker SB, Parrish RG, Trowbridge FL.
A method for evaluating systems of
epidemiological surveillance. World Health
Stat Q Rapp Trimest Stat Sanit Mond.
1988;41(1):11-18.
Protocol for the Evaluation of Aimed to provide a guide on assessing an objectives of the system, population under coverage, impact, availability, utility,
Epidemiological Surveillance Systems existing surveillance system. It discusses surveillance, event, priority, event sensitivity, specificity, representativeness,
evaluation steps and methods. It also detection, reporting, decision making and timeliness, simplicity, flexibility,
World Health Organization. Protocol for includes definition and some example action taken, feedback, resources, staffing, acceptability, completeness, usefulness
the Evaluation of Epidemiological measures for surveillance attributes (i.e. equipment, budget
Surveillance Systems. Geneva: World called system capacity).
Health Organization; 1997.
147
Guidelines for the use of performance This report focuses on national level objective, surveillance methodology, data sensitivity, specificity, data quality,
indicators in rinderpest surveillance rinderpest surveillance program for collection, data manipulation, data timeliness
programmes: A part of the Global countries in GREP. Major components for analysis, data interpretation,
Rinderpest Eradication Programme different types of surveillance were structure, staff, materials
(GREP) suggested. Performance indicators are
proposed and described. Example indicator
Joint FAO/IAEA Division of Nuclear criteria are included.
Techniques in Food and Agriculture.
Guidelines for the Use of Performance
Indicators in Rinderpest Surveillance
Programmes: A Part of the Global
Rinderpest Eradication Programme
(GREP). International Atomic Energy
Agency, Animal Production and Health
Section; 2000.
Evaluating Public Health Surveillance This textbook chapter explains the objective, event, public health importance, cost, cost-effectiveness, usefulness,
evaluation steps and attributes. Important organization, quality control, surveillance simplicity, flexibility, acceptability,
Romaguera RA, German RR, Klaucke DN. concepts and considerations for each step methods, data source, stakeholder, report, sensitivity, PVP, representativeness,
Evaluating Public Health Surveillance. In are discussed. It includes definitions and retrieved data timeliness,
Teutsch SM, Churchill RE. Principles and methods for evaluating attributes.
Practice of Public Health Surveillance. 2 Attributes beyond the CDC updated
edition. Oxford ; New York: Oxford guidelines are also discussed. Example
University Press; 2000. evaluations are also provided.
Older version of that in 2010
148
Updated guidelines for evaluating public This is the most well-known guideline and stakeholders, data users, data providers, simplicity, flexibility, data quality,
health surveillance systems. for all kinds of public health surveillance practitioners, providers, community, acceptability, sensitivity, PVP,
systems. It provides six evaluation steps, government agencies, professional, non- representativeness, timeliness, stability,
Centers for Disease Control and Prevention adapted from the Framework for Program profit organizations, public health usefulness
(CDC). Updated guidelines for evaluating Evaluation in Public Health [1]. Ten importance, objective, operation,
public health surveillance systems: general evaluation attributes are population under surveillance, data
recommendations from the guidelines recommended and discussed. management, data analysis, dissemination,
working group. Morb Mortal Wkly Rep. confidentiality, standards, report,
2001;50(No. 13):1-36. [1] Centers for Disease Control and publication, presentation, poster, data
Prevention (CDC), Program Performance release, resource, funding, personnel,
and Evaluation Office. Framework for training, supplies, cost
program evaluation in public health.
Morbidity and Mortality Weekly Report.
1999;48(No.RR-11):1-40.
Protocol for the assessment of national This protocol is developed to guide detection, registration, confirmation, acceptability, sensitivity, specificity
communicable disease surveillance and assessment of national level communicable reporting, early warning, routine reporting,
response systems. disease surveillance and response systems. analysis, interpretation, response, control,
Specific guides are included on evaluating outbreak investigation, program
World Health Organization, Department of the overall structure and performance of adjustment, changes in policy and
Communicable Disease Surveillance and surveillance activities. planning, feedback, evaluation and
Responses. Protocol for the Assessment of monitoring, setting standards, case
National Communicable Disease definition, case management, standard
Surveillance and Response Systems. procedures for investigation, training,
Guidelines for Assessment Teams. World supervision, communication system,
Health Organization; 2001. resources, human, material, financial,
communication, technology platform,
active case finding
149
Quality assurance applied to animal This paper provides a general consideration objectives, event, legislation and simplicity, flexibility, data quality,
disease surveillance systems in evaluating animal health monitoring and regulation, authorities, resources, target acceptability, sensitivity, PVP,
surveillance systems. It introduces factors population, surveillance design, diagnostic representativeness, timeliness, stability,
Salman null, Stärk KDC, Zepeda C. affecting the surveillance quality and methods, data management, data analysis, usefulness
Quality assurance applied to animal disease surveillance system elements. Methods like feedback, dissemination, protocol
surveillance systems. Rev - Off Int scoring system and quality indicators are
Epizoot. 2003;22(2):689-696. recommended for evaluating surveillance
systems.
Framework for evaluating public health This CDC guide is for the evaluation of objectives, stakeholders, data providers, timeliness, validity, sensitivity, PVP, PVN,
surveillance systems for early detection public health outbreak surveillance systems data users, practitioners, providers, data quality, representativeness,
of outbreaks: recommendations from the (e.g., syndromic surveillance). It discusses government agencies, officials, completeness, usefulness, flexibility,
CDC Working Group. four tasks in an evaluation and important organizations, commercial systems acceptability, portability, stability,
attributes in each task. Emphasis is given to developers, operation, data transmission reliability, cost, cost-effectiveness
Centers for Disease Control and Prevention timeliness and the balance among other standards, data sources, data processing,
(CDC). Framework for evaluating public attributes (e.g. sensitivity). data analysis, interpretation, investigation,
health surveillance systems for early outbreak detection, case definition
detection of outbreaks: recommendations
from the CDC Working Group. Morb
Mortal Wkly Rep. 2004;53(No. RR-5):1-
13.
150
Framework for the evaluation and This framework was developed for coordination structure, decision-making, representativeness, timeliness,
assessment of EU-wide surveillance planning an evaluation project of EU-wide project management, administration, completeness of reporting, validity of data,
networks in 2006-2008 surveillance networks. Specific protocol supervision, case definitions, data to be quality assurance, workload, use of
including scope of work, evaluation team, collected, data management, data resources, usefulness
European Center for Disease Prevention data collection, evaluation process, protection, data access, data
and Control (ECDC). Framework for the timetable, performance indicators and confidentiality, interoperability, data
Evaluation and Assessment of EU-Wide measurements are included, which can be dissemination, data reporting, public health
Surveillance Networks in 2006-2008. referenced in other evaluations. action, infection control procedures,
European Center for Disease Prevention laboratory procedures
and Control (ECDC); 2006.
Communicable Disease Surveillance and WHO proposes a structural model for core function, case detection, registration, completeness, timeliness, usefulness,
Response Systems: Guide to Monitoring communicable diseases surveillance confirmation, reporting, data analysis, simplicity, acceptability, flexibility,
and Evaluating systems. It includes a detailed introduction interpretation, epidemic preparedness, sensitivity, specificity, PVP,
on each component and attributes. It also response and control, feedback, standards representativeness, transparency,
World Health Organization (WHO). discusses evaluation steps for two types of and guidelines, training, supervision, compliance, completeness, reliability
Communicable Disease Surveillance and evaluation, i.e. routine tracking and communication facilities, resources,
Response Systems: Guide to Monitoring periodic assessment. logistics, monitoring, evaluation,
and Evaluating. Geneva: World Health coordination, system structure, legislation,
Organization (WHO); 2006. surveillance strategy, surveillance
implementers and stakeholders,
networking, partnership, event under
surveillance, protocol
151
Proposal of a framework for evaluating This military framework is for evaluating / timeliness, validity, data quality,
military surveillance systems for early syndromic surveillance systems. It usefulness, costs, sensitivity, reliability,
detection of outbreaks on duty areas incorporates characteristics of French and acceptability, portability, stability,
British military syndromic surveillance flexibility, specificity, relevance,
Meynard J-B, Chaudet H, Green AD, et al. systems. Evaluation parameters (i.e. feasibility, consistency, security,
Proposal of a framework for evaluating attributes) are specified for different types availability, efficiency, interoperability,
military surveillance systems for early of evaluation corresponding to the impact
detection of outbreaks on duty areas. BMC development of a system (e.g., initial
Public Health. 2008;8(1):146. evaluation, final evaluation).
Effective environmental public health This framework is for the evaluation of system purpose, priority topic areas, validity, representativeness, trusted,
surveillance programs: a framework for surveillance data sources and indicators. It resource, data sources availability, measurability, feasibility,
identifying and evaluating data discusses different aspects to be considered accessibility, accuracy, reliability,
resources and indicators in an evaluation, such as scientific robustness, data quality, sensitivity, cost-
relevance and feasibility of an indicator. effectiveness, timeliness
Malecki KC, Resnick B, Burke TA. Attributes are listed as criteria for each
Effective environmental public health aspect.
surveillance programs: a framework for
identifying and evaluating data resources
and indicators. J Public Health Manag
Pract. 2008;14(6):543–551.
152
Characteristics of national registries for Aimed to guide the evaluation of national case investigation, case reporting, case completeness, coverage, acceptability
occupational diseases: international registries of occupational diseases in EU registration, physicians, publication of alert
development and validation of an audit countries . The audit tool includes information
tool (ODIT) indicators assessing a national registry
system which include monitor and/or alert
Spreeuwers D, de Boer AG, Verbeek JH, functions.
van Dijk FJ. Characteristics of national
registries for occupational diseases:
international development and validation of
an audit tool (ODIT). BMC Health Serv
Res. 2009;9:194.
153
OASIS: an assessment tool of Research project comprise 10 experts, objectives, scope, organizational structure, simplicity, flexibility, data quality,
epidemiological surveillance systems in based on three existing methods central organization, field organization, acceptability, sensitivity, PVP,
animal health and food safety diagnostic, laboratory, surveillance representativeness, timeliness, stability,
procedure, data management, coordination, usefulness, relevance, consistency,
Hendrikx P, Gay E, Chazel M, et al. supervision, training, integration, integration, suitability, accuracy
OASIS: an assessment tool of evaluation, resources, budget, quality
epidemiological surveillance systems in assurance, standardization, protocol, active
animal health and food safety. Epidemiol surveillance, passive surveillance, news
Infect. 2011;139(10):1486-1496. bulletin, performance indicators
154
Evaluating an NCD-Related This is a guide/tutorial for evaluating non- surveillance objective, stakeholders, simplicity, flexibility, data quality,
Surveillance System communicable disease (NCD) surveillance practitioners, providers, community, data acceptability, sensitivity, PVP,
systems based on CDC’s Updated provider, data user, professional representativeness, timeliness, stability,
Centers for Disease Control and Prevention guidelines. It discusses the six steps organization, government agency; clarity
(CDC). Evaluating an NCD-Related recommended in the CDC updated public health importance, cost, use of data,
Surveillance System. Atlanta, GA: Centers guidelines. Methods and measures for event, case definition
for Disease Control and Prevention; 2013. evaluating attributes in the context of NCD
surveillance are discussed.
The EVA Tool developed in RISKSUR The EVA tool provides comprehensive organization and management, training, availability, sustainability, acceptability,
Project guidance for the evaluation of animal evaluation, performance indicator, simplicity, flexibility, adaptability,
health surveillance systems. It lists resource, data collection, sampling coverage, portability, interoperability,
Calba C, Cameron A, Goutard F, et al. The structural/organizational, functional, and strategy, data management, internal representativeness, utility, accuracy,
EVA Tool: An Integrated Approach for effectiveness attributes and considers the communication, dissemination, laboratory timeliness, PVP, PVN, repeatability,
Evaluation of Animal Health Surveillance ranking of attributes. Example evaluation testing, data analysis, quality assurance, robustness, cost, impact, benefit, cost-
Systems. RISKSUR Project; 2013. questions are provided. Economic multiple hazard, risk-based criteria effectiveness
evaluation is also considered. definition
Peyre M, Hoinville L, Njoroge J, et al. The
RISKSUR EVA tool (Survtool): A tool for
the integrated evaluation of animal health
155
Animal Health Surveillance This terminology book is tailored to animal surveillance purpose, scope of surveillance, stability, sustainability, simplicity,
Terminology Final Report from Pre- health surveillance. Three groups of unit of interest, stakeholders, management, flexibility, repeatability, coverage,
ICAHS Workshop terminologies are listed and defined: personnel, organizational structure, basis of representativeness, utility, completeness,
general terms, characteristics of participation, legislation and regulation, accuracy, sensitivity, timeliness, accuracy,
Hoinville L. Animal Health Surveillance surveillance activities, attributes. species/breed, sampling strategy, bias, cost, impact, cost-effectiveness
Terminology Final Report from Pre- population stream, disease focus, data
ICAHS Workshop. In: International source, data collection method, study
Conference on Animal Health design, case definition, data analysis, data
Surveillance; Vol 17. 2013:1-27. providers, dissemination, data
management, communication, laboratory
management
Evaluating a National Surveillance This UN AID/WHO framework provides flow chart, level of integration, objectives, flexibility, timeliness, coverage,
System guidance for the evaluation of national data sources, surveillance approach, completeness, accuracy, confidentiality,
level HIV surveillance systems. It covers activities, geographical dimension, data simplicity, flexibility, data quality,
UNAIDS/WHO working group on global various aspects in evaluating HIV/AIDS collection method, data providers, acceptability, sensitivity, PVP,
HIV/AIDS and STI surveillance. surveillance and programs, such as resources, human, financial, stakeholders, representativeness, stability, usefulness
Evaluating a National Surveillance System. inventory, design and implementation, planners, implementers, data users,
Geneva: World Health Organization; 2013. surveillance data quality. Case studies are guidelines, case reporting, size estimation,
also provided as illustrations. protocol, quality standards, staffing,
supervision, laboratory, equipment, ethical
standards, data analysis, data
dissemination, data use, data management
European Data Quality Monitoring and This is a guide for evaluation of public case ascertainment, case reporting, completeness, validity, sensitivity,
Surveillance System Evaluation health surveillance systems in EU/EEA objective, event, case definition, data specificity, PVP, PVN, timeliness,
Handbook Member States. It includes a detailed source, data flow, networks, population usefulness, representativeness, simplicity,
introduction and guide on evaluating under surveillance, geographical coverage, flexibility, acceptability, stability,
European Center for Disease Prevention quality attributes in the context of data management, data entry, data reliability
and Control (ECDC). Data Quality communicable diseases surveillance. reporting process, database structure, data
Monitoring and Surveillance System quality monitoring, legislation,
Evaluation - A Handbook of Methods and stakeholders, data provider
Applications. Stockholm: ECDC: European
Center for Disease Prevention and Control;
2014.
Development of an index system for the This article reports a comprehensive structure, data processing, data utilization /
comprehensive evaluation on public evaluation index system for the the China
health emergency events surveillance Public Health Emergency Events
system in China Surveillance System (CPHEESS).
SERVAL (SuRveillance EVALuation This is a generic framework for evaluating conditions under surveillance, current Benefit, bias, cost, coverage, data
framework) animal health surveillance. It discusses situation, surveillance objectives, target completeness, flexibility, historical data,
evaluation steps and key questions related population, structure of the system, design accessibility, impact, laboratory multiple
Drewe JA, Hoinville LJ, Cook AJC, Floyd to each step. Guides on attributes and their of the system, system users, organizational utility, participation, precision,
T, Gunn G, Stärk KDC. SERVAL: a new selection are included. It also includes structure, communication, data analysis, repeatability, representativeness,
framework for the evaluation of animal guide on economic analysis for the system. data collection, data management, sensitivity, specificity, stability,
health surveillance. Transbound Emerg management sustainability, timeliness
Dis. 2015;62(1):33-45.
Brief Evaluation Protocol for Public This protocol provides a practical tool stakeholders, staff, data providers, data cost, simplicity, flexibility, data quality,
Health Surveillance Systems based on the CDC updated guidelines managers, data users, general public, acceptability, sensitivity, PVP,
(2001) to assist evaluation of a public community members, representativeness, timeliness, stability,
Hoffman S, Dekraai M. Brief Evaluation health surveillance system in a short period system purpose, goals, standards, guidance, usefulness, mutual understanding,
Protocol for Public Health Surveillance of time and with limited resources. structure, reporting, resource, budget,
Systems. the Public Policy Center, Examples and working tables are included personnel (staff member/FTEs), training,
University of Nebraska; 2016. to illustrate evaluation steps and the computer
evaluation of attributes.
158
Infrastructure
Surveillance Strategy
3
Adapted from:
[1] Centers for Disease Control and Prevention. Updated guidelines for evaluating public health surveillance systems:
recommendations from the guidelines working group. Morb Mortal Wkly Rep. 2001;50(No. 13):1-36.
[2] Hoffman S, Dekraai M. Brief Evaluation Protocol for Public Health Surveillance Systems. the Public Policy Center,
University of Nebraska; 2016.
166
Data sources
Stakeholders
Mean
Component Sub-component Definition
rating
Infrastructure / Basic structure and foundations for establishing and 4.6
maintaining a surveillance system. It should be flexible and
able to change/evolve as the system grows and as needed.
Legislation & regulations Legal requirements and stipulations supporting the 3.8
establishment of the surveillance activities and mandatory
requirements that the system should follow.
Standards & guidelines Available standards and guidelines identified to guide the 4.8
development, implementation, maintenance and evaluation of
the surveillance system.
Funding resources Funding sources, funds available for developing and 4.5
implementing basic and advanced/expanded surveillance
activities, and funding continuity.
Organizational structure & Organization of the surveillance system and human resources, 4.1
human resources including the system's leadership, staff, communication and
coordination mechanism.
Material resources Logistics, material items and technological resources vital to 4.6
the surveillance system.
Surveillance / Guiding standards, methodologies and procedures for the 4.9
strategy establishment and maintenance of a surveillance system,
including surveillance objectives, events under surveillance and
case definitions, available data sources and surveillance
techniques.
Surveillance objectives Objectives and priorities relevant to the current and emerging 4.4
occupational safety and health needs should be clearly stated to
provide an overall guide to the surveillance system. Objectives
should be specific, adaptable to changing needs, and
measurable.
Events under surveillance Selection and prioritization of occupational safety and health 4.2
cases/events under surveillance in accordance with the
surveillance objectives. For each event, case definition is clear
and valid without ambiguity. Available data sources and
analysis methods are specified.
Mean
Component Sub-component Definition
rating
Surveillance protocols Detailed protocols established to regulate surveillance 4.8
activities, including the list of prioritized events under
surveillance, case definitions, applicable data sources, data
collection and processing methods, frequency of reports and
ways for data dissemination, as well as data quality control
process and periodical evaluation mechanism.
Mandatory reports Certain diseases are required to be notified to state health 4.6
departments or Poison Control Centers by health care
providers, hospitals and emergency departments, clinical
laboratories. The list of notifiable diseases is historically
geared for infectious diseases but now include some
occupational and environmental health conditions and
exposures such as lead poisoning.
Administrative data Includes hospital discharge data, emergency department data, 4.6
hospital outpatient data, workers' compensation data, OSHA
records, Coast Guard records, etc. All payer claims data and
electronic health records are also potential data sources.
Registry data Includes birth and death certificates, cancer registries, birth 4.2
defect registries, trauma registries, and burn registries, etc.
Census data Provides injury and illness data such as the Census of Fatal 4.6
Occupational Injuries (CFOI), and demographics data such as
Quarterly Census of Employment and Wages (QCEW).
Survey data Include state and national Survey of Injuries and Illnesses 4.2
(SOII) data, Behavioral Risk Factor Surveillance System data
(BRFSS), Youth Risk Behavior Surveillance System (YRBS),
National Health Interview Survey (NHIS), National Health and
Nutrition Examination Survey (NHANES), etc.
Other data source Other potential data sources for OSH surveillance may include 3.8
news reports, Google alerts. Social media and smart devices are
promising new data sources.
Stakeholders / Those who provide data, funding or technical support; or are 4.4
involved in the surveillance activities such as survey
participants; or impacted by surveillance activities and results;
or surveillance data users.
171
Mean
Component Sub-component Definition
rating
Federal agencies Various federal agencies may play a role, including the Bureau 4.4
of Labor Statistics (BLS), the National Center for Health
Statistics (NCHS), the Occupational Safety and Health
Administration (OSHA), the Mine Safety and Health
Administration (MSHA), the National Institute for
Occupational Safety and Health (NIOSH). Federal agencies
have responsibilities and programs pertaining OSH
surveillance. They may serve as major funding sources for
OSH surveillance programs. They may also oversight and
coordinate OSH surveillance activities.
State agencies State agencies such as state health departments, labor 4.6
departments, workers' compensation bureaus/divisions, or
employment security departments, play a critical role in
collecting, analyzing and disseminating data from local sources
and conduct various OSH surveillance systems in partnership
with federal agencies and other national and local
organizations.
Employers & employees Employers are asked to participate in the collection and 3.8
submission of data relevant to their company and their
industry, including injuries and illnesses happened to their
employees. Employees themselves may choose to (or not to)
seek medical care, file a workers' compensation claim, or
report poisoning or other work-related conditions. Employers
and employees' engagement may directly impact the quality
and representativeness of data collected. Employers and
employees (e.g. unions) can also be helpful distributors of
surveillance findings.
Medical care providers Include clinics/physicians, testing laboratories who make 4.2
diagnosis and treat health conditions. They are required to
report selected diseases and health conditions to state health
agencies. Hospitals and emergency departments can provide
important medical record data.
Professionals and Provide technical supports. For example, occupational health 3.6
professional organizations advisory committee of the surveillance system provides subject
matter expertise on surveillance activities and improvement.
NGOs, businesses (e.g. Electronic Medical Record developers)
and research institutions provide education programs,
technological assistance or resources for promoting data
dissemination and use. They may also provide data to the
surveillance system.
Other stakeholders Include communities and individuals who are affected 3.9
by/benefited from the surveillance activities, organizations and
the general public who are interested in the surveillance and the
data it generates.
172
Mean
Component Sub-component Definition
rating
Data Processing / The function of system to synthesize and analyze data at 4.7
individual and aggregated levels and produce interpretable
results. Protocols should be in place specifying data collection
methods & procedures.
Data collection The process of collecting and recording data into the system. 4.6
Different local and state health agencies, medical care
providers, insurance companies, businesses and other
organizations may be involved in data collection and curation
before the surveillance system obtain access to the data.
Provide timely data In current OSH surveillance practice, there could be months to 3.5
years lag in data incubation. New and emerging data sources
such as syndromic data system, electronic records data, social
media or other crowd sourcing data may provide hope for more
timely data.
Detect clusters & unusual Clusters or unusual events such as multiple injuries/illness 3.8
events happening in the same workplace at the same time may signal
unsafe working condition. More timely data can help to detect
clusters and unusual events in a quick manner.
Guide immediate actions More timely data help to guide immediate actions to prevent 3.5
more events from happening on workers. Data can be provided
to relevant organizations if the surveillance system does not
have the action capacity. For example, OSHA may use the
information to guide workplace inspections on workplace
hazards.
Mean
Component Sub-component Definition
rating
Track burdens & trends Measure burdens of work-related injuries or illnesses (e.g. 4.7
frequency, rate, & severity) and monitor trends over time and
space.
Identify populations at risk Identify industries, occupations, and worksites as well as 4.7
populations, defined by sociodemographic characteristics or
work arrangements, at high risk for work-related injury, illness,
or hazardous exposures.
Detect workplace hazards Detect workplace hazards and facilitate the investigation of 4.8
diseases/events linked to occupational exposures.
Data / The function to routinely and actively disseminate data and 4.6
Dissemination information to the public to guide actions on improving
occupational safety and health.
Produce dissemination Consistently produce data and information for dissemination. 4.3
materials Customizing contents and formats relevant to targeted users
promotes the use of information for actions by stakeholders.
Dissemination timing Data dissemination on a routine and timely basis. Also consider 4.1
flexibility in the plan.
Dissemination Plan dissemination mechanism and maximizing dissemination 4.3
channels/mechanisms channels. Wide collaboration with local, state and national
government departments, institutions, industry associations and
employee unions, and other agencies helps to identify more
ways and channels for dissemination.
Mean
Component Sub-component Definition
rating
Surveillance evaluation Evaluation of the system periodically and when needed 4
throughout the surveillance system's life cycle. Important to
ensure that the planned activities are on track and the
surveillance objectives are being achieved. Findings and
recommendations resulting from the evaluation should be
disseminated and utilized to improve the system.
Publications / Data (e.g. summary statistics, case investigation, trend analysis, 4.8
etc.) and interpretable information generated routinely and on
demand need to be reported in various forms, depending on the
surveillance purposes and dissemination methods.
Technical reports Describes the process, progress, and/or results of surveillance 4.2
activities, including data collection, generation and
interpretation in a scientific manner. Technical reports may
include findings and recommendations.
Conference presentations Way to disseminate surveillance and connect with academic 4.2
researchers and professionals from various fields, who could
potentially be data users and collaborate with the surveillance
system.
Datasets / Individual level and aggregated data records released by the 3.8
surveillance system to the public and interested
organizations/individuals for research and other purposes.
Other outputs / Other outputs in the surveillance system may include research 3.8
and outreach, intervention prevention guidance, strategic policy
recommendations, etc., depending on the surveillance system's
objectives and resources available.
Short & mid- Support research Provides data for hypotheses generation and support applied 3.7
term outcomes research & epidemiological studies.
Guide intervention Data and information produced help to guide the planning, 4.7
programs implementation, and evaluation of actions & programs
intended to prevent and control work-related injuries, illnesses,
and hazardous workplace exposures.
175
Mean
Component Sub-component Definition
rating
Increase awareness and Data and information produced help to increased awareness 4.5
knowledge and knowledge of occupational safety and health among public
health and occupational health professionals, decision makers,
and working populations.
Long-term Policy changes at Surveillance data and findings inform policy-making at 4.5
outcomes national/state/local level different administrative levels, which addresses issues and gaps
in occupational safety and health field.
Safer workplace Results and findings in the surveillance system help to create a 4.9
safer workplace, where hazardous exposures are eliminated and
interventions to promote workers' health are implemented.
Reduced workplace The ultimate goal and overarching outcome of the surveillance 4.9
injuries, illnesses & system is to lead reduced work-related injuries, illnesses, and
diseases diseases.
176
Mean
Component Attribute Definition
Rating
Infrastructure Legislative support Existence of legal and regulatory requirements to support the 3.6
establishment and maintenance of the surveillance system and
its activities.
Compliance Degree to which the surveillance system complies with all 4.2
relevant legislation, regulations and policies, including ethics
and confidentiality requirements.
Sustainability The system is able to sustain itself with adequate financial, 4.8
human and material resources.
Surveillance Confidentiality Privacy and data confidentiality requirements for the 4.5
strategy collection, storage, backup, transport and retrieval of
information (especially over the internet), based on relevant
standards and guidelines.
Mean
Component Attribute Definition
Rating
Outputs Accessibility Ability of the surveillance system to make data and 4.5
information accessible to those who need it and when they
need it.
Usability Data and products produced by the surveillance system should 4.6
be tailored to meet the needs of intended users.
Outcomes Usefulness Ability of the surveillance system to directly and indirectly 4.7
contribute to the prevention and control of adverse
occupational health conditions and improve workers' safety and
health.
System related Simplicity A guiding principle in designing and implementing the 3.8
surveillance system, including the organizational structure, the
surveillance strategies, information flow from data providers to
data users, coordination of various surveillance activities.
Simplicity is conductive to the effectiveness and efficiency. All
different components in the surveillance system should be
designed to avoid unnecessary complexity.
Data quality Validity Degree to which information correctly describe the health 4.3
related event it was designed to measure.
Sensitivity The ability of the surveillance system to capture true 4.1
cases/events or outbreaks/clusters.
178
Mean
Component Attribute Definition
Rating
Specificity The proportion of individuals not having the occupational 4.1
health related event identified by the system as not having the
event.
Predictive Value Positive The proportion of reported cases/events that are actually true 4.1
(PVP) cases/events.
Representativeness The extent to which data adequately represent the population 4.6
under surveillance and relevant sub-populations by time, place,
population demographics and socio-demographics.
Consistency Data have the same meanings (e.g. case definition, diagnosis 4.5
standards) to allow for consistent interpretation. This includes
internal consistency (within a dataset) and external consistency
(across different data sources), as well as consistency over
time.
Completeness Required variables recorded in the surveillance datasets are 4.1
complete without missing data.
Accuracy Data, statistics, and information produced in the surveillance 4.7
datasets are accurate without errors.
Clarity Data, statistics, and information is coded/presented clearly 4.2
without ambiguity. For example, variables are named
appropriately. Data dictionary is clear and easy to follow.
179
Attribute Measure
Legislative support Are there mandatory requirements on the establishment of the system?
Are there mandatory requirements on data reporting/collection for the
events/cases under surveillance?
Compliance Is the system in compliance with all legal and regulatory requirements?
Sustainability Is funding secure for short-term (e.g. 3-5 years) and long-term (e.g. more than 5
years) future?
Are the necessary human resources (e.g. staffing) sustainable for the short-term
and long-term future?
Are the necessary material resources (e.g. technical infrastructure) sustainable for
the short-term and long-term future?
Can the system maintain and expand collaborations in the future (e.g. for data
collection, analysis and dissemination)?
Can financial, human and material resources support the system to adapt to
envisioned changes and/or to expand activities?
Confidentiality Are privacy and confidentiality requirements clearly specified in
protocols/documentation?
Are the requirements reviewed regularly to be updated as necessary?
Are well-established methodologies in place to secure confidentiality in the data
collection and transport, especially over the internet?
Is there a process to check for and document deviations and corrective actions
when privacy or confidentiality is breached?
Significance Does the system identify significant OSH issues (i.e. potentially catastrophic,
associated with big economic and/or social impacts, or concerns among the public
or industries)?
Once identified, is there a process to set objectives and prioritize events to
surveil? e.g. with advisory board or stakeholders, through consensus, through
grant peer review.
How does the system adjust its objectives and priorities based on emerging OSH
issues?
Are there past examples of the system reflecting significant occupational safety
and health concerns?
Are populations at risk (defined by age, ethnicity, size, spatial distribution, etc.)
identified and tracked in the system on an ongoing basis?
Feasibility Do the system strategies correspond to objectives and priorities?
Do necessary outside resources exist (e.g. steering committee, technical
committee, government affiliation)?
Do the staff have the required knowledge, skills and experience to operate and
maintain the system? Multidisciplinary expertise may be needed for modern day
surveillance.
180
Attribute Measure
Are the surveillance events/cases trackable over time given the data sources
available?
Do the resources available match the strategies and processes (data collection,
data analysis, and data dissemination, etc.)?
Transparency Are important details (e.g. surveillance strategies, planned activities,
methodologies for data collection and analysis, changes) clearly documented in
system protocols/documentation on a regular basis?
Acceptability Are stakeholders aware of the existence of legal and mandatory requirements for
the surveillance activities?
Do stakeholders actively communicate with the system staff?
Participation rate/responding speed of stakeholders in a certain surveillance
activity (e.g. meeting, data request, survey).
The level of willingness of stakeholders to collaborate with the surveillance
system.
Mutual understanding Are responsibilities clearly specified for staff and collaborators in the surveillance
system?
Is there a mutual understanding of security, confidentiality and privacy process
between the system and stakeholders (e.g. data providers, data users)?
Has a trusted relationship been established between the surveillance system and
its collaborators?
Mutual benefit Are opportunities created for collaborators to benefit from the system?
Relevance Is there an up-to-date logic model/flowchart that links the system's functions and
objectives?
Is each function/activity necessary to achieve objective/meet priorities?
Are performance indicators specified in the surveillance system? Example
indicators include expected frequency of case reporting, data analysis and
dissemination, or frequency of work meetings?
Adherence Is there auditing process to track how function protocols are followed?
Are methods established for data collection and processing to ensure high-quality
data?
Accessibility Is there an appropriate plan for dissemination to increase accessibility, including
channels and formats?
Is the system able to respond to internal and external data requests? Balance
between confidentiality requirements and benefits of feeding data to researchers
and the public may be a challenge.
Usability Are useful data interpretations provided to the audience beyond the data (e.g.
statistics)?
Are products tailored to the need of the end users (e.g. reading level, knowledge
level, language, industry/occupation, disability needs, culture and formats of
communication)?
Can statistics be stratified by factors of interest, such as sex, age, ethnicity,
socioeconomic status, geography, etc?
181
Attribute Measure
Does the system seek feedback on usability by stakeholders to guide
improvement?
Usefulness Does the system provide timely data to detect outbreaks/clusters/unusual events?
Calculate the number/percentage of events detected.
Does the surveillance data allow for investigation of clusters, outbreaks, and/or
unusual events to guide timely mitigation action? Calculate the
number/percentage of the investigated/actionable.
Does the system measure the burden of work-related injuries or illnesses and
monitor trends over time and space?
Can the surveillance data guide identification of worker populations at risk?
Does system track leading indicators of occupational safety and health (e.g.
workplace hazards)?
Can the system detect new/emerging diseases and/or workplace hazards?
Has the data dissemination helped to increase the workers' and employers' safety
and health awareness?
Does the system provide data to support epidemiological studies and applied
research/practices?
Can surveillance data guide the planning, implementation, and evaluation of
interventions or programs intended to prevent and control work-related injuries,
illnesses, and diseases?
Has the surveillance data lead to policy/regulation changes at national/state/local
level?
Simplicity Is the case definition of events/conditions under surveillance simple and easy to
understand?
Does the system rely on sophisticated data collection methods, multiple data
sources and/or multiple collaborators?
Does the system involve active surveillance activities?
Are system protocols easy to follow?
Is the organizational structure simple but sufficiently effective to meet the
surveillance needs?
Does the system use accepted standards for data coding and data sharing so that
merging and comparison within/outside the system is easy?
Are processes and platforms (e.g. software, database) for data reporting, entry,
coding, merging, and sharing streamlined and easy to use?
Is there any part of the system unnecessarily complicated? Is there a better way to
streamline the system?
Flexibility Can the system respond to event/case definition changes?
Can the system respond to changes and/or variations across existing data sources?
Can the system easily incorporate new data sources?
Can the system respond to new data standards (e.g. ICD-9 to ICD-10)?
Can the system respond to changing surveillance technologies (e.g. online data
sharing, machine learning techniques for data coding)?
182
Attribute Measure
Can the system adjust surveillance objectives and strategies as needed (e.g.
respond to new hazards/conditions)?
Is there funding/personnel/material redundancy to support flexibility?
Timeliness Time required for collaborating agencies to process data before it is transferred to
the system.
Time needed to investigate and analyze the data for mortality/morbidity, trends,
outbreak, risk factors.
Time needed for the surveillance system to release warning alerts (outbreak,
unusual events) and/or data/statistics reports.
Time needed for the system to respond to requests (e.g. data request,
regular/urgent work needs).
Overall time spent from events (exposure, seeking health information, medical
care, and/or medicine) to data dissemination and eventually to public health
actions.
Are the time spent appropriate for the objectives of the system?
Stability Can the system continue its planned activities with staff turnover or other
resource problems?
Are data sources and data collection collaboration stable to ensure ongoing
surveillance?
Can the system maintain its activities under changing situations such as coding
system changes, technological updating, switching technical platforms?
Integration Is the system connected with other public health systems within or outside the
same organization (in terms of data collection/processing/sharing)?
Does the system follow standardized data collection and sharing for integration
ease?
Are problems/opportunities identified with functions as they relate to each other?
Are potential issues addressed for a successful integration (e.g. to avoid
annexation)?
Are there data sharing mechanisms among different surveillance functions, which
includes information to share, to whom the information being communicated, and
data sharing platform?
Effectiveness Is the system effective in accomplishing its stated objectives?
Are various methods used to increase effectiveness, such as shared efforts on data
collection/processing/dissemination through internal and external collaboration?
Percentage of planned tasks (e.g. obtaining source data, outbreak detection & case
investigation, data analysis, producing data reports) accomplished by due dates.
What are the costs (fixed and variable) associated with each surveillance activity
in the system?
Cost-effectiveness What are the costs (direct and indirect, individual and societal) for
injuries/illnesses/events under surveillance?
Is cost-effectiveness (financial and social) a consideration in decision-making for
objectives and priorities, and the selection of surveillance methodologies?
183
Attribute Measure
Is there up-to-date cost-effectiveness analysis done for the system?
Validity Is their standard scientific evidence that the event under surveillance is work-
related, or significantly associated with occupational factors.
Validity of the survey constructs.
Are the data of sufficient quality to generate a valid estimate?
Are biases introduced in data collection process (e.g. case reporting, data coding,
method to identify work-relatedness) that affect the validity of measurements?
Sensitivity The likelihood/percentage that workers will seek medical care or compensation
and be reported/captured to the surveillance system?
Sensitivity of the screening/diagnosis procedure.
The proportion/number of cases/events are captured by the surveillance system
compared to certain standard OSH data source(s)?
Is there any active approach used to help to detect true cases/events?
Can the system accurately detect clusters, outbreaks or usual events in an
appropriate time frame?
Specificity Specificity of the screening/diagnosis procedure.
The proportion/number of non-cases/events correctly determined by the
surveillance methodology/algorithm (e.g. by comparing two methodologies
identifying work-related cases).
Predictive Value Positive The proportion/number of cases/events captured by the surveillance system that
(PVP) are confirmed as true cases/events.
Is there active approach used to verify cases/events?
Representativeness Does the system reflect the characteristics of working populations under
surveillance?
Does the system reflect the change of population characteristics over time?
Are there sub-populations of interest excluded or under-reported?
Consistency Are data collected by different entities within the system following consistent and
rigorous formats?
Are data collected from different sources share common methodology (e.g. case
definition, coding systems, and classifications)?
Is data coded the same way over time to track trends?
Completeness Proportion of missing or blank data?
What is the reason for missing data?
What is the impact of missing data?
Accuracy Are there proper quality control measures to identify and correct errors in place?
Is there data corruption due to incorrect data merging or conversion?
Clarity Are variables coded clearly and consistently?
Does the surveillance system keep a clear and easy-to-follow, up-to-date data
dictionary?
Are data provided by the system in formats easy to understand and provided with
accurate data dictionaries?
184
Attribute Measure
Are educational materials and other materials for dissemination formatted clearly
and precisely?
185
Appendix 3-5 Template working tables for developing evaluation methods on selected each attribute and measures
Table S1 demonstrates how methods are developed for each attribute by selecting measures and developing evidence
collection methods, data analysis and evaluation criteria (if applicable) for each measure. Table S2 further sort evaluation methods by
Flexibility Surveillance strategies and 1) whether it accommodates · Check previous changes in · Describe possible
methods are suitable and updates and changes in data surveillance activities; changes;
applicable to the surveillance sources and coding · Interview key surveillance · Describe ways the
objectives and available resources. standards? staff for experience/process on system works with
adopting changes and possible possible changes
problems with it;
· Interview main data
providers for information on
any changes in the past years
and future;
Other attributes
…
…
…
187
Data provider (source #1) Position title/Name of Flexibility - 1) Whether it · Do you expect any changes related · Open
the agency accommodates updates and to the surveillance system's access to questions
changes in data sources and and use of your data?
coding standards? · How should the system be prepared
for the changes?
Supporting leader Position title/Name of Flexibility - 2) Whether it · Do you expect any changes related · Open
the system accommodates other to ~ in the surveillance system (e.g., questions
changes (e.g. emerging funding, or other aspects related to
OSH issues/surveillance the responsibility of the supporting
technologies/funding leader)?
sources/legislations)? · How should the system be prepared
for the changes?
Other data
collection
method
…
…
…
189
Evidence Responsible
Participants and Evaluation Planned time Actual time
collection person and Remark
contact protocol and location and location
method contact
Document /
review
Interview Surveillance key Protocol name
staff name
Email:
Data provider name Protocol name
(source #1)
Email:
Data provider name Protocol name
(source #2)
Email:
Supporting leader Protocol name
name
Email:
…
…
…
190
1. How do you see the importance and significance of the surveillance system?
2. How does the surveillance system fit into the organization’s objectives and goals?
3. What is the long-term strategic plan in your organization to further support the system?
• What are the challenges or barriers for the organization to support the system?
1. Please describe the working process/steps including data request, and the approximate
2. For each event/case definition, please rate how simple it is to identify cases on a 5-point
scale, with 1 indicating not simple and 5 the simplest (show a table list).
3. How many organizations request work reports and the surveillance data on a regular
basis?
5. Are there formal trainings on knowledges and skills needed for the OSH surveillance?
6. How quickly and timely the data providers and other collaborators respond to your work
requests?
4
Adapted from:
[1] Yang L., Weston C., Cude C., Kincl L. Evaluating Oregon’s Occupational Public Health Surveillance System Based on CDC
Updated Guidelines. Am J Ind Med. In press.
191
8. Do you have any concerns on the data completeness and validity in this system?
9. Are you aware of any study done on validity, sensitivity and/or PVP for source data
11. Are there any active surveillance approaches in place to increase the sensitivity and/or
correct errors?
12. Have you experience any failure in the work infrastructure that caused you not being able
13. Are there any potential changes/challenges with data availability and the surveillance
1. Please briefly describe how you process your data from obtaining the raw data to
2. Please describe the data quality control process in place to monitor errors and missing
4. Do you have formal training on data processing and data quality control?
5. Please describe the process dealing with the OSH surveillance system’s data request, and
7. Do you have any concerns on the completeness, validity, sensitivity, or PVP of the data
in your system?
8. Do you have completeness rate data of the key variables that the OSH surveillance
system needed?
9. Are you aware of any study done on validity, sensitivity, and/or PVP for data in your
system?
11. Are there any active surveillance approaches in place to increase the sensitivity and/or
correct errors?
12. Do you expect major technical changes in the next 5 years for your data system?
13. Do you expect major other changes in the next 5 years in your system, which may impact
14. Do you have any suggestions for us to better collaborate with your agency?
193
Table of Contents
List of Tables
List of Figures
List of Acronyms
Executive summary
2. Evaluation design
3. Evaluation results
5. Conclusions
1. Centers for Disease Control and Prevention. Updated guidelines for evaluating public health
surveillance systems: recommendations from the guidelines working group. Morb Mortal Wkly
Rep. 2001;50(No. 13):1-36.
Introduction: This CDC updated guidelines is the most well-known guideline for all kinds of
public health surveillance systems. It introduces evaluation steps and details on collecting
evidence to evaluate the ten most commonly used attributes, which are good reference for OSH
surveillance evaluation.
2. Centers for Disease Control and Prevention (CDC), Office of Strategy and Innovation.
Introduction to Program Evaluation for Public Health Programs: A Self-Study Guide. Atlanta,
GA: Centers for Disease Control and Prevention; 2011.
3. Hoffman S, Dekraai M. Brief Evaluation Protocol for Public Health Surveillance Systems. the
Public Policy Center, University of Nebraska; 2016.
Introduction: This brief protocol converts the CDC updated guidelines into more specific guides
with practical tools for designing and implementing a brief evaluation of public health
surveillance systems.
4. Harris MJ. Evaluating Public and Community Health Programs. John Wiley & Sons,
Incorporated; 2016.
Introduction: This book includes thorough introduction on many aspects of health program
evaluation, including collecting and analyzing quantitative and qualitative data. It may serve as a
guide book for basic theory and practice in evaluation.
5. Bernard HR, Wutich A, Ryan GW. Analyzing Qualitative Data: Systematic Approaches.
SAGE publications; 2016.
Introduction: This book presents systematic methods for analyzing qualitative data with clear
and easy-to-understand steps. It can be used as a guide book for qualitative data analysis.
195
Appendix 4-1 NORA industry sectors ranked by Prevention Index (PI) for male and female young workers
Table S1. NORA industry sectors ranked by Prevention Index (PI) for male young workers, Oregon workers' compensation accepted disabling and non-disabling claims,
2013-2018
Claim Cost Claim Cost
Claim Rate Rate Basic Expanded Claim Rate Rate Basic Expanded
NORA Sector Description Count /10,000 /100 PI PI Count /10,000 /100 PI PI
Workers Workers Workers Workers
Disabling Claims Non-disabling Claims
Agriculture, Forestry & Fishing (except Wildland 887 375.8 26357.7 1 1 1651 782.5 7908.8 3 4
Firefighting and including seafood processing)
Construction 1050 262.3 16495.2 2 3 2478 801.2 6849.2 2 2
Healthcare & Social Assistance (including 397 125.1 4162.2 9 9 859 445.0 3561.3 7 5
Veterinary Medicine/Animal Care)
Manufacturing (except seafood processing) 1219 223.5 12082.0 4 5 2503 853.6 7312.3 1 1
Mining 14 236.5 41408.4 8 4 41 942.5 11198.1 4 3
Public Safety (including Wildland Firefighting) 92 245.0 15070.7 7 6 85 396.5 3166.3 9 9
Services (except Public Safety and Veterinary 2493 94.0 3955.0 6 8 5059 366.9 2771.4 6 8
Medicine/Animal Care)
Transportation, Warehousing & Utilities 645 314.2 17184.1 3 2 279 406.3 3800.9 8 7
Wholesale and Retail Trade 1618 115.5 4865.1 5 7 2311 442.2 3541.0 5 6
NORA: National Occupational Research Agenda
196
Table S2. NORA industry sectors ranked by Prevention Index (PI) for female young workers, Oregon workers' compensation accepted disabling and non-disabling claims,
2013-2018
Claim Cost Claim Cost
Claim Rate Rate Basic Expanded Claim Rate Rate Basic Expanded
NORA Sector Description Count /10,000 /100 PI PI Count /10,000 /100 PI PI
Workers Workers Workers Workers
Disabling Claims Non-disabling Claims
Agriculture, Forestry & Fishing (except Wildland 130 141.4 7094.7 6 4 276 336.7 4392.5 4 3
Firefighting and including seafood processing)
Construction 36 57.5 2992.2 8 8 87 179.8 2370.8 7 5
Healthcare & Social Assistance (including 1590 142.3 5396.1 1 3 3689 542.7 5017.4 1 1
Veterinary Medicine/Animal Care)
Manufacturing (except seafood processing) 316 146.9 7470.5 3 2 426 368.4 3725.1 2 2
Mining / /
Public Safety (including Wildland Firefighting) 34 109.1 6721.1 7 6 25 139.5 1185.7 8 8
Services (except Public Safety and Veterinary 1719 53.4 2270.1 5 7 3510 208.7 1683.8 3 4
Medicine/Animal Care)
Transportation, Warehousing & Utilities 161 207.6 10002.9 2 1 53 206.4 1749.8 6 6
Wholesale and Retail Trade 930 68.5 2679.5 4 5 816 166.5 1430.4 5 7
NORA: National Occupational Research Agenda
197
Appendix 4-2 Industry groups (NAICS 4-digit) ranked by Prevention Index (PI)
Table S3. Top 5 Industry groups ranked by Prevention Index (PI) for young workers, Oregon workers' compensation accepted
disabling and non-disabling claims, 2013-2018
Claim
Cost Rate
4-digit Industry group (4-digit NAICS Claim Rate Basic Expanded
NORA sector /100
NAICS Description) Count /10,000 PI PI
Workers
Workers
Disabling Claims
Agriculture 1133 Logging 217 788.8 79941.6 1 1
Agriculture 1153 Support Activities for Forestry 191 481.1 33162.7 2 2
Healthcare 2381 Foundation, Structure, and Building 273 429.2 31059.1 3 3
Exterior Contractors
Manufacturing 3219 Other Wood Product Manufacturing 179 433.7 21519.0 4 4
Wholesale & 4244 Grocery and Related Product Merchant 198 329.9 19507.7 5 6
Retail Wholesalers
Non-disabling Claims
Healthcare 6239 Other Residential Care Facilities 643 8712.7 71249.1 1 1
Healthcare 6231 Nursing Care Facilities (Skilled 1377 2356.7 26040.7 2 2
Nursing Facilities)
Construction 2383 Building Finishing Contractors 778 1373.6 11955.9 3 5
Manufacturing 3331 Agriculture, Construction, and Mining 178 5562.5 49842.6 4 3
Machinery Manufacturing
Manufacturing 3211 Sawmills and Wood Preservation 322 1816.1 18314.5 5 4
NORA: National Occupational Research Agenda.
198
Table S4. Top 5 Industry groups ranked by Prevention Index (PI) for male young workers, Oregon workers'
compensation accepted disabling and non-disabling claims, 2013-2018
Claim Cost
4-digit Industry group (4-digit NAICS Claim Rate Rate Basic Expanded
NORA sector
NAICS Description) Count /10,000 /100 PI PI
Workers Workers
Disabling Claims
Agriculture 1133 Logging 216 835.9 85001.3 1 1
Construction 2381 Foundation, Structure, and 268 469.8 34186.6 2 2
Building Exterior Contractors
Agriculture 1153 Support Activities for Forestry 184 525.7 34440.0 3 3
Manufacturing 3219 Other Wood Product 159 457.8 22283.9 4 5
Manufacturing
Transportation 4931 Warehousing and Storage 156 424.0 30393.0 5 4
Non-disabling Claims
Healthcare 6239 Other Residential Care Facilities 179 6884.6 53307.1 1 1
Manufacturing 3331 Agriculture, Construction, and 159 5845.6 46271.8 2 2
Mining Machinery Manufacturing
Construction 2383 Building Finishing Contractors 739 1512.8 12388.6 3 5
Manufacturing 3211 Sawmills and Wood Preservation 291 1796.3 18456.4 4 3
Manufacturing 3219 Other Wood Product 297 1590.8 12939.6 5 7
Manufacturing
NORA: National Occupational Research Agenda.
199
Table S5. Top 5 Industry groups ranked by Prevention Index (PI) for female young workers, Oregon workers'
compensation accepted disabling and non-disabling claims, 2013-2018
Claim Cost
4-digit Industry group (4-digit NAICS Claim Rate Rate Basic Expanded
NORA sector
NAICS Description) Count /10,000 /100 PI PI
Workers Workers
Disabling Claims
Healthcare 6231 Nursing Care Facilities (Skilled 252 334.3 11642.9 1 1
Nursing Facilities)
Healthcare 6221 General Medical and Surgical 293 268.6 9774.7 2 3
Hospitals
Healthcare 6232 Residential Intellectual and 201 293.6 11255.1 3 3
Developmental Disability, Mental
Health, and Substance Abuse
Facilities
Transportation 4921 Couriers and Express Delivery 56 525.3 26491.1 4 2
Services
Healthcare 6233 Continuing Care Retirement 415 179.5 6889.8 5 10
Communities and Assisted Living
Facilities for the Elderly
Non-disabling Claims
Healthcare 6239 Other Residential Care Facilities 443 9229.2 74996.9 1 1
Healthcare 6231 Nursing Care Facilities (Skilled 1177 2509.6 28178.2 2 2
Nursing Facilities)
Healthcare 5419 Other Professional, Scientific, and 292 1334.6 8217.7 3 3
Technical Services
Healthcare 8129 Other Personal Services 130 797.6 5419.0 4 8
Healthcare 6212 Offices of Dentists 244 638.4 4400.7 4 11
NORA: National Occupational Research Agenda.
200
Table S6. Top 5 Industry groups under major NORA industry sector ranked by Prevention Index for young workers, Oregon
workers' compensation accepted disabling claims, 2013-2018
Claim Rate Cost Rate
NORA 4-digit Industry group (4-digit Claim Basic Expanded
/10,000 /100
sector NAICS NAICS Description) Count PI PI
Workers Workers
Agriculture, 1133 Logging 217 788.8 79941.6 1 1
Forestry &
1153 Support Activities for Forestry 191 481.11 33162.7 2 2
Fishing (except
Wildland 1119 Other Crop Farming 123 301.32 24426.0 3 3
Firefighting and
1151 Support Activities for Crop 156 284.46 14363.3 3 4
including
Production
Seafood
Processing) 1121 Cattle Ranching and Farming 65 319.88 16977.2 5 5
Construction 2381 Foundation, Structure, and 273 429.18 31059.1 1 1
Building Exterior Contractors
2361 Residential Building 197 257.89 15746.2 2 2
Construction
2383 Building Finishing Contractors 169 230.5 15333.2 3 3
2389 Other Specialty Trade 91 209.05 11439.7 4 5
Contractors
2382 Building Equipment 184 164.29 8028.9 4 6
Contractors
Healthcare & 6231 Nursing Care Facilities (Skilled 275 292.8 10004.2 1 1
Social Assistance Nursing Facilities)
(including
6221 General Medical and Surgical 383 278.59 9819.1 1 2
Veterinary
Hospitals
Medicine/Animal
Care) 6232 Residential Intellectual and 262 261.35 9727.5 3 4
Developmental Disability,
Mental Health, and Substance
Abuse Facilities
Table S7. Top 5 Industry groups under each major NORA industry sector ranked by Prevention Index for young workers,
Oregon workers' compensation accepted non-disabling claims, 2013-2018
Claim Rate Cost Rate
NORA 4-digit Industry group (4-digit Claim Basic Expanded
/10,000 /100
sector NAICS NAICS Description) Count PI PI
Workers Workers
Agriculture, 1133 Logging 312 1259.08 16702.5 1 1
Forestry &
1153 Support Activities for Forestry 415 1160.19 13135.7 2 2
Fishing (except
Wildland 1119 Other Crop Farming 257 698.75 6815.7 3 4
Firefighting and
including 1131 Timber Tract Operations 41 2645.16 23922.5 4 3
seafood 1129 Other Animal Production 35 1174.5 13167.1 5 5
processing)
Construction 2383 Building Finishing Contractors 778 1373.59 11955.9 1 1
2381 Foundation, Structure, and 349 710.22 5645.7 2 3
Building Exterior Contractors
2382 Building Equipment 546 631.07 4927.8 2 5
Contractors
2389 Other Specialty Trade 263 782.04 5920.4 4 3
Contractors
2379 Other Heavy and Civil 74 1541.67 25863.0 5 2
Engineering Construction
Healthcare & 6239 Other Residential Care 643 8712.74 71249.1 1 1
Social Assistance Facilities
(including 6231 Nursing Care Facilities (Skilled 1377 2356.67 26040.7 2 2
Veterinary Nursing Facilities)
Medicine/Animal
5419 Other Professional, Scientific, 335 1182.07 6542.0 3 3
Care)
and Technical Services
6221 General Medical and Surgical 451 527.3 4172.1 4 7
Hospitals
6222 Psychiatric and Substance 100 3921.57 27482.1 5 4
Abuse Hospitals
Manufacturing 3331 Agriculture, Construction, and 178 5562.5 49842.6 1 1
(except seafood Mining Machinery
processing) Manufacturing
3211 Sawmills and Wood 322 1816.13 18314.5 2 2
Preservation
3219 Other Wood Product 327 1474.3 12078.0 3 3
Manufacturing
3323 Architectural and Structural 160 1515.15 12244.3 4 4
Metals Manufacturing
3339 Other General Purpose 125 2055.92 11970.9 4 6
Machinery Manufacturing
Services (except 9241 Administration of 144 3348.84 21636.7 1 1
Public Safety and Environmental Quality
Veterinary Programs
Medicine/Animal 6113 Colleges, Universities, and 565 714.74 5114.9 2 7
Care) Professional Schools
203
Table S8. Top 3 Industry groups ranked by Prevention Index (PI) for four major injury types among young workers, Oregon
workers' compensation accepted disabling and non-disabling claims, 2013-2018
Claim Cost
NORA 4-digit Industry group (4-digit Claim Rate Rate Basic Expanded
Injury Type
sector NAICS NAICS Description) Count /10,000 /100 PI PI
Workers Workers
Disabling Claims
Healthcare 6231 Nursing Care Facilities 197 209.8 6984.4 1 1
(Skilled Nursing Facilities)
Work-related
Musculoskeletal Healthcare 6221 General Medical and 239 173.8 6136.2 2 2
Disorders Surgical Hospitals
Transportation 4931 Warehousing and Storage 94 153.0 7553.1 3 3
Agriculture 1133 Logging 90 327.2 45906.5 1 1
Construction 2381 Foundation, Structure, and 91 143.1 8365.9 2 3
Struck Building Exterior
by/against Contractors
Agriculture 1153 Support Activities for 61 153.7 13077.3 3 2
Forestry
Agriculture 1133 Logging 42 162.5 7491.4 1 1
Agriculture 1153 Support Activities for 41 117.1 7271.5 2 2
Fall on same Forestry
level Construction 2381 Foundation, Structure, and 22 38.6 1827.5 3 3
Building Exterior
Contractors
Healthcare 6232 Residential Intellectual and 145 144.6 4932.8 1 2
Developmental Disability,
Mental Health, and
Violence Substance Abuse Facilities
Healthcare 6222 Psychiatric and Substance 33 804.9 15067.5 2 1
Abuse Hospitals
Healthcare 8129 Other Personal Services 30 73.8 3238.8 3 3
Non-disabling Claims
Healthcare 6231 Nursing Care Facilities 516 883.1 13915.8 1 1
(Skilled Nursing Facilities)
Work-related
Musculoskeletal Healthcare 6239 Other Residential Care 92 1246.6 16438.9 2 1
Disorders Facilities
Manufacturing 3211 Sawmills and Wood 50 282.0 4591.6 3 3
Preservation
2383 Building Finishing 459 939.6 7174.0 1 2
Construction Contractors
Manufacturing 3331 Agriculture, Construction, 80 2941.2 22160.9 2 1
Struck
and Mining Machinery
by/against
Manufacturing
Manufacturing 3219 Other Wood Product 178 953.4 6605.6 3 4
Manufacturing
Fall on same Healthcare 6239 Other Residential Care 42 569.1 4449.7 1 1
level Facilities
205