0% found this document useful (0 votes)
17 views

Yang Liu 2020

This dissertation aimed to improve occupational safety and health (OSH) surveillance through three related studies. The first study developed a comprehensive list of 211 elements relevant to evaluating OSH surveillance systems using expert consensus. Experts found current systems lack formal evaluation. The second study created a practical framework to evaluate OSH surveillance using identified elements and expert feedback. The third study analyzed Oregon workers' compensation claims data to prioritize high-risk industries for young workers. Non-disabling injuries were 4 times more common than disabling injuries. The research enhances OSH surveillance through improved evaluation methods and targeted prevention efforts.

Uploaded by

Marlon Fernando
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

Yang Liu 2020

This dissertation aimed to improve occupational safety and health (OSH) surveillance through three related studies. The first study developed a comprehensive list of 211 elements relevant to evaluating OSH surveillance systems using expert consensus. Experts found current systems lack formal evaluation. The second study created a practical framework to evaluate OSH surveillance using identified elements and expert feedback. The third study analyzed Oregon workers' compensation claims data to prioritize high-risk industries for young workers. Non-disabling injuries were 4 times more common than disabling injuries. The research enhances OSH surveillance through improved evaluation methods and targeted prevention efforts.

Uploaded by

Marlon Fernando
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 220

INTENTIONALLY LEFT BLANK

AN ABSTRACT OF THE DISSERTATION OF

Liu Yang for the degree of Doctor of Philosophy in Public Health presented on June 3, 2020.

Title: Improving Occupational Safety and Health (OSH) Surveillance: Focusing on Enhancing
Surveillance Methods and Evaluation

Abstract approved: ______________________________________________________


Laurel D. Kincl

Work-related hazards and exposures affect human health and well-being. Occupational

Safety and Health (OSH) surveillance is fundamental to the public health approach. OSH

surveillance provides important data on the occurrence and magnitude of the problem and

risk/protective factors. A wide variety of OSH surveillance programs and systems have been

established and strengthened to address multiple areas of workplace safety and health over the

past few decades. This includes national systems for work-related injuries, illnesses, and

fatalities, as well as state-based OSH surveillance programs which contribute to local prevention

efforts as well as to the national data profile. However, challenges exist in current OSH

surveillance that call for more accurate, timely and comprehensive reporting of events and

conditions of occupational health importance. This research is focused on three specific

challenges and gaps, including utilization of new and existing data sources for more complete

surveillance, OSH surveillance terminologies and conceptual model, and the lack of OSH

surveillance evaluation guiding frameworks. Two related studies were conducted using both

qualitative and quantitative research methods along with a translational study to accomplish three

specific research aims.


The first study aimed to understand the current status of OSH surveillance in the US and

to develop a comprehensive list of elements relevant to OSH surveillance and its evaluation,

including surveillance components, attributes and example measures. A modified Delphi study

with three survey rounds was conducted with ten experts. Based on the experts’ ratings and

comments, a total of 211 elements were selected from a draft list compiled from literature, with

134 (62.3%) reaching high level of consensus. Definition/description for each element were

refined based on experts’ suggestions to reflect best practice. Experts further reviewed their

respective OSH surveillance systems, including six state-level and one national OSH

surveillance systems/programs. Experts’ review highlighted that evaluations were not a common

practice in these OSH surveillance systems/programs. Major themes from experts’ comments

and verbal answers revealed current status and gaps of OSH surveillance in the US. Resources

constraints including financial support, technical platforms and expertise have been limitations

for more advanced OSH surveillance activities. As such, feasibility should be an important

consideration in OSH surveillance evaluation as well as setting practical performance criteria.

Following the first study, the second translational study aimed to develop a practical

guiding framework for evaluation of OSH surveillance systems. Feedback on applicability of

identified surveillance elements as well as the development of the framework were sought from

Delphi panel experts and potential users of the framework through the Council of State and

Territorial Epidemiologists (CSTE) Occupational Health (OH) Subcommittee. Experts agreed

that it is meaningful to develop a tailored framework for OSH surveillance evaluation and that

the framework must be flexible, simple, and straightforward. The developed framework

incorporates the identified elements, field experts’ suggestions, and experience from the research

team.
The third study analyzed surveillance data, specifically Oregon’s workers’ compensation

(WC) claims data to apply a Prevention Index (PI) to prioritize industries of high-risk for a

vulnerable population, young workers aged 24 years and under. All accepted WC disabling

claims data (N=13,360) and a significant portion of accepted non-disabling claims data

(N=24,460) were obtained from 2013 to 2018. Non-disabling injury rate was estimated

approximately 4 times the rate of disabling injuries among Oregon young workers (401.3 vs.

111.6 per 10,000 young workers). The most common injury types among young workers for both

disabling and non-disabling claims combined was “struck by/against” (35.6%), followed by

“work-related musculoskeletal disorders” (19.5%), “fall on same level” (9.1%), and “violence”

(6.6%). Differences in the distributions of injury types were observed across industry sectors.

Based on the PI, high ranked industry sectors included agriculture, construction and

manufacturing for both genders combined. For female young workers, the high ranked sector

was healthcare. The study demonstrated that the inclusion of non-disabling WC claims data was

useful for a more complete documentation of work-related injuries and illnesses.

The overall research project is applied in nature. The developed guiding framework is

expected to address the lack of a practical framework for evaluating OSH surveillance systems

and helps to promote more evaluations and in turn to improve OSH surveillance systems. The

exploration of non-disabling WC claims data and the use of Prevention Index contributes to

targeted prevention efforts and enhances OSH surveillance of young workers that can be

extended to other worker populations.


©Copyright by Liu Yang
June 3, 2020
All Rights Reserved
Improving Occupational Safety and Health (OSH) Surveillance: Focusing on Enhancing
Surveillance Methods and Evaluation

by
Liu Yang

A DISSERTATION

submitted to

Oregon State University

in partial fulfillment of
the requirements for the
degree of

Doctor of Philosophy

Presented June 3, 2020


Commencement June 2020
Doctor of Philosophy dissertation of Liu Yang presented on June 3, 2020

APPROVED:

Major Professor, representing Public Health

Head of the School of Biological and Population Health Sciences

Dean of the Graduate School

I understand that my dissertation will become part of the permanent collection of Oregon State
University libraries. My signature below authorizes release of my dissertation to any reader
upon request.

Liu Yang, Author


ACKNOWLEDGEMENTS

Having the support of my family, committee members, colleagues, professors, and

friends have made it possible to successfully navigate the doctoral program and complete my

dissertation research. My parents, Shengli Yang and Fuyi Shen have inspired me through my

academic pursuits since I was young. Being amazing dad and mom, they have been providing me

encouragement and selfless support in this endeavor. My husband, Xun Mo have been

encouraging and supporting me throughout my graduate program. My two children, Chenghao

Mo and Jessica Aiyang Mo, have been a big part of our lives and a constant source of job. I

couldn’t have done this without the love and support of my family.

A PhD student could not have asked for a better committee than mine. As my major

advisor and mentor, Dr. Laurel Kincl has been providing great support and help throughout my

graduate program. Through the past five years, she has not only been diligently mentoring me on

conducting academic research from designing and implementing studies to writing manuscripts

and seeking publication, but also fostering my independence as an academic researcher,

including leading and communicating with the research team, developing partnerships and

seeking ways to disseminate the research, which made this research possible. She also cares

about my personal and professional growth and has provided invaluable guide in my future

career development. I would also wish to thank all other committee members, Dr. Viktor

Bovbjerg, Dr. Jeong Ho “Jay” Kim, Dr. Adam Branscum, and Dr. Bryan Tilt. My minor advisor,

Dr. Viktor Bovbjerg, has supported me in the application of the minor PhD degree. Being a

Teaching Assistant with his courses, I have learned valuable teaching experience and skills from

him. Dr. Adam Branscum has taught multiple statistics courses in which I enjoyed his clear and

coherent instructions on various aspect of statistical modeling and interpretation. He has also
been working with me through multiple studies and provided tremendous help and guide in

conducting and summarizing research findings and drafting manuscripts. All my PhD committee

members have been providing me great support and encouragement navigating me throughout

my PhD program, and provided me instructions and suggestions on the research.

I would wish to thank Curtis Cude and Crystal Weston in the Oregon Health Authority

for their great support in my internship with the Oregon Occupational Public Health Program and

their collaboration in my studies. I would also wish to thank all participants in the Delphi study

as well as in the follow-up opinion seeking for their time, effort and enthusiasm contributing to

this research. I would also like to thank the Council of State and Territorial Epidemiologists

(CSTE) for facilitating the study enrollment and dissemination of our study results.

At Oregon State University, I have had the great opportunity to learn from outstanding

faculty members and fellow students. I enjoyed great courses from many professors including

Dr. Perry Hystad, Dr. Ellen Smit, Dr. Molly Kile and Dr. Anthony Veltri. My friends in the

program and college, Laura N Syron, Amber Lyon-Colbert, Barbara Hudson-Hanley, Barrett

Welch, Lan N. Doan, Deeksha Vasanth Rao, Meghan Fitzgerald, Jafra D. Thomas, Linh Bui, and

many others, have been motivating and encouraging me through all of the ups and downs of a

graduate study, and sharing with each other valuable experiences and resources. I am proud of

being together with them achieving our dreams and career goals.

Liu Yang
CONTRIBUTION OF AUTHORS

Liu Yang, MS, MPH, conceptualized and designed the research, collected and analyzed the data,

and drafted the manuscripts.

Dr. Laurel D. Kincl supervised the research, assisted with the study design, assisted with the

interpretation of findings, and provided editorial comments to the manuscripts.

Dr. Adam Branscum and Dr. Viktor Bovbjerg assisted with the interpretation of findings and

provided editorial comments to the manuscripts.

Curtis Cude and Crystal Weston assisted with data collection, interpretation of findings and

provided editorial comments to the third manuscript.


TABLE OF CONTENTS

Page

CHAPTER 1 Introduction............................................................................................................... 1
Occupational safety and health (OSH) surveillance in the United States ................................... 1

Gaps and opportunities in OSH surveillance .............................................................................. 3

Exploring new and existing data sources .................................................................................... 4

Surveillance terminologies and conceptual models .................................................................... 5

Surveillance evaluation ............................................................................................................... 6

Research purpose and approach ................................................................................................ 10

Tables and Figures .................................................................................................................... 13

References ................................................................................................................................. 19

CHAPTER 2 Understanding Occupational Safety and Health Surveillance: Expert Consensus on


Components, Attributes and Example Measures for an Evaluation Framework .......................... 27
Abstract ..................................................................................................................................... 28

Introduction ............................................................................................................................... 29

Methods..................................................................................................................................... 30

Results ....................................................................................................................................... 34

Discussion ................................................................................................................................. 38

Conclusion ................................................................................................................................ 42

Tables & figures ........................................................................................................................ 44

Reference .................................................................................................................................. 48

CHAPTER 3 Guiding Framework for Evaluating Occupational Safety and Health Surveillance
Systems ......................................................................................................................................... 51
Abstract ..................................................................................................................................... 52

Introduction ............................................................................................................................... 53
TABLE OF CONTENTS (Continued)

Page

Methods..................................................................................................................................... 54

Results ....................................................................................................................................... 55

Discussion and conclusion ........................................................................................................ 57

References ................................................................................................................................. 60

Supplement: A Guiding Framework ......................................................................................... 62

CHAPTER 4 Assessing disabling and non-disabling injuries and illnesses using accepted
workers compensation claims data to prioritize industries of high risk for Oregon young workers
....................................................................................................................................................... 92
Abstract ..................................................................................................................................... 93

Introduction ............................................................................................................................... 95

Methods..................................................................................................................................... 97

Results ..................................................................................................................................... 102

Discussion ............................................................................................................................... 106

Conclusions ............................................................................................................................. 109

Reference ................................................................................................................................ 111

Tables & Figures ..................................................................................................................... 117

CHAPTER 5 General Conclusions ............................................................................................. 122


Bibliography ............................................................................................................................... 126
Appendices.................................................................................................................................. 141
LIST OF FIGURES

Figure Page

Figure 1-1 Frequency counts of identified guidelines and frameworks by time period and type of
surveillance ................................................................................................................................... 18

Figure 2-1 Selected components and attributes of OSH surveillance in logic model (Elements
marked yellow were selected with low consensus) ...................................................................... 47

Figure 3-1 Stages of OSH surveillance activities and types of evaluation corresponding to each
stage .............................................................................................................................................. 65

Figure 3-2 Four steps to evaluate an OSH surveillance system.................................................... 67

Figure 3-3 Main tasks in Step 1: Design the evaluation ............................................................... 68

Figure 3-4
Common OSH surveillance components and attributes in Logic Model .................................... 71

Figure 3-5 Initial/need evaluation: Minimum and optional components and attributes .............. 82

Figure 3-6 Routine/rapid evaluation: Minimum and optional components and attributes .......... 84

Figure 3-7 Comprehensive evaluation: Minimum and optional components and attributes ....... 86

Figure 3-8 Data quality evaluation: Minimum and optional components and attributes ............ 88

Figure 4-1 Disabling injury rate by age group (including adult worker group) and gender, Oregon
accepted disabling claims (2013-2018) ...................................................................................... 120

Figure 4-2 Distribution of claims by injury type for young workers, Oregon accepted disabling
and non-disabling claims, 2013-2018 ......................................................................................... 121
LIST OF TABLES

Table Page

Table 1-1 Major sources of data for occupational safety and health surveillance ........................ 13

Table 1-2 Summary information of identified guidelines and frameworks .................................. 14

Table 1-3 Attributes abstracted from existing guidelines and frameworks .................................. 15

Table 1-4 Abstracted components in existing guidelines and frameworks .................................. 16

Table 2-1 Preset Criteria for Experts in the Delphi Study ............................................................ 44

Table 2-2 Summary statistics for framework elements (average, range) ..................................... 45

Table 2-3 Major themes from experts' comments ........................................................................ 46

Table 3-1 Questions in the Delphi study and follow-up opinion seeking..................................... 55

Table 3-2 Major comments and suggestions for an evaluation framework from experts in the
Delphi study and follow up review (n=19) ................................................................................... 55

Table 3-3 Objectives and suitable situations by type of OSH surveillance evaluation. ............... 66

Table 4-1 Oregon workers' compensation disabling and non-disabling claims frequency, rate,
medical cost by year and demographics, 2013-2018 .................................................................. 117

Table 4-2 Claim rate for young workers, by major injury type and NORA industry sector,
Oregon workers' compensation accepted disabling and non-disabling claims, 2013-2018 ....... 118

Table 4-3 NORA industry sectors ranked by Prevention Index (PI) for young workers, Oregon
workers' compensation accepted disabling and non-disabling claims, 2013-2018 .................... 119
LIST OF APPENDICES

Appendix Page

Appendix 1-1 Historical events in occupational safety and health (OSH) and the development of
OSH surveillance in the US ........................................................................................................ 141

Appendix 1-2 Identified guidelines and frameworks for evaluating public health surveillance
systems in a thorough literature review ...................................................................................... 146

Appendix 1-3 Summary of six existing evaluations of OSH surveillance systems .................... 159

Appendix 3-1 Questions/aspects to describe the system ............................................................ 165

Appendix 3-2 Components common to OSH surveillance ......................................................... 169

Appendix 3-3 Attributes to OSH surveillance components ........................................................ 176

Appendix 3-4 Example measures for attributes .......................................................................... 179

Appendix 3-5 Template working tables for developing evaluation methods on selected each
attribute and measures................................................................................................................. 185

Appendix 3-6 Template working table for data collection plan ................................................. 189

Appendix 3-7 Sample interview guides ..................................................................................... 190

Appendix 3-8 A sketch of evaluation report ............................................................................... 193

Appendix 3-9 Further references ................................................................................................ 194

Appendix 4-1 NORA industry sectors ranked by Prevention Index (PI) for male and female
young workers............................................................................................................................. 195

Appendix 4-2 Industry groups (NAICS 4-digit) ranked by Prevention Index (PI) .................... 197
1

Improving Occupational Safety and Health (OSH) Surveillance: Focusing on Enhancing

Surveillance Methods and Evaluation

CHAPTER 1 Introduction

Occupational safety and health (OSH) surveillance in the United States

Work-related hazards and exposures affect human health and well-being. Globally, it is

estimated that some 5-7% mortalities in industrialized countries are attributable to work-related

illnesses and injuries 1. In the United States of America (US), there were more than 5,000 work-

related fatal injuries and 2.8 million non-fatal injuries and illnesses reported by private

employers nationally in 2018, corresponding to a fatal injury rate of 3.5 per 100,000 Full-Time

Equivalent workers (FTE) and a non-fatal injury rate 2.8 per 100 FTEs, respectively 2. It was

estimated that the total cost of occupational injuries in the US in 2018 was $170.8 billion, equal

to a cost per worker of $1,100 3.

Work-related injuries and illnesses are preventable. Promoting safe and secure working

environments for all workers is one of the United Nations’ Sustainable Development Goals for

2015-2030 4. In public health practice, occupational health issues are addressed using the public

health approach, which includes four steps from identifying the health program to the

development of promotion of prevention intervention strategies 5,6. Occupational safety and

health (OSH) surveillance is central to every step in this approach by providing data on the

occurrence and magnitude of the problem and the risk/protective factors 7.

Public health surveillance is defined as the ongoing, systematic collection, analysis, and

interpretation of data regarding health related events, that are essential to the planning,
2

implementation, and evaluation of public health practices, and closely integrated with the timely

dissemination of these data linked to prevention and control 8. OSH surveillance is a subset of

public health surveillance specifically targeting issues of OSH importance 9. In the US, OSH

surveillance has largely been undertaken by government agencies at national and state level

which have the legal authority to require disease and injury reporting and access to various data

sources containing OSH information 10,11. Surveillance activities for OSH events had not been

formalized until the Occupational Safety and Health Act (OSH Act) 12. The OSH Act of 1970

designated the Bureau of Labor Statistics (BLS) to be responsible for collecting statistics on

occupational injuries and illnesses and created the National Institute for Occupational Safety and

Health (NIOSH) to conduct research to inform standards and regulations in OSH 9,12,13.

Over the past few decades, OSH surveillance has undergone tremendous improvement,

with a wide variety of OSH surveillance programs and systems being established and

strengthened to address multiple areas of workplace safety and health. National systems for

work-related injuries, illnesses, and death have been implemented and maintained including the

BLS national Survey of Occupational Injuries and Illnesses (SOII) and the Census of Fatal

Occupational Injuries (CFOI) 2, as well as various surveillance programs operated in NIOSH in

collaborating with other national and State partners, such as the National Traumatic

Occupational Fatalities surveillance system (NTOF), the National Electronic Injury Surveillance

System - Occupational Supplement (NEISS-Work), to name but a few 14. States have been

playing a pivotal role in OSH surveillance by supporting the BLS SOII and CFOI programs.

Recognizing the importance of states in a nationwide OSH surveillance, NIOSH has been

collaborating with states since 1980s, resulting a set of state-based programs such as the Fatality

Assessment and Control Evaluation (FACE) programs and the Adult Blood Lead Epidemiology
3

and Surveillance System (ABLES) 7. Since early 2000, NIOSH has funded states to establish

state-based OSH surveillance systems for occupational health indicators (OHIs) and other

expanded surveillance activities, which contribute to local prevention efforts as well as to the

national data profile 15. Other agencies such as the Occupational Safety and Health

Administration (OSHA) have also been building OSH databases such as the Integrated

Management Information System (IMIS) which contains data available for tracking workplace

exposures and hazards 16. Appendix 1-1 summarizes the historical events from the 1900’s to the

present time in occupational safety and health including the development of OSH surveillance in

the US.

Gaps and opportunities in OSH surveillance

Challenges and gaps exist in current OSH surveillance in response to calls for more

accurate, timely and comprehensive reporting of events and conditions of occupational health

importance. The recent report, A smarter national surveillance for occupational safety and

health in the 21st century, by the National Academies of Sciences, Engineering, and Medicine as

requested by NIOSH, BLS and OSHA, reviewed current OSH surveillance in the US and

summarized key barriers 9. These included: concerns regarding confidentiality and privacy in

data reporting and use, cost of maintaining and expanding surveillance, collection of exposure

data and multi-source surveillance, OSH surveillance structure and expertise including adequate

epidemiological and informatics capacity, as well as the culture, mission, and methodology in

organizations to suit OSH surveillance.

OSH surveillance has historically being a resource-limited field in public health, with

limited funding and staff capacity 9,17,18. However, increasing interest and opportunities in public

health surveillance including emerging data sources, new technologies and exploration of
4

surveillance methods are promising to address gaps in OSH surveillance. The following sections

describe various issues are being addressed.

Exploring new and existing data sources

Complete and accurate data are essential to diseases prevention and policy making.

Under-reporting and under-estimation of occupational injuries and illnesses has continued to be a

significant concern regarding OSH surveillance data reliability 19–21. Studies have shown an

under-estimation as much as 69% in national statistics of non-fatal work-related injuries and

illnesses 22–24. Certain working populations such as self-employed, or household workers are

likely to be excluded from some OSH data systems. For example, SOII has been estimated as

excluding approximately 9% of the workforce 9.

Failures in caupturing work-related injuries and illnesses can happen anywhere from the

worker and physician reporting to the documentation of the injury or illness in different data

systems 25. There has been a growing recognition and practice in OSH surveillance to leverage

various data sources to detect cases that are possibly missed in one data system to improve data

completeness. Studies showed that the use of more than one data source doubled estimates of

work-related injuries compared to the BLS statistics 26–28. A wide variety of data sources have

now been explored for their use in surveillance of occupational injuries, illnesses, and hazards 26–
37
. Table 1-1 summarizes the potential data sources for OSH surveillance in the US.

Among the ample data sources, workers’ compensation (WC) claims is a commonly used

data source for OSH surveillance 38–47. Coded WC claims database provides case-level data on

the injury and illness, which are sometimes difficult to obtain in other data sources 9,48. Further,

WC data may link to data on injury severity (e.g., days of lost work time), medical and
5

compensation cost, health care records, as well as workplace safety and health and prevention

measures to allow for in-depth research. It is recommended to explore and promote the expanded

use of WC data to enhance OSH surveillance 9. However, WC data collected by government

agencies, which are generally the only source available for research and OSH surveillance,

usually include only lost work time claims 9,48. Access to medical service-only claims containing

information on less severe yet more frequent work-related injuries and illnesses help with a more

complete report.

Surveillance terminologies and conceptual models

The Centers for Disease Control and Prevention (CDC) identified issues with lexicon,

definitions, and conceptual framework as a major concern of public health surveillance in the

21st century 49,50. Agreed terms and frameworks set a foundation toward increased understanding

within and beyond the public health field. Efforts have been made to standardize concepts,

terminologies, structural components, and methods in public health surveillance since decades

ago 49,51,52. Various models have been proposed regarding surveillance processes and activities.

The CDC, following suggestions by Alexander Langmuir, who is regarded as the founder of

modern public health surveillance, regards surveillance as limited to data collection, analysis and

dissemination, with no direct responsibility for disease control activities 8,51–54. While some

researchers included disease control and action as a key public health surveillance component.

For example, McNabb et al. 55 suggested eight core functions for public health surveillance,

namely: detection, registration, reporting, confirmation, analysis, feedback, acute response, and

planned response, as well as four supporting functions including communication, training,

supervision, and resource-provision. This model was further adapted in the World Health

Organization (WHO) guidelines for monitoring and evaluating communicable disease


6

surveillance and response systems 56,57. Choi summarized a set of common concepts and

components of public health surveillance, including ongoing, systematic, population-based, data

collection, data analysis, interpretation, dissemination, mortality data, morbidity data,

epidemiologic data, and health data 51. Frameworks and key components were also proposed for

specific types of surveillance to reflect their own features and requirements, including infectious

diseases surveillance 58, animal health surveillance 59,60 and injury surveillance 61.

Despite ample literature on standardizing public health surveillance concepts and

components that can be used to guide OSH surveillance, there has been no effort to

systematically assemble components and elements that best describe the overall process and

features of current OSH surveillance systems, especially systems in the US to date. The recent

report by the National Academies of Sciences, Engineering, and Medicine summarized key OSH

surveillance elements into three major groups including process, organizational and

technological components, and methods 9. However, definitions and difference between these

elements were not clearly described.

Surveillance evaluation

The value of public health surveillance lies in the effective and efficient delivery of

useful data regarding health determinants and outcomes. The need for both quality data and a

quality surveillance system stimulates rigorous evaluation of the current surveillance. Evaluation

is defined as systematic investigation of the merit, worth, or significance of an object 62,63. The

evaluation of a health program can be defined as the systematic collection of information about

the activities, characteristics, and outcomes of the program to make judgments about the

program, improve program effectiveness and/or inform decisions about future program

development, where the term “program” refers to “any set of organized activities supported by a
7

set of resources to achieve a specific and intended result” 62. Public health surveillance system fit

into the broad definition of “program” and thus the principles and methodologies for program

evaluation applies to surveillance evaluation. However, the data stream centered organization

and structure of public health surveillance makes the objectives and focuses in surveillance

evaluation differ from the general program evaluation. Specifically, the evaluation of public

health surveillance systems is to ensure that the surveillance system is monitoring problems of

public health importance efficiently and effectively and the utility of surveillance resources and

information collected is maximized 8.

Guidelines and frameworks have been developed to guide the evaluation of public health

surveillance systems in the past three decades. However, existing guidelines and frameworks

provide only generic guidance which are insufficient in guiding a practical evaluation 64. To date,

no guideline or framework has been developed for evaluating OSH surveillance systems. In

practice, public health surveillance systems were evaluated largely inconsistently in methods and

criteria, with evaluation quality varying significantly 65,66. This creates challenges in results

comparison and thus reduces the usability of evaluations in guiding the improvement of

surveillance systems. Further, publications on OSH surveillance evaluation are very limited to

date. Detailed review of current evaluation guidelines and practices are presented in below

sections.

Guidelines and frameworks for OSH surveillance evaluation

Guidelines and frameworks refer to publications providing instructions,

recommendations and tools for surveillance evaluation. A thorough literature review has been

conducted to comprehensively identify existing guiding approaches and examine whether there

has been published guiding approaches for OSH surveillance evaluation. A total of 34 guidelines
8

and frameworks were identified and reviewed (Appendix 1-2). Most of them (N=30, 88%) were

published since 2000 (Figure 1-1). Nearly one third of guidelines and frameworks (N=11) were

developed for all types of public health surveillance, followed by animal health surveillance

(N=8, 24%) and communicable diseases and animal health surveillance (N=4, 12%) (Error!

Reference source not found. and Table 1-2). Only one framework targets occupational diseases,

which was developed as an audit tool for national registries in EU countries 67.

The three most frequently cited guidelines are Injury surveillance guidelines by WHO

and CDC (N=601) 68, Updated guidelines for evaluating public health surveillance systems by

CDC (N=574) 8 and Guidelines for evaluating surveillance systems by CDC (N=503) 53. The two

CDC guidelines, especially the updated guidelines, are by far the most well-known and the de

facto authoritative guidelines in public health surveillance evaluation, which have guided many

existing surveillance evaluations 69–78 as well as the development of other surveillance evaluation

guidelines and frameworks 68,79–84.

The CDC updated guidelines recommend ten attributes, i.e. measurable features and

characteristics of a surveillance system to be evaluated, including simplicity, flexibility, data

quality, acceptability, sensitivity, predictive value positive (PVP), representativeness, timeliness,

stability and usefulness. Attributes have been adopted in existing guidelines and frameworks as

well as evaluation practice as a method to assess the performance and quality of a surveillance

system 56,57,68,74,84–88. A total of 52 distinct attributes were abstracted from the 34 guidelines and

frameworks (Table 1-3), among which the most frequent attributes are those recommended in

CDC updated guidelines. Components and activities in a surveillance system discussed in these

guidelines and frameworks were also abstracted (Table 1-4).


9

Researchers and evaluators have noted that the CDC guidelines tend to be general and

towards infectious diseases surveillance 84,88,89. Further, it fails to guide evaluation on

surveillance process and components 90,91. We have reported elsewhere challenges in the

evaluation of Oregon OSH surveillance system based on the CDC updated guidance (manuscript

in press). In fact, similar weaknesses were also found in other existing surveillance evaluation

guidelines and frameworks. Calba et al. 64 pointed out multiple weaknesses and limitations in

their systematic review of 15 identified guidelines and frameworks and concluded that existing

guidelines and frameworks were insufficient in guiding a practical evaluation. Similarly, my

thorough review demonstrated that most existing guidelines and framework provide only generic

recommendations for surveillance evaluation, especially OSH surveillance evaluation. Specific

and detailed level of guidance and tools on implementing the evaluation are usually not included.

There is lack of a comprehensive list of attributes and guidance on how to select these attributes

and match them to components to a specific type of surveillance. Further, methods for assessing

the attributes were usually discussed with general examples. On the other hand, some guidelines

and frameworks are too specific to a certain type of surveillance and lack generalizability to

other systems (e.g., those with scoring grid) 67,92,93.

OSH Surveillance Evaluation practice

In practice, public health surveillance systems have been evaluated inconsistently, with

the evaluation quality varying significantly 65,66. Existing evaluations vary in methods and

criteria. Given limited resources, evaluations were mostly conducted in a convenient way, with

system attributes being selected subjectively and described largely qualitatively 66. A systematic

review conducted by Drewe et al. 94 of 99 existing evaluations of animal and public health

surveillance systems found that nearly half of the evaluations assessed only one or two attributes.
10

They found that comprehensive evaluation was uncommon and there was almost no evaluations

for occupational health conditions. In fact, compared to a mounting body of literature on

infectious diseases surveillance evaluation and other types of public health surveillance,

published evaluations on OSH surveillance are limited.

To further understand current OSH surveillance evaluation practice, a systematic search

for publications on evaluating OSH surveillance systems identified only six publications on

comprehensive evaluation of OSH surveillance systems 87,88,95–98. Five evaluations were based on

existing guidelines, among which four referenced the two CDC guidelines on surveillance

evaluation 8,53. Three evaluations assessed attributes recommended in the CDC guidelines8,53.

Most of these attributes were evaluated qualitatively, with inconsistent measures used in these

evaluations. None of the six evaluations assessed the data quality related attributes quantitatively,

such as sensitivity and predictive value positive (PVP). Rather, they were evaluated with

qualitative evidence or even left without assessment due to lack of data. Two of the six

evaluations did not clearly describe the evaluation methods used. Among those that introduced

evaluation methods, interview, questionnaire survey and literature/document review were

common methods to collect evaluation evidence. A summary of the six evaluations is included in

Appendix 1-3.

Research purpose and approach

This research aims to address the identified gaps and opportunities in current OSH

surveillance in the US, including the lack of consistent terminology in OSH surveillance and the

lack of guidelines and frameworks for OSH surveillance evaluation, as well as the necessity to

improve OSH surveillance by utilizing new and existing data sources and methods. Three related

studies have been conducted using both qualitative and quantitative research methods and
11

translational study design to accomplish the three specific research aims. Components and

attributes abstracted from existing guidelines and evaluations introduced in above sections were

used to guide Aim 1 in this research.

The specific aim in each study are introduced as below:

Aim 1: To understand the current status of OSH surveillance in the US and to develop a

comprehensive list of elements relevant to OSH surveillance for an evaluation framework,

including surveillance components, attributes and example measures by conducting a Delphi

study among experts in the US.

Aim 2: To develop a guiding framework for evaluation of OSH surveillance systems in

the US based on the comprehensive list of elements identified in Aim 1 and follow-up

suggestions from the Delphi panel experts and potential users of the framework.

Aim 3: To explore the use of non-disabling WC claims data obtained from WC insurance

company and the Prevention Index (PI) method in OSH surveillance by examining injury and

illness burden and prioritizing industries of high-risk for a vulnerable population in Oregon,

young workers aged 24 years and under.

The overarching goal of this research is to improve OSH surveillance practice in

surveillance evaluation and surveillance methods. Moving research to practice has been a focus

in this research. Being applied in nature, this research incorporated a translational design of

multidirectional process, knowledge dissemination and integration 95,96. In Aim 1 and Aim 2,

current OSH surveillance practice and need for a practical guiding framework were identified

from field experts. The Delphi study results were then disseminated back to potential users of the

framework, i.e., field professionals and practitioners to collect further suggestions on the
12

development of a guiding framework. In Aim 3, study results were presented back to the data

provider and field practitioners for young workers safety and health to seek opportunities in

guiding preventions interventions as well as improvement in surveillance data sources.

This research will produce three manuscripts, which are presented in chapter 2 to chapter

4, and have been or will be submitted to scientific journals for publication.


13

Tables and Figures

Table 1-1 Major sources of data for occupational safety and health surveillance
Type Example data sources
Employer-based · BLS SOII;
· BLS CFOI;
· Data reporting to regulatory agencies by employers;
Hospital/billing based · Birth and death certificates;
· Data from healthcare facilities, such as hospitals and emergency
departments, including inpatient/outpatient discharge data;
· Data from healthcare providers, such as physicians;
· Data from clinical laboratories;
· Poison control centers data;
· Syndromic surveillance system;
· All payer claims data;
· Electronic health records;

Administrative data · Workers' compensation data;


· Registry data such as cancer registry, trauma registry;
· Medical examiners;
· OSHA records;
· Coast Guard records;

Individual-based · Behavioral Risk Factor Surveillance System (BRFSS);


· Youth Risk Behavior Surveillance system;
· NHIS;

Other data sources · Newspapers;


· Social media;
Resources: National Academies of Sciences, Engineering, and Medicine. A Smarter National
Surveillance System for Occupational Safety and Health in the 21st Century. Washington, DC: the
National Academies Press; 2018.
14

Table 1-2 Summary information of identified guidelines and frameworks


Item Summary Results
Region of authors US: 12 (35%)
WHO: 7 (21%)
EU: 3 (9%)
Others: 12 (35%)

Most frequently cited Injury surveillance guidelines (WHO & CDC, 2004), N=601;
Updated guidelines for evaluating public health surveillance systems (CDC,
2001), N=574;
Guidelines for evaluating surveillance systems (CDC, 1988), N=503

Methods for developing the Based on CDC guidelines: 6 (18%)


guidelines/frameworks Explicitly stated literature review as a method: 9 (26%)
Delphi study: 4 (12%)

Type of surveillance All types: 11 (32%)


Animal health: 8 (24%)
Communicable diseases: 4 (12%)
Early detection/Syndromic: 3 (9%)
Injury: 3 (9%)
Occupational diseases: 1 (3%)

Level of surveillance National/State/Local: 27 (80%)


National: 5 (20%)
Developed as scoring Yes: 7 (21%)
system No: 27 (79%)

Evaluation steps Include discussion on evaluation steps: 16 (47%)


Most guidelines and frameworks organize evaluations into 3-6 steps based on
temporal sequence from preparing and implementing the evaluation to
follow-up of evaluation recommendations.
Some organize the evaluation based on system functions/components, such as
system processing, outbreak detection, data quality.
15

Table 1-3 Attributes abstracted from existing guidelines and frameworks


Attribute Similar terms Frequency
Timeliness Speed, Response time, 27
Sensitivity 26
Representativeness Disaggregation, Vulnerable population 26
Flexibility Resilience, Adjustment, Dynamic 25
Acceptability 24
Simplicity Ease of use 20
Cost 20
Usefulness 19
Predictive Value Positive (PVP) 18
Data quality 17
Stability Operationality, Functionality 16
Completeness 13
Specificity 11
Effectiveness 10
Accuracy Precision, Correctness 10
Cost-effectiveness 9
Availability 8
Reliability 8
Coverage 6
Utility 5
Impact 5
Validity 5
Accessibility 5
Sustainability 4
Portability 4
Interoperability Comparability 4
Predictive Value Negative (PVN) 4
Repeatability 4
Robustness 4
Relevance Pertinence 4
Consistency Coherence 4
Attributes appeared in 3 guidelines/frameworks or less:
Bias, Integration, Feasibility, Confidentiality, Security, Efficiency, Compliance, Usability,
Benefit, Clarity, Adaptability, Mutual understanding, Generalizability, Transparency,
Suitability, Trusted, Measurability, Adaptability, Responsiveness, Coherence,
Interpretability
16

Table 1-4 Abstracted components in existing guidelines and frameworks

Component Similar terms Frequency


System organization Logistics, Operation, Structure, Design, 49
Activities, Active surveillance, Passive
surveillance
Surveillance objectives Goals, Purposes 45
Stakeholders 31
Data collection Data entry 30
Data analysis Data manipulation 30
Data providers Data producers 27
Resources 26
Management and supervision 25
Communication & coordination Networking, External communication, 20
Internal communication
Standards and guidelines Quality standards, Ethical standards, 18
Uniform classification systems
Data users System users 16
Event under surveillance Event of priority, Diseases, Conditions, 15
Cases, Unit of interest
Case reporting Notification, 14
Dissemination Data release 14
Notification/alert 13
Personnel/staff hours 13
Data management Data storage 12
Care providers Physician 11
Data sources 10
Population under surveillance Target population, Geographical coverage 10
Case/outbreak investigation Confirmation, Ascertainment 10
Use of data 10
Response and control early warning, epidemic preparedness 10
Technology Networks, Platform, Database structure, 10
Informatics
Laboratory 9
Surveillance monitoring and evaluation 9
Staff Surveillance planners & implementers, 8
Surveillance strategy Surveillance procedure/methodology 8
Public health importance 8
Materials Supplies, Equipment 8
Legislation and regulation 8
Training 8
Case/outbreak detection 7
17

Component Similar terms Frequency


Data interpretation Interpretation of analytical results, 7
Surveillance products Publications, Presentations, Posters, Media, 7
News bulletin
Funding/budget 7
Integration/standardization 7
Quality control Data quality monitoring, 7
Community 6
Government agencies/officials 6
Case definition 6
Diagnosis & screening 6
System performance indicators 6
Sampling 5
Feedback Data circulation 5
Surveillance process 5
Protocol 5
Data confidentiality Privacy, System security 5
Professionals & professional associations 4
Case registration Case management 4
Surveillance cost 4
Components appeared in three or less guidelines/frameworks:
Practitioners, Scope of surveillance, Data processing, Authority, Financiers, Donors, Organizations,
Species/breed, Commercial system developers, Vectors, Infrastructure, Decision making, Data flow,
Changes in policy and planning, Current situation, Public health action
18

Figure 1-1 Frequency counts of identified guidelines and frameworks by time period and type of surveillance
19

References

1. Takala J, Hämäläinen P, Saarela KL, et al. Global Estimates of the Burden of Injury and
Illness at Work in 2012. J Occup Environ Hyg. 2014;11(5):326-337.
doi:10.1080/15459624.2013.863131

2. Bureau of Labor Statistics. Injuries, Illnesses, and Fatalities (IIF). Published 2020.
Accessed March 19, 2020. https://www.bls.gov/iif/home.htm

3. National Safety Council. Work Injury Costs. Injury Facts. Accessed May 14, 2020.
https://injuryfacts.nsc.org/work/costs/work-injury-costs/

4. United Nations. Sustainable Development Goals. Accessed May 1, 2020.


https://sustainabledevelopment.un.org/?menu=1300

5. Centers for Disease Control and Prevention. Injury Prevention and Control. the Centers for
Disease Control and Prevention. Published 2014. Accessed April 1, 2018.
https://www.cdc.gov/injury/about/approach.html

6. Centers for Disease Control and Prevention. How We Prevent Chronic Diseases and
Promote Health. Published July 31, 2019. Accessed March 31, 2020.
https://www.cdc.gov/chronicdisease/center/nccdphp/how.htm

7. National Institute for Occupational Safety and Health. Tracking Occupational Injuries,
Illnesses, and Hazards: The NIOSH Surveillance Strategic Plan. U.S. Department of
Health and Human Services, Public Health Service, Centers for Disease Control and
Prevention, National Institute for Occupational Safety and Health; 2001.

8. Centers for Disease Control and Prevention. Updated guidelines for evaluating public
health surveillance systems: recommendations from the guidelines working group. Morb
Mortal Wkly Rep. 2001;50(No. 13):1-36.

9. National Academies of Sciences, Engineering, and Medicine. A Smarter National


Surveillance System for Occupational Safety and Health in the 21st Century. the National
Academies Press; 2018. Accessed January 20, 2018.
http://www8.nationalacademies.org/onpinews/newsitem.aspx?RecordID=24835

10. BS L, Wegman D, Baron S, Sokas R. Occupational and Environmental Health. Vol 1.


Oxford University Press; 2017. doi:10.1093/oso/9780190662677.001.0001

11. Koh D, Aw T-C. Surveillance in Occupational Health. Occup Environ Med.


2003;60(9):705-710. doi:10.1136/oem.60.9.705

12. Council NR. Counting Injuries and Illnesses in the Workplace: Proposals for a Better
System. The National Academies Press; 1987. doi:10.17226/18911
20

13. Shire JD, Marsh GM, Talbott EO, Sharma RK. Advances and Current Themes in
Occupational Health and Environmental Public Health Surveillance. Annu Rev Public
Health. 2011;32(1):109-132. doi:10.1146/annurev-publhealth-082310-152811

14. National Institute for Occupational Safety and Health. Worker Health Surveillance - Our
Current Surveillance Initiatives. Published 2019. Accessed May 15, 2020.
https://www.cdc.gov/niosh/topics/surveillance/data.html

15. National Institute for Occupational Safety and Health. State Surveillance Program.
Published February 6, 2020. Accessed March 18, 2020.
https://www.cdc.gov/niosh/oep/statesurv.html

16. Occupational Safety and Health Administration (OSHA). OSHA - Information System
(OIS). Accessed April 26, 2020. https://www.dol.gov/agencies/oasam/centers-
offices/ocio/privacy/osha/ois

17. CSTE. 2013 National Assessment of Epidemiology Capacity: Findings &


Recommendations. Council of State and Territorial Epidemiologiests (CSTE); 2014.

18. Arrazola J. Assessment of Epidemiology Capacity in State Health Departments — United


States, 2017. Morb Mortal Wkly Rep. 2018;67. doi:10.15585/mmwr.mm6733a5

19. Stout N, Bell C. Effectiveness of source documents for identifying fatal occupational
injuries: a synthesis of studies. Am J Public Health. 1991;81(6):725-728.
doi:10.2105/AJPH.81.6.725

20. Leigh JP, Du J, McCurdy SA. An estimate of the U.S. government’s undercount of
nonfatal occupational injuries and illnesses in agriculture. Ann Epidemiol. 2014;24(4):254-
259. doi:10.1016/j.annepidem.2014.01.006

21. PRANSKY G, SNYDER T, DEMBE A, HIMMELSTEIN J. Under-reporting of work-


related disorders in the workplace: a case study and review of the literature. Ergonomics.
1999;42(1):171-182. doi:10.1080/001401399185874

22. Leigh JP, Marcin JP, Miller TR. An Estimate of the U.S. Government’s Undercount of
Nonfatal Occupational Injuries. J Occup Environ Med. 2004;46(1):10–18.
doi:10.1097/01.jom.0000105909.66435.53

23. Rosenman KD, Kalush A, Reilly MJ, Gardiner JC, Reeves M, Luo Z. How Much Work-
Related Injury and Illness is Missed By the Current National Surveillance System? J
Occup Environ Med. 2006;48(4):357–365. doi:10.1097/01.jom.0000205864.81970.63

24. Ruser JW. Allegations of Undercounting in the BLS Survey of Occupational Injuries and
Illnesses. In: JSM Proceedings, Statistical Computing Section. American Statistical
Association; 2010:15.

25. Azaroff LS, Levenstein C, Wegman DH. Occupational Injury and Illness Surveillance:
Conceptual Filters Explain Underreporting. Am J Public Health. 2002;92(9):1421-1429.
21

26. Kica J, Rosenman KD. Multi-source surveillance for work-related crushing injuries. Am J
Ind Med. 2018;61(2):148-156. doi:10.1002/ajim.22800

27. Largo TW, Rosenman KD. Surveillance of work-related amputations in Michigan using
multiple data sources: results for 2006-2012. Occup Environ Med. 2015;72(3):171-176.
doi:10.1136/oemed-2014-102335

28. Davis LK, Grattan KM, Tak S, Bullock LF, Ozonoff A, Boden LI. Use of multiple data
sources for surveillance of work-related amputations in Massachusetts, comparison with
official estimates and implications for national surveillance. Am J Ind Med.
2014;57(10):1120–1132.

29. Borjan M, Lumia M. Evaluation of a state based syndromic surveillance system for the
classification and capture of non-fatal occupational injuries and illnesses in New Jersey.
Am J Ind Med. 2017;60(7):621-626. doi:10.1002/ajim.22734

30. Sears JM, Bowman SM, Hogg-Johnson S, Shorter ZA. Occupational Injury Trends
Derived From Trauma Registry and Hospital Discharge Records Lessons for Surveillance
and Research. J Occup Environ Med. 2014;56(10):1067-1073.
doi:10.1097/JOM.0000000000000225

31. Pefoyo AJK, Genesove L, Moore K, Del Bianco A, Kramer D. Exploring the Usefulness
of Occupational Exposure Registries for Surveillance The Case of the Ontario Asbestos
Workers Registry (1986-2012). J Occup Environ Med. 2014;56(10):1100-1110.
doi:10.1097/JOM.0000000000000235

32. Lofgren DJ, Reeb-Whitaker CK, Adams D. Surveillance of Washington OSHA Exposure
Data to Identify Uncharacterized or Emerging Occupational Health Hazards. J Occup
Environ Hyg. 2010;7(7):375-388. doi:10.1080/15459621003781207

33. Lavoue J, Friesen MC, Burstyn I. Workplace Measurements by the US Occupational


Safety and Health Administration since 1979: Descriptive Analysis and Potential Uses for
Exposure Assessment. Ann Occup Hyg. 2013;57(1):77-97. doi:10.1093/annhyg/mes055

34. Sarazin P, Burstyn I, Kincl L, Lavoué J. Trends in OSHA Compliance Monitoring Data
1979–2011: Statistical Modeling of Ancillary Information across 77 Chemicals. Ann
Occup Hyg. 2016;60(4):432-452. doi:10.1093/annhyg/mev092

35. Graber JM, Smith AE. Results from a state-based surveillance system for carbon
monoxide poisoning. PUBLIC Health Rep. 2007;122(2):145-154.
doi:10.1177/003335490712200203

36. Harduar Morano L, Watkins S. Evaluation of Diagnostic Codes in Morbidity and


Mortality Data Sources for Heat-Related Illness Surveillance. Public Health Rep Wash DC
1974. 2017;132(3):326-335. doi:10.1177/0033354917699826
22

37. Harber P, Ha J, Roach M. Arizona Hospital Discharge and Emergency Department


Database Implications for Occupational Health Surveillance. J Occup Environ Med.
2017;59(4):417-423. doi:10.1097/JOM.0000000000000971

38. Breslin C, Koehoorn M, Smith P, Manno M. Age related differences in work injuries and
permanent impairment: a comparison of workers’ compensation claims among
adolescents, young adults, and adults. Occup Environ Med. 2003;60(9):e10–e10.

39. Radi S, Benke G, Schaafsma F, Sim M. Compensation claims for occupational noise
induced hearing loss between 1998 and 2008: yearly incidence rates and trends in older
workers. Aust N Z J Public Health. 2016;40(2):181–185. doi:10.1111/1753-6405.12460

40. Morassaei S, Breslin FC, Shen M, Smith PM. Examining job tenure and lost-time claim
rates in Ontario, Canada, over a 10-year period, 1999–2008. Occup Env Med.
2014;70(3):171-178. doi:10.1136/oemed-2012-100743

41. Tarawneh I, Lampl M, Robins D, et al. Workers’ Compensation Claims for


Musculoskeletal Disorders Among Wholesale and Retail Trade Industry Workers — Ohio,
2005–2009. MMWR Morb Mortal Wkly Rep. 2013;62(22):437-442.

42. Council of State and Territorial Epidemiologists. Occupational Health Indicators: A


Guide for Tracking Occupational Health Conditions and Their Determinants (Update
2019). Council of State and Territorial Epidemiologists (CSTE) in collaboration with the
NIOSH; 2020.
https://cdn.ymaws.com/www.cste.org/resource/resmgr/publications/OHI_Guidance_Manu
al_FINAL-2.pdf

43. Mccall P, Horwitz B. Workplace Violence in Oregon: An Analysis Using


Workers’Compensation Claims from 1990–1997. J Occup Environ Med. 2004;46(4):357–
366. doi:10.1097/01.jom.0000121131.34757.ed

44. Mallon M, Cherry E. Investigating the Relationship Between Worker Demographics and
Nature of Injury on Federal Department of Defense Workersʼ Compensation Injury Rates
and Costs From 2000 to 2008. J Occup Environ Med. 2015;57 Suppl 3S:S27–S30.
doi:10.1097/JOM.0000000000000416

45. Ramaswamy SK, Mosher GA. Using workers’ compensation claims data to characterize
occupational injuries in the biofuels industry. Saf Sci. 2018;103:352-360.
doi:10.1016/j.ssci.2017.12.014

46. Horwitz IB, McCall BP. Occupational injury among Rhode Island adolescents: an analysis
of workers’ compensation claims, 1998 to 2002. J Occup Environ Med Am Coll Occup
Environ Med. 2005;47(5):473-481.

47. Walters JK, Christensen KA, Green MK, Karam LE, Kincl LD. Occupational injuries to
Oregon workers 24 years and younger: An analysis of workers’ compensation claims,
2000-2007. Am J Ind Med. 2010;53(10):984-994. doi:10.1002/ajim.20819
23

48. Utterback DF, Meyers AR, Wurzelbacher SJ. Workers’ Compensation Insurance: A
Primer for Public Health. Nitional Institute for Occupational Safety and Health; 2014.
Accessed February 19, 2019. http://elcosh.org/document/3752/d001287/workers%3F-
compensation-insurance%3A-a-primer-for-public-health.html

49. Thacker SB, Qualters JR, Lee LM. Public Health Surveillance in the United States:
Evolution and Challenges. Morb Mortal Wkly Rep. 2012;61(3):3-9.

50. Hall HI, Correa A, Yoon PW, Braden CR. Lexicon, Definitions, and Conceptual
Framework for Public Health Surveillance. 2012;61:5.

51. Choi BCK. The Past, Present, and Future of Public Health Surveillance. Scientifica.
2012;2012:1-26. doi:http://dx.doi.org/10.6064/2012/875253

52. Allaki FE, Bigras-Poulin M, Michel P, Ravel A. A Population Health Surveillance Theory.
Epidemiol Health. 2012;34:e2012007. doi:10.4178/epih/e2012007

53. Centers for Disease Control and Prevention. Guidelines for evaluating surveillance
systems. Morb Mortal Wkly Rep MMWR. 1988;37(No. S-5).

54. Langmuir AD. The surveillance of communicable diseases of national importance. N Engl
J Med. 1963;268(4):182-192.

55. McNabb SJ, Chungong S, Ryan M, et al. Conceptual framework of public health
surveillance and action and its application in health sector reform. BMC Public Health.
2002;2:2. doi:10.1186/1471-2458-2-2

56. World Health Organization. Communicable Disease Surveillance and Response Systems:
Guide to Monitoring and Evaluating. World Health Organization; 2006.
http://apps.who.int/iris/bitstream/10665/69331/1/WHO_CDS_EPR_LYO_2006_2_eng.pd
f

57. World Health Organization, Department of Communicable Disease Surveillance and


Responses. Protocol for the Assessment of National Communicable Disease Surveillance
and Response Systems. Guidelines for Assessment Teams. World Health Organization;
2001. Accessed January 16, 2019.
https://www.who.int/csr/resources/publications/surveillance/WHO_CDS_CSR_ISR_2001
_2_EN/en/

58. Xiong W, Lv J, Li L. A survey of core and support activities of communicable disease


surveillance systems at operating-level CDCs in China. Bmc Public Health. 2010;10:704.
doi:10.1186/1471-2458-10-704

59. Muellner P, Watts J, Bingham P, et al. SurF: an innovative framework in biosecurity and
animal health surveillance evaluation. Transbound Emerg Dis. 2018;0(0):1-8.
doi:10.1111/tbed.12898
24

60. Peyre M, Hoinville L, Njoroge J, et al. The RISKSUR EVA tool (Survtool): A tool for the
integrated evaluation of animal health surveillance systems. Prev Vet Med.
2019;173:104777. doi:10.1016/j.prevetmed.2019.104777

61. Auer AM, Dobmeier TM, Haglund BJ, Tillgren P. The relevance of WHO injury
surveillance guidelines for evaluation: learning from the aboriginal community-centered
injury surveillance system (ACCISS) and two institution-based systems. BMC Public
Health. 2011;11(744):1-15. doi:10.1186/1471-2458-11-744

62. Centers for Disease Control and Prevention (CDC), Office of Strategy and Innovation.
Introduction to Program Evaluation for Public Health Programs: A Self-Study Guide.
Atlanta, GA: Centers for Disease Control and Prevention; 2011.
http://www.cdc.gov/eval/guide/cdcevalmanual.pdf

63. Centers for Disease Control and Prevention, Program Performance and Evaluation Office.
Framework for program evaluation in public health. Morb Mortal Wkly Rep.
1999;48(No.RR-11):1-40.

64. Calba C, Goutard FL, Hoinville L, et al. Surveillance systems evaluation: a systematic
review of the existing approaches. BMC Public Health. 2015;15:448. doi:10.1186/s12889-
015-1791-5

65. Drewe JA, Hoinville LJ, Cook AJC, Floyd T, Stärk KDC. Evaluation of animal and public
health surveillance systems: a systematic review. Epidemiol Infect. 2012;140(4):575-590.
doi:10.1017/S0950268811002160

66. Allaki FE, Bigras-Poulin M, Ravel A. Conceptual evaluation of population health


surveillance programs: method and example. Prev Vet Med. 2013;108(4):241–252.

67. Spreeuwers D, de Boer AG, Verbeek JH, van Dijk FJ. Characteristics of national registries
for occupational diseases: international development and validation of an audit tool
(ODIT). BMC Health Serv Res. 2009;9:194. doi:10.1186/1472-6963-9-194

68. Holder Y, Peden M, Krug E, et al (Eds). Injury Surveillance Guidelines.; 2001.

69. Adamson PC, Tafuma TA, Davis SM, Xaba S, Herman-Roloff A. A systems-based
assessment of the PrePex device adverse events active surveillance system in Zimbabwe.
PLOS ONE. 2017;12(12):e0190055. doi:10.1371/journal.pone.0190055

70. Jefferson H, Dupuy B, Chaudet H, et al. Evaluation of a syndromic surveillance for the
early detection of outbreaks among military personnel in a tropical country. J Public
Health Oxf Engl. 2008;30(4):375-383. doi:10.1093/pubmed/fdn026

71. Jhung MA, Budnitz DS, Mendelsohn AB, Weidenbach KN, Nelson TD, Pollock DA.
Evaluation and overview of the National Electronic Injury Surveillance System-
Cooperative Adverse Drug Event Surveillance Project (NEISS-CADES). Med Care.
2007;45(10 Supl 2):S96-102. doi:10.1097/MLR.0b013e318041f737
25

72. Joseph A, Patrick N, Lawrence N, Lilian O, Olufemi A. Evaluation of Malaria


Surveillance System in Ebonyi state, Nigeria, 2014. Ann Med Health Sci Res. Published
online 2017. Accessed December 30, 2017. https://www.amhsr.org/abstract/evaluation-of-
malaria-surveillance-system-in-ebonyi-state-nigeria-2014-3901.html

73. Kaburi BB, Kubio C, Kenu E, et al. Evaluation of the enhanced meningitis surveillance
system, Yendi municipality, northern Ghana, 2010–2015. BMC Infect Dis. 2017;17:306.
doi:10.1186/s12879-017-2410-0

74. Liu X, Li L, Cui H, Jackson VW. Evaluation of an emergency department-based injury


surveillance project in China using WHO guidelines. Inj Prev. 2009;15(2):105–110.

75. Patel K, Watanabe-Galloway S, Gofin R, Haynatzki G, Rautiainen R. Non-fatal


agricultural injury surveillance in the United States: A review of national-level survey-
based systems. Am J Ind Med. 2017;60(7):599-620. doi:10.1002/ajim.22720

76. Pinell-McNamara VA, Acosta AM, Pedreira MC, et al. Expanding Pertussis Epidemiology
in 6 Latin America Countries through the Latin American Pertussis Project. Emerg Infect
Dis. 2017;23(Suppl 1):S94-S100. doi:10.3201/eid2313.170457

77. Thomas MJ, Yoon PW, Collins JM, Davidson AJ, Mac Kenzie WR. Evaluation of
Syndromic Surveillance Systems in 6 US State and Local Health Departments: J Public
Health Manag Pract. Published online September 2017:1.
doi:10.1097/PHH.0000000000000679

78. Velasco-Mondragón HE, Martin J, Chacón-Sosa F. Technology evaluation of a USA-


Mexico health information system for epidemiological surveillance of Mexican migrant
workers. Rev Panam Salud Pública. 2000;7:185-192. doi:10.1590/S1020-
49892000000300008

79. Hoffman S, Dekraai M. Brief Evaluation Protocol for Public Health Surveillance Systems.
The Public Policy Center, University of Nebraska; 2016.

80. Centers for Disease Control and Prevention. Evaluating an NCD-Related Surveillance
System. Centers for Disease Control and Prevention; 2013.

81. Centers for Disease Control and Prevention. Framework for evaluating public health
surveillance systems for early detection of outbreaks: recommendations from the CDC
Working Group. Morb Mortal Wkly Rep. 2004;53(No. RR-5):1-13.

82. Salman null, Stärk KDC, Zepeda C. Quality assurance applied to animal disease
surveillance systems. Rev Sci Tech Int Off Epizoot. 2003;22(2):689-696.

83. Meynard J-B, Chaudet H, Green AD, et al. Proposal of a framework for evaluating
military surveillance systems for early detection of outbreaks on duty areas. BMC Public
Health. 2008;8(1):146.
26

84. Hendrikx P, Gay E, Chazel M, et al. OASIS: an assessment tool of epidemiological


surveillance systems in animal health and food safety. Epidemiol Infect.
2011;139(10):1486-1496.

85. Health Canada. Framework and Tools for Evaluating Health Surveillance Systems.
Population and Public Health Branch, Health Canada; 2004. Accessed October 18, 2017.
http://publications.gc.ca/site/eng/260337/publication.html

86. Hagemeyer A, Azofeifa A, Stroup DF, Tomedi LE. Evaluating Surveillance for Excessive
Alcohol Use in New Mexico. Prev Chronic Dis. 2018;15. doi:10.5888/pcd15.180358

87. Austin C. An evaluation of the census of fatal occupational injuries as a system for
surveillance. Compens Work Cond. 1995;1:51–54.

88. Patel K, Watanabe-Galloway S, Gofin R, Haynatzki G, Rautiainen R. Non-fatal


agricultural injury surveillance in the United States: A review of national-level survey-
based systems. Am J Ind Med. 2017;60(7):599-620. doi:10.1002/ajim.22720

89. Garland R. Evaluation of the Oregon Asthma Surveillance System. Oregon Department of
Human Services, Public Health Division, Office of Disease Prevention and Epidemiology,
Health Promotion and Chronic Disease Prevention, Oregon Asthma Program; 2009.

90. Samoff E, MacDonald PDM, Fangman MT, Waller AE. Local Surveillance Practice
Evaluation in North Carolina and Value of New National Accreditation Measures. J
Public Health Manag Pract. 2013;19(2):146–152. doi:10.1097/PHH.0b013e318252ee21

91. Erondu N. Evaluating communicable disease surveillance in resource-poor settings: A new


approach applied to meningitis surveillance in Chad. Published online 2016.

92. Dufour B. Technical and economic evaluation method for use in improving infectious
animal disease surveillance networks. Vet Res. 1999;30(1):27–37.

93. World Health Organization. Assessing the National Health Information System: An
Assessment Tool (v4.0). World Health Organization; 2008.
http://apps.who.int/iris/bitstream/10665/43932/1/9789241547512_eng.pdf?ua=1

94. Drewe JA, Hoinville LJ, Cook AJC, Floyd T, Staerk KDC. Evaluation of animal and
public health surveillance systems: a systematic review. Epidemiol Infect.
2012;140(4):575-590. doi:10.1017/S0950268811002160

95. Zoellner JM, Porter KJ. Translational Research: Concepts and Methods in Dissemination
and Implementation Research. In: Coulston AM, Boushey CJ, Ferruzzi MG, Delahanty
LM, eds. Nutrition in the Prevention and Treatment of Disease (Fourth Edition).
Academic Press; 2017:125-143. doi:10.1016/B978-0-12-802928-2.00006-0

96. National Institute for Occupational Safety and Health. Research to Practice (r2p).
Published October 19, 2018. Accessed May 15, 2020.
https://www.cdc.gov/niosh/r2p/default.html
27

CHAPTER 2 Understanding Occupational Safety and Health Surveillance:

Expert Consensus on Components, Attributes and Example Measures for an

Evaluation Framework

Liu Yang, MS, MPH a

Adam Branscum, PhD a

Laurel Kincl, PhD a

a
College of Public Health and Human Sciences, Oregon State University, 123 Women’s
Building, Corvallis, Oregon, 97331, USA
28

Abstract

Background: Periodic evaluation is critical to the improvement of an occupational safety and

health (OSH) surveillance system. No tailored guidance exists for evaluating OSH surveillance

systems to date. The study aimed to understand current OSH surveillance in the US and propose

elements (components, attributes and measures) for an evaluation framework.

Methods: A modified Delphi study with three survey rounds invited experts to rate and

comment on OSH surveillance evaluation framework elements. An initial list of elements was

compiled from literature. The expert panel rated and commented on the elements resulting in a

final list through the panel’s consensus. Additionally, experts completed a brief review of OSH

surveillance systems they had worked with and answered questions regarding the development of

an evaluation framework. A mixed methods analysis compiled descriptive statistics of the ratings

and identified major themes from comments.

Results: Fifty-four people across the US were contacted to participate. Ten experts began the

first survey round with eight then seven experts continuing in the subsequent rounds,

respectively. A total of 64 surveillance components, 31 attributes, and 116 example measures

were selected into the final list through panel consensus, with 134 (62.3%) reaching high

consensus. Major themes regarding current OSH surveillance focused on resources and

feasibility, data collection, flexibility, and the inter-relatedness among elements.

Conclusions: A Delphi process successfully identified tailored OSH surveillance elements and

major themes regarding OSH surveillance in the US. A framework developed from these results

can serve as a standard yet flexible guide for evaluating OSH surveillance systems.
29

Introduction

Despite the declining trends, millions of workers in the United States (US) suffer injuries

and illnesses on the job each year. In 2018, there were 2.8 million nonfatal occupational injuries

and illnesses and more than 5000 fatal occupational injuries according to the US Bureau of

Labor Statistics 1. Important data on work-related fatalities, injuries and illnesses as well as the

presence of workplace hazards have been systematically collected by established occupational

safety and health (OSH) surveillance systems to inform prevention efforts. There has been a

growing interest to improve OSH surveillance capacity and quality for more timely, accurate and

complete reporting of work-related conditions 2,3. Periodic evaluation is critical to the

implementation and continuous improvement of a surveillance system 4,5.

Surveillance evaluation can be challenging, beginning with the selection of attributes (i.e.

characteristics of the surveillance system which needs to be evaluated) and measures (i.e.

evaluation questions or indicators). It was reported that only one or two attributes were measured

in an existing surveillance evaluation and comprehensive evaluation was uncommon 6.

Evaluations can be open to interpretation and not comparable due partly to the lack of consistent

operational definitions and detailed guidance corresponding to the different types of surveillance.

Despite an increasing number of guidelines and frameworks intended to standardize

terminologies and guide surveillance evaluations, most were geared towards communicable

diseases surveillance 5,7,8, and were critiqued as providing only generic guidance 9,10. A published

systematic literature review on 15 guiding approaches for public health and animal health

surveillance concluded that existing guidelines were insufficient in terms of providing a

comprehensive list of attributes and evaluation questions, as well as practical guidance on

selecting and assessing these attributes 11.


30

It can be particularly challenging to evaluate OSH surveillance systems and no tailored

guidelines have been developed specific for these systems. The development of surveillance

systems for occupational safety and health conditions in the US has historically lagged behind

other health conditions and is still facing ongoing challenges including organizational capacity,

data sources and resources 2,3,12–14. A tailored framework that takes into consideration

characteristics of OSH surveillance systems can guide comparable and replicable evaluations and

thus contribute to the continuous development and improvement of OSH surveillance. In our

model, the framework should incorporate structural components (i.e., main functions and

activities) relevant to OSH surveillance systems in the US as well as a comprehensive list of

attributes corresponding to these components. In addition, a list of accessible and appropriate

measures adds to the practicality of the framework. To this end, a Delphi survey study was

designed and conducted to solicit opinions and suggestions from a panel of experts to: 1) develop

a comprehensive list of framework elements including surveillance components, attributes and

example measures; and 2) understand the current status of OSH surveillance in the US. It is

expected that findings of this study can improve understanding of OSH surveillance and inform

the development of a tailored evaluation framework.

Methods

The Delphi technique is a commonly used consensus methods in a wide variety of

research fields including guidelines development in health sciences 15–17. It particularly suits

research questions where value judgements are required on a collective basis 18–20. Compared to

other face-to-face interactions, advantages of the Delphi technique include anonymity, the

avoidance of dominated views, structured feedback for valid consensus and quantitative group

responses 17,20–22. Electronic communication methods allows for more efficient organization and
31

timely responses in the Delphi process to reduce time, geographical and economic restrictions 17.

This was particularly suitable for the purpose of this study.

A modified Delphi study with three internet-based survey rounds and a background

survey was designed following the recommendations from common Delphi theoretical

frameworks and practices in healthcare and related fields 16,17,19–24,24–27. Experts across the US

were invited to rate and comment on framework elements as well as to review their respective

OSH surveillance systems.

The study obtained ethics review approval by the Institutional Review Board at Oregon

State University (Review#: 8797). All participants gave informed consent.

Framework Elements

Common surveillance components, attributes, and example measures were compiled with

a thorough review of existing guidelines and frameworks of public health surveillance evaluation

as well as published evaluations for OSH surveillance systems by January 2019 (see Appendix 1-

2 and Appendix 1-3 for a brief introduction of guidelines and frameworks as well as evaluations

of OSH surveillance systems referenced in this study). A total of 59 surveillance components

(including 47 sub-components), 32 attributes and 133 example measures were described and

organized into a logic model that was provided to the expert panel (See Figure 2-1 for the

finalized logic model).

Delphi Panel Composition

The panel included experts with experience in at least one of the following two main

specialty areas: 1) the operation of OSH surveillance systems in the US and, 2) evaluation of

OSH surveillance systems. Potential participants were identified through multiple channels: 1)
32

websites on national and state level OSH surveillance systems in the US; 2) authors of peer-

reviewed publications of OSH surveillance evaluation; 3) recommendations from professionals

in OSH surveillance; and 4) participants in the Council of State and Territorial Epidemiologists

(CSTE) Occupational Health meeting on December 2018.

Recruitment emails were sent to 54 identified persons from 27 national and state level

OSH surveillance systems in the US, and authors and researchers in OSH surveillance seeking

their participation and/or their recommendation of possible participants. A total of 14 people

expressed interest and were invited into the first Delphi round. Six criteria (Table 2-1) were

evaluated for each potential panelist. A panelist had to satisfy at least three criteria to qualify as

an “expert”.

Survey Rounds

Three rounds of internet-based surveys were administered from March 2019 to August

2019. In each round, an invitation email was sent to panelists who had completed the previous

round, followed by two reminder emails. The first two rounds were to review 1) surveillance

components and attributes, and 2) example measures, respectively. Panelists rated these

framework elements on a 5-point Likert scale based on their relevance to OSH surveillance in

the US and its evaluation, with 1 indicating the least relevant and 5 the most relevant. They also

provided justifications, suggested revisions, additional elements, and other relevant thoughts.

Statistics of the ratings and qualitative summaries for each element along with the panelist’s own

rating in the first review were sent back to each panelist for a second review in the next round,

where the panelist had the opportunity to reconsider his/her ratings for elements that did not have

a high panel consensus. In the third round, the panelists gave a brief review of the OSH

surveillance systems they work with. Using selected attributes from the previous two rounds, the
33

panelists scored their system’s performance on a scale from 0 (worst performance) to 10 (best

performance) (based on their best understanding without collecting actual evaluation evidence).

Questions regarding the development of the framework were included in each round. A

background survey to collect panelists’ educational and professional experience was completed

between the first and the second rounds.

Data Analysis and Selection of Elements

For each element, quantitative statistics reflecting panel opinions were calculated,

including mean rating, difference between the highest and lowest ratings, mode (the most

frequent rating), SD (standard deviation of ratings), and the percent of experts who rated 3 or

higher. The panelist’s comments were taken into consideration to edit the element. The following

predetermined consensus criteria selected elements:

1) High consensus: mean rating and mode ³ 4, 80% or more panelists rated 3 or higher,

difference of panel ratings £ 2 (e.g. range from 3 to 5), and no significant edits based on

panelists’ comments;

2) Low consensus/Selected for further confirmation: mean rating and mode ³ 3, 60% or

more panelists rated 3 or higher, but do not meet criterion 1);

3) Dropped: mean rating < 3, or mode < 3, or less than 60% panelists rated the element 3

or higher.

All other questions were analyzed using descriptive statistics including mean and

percentage. Content analysis was performed for all comments and verbal answers to identify

major themes regarding the current status of OSH surveillance systems and their evaluation. Data
34

were initially processed using Microsoft Excel. Quantitative analyses were conducted using R

(version 3.6.1). Qualitative analysis was conducted using NVivo 12.

Results

Delphi Panel Participation and Composition

Of the 14 people who expressed interest, 10 (71.4%) completed the first round survey and

were included in the Delphi panel. Eight out of the ten panelists (80%) completed the second

round and seven (70%) completed the third round. The initial 10 panelists were geographically

distributed across the US, representing 12 different states covering all five regions in the US. Of

the 10 participants, seven (70%) worked for state health departments and three (30%) were from

an academic setting. The panelists’ experience working with OSH surveillance systems in the US

ranged from 2 to 5 years, with a mean of 4.3 years. Eight of the 10 panelists had experience with

OSH surveillance evaluation. All panelists held a master’s degree or higher and had peer-

reviewed publications and/or conference presentations relevant to OSH surveillance. All panel

members were qualified to be experts based on the preset criteria (Table 2-1).

Consensus and Opinion Convergence

The number of elements selected and consensus statistics from the Delphi rounds are

presented in Table 2-2. The process resulted in 64 components (including 50 sub-components),

31 attributes, and 116 example measures in a final list, with 134 (62.3%) reaching high

consensus. In general, components and attributes received higher mean rating and mode as well

as smaller difference and SD than example measures. Compared to elements with low consensus,

elements that received high consensus tended to have much higher mean and mode, and much

smaller difference and SD, with more than 98% of panel experts rating them as ≥ 3.
35

Changes of statistics through rounds were further checked to investigate opinion

convergence and the panel agreement. The SD and difference of elements rated in the second

review reduced (Wilcoxon signed rank test P-values < 0.05), possibly indicating increased

opinion convergence. More elements had small difference in the final list compared to the initial

list (data not shown). For example, the percentage of measures with a difference of 2 or below

increased from 66.9% to 92.2% in the second review.

An OSH surveillance logic model was finalized to include components and attributes that

were selected after two panel reviews (Figure 2-1). A complete list of all selected elements as

well as their finalized description can be found in Appendix 3-2 to Appendix 3-4.

Review of OSH Surveillance Systems in the US

Six state-level and one national OSH surveillance systems/programs in the US were

evaluated by panelists in the last Delphi round. The majority of the systems were supported by

national funds (86%). Nearly half of the systems (43%) conducted only secondary data analysis.

More than 57% of the systems were never or rarely evaluated and no system was evaluated

frequently. For those ever being evaluated, common purposes were to assess implementation

(80%) and performance (60%). Need assessment, cost-effectiveness and data quality assessment

were rarely or never conducted. Seven attributes were scored high (>8.0) in these systems,

among which, compliance and confidentiality received the highest score (9.0), followed by

significance, relevance, adherence, accuracy, and clarity. Three attributes received the lowest

score: legislative support (5.9), timeliness (5.7), and integration (3.9). Common perceived

weaknesses included high quality data sources, being able to conduct various surveillance

activities, dissemination, and creating outcomes and impacts. Major constraints included lack of
36

resources, legislative support, and recognition and integration into the general public health

domain.

Major Themes Identified

Content analysis on the ample commentary from panel experts over the three Delphi

rounds revealed the themes described below. Table 2-3 includes the theme categories and most

frequent codes with associated frequency results.

Status quo and issues: Resource constraints and feasibility were common issues noted by

the panel. One expert commented that “the reality is that resources for this type of work [OSH

surveillance] are historically VERY limited.” Funding was reported as not stable. A panelist

wrote that it would be “best if … built into the regular funding cycle.” Technological resources

and legislative supports were reported as basic or even lacking. The lack of resources included

comments on the dependence of external data in current OSH surveillance. An expert

commented that “without them [legislation and regulations] one may end up using existing data

not designed for OSH” and another wrote that “most of the data systems in use currently for

OSH surveillance were designed for an entirely separate purpose (i.e. financial tracking and

reimbursement)”. Reported issues with various existing data sources included the lack of

information/variables critical to OSH surveillance (e.g., industry and occupation information),

and the data timeliness. A panelist commented, “Often there is a substantial lag time in the

availability of surveillance data for public health actions”. A recurring message from the panel

experts was to have a “realistic” expectation about current OSH surveillance.

Ideal OSH surveillance: Some system requirements may not seem readily feasible given

limited resources and thus belong to “an ideal” OSH surveillance. Experts’ comments helped to
37

outline an ideal OSH surveillance, in which all necessary infrastructures and resources including

funding, legislation, human resources, technological platform and stakeholders’ collaboration,

were well-integrated to support key surveillance activities from data collection to dissemination

as well as interoperability among data systems. Strategic planning was emphasized in the ideal

OSH surveillance. “Having a well thought out plan…is critical to developing a streamlined

system”. Plans and protocols should cover both existing and potential surveillance activities.

Implications for evaluation and a framework: Nearly all experts held positive

expectations about the development of a tailored framework for OSH surveillance evaluation.

One expert commented that “It would be helpful to have an evaluation framework specific to

OSH surveillance ...” and another noted that “we've been struggling with developing a

systematic evaluation approach, so it will be good to have consensus on components of an

evaluation”. Experts called for flexibility in OSH surveillance evaluation, because “a ‘one-size

fits all’ mentality of program evaluation is often unnecessary”, and an evaluation framework

should “allow for ‘picking and choosing’ of those evaluation components”. Factors such as

differences between systems, changing surveillance objectives and OSH conditions over time,

and non-routine activities affect the evaluation choices. One expert commented that “…

evaluating it [the infrastructure] seems difficult as it constantly changes and differs from state to

state”. Another expert stated, “Case definitions are often lacking for OSH across jurisdictions,

states, areas.” As to assessing the significance of surveillance, one expert wrote that “Priorities

change and the most useful surveillance systems are those that have a historical trove of

information about priority conditions that emerge as ‘significant concerns’ at some point.”

Inter-relatedness among components and attributes: Fundamental and critical

components such as funding, resources, organizational structure, management and training,


38

standards and guidelines, and strategies were commonly stated as impacting the system’s

performance and stability. For example, experts commented that “Infrastructure is necessary for

an effective and efficient public health surveillance system” and the surveillance strategy

“impacts standardization and consistency over time and among data sources”. Some elements

were regarded as more central in relation to other elements. For example, experts commented

that stakeholders’ acceptance and participation were impacted by the system’s compliance,

transparency, simplicity and usefulness, while they in turn impacted various aspects of the

system’s performance. They stated that “without the willingness, the data is not as good”,

“whether a system will function smoothly, meets the intended purpose and can be sustained also

depends on its acceptability by stakeholders.” Attributes such as data quality, data timeliness and

usefulness were also regarded as central elements.

Discussion

The lack of comparability and replicability in evaluations limits their usefulness 28. We

have reported elsewhere the results of an OSH surveillance evaluation in Oregon and challenges

with the development of an evaluation strategy (manuscript in review). Agreed descriptions

facilitate the understanding of important elements in OSH surveillance and thus contribute to the

improvement of surveillance and its evaluation. This three-round survey study based on the

Delphi technique sought experts’ consensus on framework elements. Experts’ perspectives on

the current status of OSH surveillance systems in the US further informed contextual information

underlying their choices.

Delphi Consensus

The resulting list (Appendix 3-2 to Appendix 3-4) contains elements receiving high and

low level of consensus. Means and modes are common statistics used in the Delphi process to
39

indicate central tendency of panelists’ ratings, or “average group estimate”, while difference, SD,

and the percent of experts giving a certain rating or selecting a certain category can be used to

measure the divergence of group opinions, or the amount of panel disagreement 21,22,29. This

study leveraged a combination of central tendency and divergence measures to differentiate the

framework elements.

A key to a successful Delphi process is to allow some panelists to reconsider and

potentially change their opinions after referring to the views of their peers 17,21,30. The second

review helped to reach panel opinion convergence for elements that the panel rated low and/or

held more dispersed views for in the first review. After the second review, both the difference

and SD reduced significantly. The change of ratings in this study suggested that the “group

opinions” had guided the panel experts’ reconsideration of their stance. A strength in this Delphi

study is that experts were provided with feedback in both quantitative and qualitative summaries,

as literature have suggested that qualitative feedback is an effective way to mitigate experts’

propensity to stick to their initial opinions 31,32.

Surveillance Elements

Many elements in the “Inputs” in the logic model were considered fundamental and

critical to the OSH surveillance. The need for more relevant OSH surveillance data sources and

stakeholders was also identified by the panel. Regarding surveillance activities, not surprisingly,

data processing, ongoing monitoring and data dissemination and their sub-components (except

case investigation) were considered to be relevant to OSH surveillance. These activities are also

well-accepted components of public health surveillance to date 33.


40

The study revealed less relevant components and attributes to current OSH surveillance

due partly to limited resources, for example, the component, “early detection” and its three sub-

components, “provide timely data”, “detect clusters and unusual events” and “guide immediate

actions”. Despite that guiding immediate action was proposed for an ideal OSH surveillance 3,

the panel experts thought that these activities were less relevant to current OSH surveillance

because they were “resources intensive”. Among low consensus attributes, “simplicity” is worth

mentioning. In general, a surveillance system should be effective and as simple as possible 4,5,8,34.

Interestingly, the panelists had conflicting perspectives towards simplicity: OSH surveillance

was fairly simple, but, on the other hand, that surveillance systems can be intricate.

Current OSH surveillance status and implications

This study confirmed that current OSH surveillance was largely focused on secondary

data analyses with multiple external data sources. As the recent report by the National

Academies of Sciences, Engineering, and Medicine described, “there is no single,

comprehensive OSH surveillance system in the US, but rather an evolving set of systems using a

variety of data sources that meet different objectives, each with strengths and weaknesses” 3.

Both the experts’ comments and their reviews of respective systems highlighted limited

resources for OSH surveillance, which is also related to the lack of or less frequent evaluation for

the systems. Experts reported barriers to evaluation including staff time and budget, as well as

difficulty in reaching out to stakeholders. A major implication is the consideration of flexibility

and feasibility in deciding evaluation objectives and criteria. Although feasibility is always a

consideration, it seemed to be more remarkable with OSH surveillance, as the panel experts

pointed out that many criteria were ideal but not readily feasible due to limited resources.
41

The identified network relationships among elements are similar to another existing study

on animal health surveillance systems 35. Further research is needed to comprehensively

understand interactions among OSH surveillance components and attributes and to investigate

weights for elements based their interactions.

This study highlighted the need for tailored guidelines for OSH surveillance evaluation.

According to the panel experts, they struggled with developing a systematic evaluation approach

and current guidelines were not designed to accommodate systems for non-communicable

diseases and systems that rely on multiple data sources such as OSH. Relevant components,

attributes and measures identified in this study can guide more relevant evaluation of OSH

surveillance systems. The study team is further working on a more detailed evaluation

framework based on study findings to bridge this gap.

Limitations

A few limitations should be recognized in this study. Similar to any other consensus

methods, the Delphi technique is subject to cognitive bias in information exchange and decision-

making processes related to the participants’ backgrounds and personal traits 36. Potential

participants were identified using multiple information sources, however, despite the effort to

recruit a panel representing both the national and state level OSH surveillance systems, most

panel participants in this study were from state-level surveillance systems.

Participation rate in the second and the third round were 80% and 70% respectively in

this study. The attrition may impact the quality of information being gathered in this study.

Participant fatigue and the associated attrition has been recognized as a particular challenge in

Delphi studies given its iterative feature 18,26. However, as compared to a 16% participation rate
42

in the third round for large Delphi studies reported in literature 16, the participation rates in this

study were relatively high. This study adopted multiple recommendations in literature to

minimize attrition including the use of endorsed individuals, a modified Delphi process using

structured (closed-ended) items in the first round, frequent reminders and active contacts even

after the specified deadlines 26.

Researchers have warned that even the most well-planned Delphi may not yield an

exhaustive nor all-inclusive set of ideas 25. The finalized list in this study may not cover every

aspect of OSH surveillance given limited existing reference for this field. With continued efforts

promoting OSH surveillance and its evaluation, more elements that are relevant and practical to

OSH surveillance evaluation may be added into the list.

Conclusion

This paper described the development of a comprehensive list of components, attributes,

and example measures relevant to OSH surveillance systems in the US by conducting a three-

round Delphi survey study with experts across the US who had experience in OSH surveillance

operation and/or with surveillance evaluation. The study results suggested that a Delphi process

delivered valuable findings through panel consensus based on both quantitative and qualitative

analyses. Findings from the experts’ reviews of their respective OSH surveillance systems and

their comments provided contextual information to understand the current status of OSH

surveillance and barriers to it.

A framework developed from these results can serve as a flexible, yet standard guide for

evaluating OSH surveillance systems. The evaluators could choose components and attributes

that are appropriate to their systems and using example measures to develop their own evaluation
43

metrics. Future research and practice are needed to further refine and enrich the list and explore

its applicability.
44

Tables & figures

Table 2-1 Preset Criteria for Experts in the Delphi Study

No. Criteria Description


1 OSH surveillance More than one year’s experience working as a key staff in
experience OSH surveillance systems in the U.S.
2 Evaluation experience Experience in evaluating OSH surveillance systems.
3 Education Bachelor’s degree or higher in public health, safety, or related
fields.
4 Publications More than three peer-reviewed publications/book or book
chapter on occupational safety and health topics in the past
five years, among which at least one focusing on OSH
surveillance.
5 Conference presentations More than three national level conference presentations on
occupational safety and health topics in the past five years,
among which at least one focusing on OSH surveillance.
6 Professional involvement Current member or committee in professional organizations of
occupational safety and health, public health surveillance, or
program evaluation.
45

Table 2-2 Summary statistics for framework elements (average, range)

Delphi %Panelists
Element Category #Element Mean rating Mode Difference SD
process rated 3 or above
Component After 1st All 59 4.3 (3.4, 4.9) 4.6 (3, 5) 2.0 (1, 3) 0.7 (0.3, 1.2) 95 (80, 100)
review High consensus 34 4.6 (4.0, 4.9) 4.9 (4, 5) 1.6 (1, 2) 0.6 (0.3, 0.9) 99 (90, 100)
(n=10)
For confirmation 22 3.9 (3.4, 4.5) 4.1 (3, 5) 2.6 (1, 3) 0.9 (0.5, 1.2) 90 (80, 100)
Dropped 3 4.1 (3.7, 4.6) 4.3 (4, 5) 2.0 (2, 2) 0.7 (0.7, 0.7) 97 (90, 100)
After 2nd High consensus 45 4.5 (4.0, 4.9) 4.9 (4, 5) 1.6 (1, 2) 0.6 (0.3, 0.9) 98 (80, 100)
review (n=8) Low consensus 19 3.8 (3.5, 4.4) 4.1 (3, 5) 2.7 (2, 3) 0.9 (0.6, 1.2) 88 (60, 100)
Dropped 1 4 5 3 1.3 50
Final list All 64 4.3 (3.5, 4.9) 4.6 (3, 5) 2.0 (1, 3) 0.7 (0.3, 1.2) 95 (60, 90)
Attribute After 1st All 32 4.2 (3.6, 4.8) 4.5 (3, 5) 2.0 (1, 3) 0.8 (0.4, 1.2) 96 (80, 100)
review (n=8) High consensus 24 4.4 (4.0, 4.8) 4.6 (4, 5) 1.7 (1, 2) 0.7 (0.4, 0.9) 99 (90, 100)
For confirmation 8 3.9 (3.6, 4.3) 4.1 (3, 5) 2.8 (2, 3) 1.0 (0.7, 1.2) 89 (80, 100)
Dropped 0 / / / / /
After 2nd High consensus 26 4.4 (4.0, 4.8) 4.6 (4, 5) 1.7 (1, 2) 0.7 (0.4, 0.9) 99 (90, 100)
review (n=7) Low consensus 5 3.7 (3.3, 4.0) 3.6 (3, 5) 2.8 (2, 3) 0.9 (0.7, 1.1) 90 (80, 100)
Dropped 1 4.1 5 2 0.9 90
Final list All 31 4.3 (3.3, 4.8) 4.5 (3, 5) 1.9 (1, 3) 0.7 (0.4, 1.1) 98 (80, 100)
Measure After 1st All 133 3.8 (2.5, 4.8) 4.0 (2, 5) 2.3 (1, 4) 0.9 (0.4, 1.4) 92 (50, 100)
review (n=8) High consensus 55 4.2 (4.0, 4.8) 4.6 (4, 5) 1.9 (1, 2) 0.7 (0.5, 1.0) 99.8 (88, 100)
For confirmation 71 3.6 (3.0, 4.3) 3.6 (3, 5) 2.5 (1, 4) 0.9 (0.4, 1.4) 89 (63, 100)
Dropped 7 3.0 (2.5, 3.8) 2.9 (2, 4) 2.9 (2, 3) 1.1 (0.9, 1.3) 66 (50, 88)
After 2nd High consensus 63 4.2 (4.0, 4.8) 4.6 (4, 5) 1.9 (1, 2) 0.7 (0.5, 1.0) 99.8 (88, 100)
review (n=7) Low consensus 53 3.5 (3.0, 4.1) 3.5 (3, 5) 1.8 (0, 4) 0.7 (0.0, 1.2) 94 (75, 100)
Dropped 10 2.8 (2.6, 3.1) 2.9 (2, 3) 2.5 (2, 3) 0.8 (0.7, 1.1) 69 (63, 88)
Final list All 116 3.9 (3.0, 4.8) 4.1 (3, 5) 1.9 (0, 4) 0.7 (0.0, 1.2) 97 (75, 100)
46

Table 2-3 Major themes from experts' comments (n=10)


Theme codes Frequency
Status quo and issues
Resources and feasibility 22
Rely on secondary data 10
Authority and data collection 8
Gaining buy-in 8
Ideal OSH surveillance
Data collection 11
Data dissemination 9
Plan and strategy 7
Implications for evaluation
Inter-relatedness among elements 55
Flexibility 25
Importance of framework 10
Stage of evaluation 8
Weights of elements 8
47

Figure 2-1 Selected components and attributes of OSH surveillance in logic model (Elements marked yellow were selected with low consensus)
48

Reference

1. Bureau of Labor Statistics. Injuries, Illnesses, and Fatalities (IIF). Published 2020.
Accessed March 19, 2020. https://www.bls.gov/iif/home.htm

2. Landrigan PJ. Improving the surveillance of occupational disease. Am J Public Health.


1989;79(12):1601–1602.

3. National Academies of Sciences, Engineering, and Medicine. A Smarter National


Surveillance System for Occupational Safety and Health in the 21st Century. the National
Academies Press; 2018. Accessed January 20, 2018.
http://www8.nationalacademies.org/onpinews/newsitem.aspx?RecordID=24835

4. Centers for Disease Control and Prevention. Overview of Evaluating Surveillance


Systems.; 2013.

5. Centers for Disease Control and Prevention. Updated guidelines for evaluating public
health surveillance systems: recommendations from the guidelines working group. Morb
Mortal Wkly Rep. 2001;50(No. 13):1-36.

6. Drewe JA, Hoinville LJ, Cook AJC, Floyd T, Staerk KDC. Evaluation of animal and
public health surveillance systems: a systematic review. Epidemiol Infect.
2012;140(4):575-590. doi:10.1017/S0950268811002160

7. Hoinville LJ, Alban L, Drewe JA, et al. Proposed terms and concepts for describing and
evaluating animal-health surveillance systems. Prev Vet Med. 2013;112(1-2):1-12.
doi:10.1016/j.prevetmed.2013.06.006

8. World Health Organization. Communicable Disease Surveillance and Response Systems:


Guide to Monitoring and Evaluating. World Health Organization; 2006.
http://apps.who.int/iris/bitstream/10665/69331/1/WHO_CDS_EPR_LYO_2006_2_eng.pd
f

9. Auer AM, Dobmeier TM, Haglund BJ, Tillgren P. The relevance of WHO injury
surveillance guidelines for evaluation: learning from the aboriginal community-centered
injury surveillance system (ACCISS) and two institution-based systems. BMC Public
Health. 2011;11:744. doi:10.1186/1471-2458-11-744

10. Patel K, Watanabe-Galloway S, Gofin R, Haynatzki G, Rautiainen R. Non-fatal


agricultural injury surveillance in the United States: A review of national-level survey-
based systems. Am J Ind Med. 2017;60(7):599-620. doi:10.1002/ajim.22720

11. Calba C, Goutard FL, Hoinville L, et al. Surveillance systems evaluation: a systematic
review of the existing approaches. BMC Public Health. 2015;15:448. doi:10.1186/s12889-
015-1791-5
49

12. Shire JD, Marsh GM, Talbott EO, Sharma RK. Advances and Current Themes in
Occupational Health and Environmental Public Health Surveillance. Annu Rev Public
Health. 2011;32(1):109-132. doi:10.1146/annurev-publhealth-082310-152811

13. Arrazola J. Assessment of Epidemiology Capacity in State Health Departments — United


States, 2017. Morb Mortal Wkly Rep. 2018;67. doi:10.15585/mmwr.mm6733a5

14. CSTE. 2013 National Assessment of Epidemiology Capacity: Findings &


Recommendations. Council of State and Territorial Epidemiologiests (CSTE); 2014.

15. Birrell L. Developing evidence-based guidelines in occupational health. Occup Med.


2001;51(2):73-74. doi:10.1093/occmed/51.2.073

16. Foth T, Efstathiou N, Vanderspank-Wright B, et al. The use of Delphi and Nominal Group
Technique in nursing education: A review. Int J Nurs Stud. 2016;60:112-120.
doi:10.1016/j.ijnurstu.2016.04.015

17. Murphy MK, Black NA, Lamping DL, et al. consensus development methods, and their
use in clinical guideline development. Health Technol Assess. 1998;2(3):1-88.

18. Adler M, Ziglio E. Gazing Into the Oracle: The Delphi Method and Its Application to
Social Policy and Public Health. Jessica Kingsley Publishers; 1996.

19. Linstone HA, Turoff M. The Delphi Method: Techniques and Applications. First Edition.
Addison-Wesley Educational Publishers Inc; 1975.

20. Powell C. The Delphi technique: myths and realities. J Adv Nurs. 2003;41(4):376-382.
doi:10.1046/j.1365-2648.2003.02537.x

21. Hsu C-C, Sandford BA. The Delphi technique: making sense of consensus. Pract Assess
Res Eval. 2007;12(10):1–8.

22. Rowe G, Wright G. The Delphi technique as a forecasting tool: issues and analysis. Int J
Forecast. 1999;15(4):353-375. doi:10.1016/S0169-2070(99)00018-7

23. Boulkedid R, Abdoul H, Loustau M, Sibony O, Alberti C. Using and Reporting the Delphi
Method for Selecting Healthcare Quality Indicators: A Systematic Review. Wright JM, ed.
PLoS ONE. 2011;6(6):e20476. doi:10.1371/journal.pone.0020476

24. Habibi A, Sarafrazi A, Izadyar S. Delphi Technique Theoretical Framework in Qualitative


Research. Int J Eng Sci. 2014;3(4):8-13.

25. Hasson F, Keeney S. Enhancing rigour in the Delphi technique research. Technol Forecast
Soc Change. 2011;78(9):1695-1704. doi:10.1016/j.techfore.2011.04.005

26. Hsu C-C, Sandford BA. Minimizing Non-Response in The Delphi Process: How to
Respond to Non-Response. Pract Assess Res Eval. 2007;12(17):1-6.
50

27. Sinha IP, Smyth RL, Williamson PR. Using the Delphi Technique to Determine Which
Outcomes to Measure in Clinical Trials: Recommendations for the Future Based on a
Systematic Review of Existing Studies. PLoS Med. 2011;8(1):e1000393.
doi:10.1371/journal.pmed.1000393

28. El Allaki F, Bigras-Poulin M, Ravel A. Conceptual evaluation of population health


surveillance programs: Method and example. Prev Vet Med. 2013;108(4):241-252.
doi:10.1016/j.prevetmed.2012.11.001

29. Greatorex J, Dexter T. An accessible analytical approach for investigating what happens
between the rounds of a Delphi study. J Adv Nurs. 2000;32(4):1016-1024.

30. Makkonen M, Hujala T, Uusivuori J. Policy experts’ propensity to change their opinion
along Delphi rounds. Technol Forecast Soc Change. 2016;109:61-68.
doi:10.1016/j.techfore.2016.05.020

31. Bolger F, Wright G. Improving the Delphi process: Lessons from social psychological
research. Technol Forecast Soc Change. 2011;78(9):1500-1513.
doi:10.1016/j.techfore.2011.07.007

32. Meijering JV, Tobi H. The effect of controlled opinion feedback on Delphi features:
Mixed messages from a real-world Delphi experiment. Technol Forecast Soc Change.
2016;103:166-173. doi:10.1016/j.techfore.2015.11.008

33. Choi BCK. The Past, Present, and Future of Public Health Surveillance. Scientifica.
2012;2012:1-26. doi:http://dx.doi.org/10.6064/2012/875253

34. Health Canada. Framework and Tools for Evaluating Health Surveillance Systems.
Population and Public Health Branch, Health Canada; 2004. Accessed October 18, 2017.
http://publications.gc.ca/site/eng/260337/publication.html

35. Calba C, RVC JD, Goutard F, et al. The Evaluation Attributes Used for Evaluating Animal
Health Surveillance Systems. RISKSUR Project; 2013.

36. Winkler J, Moser R. Biases in future-oriented Delphi studies: A cognitive perspective.


Technol Forecast Soc Change. 2016;105:63-76. doi:10.1016/j.techfore.2016.01.021
51

CHAPTER 3 Guiding Framework for Evaluating Occupational Safety and

Health Surveillance Systems

Liu Yang a
Laurel Kincl a

a
College of Public Health and Human Sciences, Oregon State University, 123 Women’s
Building, Corvallis, Oregon, 97331, USA
52

Abstract

Background: Despite increased interest in public health surveillance and its evaluation, no

tailored framework has been developed for evaluating Occupational Safety and Health (OSH)

surveillance in the United States (US).

Methods: A Delphi study was conducted with an expert panel to select framework elements

(surveillance components, attributes and measures) through consensus. The findings were shared

with experts and field practitioners in OSH surveillance in the US through the Council of State

and Territorial Epidemiologists (CSTE) Occupational Health (OH) Subcommittee. Structured

feedback was obtained on the applicability of identified elements and suggested content for a

tailored framework. This feedback guided the development of a tailored evaluation framework

for OSH surveillance.

Results: A total of ten experts from the Delphi panel and nine field professionals provided

feedback on proposed elements and guidance on their use. Suggestions covered multiple aspects

for a framework, including offering flexibility, providing recommended and optional elements,

considering stages of surveillance, and being simple and straightforward. The guiding framework

has been developed to incorporate the final elements selected in the Delphi study as well as

experts’ suggestions. The framework includes practical guidance on evaluation steps and four

different evaluation types, as well as practical tools such as template working tables.

Conclusion: The developed framework is expected to address the lack of a practical guiding tool

in OSH surveillance evaluation. It is expected with use, the framework will provide a

standardized approach to improve surveillance of occupational illnesses and injuries.


53

Introduction

Despite increased interest in public health surveillance and its evaluation, no tailored

framework has been developed for evaluating Occupational Safety and Health (OSH)

surveillance in the United States (US). Evaluators refer to commonly used guidelines,

represented the Updated guidelines for evaluating public health surveillance systems by CDC, to

evaluate OSH surveillance systems 1. However, the CDC guidelines provide limited guidance for

evaluating OSH surveillance systems, which tend to have limited resources and lack components

common to other types of public health surveillance 2–4.

Public health surveillance systems are usually evaluated by attributes, which refer to a set

of measurable characteristics regarding certain components in a surveillance system, such as

simplicity, flexibility, data quality, acceptability 1. To systematically select components,

attributes, and measures for each attribute, a Delphi study was conducted. A panel of experts

across the US reviewed and rated elements (surveillance components, attributes and measures)

compiled from the literature. The panel identified relevant elements that could be used in a

framework for OSH surveillance evaluation.

Using a translational study design, findings from the Delphi study were disseminated to

the potential users of the framework, i.e., health professionals and practitioners in national and

state-level OSH surveillance systems. Structured feedback was collected on the applicability of

identified elements and suggested contents in a tailored framework. This paper presents the

resulting framework reflecting stakeholders’ perspective, which is promising to be a practical

tool for frontline OSH surveillance evaluators.


54

Methods

A Delphi study was conducted from March 2019 to August 2019 to propose elements

(surveillance components, attributes and example measures) that are relevant to OSH

surveillance and its evaluation for a tailored evaluation framework. An expert panel of ten

members with experience in the operation and evaluation of OSH surveillance systems reviewed

and rated a draft list of elements, resulting in consensus on a total of 64 OSH surveillance

components, 31 attributes and 116 example measures, with 134 (62.3%) reaching a high level of

consensus (manuscript under review).

In the Delphi study, experts answered open and closed-end questions regarding the

development of a tailored guiding evaluation framework. After the study, all selected elements

and a proposed framework sketch were disseminated back to Delphi experts as well as field

professionals in national and state-level OSH surveillance systems through the Council of State

and Territorial Epidemiologists (CSTE) Occupational Health (OH) Subcommittee. The follow-

up sought opinions regarding selected elements and framework content from experts and

professionals via 1) a presentation and in-person communication with participants in the CSTE

Occupational Health (OH) Subcommittee annual winter meeting (December 2020, Miami, FL)

and 2) written feedback from interested experts and professionals via emails. Table 3-1 includes

questions in the Delphi study and the follow-up opinion seeking. All feedback were compiled

with themes and descriptive statistics to guide the development of this OSH evaluation

framework.
55

Table 3-1 Questions in the Delphi study and follow-up opinion seeking

Study Questions
Delphi Study What other comments do you have for developing the tailored evaluation
framework, in addition to those you have provided with specific elements?
What other contents besides the surveillance components and attributes do you
think need to be included in the tailored evaluation framework?
Is it good to group framework elements into "recommended" and "optional"
based on the panel's mean ratings?
What other suggestions do you have for the to-be-developed OSH surveillance
evaluation framework?
Follow-up opinion seeking Is there any selected element for OSH surveillance you want to comment on?
Should elements selected with low consensus be excluded from the framework
or included as optional items?
Can the mean rating be used as weight for a given element?
Any specific content should be removed/added from the framework sketch?

Results

Experts Opinions Regarding the Framework

A total of ten Delphi panel experts answered open and closed-end questions regarding the

development of a guiding evaluation framework in the Delphi study. Nine field professionals

(including three experts in the Delphi panel) provided feedback regarding the selected elements

and the development of the guiding framework through written and in-person communications in

the follow-up review. Table 3-2 summarized major comments and suggestions from experts for

the key aspects of an ideal evaluation framework.

Table 3-2 Major comments and suggestions for an evaluation framework from experts in the Delphi study (n=10)
and the follow up review (n=9)
Aspect Major comments and suggestions
Low consensus elements Six experts in the follow-up (67%) suggested to include elements with low
consensus, given:
1) They may be relevant to advanced level OSH surveillance systems;
2) It allows for customization in the evaluation.
Be simple and Ten experts (53%) emphasized that the framework needs to be clear and simple to be
straightforward a practical and useful tool for surveillance staff.
Ways to make the framework practical include:
56

1) Reduce the number of elements required for an evaluation;


2) Be realistic in setting evaluation criteria for elements;
3) Make the framework short rather than lengthy;
4) Provide visualization to aid the evaluation, e.g., logic model.
Recommended vs. Ten experts (53%) suggested to categorize elements into recommended and optional
optional elements based on panel ratings.
Framework could be customizable by organizing elements:
1) Based on high/low consensus;
2) Make recommended elements those with very high mean ratings.
Types of evaluation Eight experts in the Delphi panel (80%) suggested that framework should provide
guidance on selecting elements for different evaluations, which may be based on the
system's maturity and evaluation purposes: need assessments, process evaluations,
outcome evaluations, rapid/brief evaluations, and comprehensive evaluations.

Weights and quantitative Experts agreed that the framework could suggest elements that are more important to
scoring grid OSH surveillance, but they were unable to comment more on weights due to lack of
expertise.
Experts raised concern that numerical scales (the quantitative scoring grid) may not
be as interpretable as qualitative summary.
Practicality Six experts (32%) confirmed that the list of elements was thorough and
comprehensive.
Four experts (21%) commented that some of the elements are not feasible for basic
OSH surveillance systems or are unrealistic to measure them.
Three experts (16%) identified elements that reflect features more of a complete
system, describing current OSH surveillance as a "loosely defined" system relying on
data systems that are not owned/managed by organizations conducting the
surveillance.

Other comments on Experts commented that the framework should provide:


specific elements
1) An overview of evaluation methods, how to develop and conduct evaluations;
2) Probing questions to assess each attribute would be useful;
3) A start which can be amended as more people attempt to use the tool;

All experts believed that it is meaningful to develop a tailored framework for OSH

surveillance evaluation. Most experts in the follow-up (67%) explicitly suggested to include low

panel consensus elements, thinking they could be optional choices in a flexible framework. A

common suggestion was to group elements as “recommended” or “optional” to allow for

customization. Most panel experts (80%) agreed that the framework should provide guidance for

different types of evaluations based on system’s maturation stage. Many emphasized the
57

importance of keeping the framework simple and straightforward. One expert commented, the

framework “needs to be very clear and straightforward. Otherwise it would not be useful.”

Another expert noted that OSH surveillance systems “may not have resources” and surveillance

staff are “not trained evaluator”. While we had hoped to have the experts consider how to

differentiate elements based on their level of relevance to OSH surveillance, they were unable to

determine how best to appropriately weight elements.

The Developed Framework

The OSH surveillance evaluation framework (see Supplement) has been developed to

incorporate the selected elements and experts’ suggestions, with an aim to assist with evaluation

of OSH surveillance systems being carried out by public health departments and institutions at

various jurisdictions in the US. The framework offers practical guidance in two main parts: 1)

steps on formulating and implementing an evaluation and 2) guides for four different types of

evaluation. Template working tables and other tools are included to aid the implementation of an

evaluation.

Discussion and conclusion

The OSH surveillance evaluation framework has been developed following a principle of

being “simple and straightforward” suggested by experts. We adapted the six-step tasks

recommended in the CDC updated guidelines and other participatory based evaluation models

into a more intuitive and practical 4-step procedure, in which engaging stakeholders is not

recommended as a main task in OSH surveillance evaluation 1,5. For resource limited

surveillance systems and systems largely relying on secondary data sources such as OSH

surveillance systems, stakeholders are usually limited to data providers and some data user with
58

which contacts can possibly be established. The situation makes it neither meaningful nor

feasible to emphasize the engagement of stakeholders in an evaluation.

Four types of evaluation were proposed in this framework based on surveillance stages. A

prevailing message from experts was to offer flexibility of customization. We grouped elements

into recommended and optional for each type of evaluation, with the objective to simplify each

type of evaluation while provide flexibility to tailor by the evaluators. Further, unlike the

common surveillance evaluation guidelines stood by the CDC updated guidelines, we do not

recommend assessing some quality related attributes in a normal evaluation, including

sensitivity, specificity and predictive value positive (PVP). Our experience has demonstrated that

data and resources available for a normal OSH surveillance evaluation usually would not allow

an in-depth analysis on data quality attributes (manuscript in press). Rather, we proposed a type

of evaluation, i.e., data quality evaluation to specifically address these attributes.

A logic model is included in the framework as a useful way to visualize elements on the

OSH surveillance organization flow. Logic models have been suggested in many existing guides

for program and surveillance evaluation 1,5,6,15. By incorporating all recommended components

and attributes into one comprehensive graphic presentation, the logic model provides an easy and

straightforward way for evaluators to check and select elements for their tailored evaluations.

It is expected the framework addresses the lack of a practical guiding tool in OSH

surveillance evaluation and helps to promote more evaluations and in turn to improve OSH

surveillance systems. Despite intending to be comprehensive, it is likely that not all components

and attributes relevant to OSH surveillance have been included in this framework given the wide

variety of OSH surveillance activities. Further, the recommended components and attributes for
59

each type of evaluation may need further refinement with more practice. Application of this

framework can help to continuously improve the framework, as one Delphi panel expert

commented, “it will be helpful in the development of a draft [the framework], which can be

amended as more people attempt to use the tool.” Although the framework is intended for OSH

surveillance systems in the US, it is promising to provide a reference for OSH surveillance in

other contexts.
60

References

1. Centers for Disease Control and Prevention. Updated guidelines for evaluating public
health surveillance systems: recommendations from the guidelines working group. Morb
Mortal Wkly Rep. 2001;50(No. 13):1-36.

2. National Academies of Sciences, Engineering, and Medicine. A Smarter National


Surveillance System for Occupational Safety and Health in the 21st Century. the National
Academies Press; 2018. Accessed January 20, 2018.
http://www8.nationalacademies.org/onpinews/newsitem.aspx?RecordID=24835

3. CSTE. 2013 National Assessment of Epidemiology Capacity: Findings &


Recommendations. Council of State and Territorial Epidemiologiests (CSTE); 2014.

4. Arrazola J. Assessment of Epidemiology Capacity in State Health Departments — United


States, 2017. Morb Mortal Wkly Rep. 2018;67. doi:10.15585/mmwr.mm6733a5

5. Harris MJ. Evaluating Public and Community Health Programs. John Wiley & Sons,
Incorporated; 2016. Accessed May 1, 2020.
http://ebookcentral.proquest.com/lib/osu/detail.action?docID=4722974

6. Centers for Disease Control and Prevention (CDC), Office of Strategy and Innovation.
Introduction to Program Evaluation for Public Health Programs: A Self-Study Guide.
Atlanta, GA: Centers for Disease Control and Prevention; 2011.
http://www.cdc.gov/eval/guide/cdcevalmanual.pdf

7. Choi BCK. The Past, Present, and Future of Public Health Surveillance. Scientifica.
2012;2012:1-26. doi:http://dx.doi.org/10.6064/2012/875253

8. Allaki FE, Bigras-Poulin M, Michel P, Ravel A. A Population Health Surveillance Theory.


Epidemiol Health. 2012;34:e2012007. doi:10.4178/epih/e2012007

9. Austin C. An evaluation of the census of fatal occupational injuries as a system for


surveillance. Compens Work Cond. 1995;1:51–54.

10. Patel K, Watanabe-Galloway S, Gofin R, Haynatzki G, Rautiainen R. Non-fatal


agricultural injury surveillance in the United States: A review of national-level survey-
based systems. Am J Ind Med. 2017;60(7):599-620. doi:10.1002/ajim.22720

11. Nobles RE. Process evaluation of the Texas occupational safety & health surveillance
system. Published online 2009. Accessed November 10, 2017.
https://search.proquest.com/openview/bbf7eddd1106df9c4ea7c4ea18613bfb/1?pq-
origsite=gscholar&cbl=18750&diss=y

12. Largo TW, Rosenman KD. Surveillance of work-related amputations in Michigan using
multiple data sources: results for 2006-2012. Occup Environ Med. 2015;72(3):171-176.
doi:10.1136/oemed-2014-102335
61

13. Rosenman KD, Kalush A, Reilly MJ, Gardiner JC, Reeves M, Luo Z. How Much Work-
Related Injury and Illness is Missed By the Current National Surveillance System? J
Occup Environ Med. 2006;48(4):357–365. doi:10.1097/01.jom.0000205864.81970.63

14. Davis LK, Hunt PR, Hackman HH, McKeown LN, Ozonoff VV. Use of statewide
electronic emergency department data for occupational injury surveillance: a feasibility
study in Massachusetts. Am J Ind Med. 2012;55(4):344-352. doi:10.1002/ajim.21035

15. McNabb SJN, Surdo AM, Redmond A, et al. Applying a new conceptual framework to
evaluate tuberculosis surveillance and action performance and measure the costs,
Hillsborough County, Florida, 2002. Ann Epidemiol. 2004;14(9):640-645.
doi:10.1016/j.annepidem.2003.09.021
62

Supplement: A Guiding Framework

Guiding Framework for Evaluating Occupational Safety and Health

Surveillance Systems

Table of Contents

1 Introduction ……………………………………………………………………………62

2 How to use this framework ……………………………………………………………62

3 Evaluation types, steps and standards ……………………………………………………63

3.1 Evaluation types ……………………………………………………………………63

3.2 Evaluation steps ……………………………………………………………………65

3.3 Evaluation standards ……………………………………………………………………79

4 Initial/need evaluation ……………………………………………………………………80

5 Routine/rapid evaluation ……………………………………………………………81

6 Comprehensive evaluation ……………………………………………………………83

7 Data quality evaluation ……………………………………………………………85

8 Weighting attributes ……………………………………………………………………87

9 Key concepts and definitions ……………………………………………………………89

10 Appendices …………………………………………………………………………..164
63

1 Introduction

Occupational Safety and Health (OSH) surveillance is the ongoing, systematic collection,

analysis, interpretation and dissemination of data on work-related health outcomes and on the

presence of health and safety hazards that are essential to OSH practices and research. OSH

surveillance systems should be evaluated periodically to ensure that problems of OSH

importance are being monitored effectively and to promote the best use of public health

resources.

To aid the evaluation of OSH surveillance systems, the guiding framework has been

developed based on framework elements (OSH surveillance components, attributes and

measures) identified from experience and research informed by field experts in OSH surveillance

systems in the US (manuscripts in press/under review). The framework is intended for use of

OSH surveillance systems carried out by public health departments and relevant institutions at

various jurisdictions in the US. However, it may provide a reference for evaluating OSH

surveillance in other contexts.

2 How to use this framework

A logic model is provided to organize all evaluation elements included in this framework

(see Figure 3-4 and section 4 through 7). This framework aims to assist evaluators to plan and

implement four types of evaluations relevant to OSH surveillance:

• Initial/need evaluation,

• Routine/rapid evaluation,

• Comprehensive evaluation, and


64

• Data quality evaluation.

These types of OSH surveillance evaluation and steps in conducting an evaluation are

described in Section 3. The following Sections 4 to section 7 contain specific guidance for each

type of evaluation. For each type, minimum components and attributes to be considered are

recommended. The framework concludes with suggestions to weight attributes for an evaluation

(section 8). Appendices include templated working tables and other tools to aid the

implementation of an evaluation.

3 Evaluation types, steps and standards

3.1 Evaluation types

An OSH surveillance system may be composed of new and ongoing initiatives/activities

of systematic collection and analysis of data regarding the occurrence and trend of OSH events.

A new surveillance initiative begins with identification of event(s) of OSH importance and

assessment of surveillance strategies and organizational infrastructure supporting the

surveillance of the event(s). After implementation, the ongoing surveillance activities produce

data and statistics which are disseminated for wide use in OSH prevention action.
65

Figure 3-1 Stages of OSH surveillance activities and types of evaluation corresponding to each

stage

For a new initiative, initial/need evaluation is needed to establish a baseline for ongoing

evaluation. A data quality evaluation may also be needed to provide in-depth information on data

sources and analysis methodology. When the initiative is in ongoing operation stage,

routine/rapid evaluation is recommended every year while comprehensive evaluation is

recommended for every three years. Data quality evaluation may be needed from time to time.

Table 3-3 introduces objectives and suitable situations for each of the four types of evaluation.
66

Table 3-3 Objectives and suitable situations by type of OSH surveillance evaluation.

1) Initial/Need Evaluation

· To document the public health importance of


the OSH event(s) being surveilled;
· New surveillance initiative(s);
· To describe the purpose of the surveillance
· Major change(s) occurring to impact
activities;
Suitable any part of the system (e.g. data
Objectives · To assess the feasibility of the surveillance
for sources/standards, technical
strategies and activities;
platform, significant program
· To understand the availability of input for the
integration)
surveillance activities;
· To set a baseline for future evaluations.

2) Routine/Rapid Evaluation

· To assess the implementation of planned


· Recommended every year;
activities; Suitable
Objectives · Limited time and resources for the
· To identify issue impacting the system's for
evaluation.
stability.

3) Comprehensive Evaluation

· To determine if surveillance activities are


being implemented as planned and effectively; · Recommended every three years;
· To determine if resources and input are · Significant changes occurring;
Suitable
Objectives sustainable; · Stakeholders suggest an evaluation;
for
· To determine if the system is making proper · Significant timelines in the system
products and outcomes; (e.g., mid funding cycle)
· To determine the usefulness of the system.

4) Data Quality Evaluation


· To gain in-depth understanding of the data
being used and produced in the surveillance · New surveillance initiatives;
Suitable
Objectives system; · When data sources/surveillance
for
· To identify gaps and issues with data analysis methodologies changes
methodologies.

3.2 Evaluation steps

The framework proposes a four-step practical procedure for evaluating OSH surveillance

evaluation, which is centered with events under surveillance and cycled by ensuring the use of

evaluation findings. Figure 3-2 shows the diagram of the four evaluation steps that apply to all

four types of evaluation. Please keep in mind that an evaluation is often not a linear process,

rather it is an iterative process across these steps.


67

Figure 3-2 Four steps to evaluate an OSH surveillance system

Step 1: Design the evaluation

This is a key step in an evaluation as it sets the foundation for the next steps. It contains a

series of iterative tasks demonstrated in Figure 3-3.


68

Figure 3-3 Main tasks in Step 1: Design the evaluation

Form an evaluation team: Generally, an evaluation starts with forming an evaluation

team with one or two lead evaluators and a few key members including system staff, possible

external evaluators and stakeholders. In a routine/rapid evaluation, the team may have only one

or two members, usually the system staff. The lead evaluator has primary responsibility in

designing and implementing the evaluation, as well as suggesting team members. Requirements

for a good evaluation team include the following 1:

1
Adapted from:
[1] Centers for Disease Control and Prevention (CDC). Office of Strategy and Innovation. Introduction to Program Evaluation for
Public Health Programs: A Self-Study Guide. Atlanta, GA: Centers for Disease Control and Prevention; 2011.
[2] Harris MJ. Evaluating Public and Community Health Programs. John Wiley & Sons, Incorporated; 2016.
69

• Experience/sound understanding in the type of evaluation needed

• Able to establish contact and work with stakeholders

• Can develop practical evaluation approaches.

• Able to develop innovative approaches to address difficult evaluation questions

• Comfortable with quantitative/qualitative data sources and analysis

• Understands both the potential benefits and risks of evaluation and conform ethics and

confidentiality requirements in evaluation

• Able to educate program personnel in designing and conducting the evaluation if

needed

• Able to report full findings (i.e., will not gloss over or fail to report certain findings)

• Able to translate evaluation results into appropriate recommendations.

Stakeholders identified in next tasks may be invited into the evaluation team to help to

establish priorities for the evaluation and to facilitate evaluation resources. For example, in a

comprehensive evaluation or data quality evaluation, participation of main data providers and

data users help to determine appropriate scope of work and solicit in-depth information on how

the data are collected, analyzed and used. Stakeholders deemed appropriate to be in the

evaluation team should have fairly good understanding of the surveillance system’s objectives

and activities so that they are able to provide concrete suggestions for the evaluation.

Collect information to describe the system: Framing the evaluation design begins with

learning some details about the system under evaluation. Two key questions are 1) what event(s)

are under surveillance and 2) what stage(s) they are in. Typically, an OSH surveillance system
70

may include multiple surveillance initiatives/activities covering many events using data from a

wide variety of sources. For each surveillance initiative/activity, information needs to be

collected on the public health importance of the event(s) (e.g., incidence and prevalence,

population affected, costs associated with the event), purposes of the surveillance, surveillance

strategies (e.g., case definition, data sources, data analysis methods, technical platforms),

expected outputs and outcomes, stakeholders and the infrastructure and supporting functions in

the system to support the surveillance initiative/activity. Drawing a logic model of your system is

suggested to visualize the input and the organization flow of the system. Figure 3-4 is an

example of a comprehensive logic model that organizes common components for OSH

surveillance systems as well as attributes. Appendix 3-2 and Appendix 3-3 describe in details

all of the components and attributes.


71

Figure 3-4 Common OSH surveillance components and attributes in Logic Model
72

Main sources for collecting documentation to describe the system are listed below:

• Reports and materials from previous evaluations;

• Funding grant proposals, strategic plans or workplans;

• Working reports and copies of educational materials distributed;

• Logs, rosters, meeting minutes, administrative records and other notes;

• Newsletters, press releases, social media posts;

• Surveillance data and database platforms;

• Registration, enrollment, or intake form;

• Records held by funders and/or collaborators;

• The system’s web pages and other related web pages;

• Graphs, maps, charts, photographs, videotapes.

In addition to documentation review, surveillance information can be collected by

interviewing system staff and other stakeholders or by onsite observation. A complete list of

common questions/aspects to consider in describing the system are listed in Appendix 3-1.

Thorough data collection work usually happens once during an initial evaluation or the

first comprehensive evaluation. Once this information is collected, it may be used in future

evaluations with necessary updates. Information collected in this task may be used as evaluation

evidence later so making logs and documentation is needed. Understanding the system also helps

to define the scope of evaluation work and to select components and attributes to be evaluated.
73

Identify stakeholders: More often than not, the major stakeholders for an OSH

surveillance system are staff members responsible for the surveillance activities, data providers

and data users collaborating within federal and state health agencies. The developers and

maintainers of technical platforms such as Electronic Health Records (EHR) may also be

stakeholders. It is important to note that while they may be stakeholders, medical care providers,

employers and employees, and community members may be too up- or down-stream to consider.

Identified stakeholders can be invited into the evaluation team (if necessary) and/or participants

from which evaluation evidence are collected from. Stakeholder involvement in an evaluation

may be limited by the nature of evaluation. For example, in a routine/rapid evaluation, the

number of stakeholders can be minimum to none.

Determine the scope of evaluation: The evaluation purpose and objectives may be pre-

determined at the initiation of the evaluation project however the purpose and scope of

evaluation may still need to be tuned with information collected in describing the system. In

some situations, the evaluation team may be assigned a broad mission and needs to determine a

clear scope of work. An important consideration is to determine the type of evaluation that is

based on the main evaluation purpose and stages of selected surveillance activities.

Common purposes for an OSH surveillance evaluation include to:

• Demonstrate whether the system is functioning as planned and effectively;

• Examine if the system is utilizing resources efficiently;

• Identify gaps needing improvement and areas needing more surveillance activities;

• Determine if data collected and generated in the system are in good quality;
74

• Justifying funding and other supports.

With the determined type of evaluation, you must decide the components and attributes to

be included. Sections 4 through Section 7 introduce minimum and optional components and

attributes recommended for each type of evaluation that you can tailor for use of your specific

systems under evaluation. Deciding on components and attributes under evaluation is not as

straightforward as it seems. Infrastructure and resources supporting OSH surveillance activities

in the hosting agency (i.e., public health department or institution) may be shared by various

programs and functions in that agency or the OSH surveillance activities may be collaborative

efforts across different agencies. Further, an OSH surveillance system may include multiple

surveillance initiatives/activities, with each focusing different OSH events using data from a

wide variety of sources and being in different stage. The evaluation team needs to draw a clear

line on which to be included in the surveillance “system” under evaluation based on the purpose

of the evaluation. Feasibility on evidences and methods to assess these attributes is also an

important consideration. There is often an iterative process between “describing the system”,

“identifying stakeholders”, “determining the scope of evaluation”, and the later task “develop

evaluation methodology” with multiple team meetings.

Develop evaluation methodology: The performance of a surveillance system is typically

evaluated via attributes, which are measurable features of the corresponding surveillance

components. For each attribute selected for an evaluation, appropriate measures with possible

information source(s) must be identified. Appendix 3-4 provide example measures

corresponding to each attribute listed in Figure 3-4. Multiple measures may be needed for one

attribute with evaluation evidences being collected from multiple information sources. For

example, acceptability of stakeholders can be assessed by their participation rate in work


75

meetings (checking meeting minutes and/or interviewing staff), responsiveness working with the

surveillance staff (checking meeting minutes and/or interviewing staff), and/or the stakeholders’

own perspective (surveying or interviewing the stakeholders). However, a small number or

subset of effective measures can be used for routine and rapid evaluation. Availability of

information sources must be considered in selecting measures (e.g., stakeholders, data for

quantitative analysis), along with possible resources (e.g., time frame required for the evaluation)

and feasibility. Evaluation questions and data collection methods that take a great amount of

time, complex design and data analysis, and involvement of a large number of stakeholders may

not be appropriate for an evaluation with limited staff time and budget. If given a limited time

frame and/or resources, prioritize evaluation measures so that the most important and relevant

ones are selected.

For each attribute and each corresponding evaluation measure, methods on collecting and

analyzing evaluation evidence are usually specified with one or more operational evaluation

questions. For example, a measure of simplicity, “Ease of event/case ascertainment” is assessed

with evidence collected form the surveillance staff. Two operational questions may be used:

“Could you rate the ease of case ascertainment/definition for each event/case definition using 5-

point scale?” and “Are there some case definitions difficult or tricky to ascertain?”. Common

data collection methods include system document review, literature review, onsite observation.

Methods to collect evidence from stakeholders (including surveillance staff) include interviews

(in-person/phone/remote video), focus group discussion, and surveys (in-person/online).

Appendix 3-5 includes demonstration working tables that help to develop evaluation

measures and questions and sort them by data collection methods and participants, which will

facilitate data collection in step 2.


76

Make evaluation plans: At the conclusion of the evaluation design step, the evaluation

team should make a detailed plan to specify responsible persons and timelines for following

evaluation activities, including developing evaluation protocols, collecting and analyzing

evaluation evidences, reporting the findings, and proposing recommendations for system

improvement. Responsible person(s) for each activity should also be specified. Appendix 3-6

provide a template working table for the evaluation data collection plan. For a routine/rapid

evaluation or evaluation with a small team, evaluation plans and protocols may be documented

simple.

Though not as common as in other types of program evaluation, engaging stakeholders

may also be part of the evaluation to facilitate data collection and evaluation findings

dissemination. Ways to engage stakeholders may include enrolling them into the evaluation

team, workshops and presentations on the evaluation project and findings, individual feedback

regarding evaluation results and recommendations. Often there is no budget for an evaluation,

but if there is, the team may need to make a budget plan.

Step 2: Collect evaluation evidence

Following evaluation methodology developed in step 1, this step includes two specific

tasks, 1) preparing and 2) implementing data collection. In the preparation stage, detailed

protocols for conducting the data collection activities should be developed, such as onsite visit

plans or interview guides. Measures and evaluation questions developed in step 1 for a specific

information source (e.g., the surveillance staff) need to be sorted together and further refined for

operational guides of interviews and/or focus groups, and survey questionnaires, if applicable.

Appendix 3-7 provides sample interview guides adapted from an evaluation of a state-based
77

OSH surveillance system (manuscript in press). The validity of developed guides needs to be

pre-tested within the team or by relevant people (e.g., staff in other departments, or a small

number of stakeholders if possible) to finalize it before actually using them.

Evaluations often do not require ethics review, but consult with your relevant

Institutional Review Board to confirm before implementing the data collection work.

The implementation of data collection can be as simple as document/literature review,

convenience onsite visits and meeting with surveillance staff, or as complex as multiple

surveys/interviews/focus group discussions as well as formal onsite visits with more than one

interviewer/facilitator. In the latter condition, training(s) of data collection processes should be

considered to ensure data collection quality. In addition to in-person communication,

phone/virtue meeting may be time- and cost-efficient ways for interviews/focus groups. Survey

could be developed and delivered via common online survey tools such as Qualtrics, REDCap,

or SurveyMonkey. Each data collection activity should be documented in a log and saved for

future records. It is recommended to transcribe audio/video records and summarize collected

evidences/findings in a timely fashion. This helps to reflect possible lapses to adjust for future

data collections.

Step 3: Analyze and interpret the data

It is common in evaluation to use a mixed-methods approach combining both quantitative

and qualitative data.

• Quantitative data refer to those in numerical format and analyzed using statistical

methods. Quantitative data in an evaluation can be collected by reviewing documents/literature

or from stakeholders (usually with closed-ended questions). Quantitative data describes the
78

system and its performance, such as the number of data sources for a surveillance event, time

interval between obtaining surveillance data, annual funding and staff time (e.g., FTE), or

percentage of planned surveillance activities accomplished. Quantitative data in an evaluation are

usually analyzed for descriptive statistics, such as mean and range (for continuous data) and

frequency/percentage (for categorical data). A clear cut-off may be determined with quantitative

data to represent the level of system performance.

Quantitative data and in-depth statistical analysis are particularly essential in data quality

evaluation for attributes such as sensitivity and Predictive Value Positive (PVP).

• Qualitative data refer to those that do not involve numerical data or lend themselves to

statistical analysis. Qualitative data in an evaluation are usually collected with open-ended

questions with stakeholders in survey/interview/focus group. Qualitative data can be used as a

useful supplement to quantitative data to describe reasons, perceptions, experiences and attitudes.

For example, following a Yes/No closed-ended question asking participants about the usefulness

of the system, an open-ended question asking why provides deeper information for

interpretation. Expertise with content analysis is needed to analyze qualitative data, which aims

to abstract major themes and patterns in qualitative data. Appendix 3-9 provides further

reference guides including qualitative analysis.

Interpreting the evaluation evidence and findings is the important step to determine the

system’s performance regarding each attribute, which leads to evaluation conclusions and

recommendations. You may consider two ways to present the system’s performance. Most often

a surveillance system’s performance is presented in a descriptive way with clear statement of the

strengths and weaknesses of the system regarding each attribute. Another way is to convert

findings into quantitative scoring grid, in which performance of each attribute as well as the
79

overall system performance can be presented with numerical numbers that are easily compared.

However, a significant challenge in this method is to define clear cut-off of “good” and “bad”

performance. The scoring grid method is only recommended when meaningful and interpretable

cut-offs can be made.

Appendix 3-2 and Appendix 3-3 contain descriptions of each component and attribute

reflecting general performance criteria for OSH surveillance. However, the system’s

performance should be determined within the context of the evaluation purpose, the system’s

infrastructure and other resources, as well as other factors possibly affecting the system’s

performance. A discussion with the surveillance system leadership and other key members may

be needed before conclusions and recommendations are made.

Step 4: Report the findings

The evaluation project largely ends with findings being summarized into an evaluation

report, which can be kept in the system, submitted to relevant parties (e.g., funding agencies),

and disseminated to stakeholders and other interested people and entities. The evaluation report

generally includes introduction of the evaluation purpose and scope of work, description of the

surveillance system and activities, evaluation methods, evaluation results and findings,

recommendations and conclusions. A sample sketch of evaluation report can be found in

Appendix 3-8.

Ensure the use of evaluation findings

It is recommended that the evaluation team provides good documentation of the

evaluation evidence, findings and recommendations. It may be difficult to ensure that the

evaluation findings will be used if follow-up on the implementation of evaluation


80

recommendations is beyond the resources available for the evaluation team. It’s optimal if the

team can push the first step in translating evaluation findings into actions by discussing with the

system leaders and stakeholders for possible correction/improvement action plans.

The evaluation reports and materials collected should be referenced in future evaluations.

Gaps identified and recommendations proposed in a previous evaluation can inform the purpose

and scope of work in the next evaluation.

3.3 Evaluation quality standards

To design and implement a quality OSH surveillance evaluation, the evaluation team

should consider the following four standards, namely: utility, feasibility, propriety, and accuracy.

For each quality standard, requirements and possible aspects to consider are listed as below.

Standard Requirements Consider these aspects


Utility Requires that the information provided by the · Have key stakeholders who are available been
evaluation is useful. It encourages evaluation to reached and consulted?
be planned, conducted, and reported in ways that · Are methods developed appropriate to the
increases the likelihood of stakeholders’ using the intended evaluation purposes?
evaluation findings. · Are there specific methods or sources that will
enhance the credibility of the evaluations?

Feasibility Requires that the evaluation process be practical · Can the evaluation methods be implemented
and nondisruptive, that it acknowledge the within the time and budget available?
political nature of the evaluation, and that · Does the evaluation team have the expertise to
resources that are available for conducting the implement the chosen methods?
evaluation are used carefully. It specifies that · Are logistics and protocols realistic given the time
evaluations should be efficient and produce and resources available?
information that justify the expenses.

Propriety Requires that the evaluation be ethical and be · Are issues of confidentiality of respondents
conducted with regard for the rights of those addressed?
involved and affected. It requires that the
evaluation be complete and fair in its process and
reporting. It also requires that the evaluation
demonstrate respect for human dignity and avoid
conflicts of interest.
81

Accuracy Requires that the evaluation uses accurate and · Is appropriate quality control in place?
systematic processes to collect evaluation · Are enough data being collected?
evidence and produces technically accurate · Are methods and sources consistent with the
conclusions that are defensible and justifiable. nature of the evaluation?

Adapted from:
[1] Harris MJ. Evaluating Public and Community Health Programs. John Wiley & Sons, Incorporated; 2016.
[2] Hoffman S, Dekraai M. Brief Evaluation Protocol for Public Health Surveillance Systems. The Public Policy Center,
University of Nebraska; 2016.

4 Initial/need evaluation

Initial/need evaluation is conducted for new surveillance initiative(s) or major changes

that may impact the surveillance activities, such as major program integration or changes in data

sources, standards and surveillance methods.

For a new surveillance initiative, it is necessary to thoroughly document the public health

importance of the event(s) under surveillance, the purpose and activities of the surveillance (e.g.,

monitoring population trends and/or tracking individual cases), and surveillance strategies and

methodologies including data sources and technical platform. Further, the infrastructure and

stakeholders supporting the surveillance activities need to be described.

As a new initiative has not usually been in operation long enough to establish credible

evidence on outputs, in-depth assessment on these components is not recommended. Minimum

components and attributes to consider in the initial/need evaluation are presented in Figure 3-5.
82

Figure 3-5 Initial/need evaluation: Minimum and optional components and attributes *

* Boxes and text in black are recommended for this type of evaluation. Boxes and text in shade are not
recommended, but evaluators may choose them to tailor their evaluation project.
Detailed description on each component and attribute can be found in Appendix 3-2 and Appendix 3-3. Example
measures for each attribute are included in Appendix 3-4.

5 Routine/rapid evaluation

Routine and/or rapid evaluations are performed when an OSH surveillance system has

been routinely operated without major changes, and/or when there is limited time and resources

for a comprehensive evaluation. It is recommended that a routine/rapid evaluation is conducted


83

every year. The major purpose of routine/rapid evaluation is to assess and ensure that the

surveillance activities are being implementation as planned. As such, system’s outcomes are not

the main focus in the routine/rapid evaluation.

Routine evaluation by its very nature is conducted rapidly and frequently. It is not

encouraged to intensively engage and involve stakeholders unless necessary and readily

accessible. Further, as no major changes have occurred to the system, describing the

infrastructure and input could be brief, especially if completed in a previous evaluation.

Figure 3-6 includes minimum and optional components and attributes to consider in a

routine/rapid evaluation.
84

Figure 3-6 Routine/rapid evaluation: Minimum and optional components and attributes *

* Boxes and text in black are recommended for this type of evaluation. Boxes and text in shade are not
recommended, but evaluators may choose them to tailor their evaluation project.
Detailed description on each component and attribute can be found in Appendix 3-2 and Appendix 3-3. Example
measures for each attribute are included in Appendix 3-4.

6 Comprehensive evaluation

Comprehensive evaluations are recommended every three years, or when major changes

occurring to the system's infrastructure, surveillance methodologies and activities. A


85

comprehensive evaluation can also be conducted based on stakeholders’ suggestions or for

significant system timelines, for example, reviewing for funding application. A comprehensive

evaluation aims to collect thorough information regarding every aspect of an OSH surveillance

system to comprehensively understand the input and resources in the system, the surveillance

activities being performed, and the outputs and outcomes being produced by the system. A

comprehensive evaluation makes conclusions about the performance and usefulness of the

system, identifies gaps to be addressed and proposes recommendations to improve the system.

It is not recommended to assess certain data quality attributes even in a comprehensive

evaluation, including sensitivity, specificity and predictive value positive (PVP) as these are

more appropriate for a data quality evaluation. Further, assessing the usefulness of an OSH

surveillance system should be limited to immediate and visible outputs because attributing

downstream changes in workplace and worker safety and health to OSH surveillance may be

difficult.

Figure 3-7 includes the minimum and optional components and attribute to consider in a

comprehensive evaluation. Though comprehensive in nature, feasibility is still an important

consideration to guide the selection of elements and evaluation design. For example, data sources

and events under surveillance can be limited to a few important ones depending on available

resources and time frame for the evaluation, so do the stakeholders.


86

Figure 3-7 Comprehensive evaluation: Minimum and optional components and attributes *

* Boxes and text in black are recommended for this type of evaluation. Boxes and text in shade are not
recommended, but evaluators may choose them to tailor their evaluation project.
Detailed description on each component and attribute can be found in Appendix 3-2 and Appendix 3-3. Example
measures for each attribute are included in Appendix 3-4.

7 Data quality evaluation

A specific data quality evaluation may be needed for an in-depth understanding of the

data produced in an OSH surveillance system. It is sometimes necessary to conduct a data quality
87

evaluation, for example, when there is a new surveillance initiative, or when major changes

occur potentially impacting the data quality significantly.

Data quality evaluations can be a separate project with specifically allocated resources,

including time and expertise to ensure the depth and width of the assessment. This evaluation

needs to describe the surveillance strategies and the data sources but not necessarily the

surveillance infrastructure. An understanding on the core data managing and processing activities

is needed. The study team should have access to all necessary sources datasets as well as data

generated in the system for quantitative data analysis.

Figure 3-8 includes the minimum and optional components and attribute to consider in a

data quality evaluation. This evaluation should focus on quality related attributes that cannot be

easily approached in a normal evaluation, including sensitivity, specificity and predictive value

positive (PVP), as well as others. Data analysis methods may include capture-recapture

techniques, data linking, use of “gold standard” data, medical charts review, case follow-up

survey or interview, and literature review.


88

Figure 3-8 Data quality evaluation: Minimum and optional components and attributes *

* Boxes and text in black are recommended for this type of evaluation. Boxes and text in shade are not
recommended as minimum requirements, but evaluators may choose them to tailor their evaluation project.
Detailed description on each component and attribute can be found in Appendix 3-2 and Appendix 3-3. Example
measures for each attribute are included in Appendix 3-4.

8 Weighting attributes

An OSH surveillance system’s performance is determined by a set of attributes which

reflect varied aspects of the system. Among the selected attributes to get evaluated, the

evaluation team may want to emphasize certain attributes in the evaluation design, data
89

collection and analysis to better reflect the important aspects of the system. If wished,

quantitative weights can even be assigned to the attributes being measured to acquire a sum of

“overall performance” when the evaluation team decides to quantitatively score evaluation

results.

To this end, the average rating for each attribute (on a scale of 1 to 5) from expert panel

in the Delphi study (manuscript under review) may serve as a reference for choosing important

attributes. However, it must be considered that these ratings reflect the attributes’ relevance to a

“general” OSH surveillance system. Attributes with very high meaning rating (≥ 4.5) are

suggested to give more consideration or weight, including the following:

• Sustainability
• Confidentiality
• Feasibility
• Acceptability
• Accessibility
• Usability
• Usefulness
• Stability
• Effectiveness
• Representativeness
• Consistency
• Accuracy.
90

9 Key concepts and definitions

Key concepts and definitions in this framework are listed as below 2:

OSH surveillance system and program: An OSH surveillance system is composed of a

range of components including surveillance activities and infrastructure supporting these

activities. A system may include one or more surveillance programs, which focus on similar or

different surveillance activities.

Initiative: Usually refers to the plan and/or the initial implementation of new surveillance

activities or of major change(s) in existing surveillance activities. It can also refer to generally

surveillance activities.

Normal evaluation: Refers to any type of evaluation normally allocated with limited to

moderate resources including time, budget and expertise. It does not include evaluations with

special objectives (e.g., data quality evaluation) or with extensive supporting resources.

Need assessment and process evaluation: In program evaluation, need assessment

usually assesses whether the initiative is likely to have the intended effect and provides early

feedback and documentation on the initiative. Process evaluation documents whether the

program is being implemented as intended and provides explanations on why or why not by

identifying strengths and weaknesses of the program.

2
Adapted from:
[1] Centers for Disease Control and Prevention (CDC), Office of Strategy and Innovation. Introduction to Program Evaluation for
Public Health Programs: A Self-Study Guide. Atlanta, GA: Centers for Disease Control and Prevention; 2011.
[2] Hoffman S, Dekraai M. Brief Evaluation Protocol for Public Health Surveillance Systems. the Public Policy Center,
University of Nebraska; 2016
91

The two terms refer to the different stages of a program. In this framework, both

routine/rapid and comprehensive evaluations are process evaluation.

Component: A specific function or activity in a system.

Attribute: Measurable features of a corresponding surveillance component.

Measure: An evaluation question or indicator to clearly specify what to be assessed for

an attribute. Example measures for different attribute can be found in Appendix 3-4.
92

CHAPTER 4 Assessing disabling and non-disabling injuries and illnesses

using accepted workers compensation claims data to prioritize industries of

high risk for Oregon young workers

Liu Yang a
Adam Branscum a
Viktor Bovbjerg a
Curtis Cude b
Crystal Weston b
Laurel Kincl a

a
College of Public Health and Human Sciences, Oregon State University, 123 Women’s
Building, Corvallis, Oregon, 97331, USA
b
Public Health Division, Oregon Health Authority, 800 NE Oregon Street, Portland, Oregon,
97232, USA
93

Abstract

Background: Young workers 24 years and under are especially vulnerable to occupational

injuries and illnesses. There is a continuing need to investigate injury burden among young

workers across demographics and industry to identify vulnerable subgroups and inform targeted

interventions. Workers compensation (WC) claims data are important for quantifying work-

related injuries and illnesses, however published studies have mostly focused on disabling WC

claims. This study extended previous research on young workers’ injuries in Oregon by

including both disabling and non-disabling claims data to identify patterns of injury and high risk

industries.

Methods: We obtained all accepted disabling claims data (N=13,360) and a significant portion

of non-disabling claims data (N=24,460) from Oregon’s largest WC provider on workers aged 24

years and under from 2013 to 2018. Injury count, rate, and claim cost were calculated by year,

age, gender, industry, and injury type. A Prevention Index (PI) method was used to rank

industries in order to inform prevention efforts.

Results: Average annual disabling and non-disabling claim rates were 111.6 and 401.3 per

10,000 young workers. Workers aged 19-21 (disabling:119.0 per 10,000 and non-disabling:

429.3) and 22-24 years (115.7 and 396.4) and male workers (145.3 and 509.0) had higher claim

rates than workers aged 14-18 (80.6 and 297.0) and female workers (79.8 and 282.9). The most

frequent injury types were “struck by/against” (35.6%), followed by “work-related

musculoskeletal disorders (WMSDs)” (19.5%). Distribution of major injury types differed across

industries. Based on the PI, high risk industries included agriculture, construction and

manufacturing for both genders combined. For female young workers, the highest risk was in the

healthcare industry.
94

Conclusions: This study identified major injury types (“struck by/against” and WMSDs) and

high-risk industries (agriculture, construction, manufacturing, and healthcare) to guide targeted

research and prevention efforts among Oregon young workers. Further research can examine the

representativeness of WC claims data and other state-based data on young worker safety and

health.
95

Introduction

Adolescents and young adults make up a significant portion of the workforce in the US.

In 2018, there were about 19.4 million young workers aged 24 and under nationally and more

than 200,000 young workers in the State of Oregon, representing 12% of the total workforce in

the country and the state 1. In general, US state and federal laws dictate child labor laws, and

youth aged 14 years and above are allowed to work. Studies reported that about 35% and 50% of

17-year-olds work during the school year and summer 2. While working is a key developmental

milestone for many adolescents and young adults, risk of occupational exposures and injuries can

be a serious threat to not only their immediate health but also long-term development and ability

to work throughout adulthood 3,4. Ensuring a safe and healthy working environment is essential

to protect young workers and enable them to be productive through their working life.

Youth are vulnerable to occupational hazards and injuries 2,5,6. Young workers, especially

males, are at up to two times higher risk of injuries at work as compared to their older

counterparts 7–11. Occupational injury risk factors often differ between young and adult workers.

Young workers often hold temporary jobs and work in irregular hours 5,12,13. They tend to work

in certain industry sectors such as wholesale and retail trades, services, and agriculture, and their

jobs often require higher levels of physical exertion 5,8,10,12,14,15. Moreover, young workers often

receive inadequate safety training and supervision 16–18.

A growing body of research further highlights the heterogeneity in young workers

subgroups defined by age, gender and other potential factors such as training and job skills.

Workplace safety and health experiences may vary across subgroups 10,19. Studies have shown

that male and older young workers had higher injury rates as compared to female and younger

workers 3,8,14,15,20,21. Male and female workers tend to have different job tasks and experience
96

different types of injuries 22,23. For example, one study on adolescents reported that few females

did farm work while few males worked in health care facilities 23. Compared with males, females

are more likely to have jobs that require very fast repetitive motions and static standing for

extended time periods, which are associated with musculoskeletal injuries 16. There is a

continuing need to investigate injury burden among young workers across age, gender and

industry to identify vulnerable subgroups for targeted interventions.

A number of data sources have been examined to characterize the injury burden and

nature among young workers, including administrative and hospital data such as the workers

compensation (WC) claims data, emergency department (ED) visits data, trauma registry data,

and survey-based data 3,14,20,21,24–27. Each of these data sources are found to subject to potential

coverage and completeness issues and no single data source captures all injuries to young

workers 2. WC claims data have long been used as a population-based data source for research

and surveillance of work-related injuries and illnesses 3,28–31. The state of Oregon uses WC

claims data to document the incidence and characteristics of injuries and illnesses among young

and adult workers 32–34. However, earlier efforts focused on disabling claims, i.e. claims

involving compensation for lost work time or permanent disability or death, which generally

represent more severe injuries. Non-disabling claims involving injuries and illnesses without lost

work time were generally not included in existing studies due to difficulty in accessing such data
35
.

Literature suggests that minor injuries and illnesses occur frequently among young

workers 15,16,23,36,37. One study suggested that young workers had been experiencing minor

injuries so frequently that they saw them as a normal part of their everyday work 16. Another

study on both disabling and non-disabling WC data in the state of Washington showed that only
97

15% of claims from adolescents involved compensation for lost work time 15. A previous study

conducted by Walters et al. 33 on Oregon young workers found that injury rates almost doubled

when adding data from a portion of non-disabling claims from a commercial insurer in Oregon.

This study proposed that Oregon could improve the completeness of occupational surveillance

by including non-disabling claims data.

The present study includes non-disabling WC claims data from the State Accident

Insurance Fund Corporation (SAIF), the largest WC insurance provider in Oregon 38. By

analyzing both disabling and non-disabling WC claims data, the present study reported work-

related injuries and illnesses from 2013-2018 among Oregon young workers aged < 25 years,

with two specific objectives: 1) quantify and compare disabling and non-disabling injuries and

illnesses by demographics, industry and injury characteristics; 2) identify industry sectors of high

disabling and non-disabling injury risk for young workers using a Prevention Index (PI). The PI

takes into account multiple metrics such as injury count, rate and associated cost to

systematically compare injury risk across industry sectors to inform research and prevention

priorities 39,40

Methods

Data sources

Workers’ Compensation (WC) claims data

Oregon law requires most employers to provide workers’ compensation coverage to their

employees, including hourly and part-time employees 41. Oregon employers can choose to have

WC insurance for their employees by self-insurance, insurance through a private insurance

company, or insurance through SAIF. In 2018, SAIF’s market share in Oregon was 54%,
98

whereas the private insurers and the self-insured held 34% and 12% market share, respectively
43
.

Disabling WC claims in Oregon are defined as involving missing three or more days of

regularly scheduled work, overnight hospitalization, likely permanent disability, or death 41.

Non-disabling claims refer to those in which the injured workers do not receive any

compensation for lost work days (i.e., time lost from work is generally less than three days) or

experience permanent disability (Michael Vogt & Chuck Easterly, SAIF corporation, March,

2019, written communication). All Oregon insurance companies are required to report their

accepted disabling claims to the Workers’ Compensation Division of the Oregon Department of

Consumer and Business Services (DCBS), while accepted non-disabling claims are not required

to report.

The DCBS manages the Oregon disabling WC claims data system and provides the data

for research purposes on request. The present study obtained 2013-2018 de-identified accepted

disabling WC claims data from the DCBS for workers of all ages. Through a research project

agreement, SAIF provided 2013-2018 de-identified accepted non-disabling claims data for

workers aged 24 and under.

Employment data

To calculate claim rates, the Quarterly Workforce Indicators (QWI) data were used as

employment data. The QWI covers 95% of US private sector jobs and includes a wide variety of

record-level data sources such as the Unemployment Insurance data and Quarterly Census of

Employment and Wages data 44. The QWI data have been used in previous studies on Oregon’s

young workers as well as for other published studies 33,45. The publicly available QWI data
99

provide estimates for the number of workers aged 14 years old and above, stratified by industry

and demographics. For this study, data for Oregon workers from 2013 to 2018 were used.

Data variables and coding

Both the accepted disabling and non-disabling claims datasets contain the following

variables used in this study: injury year, worker’s demographics (age and gender), employer’s

industry, injury characteristics (nature, body part, and event/cause), and compensated medical

cost.

Age group

The worker’s age was recorded in years in both WC claims datasets. To match age

groups in the QWI employment data, workers were grouped into: < 10, 10-13, 14-18, 19-21, 22-

24, and > 24 years (in the disabling claims dataset only).

Industry sector and group

Employer’s industry in the WC claims raw data was coded using the North American

Industry Classification System (NAICS) 46. We further coded nine industry sectors based on

National Occupational Research Agenda (NORA, the second-decade) definitions using NAICS

codes up to 4 digits 47. We combined a NORA sector, “Oil and gas extraction” with another one,

“Mining (except oil and gas extraction)” into one sector, “Mining” as either sector had small

number of claims.

Injury characteristics

DCBS coded injury characteristics (i.e., nature, event and body part affected) in the

disabling claims data with the Occupational Injury and Illness Classification System (OIICS)

version 2.01 48. However, SAIF coded the non-disabling data using the Workers Compensation

Insurance Organizations (WCIO) coding system 49. The study team drafted a WCIO-OIICS
100

crosswalk independently and then compared with two other crosswalks proposed for research

purposes to amend the crosswalk (Laura Syron, NIOSH, January 2020, written communication;

Steve Wurzelbacher, NIOSH, January 2020, written communication). Conflicting results were

resolved by consensus among the study team. We used the final WCIO-OIICS crosswalk to code

injury characteristics of non-disabling WC claims to OIICS.

Injury types

Based on previous studies 39,50, we further coded claims into 17 types based on injury

and illness characteristics (i.e., nature and event). OIICS codes in version 1.01 used in previous

studies were reviewed and converted into codes in version 2.01 by consensus among the study

team. We used the Bureau of Labor Statistics definition for work-related musculoskeletal

disorders (WMSDs) (in OIICS code version 2.01) 51.

Medical cost

Medical costs were standardized to the 2018 US dollar based on inflation information

from the Bureau of Labor Statistics (BLS) website 52.

Analysis

To characterize work-related injuries and illnesses among Oregon young workers, we

counted claims by year, age group, gender, industry, injury nature and injury type for disabling

claims and non-disabling claims separately. We also calculated median medical costs associated

with the injuries and illnesses for each grouping.

Claim rates were estimated by year, age group, gender and industry for disabling and

non-disabling claims separately, expressed as the number of claims per 10,000 workers. We

chose the unit of workers to facilitate the comparison with previous studies on Oregon young
101

workers. Disabling claim counts and rates for adult workers (25 years and above) were also

calculated to provide a point of reference.

To adjust for under-coverage in calculating non-disabling claim rates, we weighted the

QWI employment data at the 2-digit level NAICS industry using the percentage of SAIF-covered

disabling claims over the total number of disabling claims in each industry in a similar study

period from 2014 to 2018 (proportion data provided by DCBS). The proportions ranged from

13.5% (NAICS code 22, “Utilities”) to 90% (NAICS code 11, “Agriculture, forestry, fishing and

hunting”), with an average of 55.6%. An implicit assumption is that similar percentages hold for

non-disabling claims. Specifically, we assumed that these proportions reflected SAIF’s market

share, that is, the percentage of workers covered under its premium in these industry sectors.

To rank industries of high risk, we adapted the Prevention Index (PI) methodology

developed by researchers in the state of Washington 39,40. This method has been used in a number

of studies as an effective way to systematically rank industry sectors to inform research and

prevention priorities 39,40,50,53,54. Each industry is ranked by injury count, injury rate and

associated medical cost, and arithmetic means of these ranks are calculated to obtain PI ranks.

The basic PI was calculated using the following formula. In case of a tie in basic PI, rate

rank was used as the tiebreaker.


01234 5637 8 5649 5637
Basic PI = :

To reflect injury severity, the expanded PI further adds medical cost rate rank, in which

the medical cost rate was calculated as the total medical cost incurred in an industry sector over

the total employment in this sector (per 10,000 workers). The expanded PI was calculated using

the following formula. In case of a tie in expanded PI, cost rank was used as the tiebreaker.
01234 5637 8 5649 5637 801D4 5637
;<=>?@A@ BC = E
102

We calculated basic and expanded PIs for each NORA sector for disabling claims and

non-disabling claims separately. In calculating PI ranks, we excluded categories with cases less

than 10 claims over the study period from the analysis due to small sample size. PI ranks were

also calculated for strata by selected factors including age group, gender and major injury types.

Due to the very small number of claimants aged below 14 years, separate statistics are not

reported in this paper. We excluded claims with age below 10 years in age related calculation

given age were likely coded wrong in these claims. For analyses involving claim rates, claims

with age below 14 years were excluded as no employment data (denominator) were available.

All statistical analyses were performed using R software (version 3.5.1). A two-sided

95% confidence level was applied when appropriate.

The study obtained ethics review approval by the Institutional Review Board at Oregon

State University (Review#: IRB-2019-0448).

Results

Injury frequency and rate

There were 13,360 accepted disabling claims and 24,460 accepted non-disabling claims

among Oregon young workers over the six-year study period from 2013 to 2018 (Table 4-1).

Nine claims involved fatal injuries. There were no important differences between disabling

claims and non-disabling in terms of claimants’ gender and age group distribution (Table 4-1).

The overall disabling claim rate from 2013-2018 was 111.6 per 10,000 workers (95%

Confidence Interval (CI): 109.7-113.5) (Table 4-1). The overall non-disabling claim rate was

401.3 per 10,000 workers (95% CI: 393.6-406.4), which was 3.60 times higher than the disabling

claim rate (95% CI: 3.52-3.67). The median medical cost for disabling claims was $1,758
103

(hereinafter, in 2018 US dollars), while the median medical cost for non-disabling claims was

$499.

An increasing trend of both the claim count and rate during the study period was

observed for non-disabling claims, with an estimated 1.03 times increase in injury rate each year

(95% CI: 1.02-1.04). The trend in disabling claim rate remained fairly constant from 2013 to

2017 but droped substantially in 2018. This drop in 2018 is likely due to potential data

incompleteness in this year due to less maturation time, which is a common phenomenon in

disabling WC claims dataset 55.

Workers aged 19-21 (disabling:119.0 per 10,000 workers and non-disabling: 429.3) and

22-24 years (115.7 and 396.4) had higher claim rates than the youngest workers aged 14-18 (80.6

and 297.0) (Table 4-1). Males had higher claim rates (disabling:145.3 per 10,000 workers and

non-disabling: 509.0) than females (79.8 and 282.9). For both disabling and non-disabling

claims, the injury rate ratio comparing male and female workers decreased with increasing age

(Rate Ratio (RR): >1.91 in age group 14-18 vs. <1.78 in age group 22-24, data not shown).

Figure 4-1 further displays disabling claim rates by gender for all age groups, including

adult workers aged above 24 years. The age group 19-21 years had the highest injury rates in

both genders. Compared to female and male adult workers, injury rates in females and males in

this age group were 1.05 (95% CI: 1.00-1.10) and 1.16 times (95% CI: 1.12-1.20) higher. The

youngest age group (14-18) had the lowest injury rates, which were 0.70 (95%CI: 0.65-0.76) and

0.81 (95%CI: 0.76-0.87) times those of female and male adult workers.

Injury nature and major types

Most claims involved the injury nature, “traumatic injuries and disorders” (92.6% in both

disabling and non-disabling claims combined), followed by “infectious and parasitic diseases”
104

(4.6%) and “diseases and disorders of body systems” (2.0%). Other and multiple diseases,

conditions, disorders, and symptoms accounted for fewer than 1% of the claims; however, they

incurred much higher medical cost. For example, the median medical cost in these injury natures

was $10,656 as compared with $1,722 in the above three more common injury natures in

disabling claims.

Regarding injury types, “struck by/against” was the most common type accounting for

35.6% of disabling and non-disabling claims combined, followed by WMSDs (19.5%), “fall on

same level” (9.1%), and violence (6.6%) (Figure 4-2). Non-disabling claims involved a higher

proportion of “struck by/against” (41.8% vs. 24.2%) and violence (7.6% vs. 4.6%) injuries

relative to the disabling claims. Young workers had more “struck by/against” injuries (36.0% vs.

13.8%) and less WMSDs (19.5% vs. 35.7%) as compared to adult workers (in disabling claims).

Considering medical cost, WMSDs tended to have higher medical cost, which was especially

evident in non-disabling claims, with a highest median cost ($692).

Claim rates by major injury types and the nine NORA sectors are presented in Table 4-2.

The “agriculture, forestry & fishing” (hereinafter referred as agriculture) sector had much higher

than average rates for all major injury types. Although the mining sector did not have claim

records for all these major injury types, it had high rates in most injury types that it had claims.

WMSDs had high rates in nearly all NORA sectors, with “healthcare & social assistance”

(hereinafter referred as healthcare), manufacturing, and mining sectors having the highest rates

for both disabling and non-disabling claims. Disabling WMSDs rate was exceptionally high in

the “transportation, warehousing & utilities” (hereinafter referred as transportation) sector (RR:

3.4; 95% CI: 3.04-3.79, compared with the average disabling WMSDs rate). “struck by/against”

injuries were concentrated in production sectors such as agriculture, construction and


105

manufacturing sectors. It is worth noting that violence injuries were concentrated in few sectors

including healthcare, agriculture and public safety, while the other sectors had extremely low

rates.

Identifying industries of high risk

Nine NORA industry sectors were ranked using the basic and expanded PIs (Table 4-3).

For disabling claims, the agriculture sector ranked the highest with both the basic and expanded

PIs, followed by construction, transportation, and manufacturing. The mining sector ranked

among the lowest with basic PI, however it ranked high on expanded PI which incorporates

medical costs. For non-disabling claims, the manufacturing and construction sectors had the

highest ranks with both basic and expanded PIs. Compared to disabling claims, the agriculture

and transportation sectors ranked relatively lower. Again, mining sector ranked high with

expanded PI. In both disabling and non-disabling claims, “services (except public safety and

veterinary medicine/animal care)” (hereinafter referred as services) and “wholesale and retail

trade” ranked fairly low given their low injury rates.

Stratified by gender, for both disabling and non-disabling claims, the healthcare sector

ranked the highest for female workers but among the lowest for male workers (Table S1-S2 in

Appendix 4-1). On the other hand, construction and agriculture were among the top ranked

sectors for male workers but among the lowest ranked for female workers. Further examination

showed that male claimants dominated the construction and agriculture sectors (>87%), while

there were mostly female claimants in the healthcare sector (80%).

We further ranked industry groups based on NAICS 4-digit codes. These results are

included in Appendix 4-2 for interested readers but not discussed in this paper.
106

Discussion

Injury is the leading cause of death as well as years of potential life lost for children and

young adults 56. Preventing work-related injuries and illnesses among young workers continues

to be in a critical need. The present study explored the use of both disabling and non-disabling

WC claims data to document patterns of work-related injuries and illnesses among Oregon

young workers aged 24 years and under in a 6-year period from 2013 to 2018. Using a

Prevention Index, the study systematically prioritized industry sectors with a high injury risk for

young workers.

Trend and types of injuries and illnesses among young workers

The study identified an overall disabling injury rate of 111.6 per 10,000 workers for

young workers, which is slightly lower than the reported disabling injury rate of 122.7 per

10,000 workers for Oregon young workers from 2000-2007 in the previous study 33. Breaking

down by age and gender, we noticed that male workers and workers aged 19-21 and 22-24 had

higher rates. The findings are consistent with existing literature 3,9,113,9,11,33. The study quantified

a non-disabling injury rate approximately 4 times the rate of disabling injuries among Oregon

young workers, which has not been reported before. Findings from the study highlighted the

importance of using both disabling and non-disabling WC claims data for a more complete report

of injuries and illnesses among young workers.

Common injury types identified for Oregon young workers were similar to those for

Oregon adult workers in this study, as well as to those reported in the state of Washington for

workers of all ages combined 39. Consistent with existing studies 3,8,22, we found that the overall

proportion of WMSDs among young workers was less than that among adult workers. Young

workers may have less time to develop impaired musculoskeletal functioning compared with
107

older workers 3, however, a progressive history of WMSD and other musculoskeletal conditions

accumulated since young age may lead to higher risk to more injuries and permanent disability
3,57
. In contrast to WMSDs, “struck by/against” injuries were more common among young

workers than adult workers. They were more prevalent in non-disabling claims, indicating this

type of injury may be less severe but happen more often. “Struck by/against” injuries refer to

those produced by forcible contact or impact between the injured person and sources. Most

injuries of this type involved open wounds or surface wounds and bruises (80.9%). Young

workers often work with machinery and equipment tools, especially in production sectors such as

agriculture, construction and manufacturing, where “struck by/against” injuries were

concentrated. These machines may be particularly dangerous to young workers, especially the

younger age group, as they are usually not designed to fit young workers’ physical body frames,

leading to awkward postures and injuries 58. Further, young workers often hold jobs with fast-

pace and higher levels of physical exertion, which exacerbate their injury risk 5,8,10,12,14,15. On the

other hand, young people tend to lack experience and cognitive maturity to deal with difficult

tasks or work environments 5,10,11,22. They may also lack access to training and appropriate

supervision on the job 17,18. Education and other intervention programs to increase young

people’s safety skills and knowledge and awareness of workplace hazards, as well as to promote

appropriate supervision help to reduce their vulnerability to many workplace injuries including

“struck by/against”.

Prioritizing industries for prevention intervention

No previous study has systematically ranked high-risk industries for young workers. This

study reported high ranked NORA sectors in both disabling and non-disabling claims based on

PI method. The basic and expanded PI ranks can inform research and intervention targets by
108

incorporating injury count, rate, and cost burden. Previous studies commonly reported

agriculture, manufacturing, trade and service sectors as hazardous for adolescent and young adult

workers 20,32,33,59. Our study suggested that agriculture, construction and manufacturing sectors

should be priority targets as they had fairly high injury counts and rates, as well as high medical

cost associated with the injuries and illnesses. Services and wholesale and retail trade sectors

should be a focus in addressing the large number of injuries among young workers. It is of note

that the mining sector had high expanded PI ranks due to its very high claim rate and cost rate,

despite that it was ranked low with basic PI due to the fewest claim counts. Although the mining

industry has historically been seen as one of the most dangerous industries in the US for its high

death toll and high injury rate for all ages 60, few previous studies highlighted concern on the

mining sector for young workers. Given the high injury rate and associated medical cost among

Oregon young workers, more research is needed to investigate young workers’ safety and health

in these industries at the state and local level. Prevention efforts in these industries must consider

how to target young workers or create provisions specific for younger workers.

Gender difference

In line with most studies on gender difference regarding work-related injuries


3,8,14,15,20,21,33
, the study showed that male young workers had the most injuries with almost

doubled injury rate compared with their female counterparts. Recognizing the high burden of

injury and illness among male workers, researchers also pointed out a bias in occupational health

that it tends to emphasize injuries and risk factors typically associated with male workers and

overlook those associated with women 16,22,61. Our study showed that male young workers

dominated in most high rank sectors. In fact, the high ranked industries for both genders

combined mostly represented patterns of male young workers. For example, agriculture and
109

construction, ranked high for both genders combined and for males, were ranked low for

females. On the other hand, the healthcare sector, ranked the highest for females, was listed low

in the combined list. By making lists specifically for female workers, the study revealed

industries and injury types that contributed disproportionately to occupational injury and illness

in female young workers.

Limitations

Despite the strength of including both disabling and non-disabling claims data, injuries

and illnesses are likely to be underestimated due to under-coverage and under-reporting issues

with WC claims 36,41,62–64. Young workers are more likely to be under-counted as they often

engage in temporary or seasonal work, or self-employment arrangements such as farm work or

with family businesses, which are not covered by Oregon WC 41.

Without data on the number of workers covered by SAIF, we had to weighted

employment data using claims proportion based on the assumption that the workforce covered by

SAIF was the same as that by other insurers, defined by factors of the study’s interest such as age

and gender. We understand that this assumption might not hold true. However, we think the

estimation is appropriate for this explorative study, given the fact that SAIF is the largest WC

provider in Oregon and covers the majority of Oregon employers. Data interoperability in

disabling and non-disabling data was another challenge in this study. Despite our best effort to

develop a crosswalk between WCIO and OIICS, misclassification was unavoidable in certain

codes that cannot be perfectly matched.

Conclusions

The study explored the expanded use of WC data by including both disabling and non-

disabling claims data in a recent 6-year period from 2013 to 2018 in describing young workers’
110

injury burden. The findings of this study complement previous studies with a more complete

report on work-related injuries and illnesses among young workers.

In summary, young workers experienced higher injury rates than adult workers. There

were significantly more non-disabling injuries than disabling injuries across different industry

sectors, age and gender groups. For both disabling and non-disabling claims, male young

workers and workers aged 19-21 and 22-24 years had a greater injury burden than their female

and younger counterparts. Common injury types included “struck by/against”, WMSDs, “fall on

same level” and violence, with differences observed between disabling and non-disabling claims,

and across industries and genders. The study is the first to report a systematic ranking of industry

sectors with high injury risk among young workers as a whole as well as in sub-groups.

The study has practical implications for young workers’ occupational safety and health.

A more complete picture of work-related injuries and illnesses based on multiple data sources

provides evidence to both develop and evaluate targeted prevention interventions for Oregon

young workers. High ranked industry sectors for young workers as a whole, such as agriculture,

construction and manufacturing sectors, and for female workers separately such as healthcare

should be priority targets to reduce both the number and rate of injuries and illnesses among

young workers. Interventions targeting certain industries could aim for major injury types that

are prevalent in this sector. For example, WMSDs and violence injuries should be the focuses in

healthcare sector.

Exploring non-disabling claims data from commercial WC insurers and the use of the

Prevention Index methodology enhances OSH surveillance of young workers, as well as other

worker groups. Future efforts to obtain non-disabling data from more insurers and to examine

data representativeness would increase the completeness and accuracy in documenting injuries
111

and illnesses among young workers. The study methodology could be adopted by other states to

inform prevention efforts with state WC data.

Reference

1. National Institute for Occupational Safety and Health. NIOSH Workplace Safety and
Health Topic: Young Worker Safety and Health. Published 2019. Accessed February 9,
2020. https://www.cdc.gov/niosh/topics/youth/default.html

2. National research Council. Protecting Youth at Work: Health, Safety, and Development of
Working Children and Adolescents in the United States. The National Academies Press;
1998. doi:10.17226/6019

3. Breslin C, Koehoorn M, Smith P, Manno M. Age related differences in work injuries and
permanent impairment: a comparison of workers’ compensation claims among
adolescents, young adults, and adults. Occup Environ Med. 2003;60(9):e10–e10.

4. Pratt B, Cheesman J, Breslin C, Do MT. Occupational injuries in Canadian youth: an


analysis of 22 years of surveillance data collected from the Canadian Hospitals Injury
Reporting and Prevention Program. Health Promot Chronic Dis Prev Can Res Policy
Pract. 2016;36(5):89-98.

5. Breslin FC, Day D, Tompa E, Bhattacharyya S, Clarke J, Wang A. Systematic Review of


Risk Factors for Work Injury among Youth. Institute for Work & Health; 2005:101.

6. Windau J, Samuel M. Occupational injuries among young workers. Montyly Labor Rev.
Published online 2005:11-23.

7. Bena A, Leombruni R, Giraudo M, Costa G. A new Italian surveillance system for


occupational injuries: characteristics and initial results. Am J Ind Med. 2012;55(7):584-
592. doi:10.1002/ajim.22025

8. Breslin FC, Smith P. Age-related differences in work injuries: a multivariate, population-


based study. Am J Ind Med. 2005;48(1):50-56. doi:10.1002/ajim.20185

9. Centers for Disease Control and Prevention. Nonfatal Occupational Injuries and Illnesses
Treated in Hospital Emergency Departments--United States, 2003. MMWR Morb Mortal
Wkly Rep. 2006;55(449-452):313.

10. Hanvold TN, Kines P, Nykänen M, et al. Occupational Safety and Health Among Young
Workers in the Nordic Countries: A Systematic Literature Review. Saf Health Work.
Published online December 2018. doi:10.1016/j.shaw.2018.12.003

11. Salminen S. Have young workers more injuries than older ones? An international literature
review. J Safety Res. 2004;35(5):513-521. doi:10.1016/j.jsr.2004.08.005
112

12. Hanvold TN. Young Workers and Sustainable Work Life: Special Emphasis on Nordic
Conditions. Nordic Council of Ministers; 2016.

13. Paterson JL, Clarkson L, Rainsird S, Etherton H, Blevvett V. Occupational fatigue and
other health and safety issues for young Australian workers: an exploratory mixed
methods study. Ind Health. 2015;53(3):293-299. doi:10.2486/indhealth.2014-0257

14. Layne L, Castillo D, Stout N, Cutlip P. Adolescent Occupational Injuries Requiring


Hospital Emergency Department Treatment - a Nationally Representative Sample. Am J
Public Health. 1994;84(4):657-660. doi:10.2105/AJPH.84.4.657

15. Miller ME, Kaufman JD. Occupational injuries among adolescents in Washington State,
1988-1991. Am J Ind Med. 1998;34(2):121-132. doi:10.1002/(SICI)1097-
0274(199808)34:2<121::AID-AJIM4>3.3.CO;2-#

16. Breslin FC, Polzer J, MacEachen E, Morrongiello B, Shannon H. Workplace injury or


“part of the job”? Towards a gendered understanding of injuries and complaints among
young workers. Soc Sci Med. 2007;64(4):782-793. doi:10.1016/j.socscimed.2006.10.024

17. Dragano N, Barbaranelli C, Reuter M, et al. Young Workers’ Access to and Awareness of
Occupational Safety and Health Services: Age-Differences and Possible Drivers in a
Large Survey of Employees in Italy. Int J Environ Res Public Health. 2018;15(7):1511.
doi:10.3390/ijerph15071511

18. Runyan CW, Schulman M, Dal Santo J, Bowling JM, Agans R, Ta M. Work-related
hazards and workplace safety of US adolescents employed in the retail and service sectors.
Pediatrics. 2007;119(3):526-534. doi:10.1542/peds.2006-2009

19. Nielsen ML, Dyreborg J, Kines P, Nielsen KJ, Rasmussen K. Exploring and Expanding
the Category of ‘Young Workers’ According to Situated Ways of Doing Risk and
Safety—a Case Study in the Retail Industry. Nord J Work Life Stud. 2013;3(3):219-234.
doi:10.19154/njwls.v3i3.3019

20. Horwitz IB, McCall BP. Occupational injury among Rhode Island adolescents: an analysis
of workers’ compensation claims, 1998 to 2002. J Occup Environ Med Am Coll Occup
Environ Med. 2005;47(5):473-481.

21. Mujuru P, Mutambudzi M. Injuries and Seasonal Risks among Young Workers in West
Virginia - A 10-Year Retrospective Descriptive Analysis. AAOHN J. 2007;55(9):381-387.
doi:10.1177/216507990705500906

22. Laberge M, Ledoux E. Occupational health and safety issues affecting young workers: A
literature review. Work- J Prev Assess Rehabil. 2011;39(3):215-232. doi:10.3233/WOR-
2011-1170

23. Parker DL, Carl WR, French LR, Martin FB. Nature and incidence of self-reported
adolescent work injury in Minnesota. Am J Ind Med. 1994;26(4):529-541.
doi:10.1002/ajim.4700260410
113

24. Brooks DR, Davis LK, Gallagher SS. Work-related injuries among Massachusetts
children: A study based on emergency department data. Am J Ind Med. 1993;24(3):313-
324. doi:10.1002/ajim.4700240308

25. Graves JM, Sears JM, Vavilala MS, Rivara FP. The burden of traumatic brain injury
among adolescent and young adult workers in Washington State. J Safety Res.
2013;45:133-139. doi:10.1016/j.jsr.2012.11.001

26. US Bureau of Labor Statistics. Survey of Occupational Injuries and Illnesses Data.
Published 2019. Accessed March 10, 2020. https://www.bls.gov/iif/soii-overview.htm

27. Brooks DR, Davis LK. Work-related injuries to Massachusetts teens, 1987–1990. Am J
Ind Med. 1996;29(2):153-160. doi:10.1002/(SICI)1097-0274(199602)29:2<153::AID-
AJIM5>3.0.CO;2-T

28. Radi S, Benke G, Schaafsma F, Sim M. Compensation claims for occupational noise
induced hearing loss between 1998 and 2008: yearly incidence rates and trends in older
workers. Aust N Z J Public Health. 2016;40(2):181–185. doi:10.1111/1753-6405.12460

29. Morassaei S, Breslin FC, Shen M, Smith PM. Examining job tenure and lost-time claim
rates in Ontario, Canada, over a 10-year period, 1999–2008. Occup Env Med.
2014;70(3):171-178. doi:10.1136/oemed-2012-100743

30. Tarawneh I, Lampl M, Robins D, et al. Workers’ Compensation Claims for


Musculoskeletal Disorders Among Wholesale and Retail Trade Industry Workers — Ohio,
2005–2009. MMWR Morb Mortal Wkly Rep. 2013;62(22):437-442.

31. Council of State and Territorial Epidemiologists. Occupational Health Indicators: A


Guide for Tracking Occupational Health Conditions and Their Determinants (Update
2019). Council of State and Territorial Epidemiologists (CSTE) in collaboration with the
NIOSH; 2020.
https://cdn.ymaws.com/www.cste.org/resource/resmgr/publications/OHI_Guidance_Manu
al_FINAL-2.pdf

32. McCall BP, Horwitz IB, Carr BS. Adolescent Occupational Injuries and Workplace Risks:
An Analysis of Oregon Workers’ Compensation Data 1990–1997. J Adolesc Health.
2007;41(3):248-255. doi:10.1016/j.jadohealth.2007.02.004

33. Walters JK, Christensen KA, Green MK, Karam LE, Kincl LD. Occupational injuries to
Oregon workers 24 years and younger: An analysis of workers’ compensation claims,
2000-2007. Am J Ind Med. 2010;53(10):984-994. doi:10.1002/ajim.20819

34. Mccall P, Horwitz B. Workplace Violence in Oregon: An Analysis Using


Workers’Compensation Claims from 1990–1997. J Occup Environ Med. 2004;46(4):357–
366. doi:10.1097/01.jom.0000121131.34757.ed

35. Utterback DF, Meyers AR, Wurzelbacher SJ. Workers’ Compensation Insurance: A
Primer for Public Health. Nitional Institute for Occupational Safety and Health; 2014.
114

Accessed February 19, 2019. http://elcosh.org/document/3752/d001287/workers%3F-


compensation-insurance%3A-a-primer-for-public-health.html

36. Tucker S, Diekrager D, Turner N, Kelloway EK. Work-related injury underreporting


among young workers: Prevalence, gender differences, and explanations for
underreporting. J Safety Res. 2014;50:67-73. doi:10.1016/j.jsr.2014.04.001

37. Turner N, Tucker S, Kelloway EK. Prevalence and demographic differences in


microaccidents and safety behaviors among young workers in Canada. J Safety Res.
2015;53:39-43. doi:10.1016/j.jsr.2015.03.004

38. SAIF. SAIF company information. Published 2020. Accessed March 7, 2020.
https://www.saif.com/x640.xml

39. Anderson NJ, Bonauto DK, Adams D. Prioritizing Industries for Occupational Injury and
Illness Prevention and Research, Washington State Workers’ Compensation Claims Data,
2002–2010. Safety and Health Assessment and Research for Prevention (SHARP)
Program, Washington State Department of Labor & Industries; 2013:1-58. Accessed
February 22, 2017. http://www.lni.wa.gov/Safety/Research/Files/bd_3F.pdf

40. Bonauto D, Silverstein B, Adams D, Foley M. Prioritizing Industries for Occupational


Injury and Illness Prevention and Research, Washington State Workers’ Compensation
Claims, 1999-2003. J Occup Environ Med. 2006;48(8):840-851.
doi:10.1097/01.jom.0000225062.88285.b3

41. Oregon State Legislature. Workers’ Compensation (2019 Edition).; 2019. Accessed March
10, 2020. https://www.oregonlegislature.gov/bills_laws/ors/ors656.html

42. Oregon State Legislature. Workers’ Compensation (2017 Edition).; 2017. Accessed
November 20, 2017. https://www.oregonlegislature.gov/bills_laws/ors/ors656.html

43. Department of Consumer and Business Services. State of Oregon Workers’ compensation
insurance and self insurance. Accessed April 4, 2020.
https://www.oregon.gov/dcbs/reports/compensation/insurance/Pages/index.aspx

44. US Census Bureau Center for Economic Studies. Longitudinal Employer-Household


Dynamics. Published 2020. Accessed March 10, 2020. https://lehd.ces.census.gov/data/

45. Syron LN, Kincl L, Yang L, Cain DT, Smit E. Analysis of workers’ compensation
disabling claims in Oregon’s seafood preparation and packaging industry, 2007-2013. Am
J Ind Med. 2017;60(5):484-493. doi:10.1002/ajim.22706

46. US Census Bureau. North American Industry Classification System. Published 2020.
Accessed March 10, 2020. https://www.census.gov/cgi-
bin/sssd/naics/naicsrch?chart=2012
115

47. Centers for Disease Control and Prevention. National Occupational Research Agenda
(NORA): Sectors. Published February 4, 2019. Accessed March 10, 2020.
https://www.cdc.gov/nora/sectorapproach.html

48. Centers for Disease Control and Prevention. Occupational Injury and Illness Classification
System (OIICS). Published 2019. Accessed March 10, 2020.
https://wwwn.cdc.gov/Wisards/oiics/default.aspx

49. Workers Compensation Insurance Organizations. Injury Descritpion Table -


Part/Nature/Cause. Published 2020. Accessed March 10, 2020.
https://www.wcio.org/Document%20Library/InjuryDescriptionTablePage.aspx

50. Anderson NJ, Bonauto DK, Adams D. Prioritizing industries for occupational injury
prevention and research in the Services Sector in Washington State, 2002–2010. J Occup
Med Toxicol. 2014;9(37):1-15. doi:10.1186/s12995-014-0037-2

51. US Bureau of Labor Statistics. Occupational Safety and Health Definitions. Published
2016. Accessed March 10, 2020. https://www.bls.gov/iif/oshdef.htm

52. Bureau of Labor Statistics. Consumer Price Index (CPI). Published 2017. Accessed May
17, 2017. https://www.bls.gov/cpi/

53. Fouquet N, Bodin J, Chazelle E, Descatha A, Roquelaure Y. Use of Multiple Data Sources
for Surveillance of Work-Related Chronic Low-Back Pain and Disc-Related Sciatica in a
French Region. Ann Work Expo Health. 2018;62(5):530-546.
doi:10.1093/annweh/wxy023

54. Silverstein B, Viikari-Juntura E, Kalat J. Use of a prevention index to identify industries at


high risk for work-related musculoskeletal disorders of the neck, back, and upper
extremity in Washington state, 1990-1998. Am J Ind Med. 2002;41(3):149-169.

55. Yang L, Branscum A, Smit E, Dreher D, Howard K, Kincl L. Work-related injuries and
illnesses and their association with hour of work: Analysis of the Oregon construction
industry in the US using workers’ compensation accepted disabling claims, 2007-2013. J
Occup Health. 2020;62(1):e12118. doi:10.1002/1348-9585.12118

56. Graitcer PL. The development of state and local injury surveillance systems. J Safety Res.
1987;18(4):191-198. doi:10.1016/0022-4375(87)90082-X

57. Saha A, Sadhu HG. Occupational Injury Proneness in Young Workers: A Survey in Stone
Quarries. J Occup Health. 2013;55(5):333-339. doi:10.1539/joh.12-0150-OA

58. Runyan CW, Zakocs RC. Epidemiology and Prevention of Injuries Among Adolescent
Workers in the United States. Annu Rev Public Health. 2000;21(1):247-269.
doi:10.1146/annurev.publhealth.21.1.247

59. Belville R, Pollack SH, Godbold JH, Landrigan PJ. Occupational injuries among working
adolescents in New York State. JAMA. 1993;269(21):2754-2759.
116

60. Margolis KA. Underground coal mining injury: A look at how age and experience relate to
days lost from work following an injury. Saf Sci. 2010;48(4):417-421.
doi:10.1016/j.ssci.2009.12.015

61. Taylor EJ, Neis B, Messing K, Dumais L. Invisible: issues in women’s occupational
health. Resour Fem Res Tor. 1996;25(1/2):36-37.

62. Azaroff LS, Levenstein C, Wegman DH. Occupational Injury and Illness Surveillance:
Conceptual Filters Explain Underreporting. Am J Public Health. 2002;92(9):1421-1429.

63. Biddle J, Roberts K, Rosenman KD, Welch EM. What Percentage of Workers With Work-
Related Illnesses Receive Workers’ Compensation Benefits? J Occup Environ Med.
1998;40(4):325–331.

64. Shannon HS, Lowe GS. How many injured workers do not file claims for workers’
compensation benefits? Am J Ind Med. 2002;42(6):467-473. doi:10.1002/ajim.10142
117

Tables & Figures

Table 4-1 Oregon workers' compensation disabling and non-disabling claims frequency, rate, medical cost by year and demographics, 2013-2018

Disabling Claims Non-disabling Claims

Claim Rate Rate Ratio Claim Rate Rate Ratio


#Claims (%) #Claims (%)
/10,000 (95% CI) (95% CI) /10,000 (95% CI) (95% CI)
Total 13360 (100) 111.6 (109.7, 113.5) / 24660 (100) 401.3 (396.3, 406.4) /
Year
2013 2000 (15.0) 113.1 (108.2, 118.1) 1 3258 (13.2) 360.8 (348.6, 373.3) 1
2014 2157 (16.1) 115.4 (110.6, 120.3) 1.02 (0.96, 1.08) 3793 (15.4) 397.0 (384.5, 409.8) 1.10 (1.05, 1.15)
2015 2234 (16.7) 112.5 (108.0, 117.3) 1.00 (0.94, 1.06) 4016 (16.3) 396.0 (383.9, 408.4) 1.10 (1.05, 1.15)
2016 2417 (18.1) 116.7 (112.1, 121.4) 1.03 (0.97, 1.10) 4255 (17.3) 400.6 (388.7, 412.8) 1.11 (1.06, 1.16)
2017 2410 (18.0) 113.5 (109.1, 118.1) 1.00 (0.95, 1.07) 4519 (18.3) 413.4 (401.4, 425.5) 1.15 (1.10, 1.20)
2018 2142 (16.0) 99.3 (86.0, 95.1) 0.88 (0.83, 0.93) 4819 (19.5) 431.6 (419.5, 443.9) 1.20 (1.14, 1.25)
Gender
Female 4926 (36.9) 79.8 (77.6, 82.1) 1 8890 (36.1) 282.9 (277.1, 288.9) 1
Male 8433 (63.1) 145.3 (142.3, 148.5) 1.82 (1.76, 1.89) 15284 (62.0) 509.0 (501.0, 517.1) 1.80 (1.75, 1.85)
Age Group
14-18 1473 (11.0) 80.6 (76.6, 84.8) 1 2853 (11.6) 297.0 (286.3, 308.1) 1
19-21 5167 (38.7) 119.0 (115.8, 122.3) 1.48 (1.39, 1.56) 9470 (38.4) 429.3 (420.7, 438. 0) 1.45 (1.39, 1.51)
22-24 6719 (50.3) 115.7 (113.0, 118.5) 1.44 (1.36, 1.52) 11807 (47.9) 396.4 (389.3, 403.6) 1.33 (1.28, 1.39)
118

Table 4-2 Claim rate for young workers, by major injury type and NORA industry sector, Oregon workers' compensation accepted disabling (n=13360) and non-
disabling claims (n=24660), 2013-2018

Public Wholesale &


Injury type All Agriculture Construction Healthcare Manufacturing Mining Service Transportation
Safety Retail

Disabling Claims (Per 10,000 workers, (95% Confidence Interval))


35 (34, 36) 56 (49, 65) 46 (41, 53) 67 (63, 72) 59 (54, 65) 60 (19, 140) 52 (37, 71) 18 (17, 19) 120 (108, 133) 36 (33, 38)
WMSDs
Struck 27 (26, 28) 90 (80, 101) 76 (68, 84) 12 (10, 14) 60 (54, 65) 30 (5, 93) 28 (17, 42) 19 (18, 20) 60 (51, 69) 24 (22, 26)
by/against
6.4 (5.9, 6.8) 27 (22, 33) 15 (12, 19) 1.3 (0.8, 2.0) 28 (25, 32) 60 (19, 140) 2.9 (0.5, 9) 3.4 (2.9, 3.9) 13 (9, 17) 4.8 (4, 5.6)
Caught
in/under/between
Rubbed or 0.4 (0.3, 0.5) 3 (1.5, 5.4) 0.9 (0.3, 2.0) / 2.1 (1.2, 3.3) / / 0.2 (0.1, 0.3) 0.4 (0, 1.6) 0.2 (0.1, 0.4)
abraded
Fall on same 14 (14, 15) 47 (40, 55) 19 (15, 23) 15 (13, 17) 20 (17, 24) 15 (1, 66) 25 (15, 38) 12 (11, 13) 31 (25, 38) 11 (9, 12)
level
Fall to lower 6.3 (5.9, 6.8) 27 (22, 33) 42 (37, 49) 2 (1.5, 3) 7.7 (5.9, 9.9) 15 (1, 66) 8.7 (3.5, 18) 4.1 (3.6, 4.7) 16 (11, 21) 3.2 (2.6, 3.9)
level
5.6 (5.2, 6.1) 21 (17, 27) 12 (9, 16) 4.2 (3.2, 5.3) 4.5 (3.1, 6.1) / 16 (8, 27) 4.3 (3.8, 4.9) 17 (12, 22) 5 (4.3, 5.9)
Transportation
5.1 (4.7, 5.5) 7.3 (4.8, 11) 1.7 (0.8, 3.2) 27 (24, 30) 0.5 (0.2, 1.2) / 34 (22, 49) 2.2 (1.9, 2.6) 2.1 (0.8, 4.3) 1.2 (0.8, 1.7)
Violence
Non-disabling Claims (Per 10,000 workers, (95% Confidence Interval))
52 (50, 54) 66 (57, 76) 59 (51, 67) 105 (99, 112) 101 (92, 111) 123 (49, 249) 33 (18, 54) 29 (27, 31) 53 (40, 69) 50 (46, 55)
WMSDs
Struck 168 (164, 257 (239, 420 (399, 71 (65, 77) 371 (352, 389) 430 (272, 51 (32, 76) 136 (132, 146 (123, 172) 155 (147,
by/against 171) 276) 442) 641) 140) 162)
19 (18, 20) 35 (28, 42) 33 (27, 39) 11 (8.5, 13) 63 (56, 71) 82 (25, 190) 18 (8, 34) 12 (11, 13) 19 (12, 29) 21 (18, 24)
Caught
in/under/between
Rubbed or 22 (21, 23) 41 (34, 48) 85 (76, 95) 5 (3.7, 6.7) 84 (75, 93) 123 (49, 249) 15 (6, 31) 11 (10, 13) 25 (17, 37) 16 (14, 18)
abraded
Fall on same 29 (27, 30) 52 (44, 61) 31 (25, 37) 36 (33, 41) 34 (28, 39) 41 (7, 126) 41 (24, 64) 25 (24, 27) 36 (25, 49) 22 (19, 25)
level
Fall to lower 8.4 (7.7, 9.2) 37 (31, 45) 37 (31, 43) 2.9 (1.9, 4) 11 (8.3, 15) 62 (15, 159) 2.5 (0.1, 4.2 (3.6, 5.0) 18 (11, 28) 5.5 (4.2, 7.1)
level 11.2)
10 (9.4, 11) 24 (18, 30) 21 (16, 26) 8 (6.3, 10) 8.8 (6.2, 12) 41 (7, 126) 5.1 (0.8, 16) 7.8 (6.8, 8.8) 32 (22, 45) 11 (8.7, 13)
Transportation
31 (29, 32) 55 (47, 64) 8.9 (6.2, 12.4) 121 (114, 6.4 (4.2, 9.1) / 36 (20, 58) 16 (15, 17) 9.5 (4.6, 17) 8.9 (7.2, 11)
Violence 128)
NORA: National Occupational Research Agenda
119

Table 4-3 NORA industry sectors ranked by Prevention Index (PI) for young workers, Oregon workers' compensation accepted disabling (n=13360) and non-
disabling claims (n=24660), 2013-2018

Non-disabling Claims Non-disabling Claims

Claim Cost Claim Cost


NORA Sector Description Claim Rate Rate Basic Expanded Claim Rate Rate Basic Expanded
Count /10,000 /100 PI PI Count /10,000 /100 PI PI
Workers Workers Workers Workers
Agriculture, Forestry & Fishing (except 1017 310.2 20965.9 1 1 1959 668.9 7031.8 5 4
Wildland Firefighting and including
seafood processing)
Construction 1086 234.5 14666.1 2 3 2614 730.7 6359.1 1 3
Healthcare & Social Assistance 1988 138.6 5125.5 5 6 4640 531.7 4841.0 3 5
(including Veterinary Medicine/Animal
Care)
Manufacturing (except seafood 1535 201.5 10762.4 4 5 2975 726.6 6514.1 2 1
processing)
Mining 15 225.6 37458.1 8 4 44 901.6 10200.9 4 2
Public Safety (including Wildland 126 183.4 11288.4 9 8 119 302.4 3009.5 9 9
Firefighting)
Services (except Public Safety and 4212 71.8 3031.9 7 9 8760 286.2 2278.1 7 8
Veterinary Medicine/Animal Care)
Transportation, Warehousing & Utilities 806 284.5 15190.0 3 2 346 366.2 3587.2 8 7
Wholesale and Retail Trade 2548 92.4 3789.7 6 7 3176 313.7 2598.9 6 6
NORA: National Occupational Research Agenda
120

160.0 155.4
148.6
Disabling Claim Rate per 10,000 Workers

140.0 133.9

120.0 108.9

100.0
85.0 83.6 81.2
80.0
57.0
60.0

40.0

20.0

0.0
Female workers Male workers

Age 14-18 Age 19-21 Age 22-24 Age > 24

Figure 4-1 Disabling injury rate by age group (including adult worker group) and gender, Oregon accepted
disabling claims (2013-2018) (Female workers: n=568 (14-18), 1904 (19-21), 2454 (22-24), 36714 (> 24); Male
workers: n=905 (14-18), 3263 (19-21), 4264 (22-24), 64931 (> 24))
121

Struck by/against
WMSDs
Fall on same level
Violence
Others
Caught in/under/between
Rubbed or abraded
Transportation
Fall to lower level
Extreme temperature
Toxic
Overexertion
Bodily reaction
Electrical
Fire and explosion
Stress
WMSDs: work-related musculoskeletal disorders
Noise

0 2000 4000 6000 8000 10000 12000 14000

Disabling claims Non-disabling claims

Figure 4-2 Distribution of claims by injury type for young workers, Oregon accepted disabling (n=13360) and non-
disabling claims (n=24660), 2013-2018
122

CHAPTER 5 General Conclusions

Surveillance of work-related injuries, illnesses and fatalities and other work-related

events has been adopted as a common practice in many countries including the US to improve

occupational health. However, challenges with data sources, surveillance methods and other

resources has been plaguing OSH surveillance. Evaluation has long been a weak point in OSH

surveillance. As the first study of this research showed, nearly half of the systems being

reviewed conducted only secondary data analysis. Approximately two thirds of the systems were

never or rarely evaluated and no system was evaluated frequently. No guideline or framework

has been developed for evaluating OSH surveillance systems to date. I have reported elsewhere

challenges in the evaluation of Oregon OSH surveillance system based on the CDC updated

guidelines (manuscript in press). This includes selecting attributes and linking them to

appropriate components in the OSH surveillance system. A tailored guiding framework

specifying OSH surveillance relevant components and attributes with standardized descriptions

is promising to facilitate this process. Experts and field practitioners engaged in this research

held positive expectations for a tailored OSH surveillance framework. This research contributes

to the field of OSH surveillance and work-related injury and illness prevention by addressing

gaps regarding surveillance data sources and evaluation to meet the needs of field practitioners.

Each specific aim of this research was accomplished (presented in Chapters 2, 3, and 4). The

findings from each of the three studies are summarized below, followed by general conclusions

from the research project as a whole.

Aim 1: To understand the current status of OSH surveillance in the US and to develop a

comprehensive list of elements relevant to OSH surveillance for an evaluation framework,


123

including surveillance components, attributes and example measures by conducting a Delphi

study among experts in the US.

The Aim 1 was accomplished by conducting a Delphi study involving experts across the

US with experience on the operation and/or evaluation of OSH surveillance systems (presented

in Chapter 2). Three iterative rounds of experts’ review and rating identified a total of 211

elements (surveillance components, attributes and example measures), with 134 (62.3%)

reaching high level of panel consensus. For each component and attribute, definition/description

was finalized based on experts’ suggestions to reflect best practice in current OSH surveillance.

Experts’ review of their respective OSH surveillance systems highlighted that evaluations were

not a common practice in these OSH surveillance systems/programs. Major system constraints

and weak points included resources and feasibility, data collection and flexibility. Resources

constraints including financial support, technical platforms and expertise were highlighted as

limitations for more advanced OSH surveillance activities. As such, feasibility was regarded as

an important consideration in OSH surveillance evaluation as well as setting practical

performance criteria.

Aim 2: To develop a guiding framework for evaluation of OSH surveillance systems in

the US based on the comprehensive list of elements identified in Aim 1 and follow-up

suggestions from the Delphi panel experts and potential users of the framework.

The Aim 2 was accomplished as a translational study project following the completion of

Aim 1 (presented in Chapter 3). Identified framework elements and a proposed framework

sketch were disseminated to the Delphi panel experts and potential users of the framework, i.e.,

health professionals and practitioners in national and State-level OSH surveillance systems
124

through the Council of State and Territorial Epidemiologists (CSTE) Occupational Health (OH)

Subcommittee. Feedbacks were collected on the applicability of identified elements and the

development of the framework. A prevalent suggestion is that the framework must be flexible,

simple, and straightforward. The developed guiding framework incorporates the identified

elements, field experts’ suggestions, as well as OSH surveillance evaluation experience from the

research team.

Aim 3: To explore the use of non-disabling WC claims data from WC insurance

company and the Prevention Index (PI) method in OSH surveillance by examining injury and

illness burden and prioritizing industries of high-risk for a vulnerable population in Oregon,

young workers aged 24 years and under.

The Aim 3 was accomplished and presented in Chapter 4. Accepted WC non-disabling

claims data from the State Accident Insurance Fund Corporation (SAIF), the largest WC

insurance provider in Oregon were analyzed together with accepted WC disabling claims data.

Frequency counts and rates by age, gender, industry, and injury type were calculated. The study

quantified a non-disabling injury rate approximately 4 times the rate of disabling injuries among

Oregon young workers (401.3 vs. 111.6 per 10,000 young workers), highlighting the importance

of using both disabling and non-disabling WC claims data for a more complete report of injuries

and illnesses among young workers. The use of PI method identified high ranked industry

sectors for Oregon young workers. For both genders combined, high-risk industries included

agriculture, construction and manufacturing. For female young workers, the high ranked sector

was healthcare.
125

Being applied in nature, the research project aimed to bridge gap between research and

practice. With a translational study design featuring multidirectional process of knowledge

dissemination and integration, field needs in OSH surveillance have been incorporated to guide

this research. Standardized description of components and attributes relevant to OSH

surveillance as well as practical guidance and tools on conducting OSH surveillance evaluation

were incorporated into the developed guiding framework, which is expected to provide a

standard yet flexible guide for evaluation of OSH surveillance systems.

Exploring the use of new and existing surveillance data sources and methods has

continued to be an important topic in OSH surveillance. The exploration of non-disabling WC

claims data and the Prevention Index is promising to expand current state-based surveillance to

guide evidence-based interventions for young workers as well as other worker populations. The

study also demonstrated limitations to be addressed in non-disabling WC claims data from

commercial insurers relating to representativeness, data quality and data interoperability,

including inconsistent data coding systems used and possible coding errors. In-depth evaluation

of non-disabling WC claims data quality help to document and quantify the magnitude and

impact of these limitations.

In the long run, further dissemination of the developed framework to potential users help

to promote more evaluations and in turn to improve OSH surveillance systems. Field application

will help to refine and enhance the framework. Continuing efforts to obtain and evaluate non-

disabling WC claims data as well as other potential data sources would increase the

completeness and accuracy in documenting work-related injuries and illnesses.


126

Bibliography

1. Takala J, Hämäläinen P, Saarela KL, et al. Global Estimates of the Burden of Injury and
Illness at Work in 2012. J Occup Environ Hyg. 2014;11(5):326-337.
doi:10.1080/15459624.2013.863131

2. Bureau of Labor Statistics. Injuries, Illnesses, and Fatalities (IIF). Published 2020.
Accessed March 19, 2020. https://www.bls.gov/iif/home.htm

3. National Safety Council. Work Injury Costs. Injury Facts. Accessed May 14, 2020.
https://injuryfacts.nsc.org/work/costs/work-injury-costs/

4. United Nations. Sustainable Development Goals. Accessed May 1, 2020.


https://sustainabledevelopment.un.org/?menu=1300

5. Centers for Disease Control and Prevention. Injury Prevention and Control. the Centers for
Disease Control and Prevention. Published 2014. Accessed April 1, 2018.
https://www.cdc.gov/injury/about/approach.html

6. Centers for Disease Control and Prevention. How We Prevent Chronic Diseases and
Promote Health. Published July 31, 2019. Accessed March 31, 2020.
https://www.cdc.gov/chronicdisease/center/nccdphp/how.htm

7. National Institute for Occupational Safety and Health. Tracking Occupational Injuries,
Illnesses, and Hazards: The NIOSH Surveillance Strategic Plan. U.S. Department of
Health and Human Services, Public Health Service, Centers for Disease Control and
Prevention, National Institute for Occupational Safety and Health; 2001.

8. Centers for Disease Control and Prevention. Updated guidelines for evaluating public
health surveillance systems: recommendations from the guidelines working group. Morb
Mortal Wkly Rep. 2001;50(No. 13):1-36.

9. National Academies of Sciences, Engineering, and Medicine. A Smarter National


Surveillance System for Occupational Safety and Health in the 21st Century. the National
Academies Press; 2018. Accessed January 20, 2018.
http://www8.nationalacademies.org/onpinews/newsitem.aspx?RecordID=24835

10. BS L, Wegman D, Baron S, Sokas R. Occupational and Environmental Health. Vol 1.


Oxford University Press; 2017. doi:10.1093/oso/9780190662677.001.0001

11. Koh D, Aw T-C. Surveillance in Occupational Health. Occup Environ Med.


2003;60(9):705-710. doi:10.1136/oem.60.9.705

12. Council NR. Counting Injuries and Illnesses in the Workplace: Proposals for a Better
System. The National Academies Press; 1987. doi:10.17226/18911
127

13. Shire JD, Marsh GM, Talbott EO, Sharma RK. Advances and Current Themes in
Occupational Health and Environmental Public Health Surveillance. Annu Rev Public
Health. 2011;32(1):109-132. doi:10.1146/annurev-publhealth-082310-152811

14. National Institute for Occupational Safety and Health. Worker Health Surveillance - Our
Current Surveillance Initiatives. Published 2019. Accessed May 15, 2020.
https://www.cdc.gov/niosh/topics/surveillance/data.html

15. National Institute for Occupational Safety and Health. State Surveillance Program.
Published February 6, 2020. Accessed March 18, 2020.
https://www.cdc.gov/niosh/oep/statesurv.html

16. Occupational Safety and Health Administration (OSHA). OSHA - Information System
(OIS). Accessed April 26, 2020. https://www.dol.gov/agencies/oasam/centers-
offices/ocio/privacy/osha/ois

17. CSTE. 2013 National Assessment of Epidemiology Capacity: Findings &


Recommendations. Council of State and Territorial Epidemiologiests (CSTE); 2014.

18. Arrazola J. Assessment of Epidemiology Capacity in State Health Departments — United


States, 2017. Morb Mortal Wkly Rep. 2018;67. doi:10.15585/mmwr.mm6733a5

19. Stout N, Bell C. Effectiveness of source documents for identifying fatal occupational
injuries: a synthesis of studies. Am J Public Health. 1991;81(6):725-728.
doi:10.2105/AJPH.81.6.725

20. Leigh JP, Du J, McCurdy SA. An estimate of the U.S. government’s undercount of
nonfatal occupational injuries and illnesses in agriculture. Ann Epidemiol. 2014;24(4):254-
259. doi:10.1016/j.annepidem.2014.01.006

21. PRANSKY G, SNYDER T, DEMBE A, HIMMELSTEIN J. Under-reporting of work-


related disorders in the workplace: a case study and review of the literature. Ergonomics.
1999;42(1):171-182. doi:10.1080/001401399185874

22. Leigh JP, Marcin JP, Miller TR. An Estimate of the U.S. Government’s Undercount of
Nonfatal Occupational Injuries. J Occup Environ Med. 2004;46(1):10–18.
doi:10.1097/01.jom.0000105909.66435.53

23. Rosenman KD, Kalush A, Reilly MJ, Gardiner JC, Reeves M, Luo Z. How Much Work-
Related Injury and Illness is Missed By the Current National Surveillance System? J
Occup Environ Med. 2006;48(4):357–365. doi:10.1097/01.jom.0000205864.81970.63

24. Ruser JW. Allegations of Undercounting in the BLS Survey of Occupational Injuries and
Illnesses. In: JSM Proceedings, Statistical Computing Section. American Statistical
Association; 2010:15.

25. Azaroff LS, Levenstein C, Wegman DH. Occupational Injury and Illness Surveillance:
Conceptual Filters Explain Underreporting. Am J Public Health. 2002;92(9):1421-1429.
128

26. Kica J, Rosenman KD. Multi-source surveillance for work-related crushing injuries. Am J
Ind Med. 2018;61(2):148-156. doi:10.1002/ajim.22800

27. Largo TW, Rosenman KD. Surveillance of work-related amputations in Michigan using
multiple data sources: results for 2006-2012. Occup Environ Med. 2015;72(3):171-176.
doi:10.1136/oemed-2014-102335

28. Davis LK, Grattan KM, Tak S, Bullock LF, Ozonoff A, Boden LI. Use of multiple data
sources for surveillance of work-related amputations in Massachusetts, comparison with
official estimates and implications for national surveillance. Am J Ind Med.
2014;57(10):1120–1132.

29. Borjan M, Lumia M. Evaluation of a state based syndromic surveillance system for the
classification and capture of non-fatal occupational injuries and illnesses in New Jersey.
Am J Ind Med. 2017;60(7):621-626. doi:10.1002/ajim.22734

30. Sears JM, Bowman SM, Hogg-Johnson S, Shorter ZA. Occupational Injury Trends
Derived From Trauma Registry and Hospital Discharge Records Lessons for Surveillance
and Research. J Occup Environ Med. 2014;56(10):1067-1073.
doi:10.1097/JOM.0000000000000225

31. Pefoyo AJK, Genesove L, Moore K, Del Bianco A, Kramer D. Exploring the Usefulness
of Occupational Exposure Registries for Surveillance The Case of the Ontario Asbestos
Workers Registry (1986-2012). J Occup Environ Med. 2014;56(10):1100-1110.
doi:10.1097/JOM.0000000000000235

32. Lofgren DJ, Reeb-Whitaker CK, Adams D. Surveillance of Washington OSHA Exposure
Data to Identify Uncharacterized or Emerging Occupational Health Hazards. J Occup
Environ Hyg. 2010;7(7):375-388. doi:10.1080/15459621003781207

33. Lavoue J, Friesen MC, Burstyn I. Workplace Measurements by the US Occupational


Safety and Health Administration since 1979: Descriptive Analysis and Potential Uses for
Exposure Assessment. Ann Occup Hyg. 2013;57(1):77-97. doi:10.1093/annhyg/mes055

34. Sarazin P, Burstyn I, Kincl L, Lavoué J. Trends in OSHA Compliance Monitoring Data
1979–2011: Statistical Modeling of Ancillary Information across 77 Chemicals. Ann
Occup Hyg. 2016;60(4):432-452. doi:10.1093/annhyg/mev092

35. Graber JM, Smith AE. Results from a state-based surveillance system for carbon
monoxide poisoning. PUBLIC Health Rep. 2007;122(2):145-154.
doi:10.1177/003335490712200203

36. Harduar Morano L, Watkins S. Evaluation of Diagnostic Codes in Morbidity and


Mortality Data Sources for Heat-Related Illness Surveillance. Public Health Rep Wash DC
1974. 2017;132(3):326-335. doi:10.1177/0033354917699826
129

37. Harber P, Ha J, Roach M. Arizona Hospital Discharge and Emergency Department


Database Implications for Occupational Health Surveillance. J Occup Environ Med.
2017;59(4):417-423. doi:10.1097/JOM.0000000000000971

38. Breslin C, Koehoorn M, Smith P, Manno M. Age related differences in work injuries and
permanent impairment: a comparison of workers’ compensation claims among
adolescents, young adults, and adults. Occup Environ Med. 2003;60(9):e10–e10.

39. Radi S, Benke G, Schaafsma F, Sim M. Compensation claims for occupational noise
induced hearing loss between 1998 and 2008: yearly incidence rates and trends in older
workers. Aust N Z J Public Health. 2016;40(2):181–185. doi:10.1111/1753-6405.12460

40. Morassaei S, Breslin FC, Shen M, Smith PM. Examining job tenure and lost-time claim
rates in Ontario, Canada, over a 10-year period, 1999–2008. Occup Env Med.
2014;70(3):171-178. doi:10.1136/oemed-2012-100743

41. Tarawneh I, Lampl M, Robins D, et al. Workers’ Compensation Claims for


Musculoskeletal Disorders Among Wholesale and Retail Trade Industry Workers — Ohio,
2005–2009. MMWR Morb Mortal Wkly Rep. 2013;62(22):437-442.

42. Council of State and Territorial Epidemiologists. Occupational Health Indicators: A


Guide for Tracking Occupational Health Conditions and Their Determinants (Update
2019). Council of State and Territorial Epidemiologists (CSTE) in collaboration with the
NIOSH; 2020.
https://cdn.ymaws.com/www.cste.org/resource/resmgr/publications/OHI_Guidance_Manu
al_FINAL-2.pdf

43. Mccall P, Horwitz B. Workplace Violence in Oregon: An Analysis Using


Workers’Compensation Claims from 1990–1997. J Occup Environ Med. 2004;46(4):357–
366. doi:10.1097/01.jom.0000121131.34757.ed

44. Mallon M, Cherry E. Investigating the Relationship Between Worker Demographics and
Nature of Injury on Federal Department of Defense Workersʼ Compensation Injury Rates
and Costs From 2000 to 2008. J Occup Environ Med. 2015;57 Suppl 3S:S27–S30.
doi:10.1097/JOM.0000000000000416

45. Ramaswamy SK, Mosher GA. Using workers’ compensation claims data to characterize
occupational injuries in the biofuels industry. Saf Sci. 2018;103:352-360.
doi:10.1016/j.ssci.2017.12.014

46. Horwitz IB, McCall BP. Occupational injury among Rhode Island adolescents: an analysis
of workers’ compensation claims, 1998 to 2002. J Occup Environ Med Am Coll Occup
Environ Med. 2005;47(5):473-481.

47. Walters JK, Christensen KA, Green MK, Karam LE, Kincl LD. Occupational injuries to
Oregon workers 24 years and younger: An analysis of workers’ compensation claims,
2000-2007. Am J Ind Med. 2010;53(10):984-994. doi:10.1002/ajim.20819
130

48. Utterback DF, Meyers AR, Wurzelbacher SJ. Workers’ Compensation Insurance: A
Primer for Public Health. Nitional Institute for Occupational Safety and Health; 2014.
Accessed February 19, 2019. http://elcosh.org/document/3752/d001287/workers%3F-
compensation-insurance%3A-a-primer-for-public-health.html

49. Thacker SB, Qualters JR, Lee LM. Public Health Surveillance in the United States:
Evolution and Challenges. Morb Mortal Wkly Rep. 2012;61(3):3-9.

50. Hall HI, Correa A, Yoon PW, Braden CR. Lexicon, Definitions, and Conceptual
Framework for Public Health Surveillance. 2012;61:5.

51. Choi BCK. The Past, Present, and Future of Public Health Surveillance. Scientifica.
2012;2012:1-26. doi:http://dx.doi.org/10.6064/2012/875253

52. Allaki FE, Bigras-Poulin M, Michel P, Ravel A. A Population Health Surveillance Theory.
Epidemiol Health. 2012;34:e2012007. doi:10.4178/epih/e2012007

53. Centers for Disease Control and Prevention. Guidelines for evaluating surveillance
systems. Morb Mortal Wkly Rep MMWR. 1988;37(No. S-5).

54. Langmuir AD. The surveillance of communicable diseases of national importance. N Engl
J Med. 1963;268(4):182-192.

55. McNabb SJ, Chungong S, Ryan M, et al. Conceptual framework of public health
surveillance and action and its application in health sector reform. BMC Public Health.
2002;2:2. doi:10.1186/1471-2458-2-2

56. World Health Organization. Communicable Disease Surveillance and Response Systems:
Guide to Monitoring and Evaluating. World Health Organization; 2006.
http://apps.who.int/iris/bitstream/10665/69331/1/WHO_CDS_EPR_LYO_2006_2_eng.pd
f

57. World Health Organization, Department of Communicable Disease Surveillance and


Responses. Protocol for the Assessment of National Communicable Disease Surveillance
and Response Systems. Guidelines for Assessment Teams. World Health Organization;
2001. Accessed January 16, 2019.
https://www.who.int/csr/resources/publications/surveillance/WHO_CDS_CSR_ISR_2001
_2_EN/en/

58. Xiong W, Lv J, Li L. A survey of core and support activities of communicable disease


surveillance systems at operating-level CDCs in China. Bmc Public Health. 2010;10:704.
doi:10.1186/1471-2458-10-704

59. Muellner P, Watts J, Bingham P, et al. SurF: an innovative framework in biosecurity and
animal health surveillance evaluation. Transbound Emerg Dis. 2018;0(0):1-8.
doi:10.1111/tbed.12898
131

60. Peyre M, Hoinville L, Njoroge J, et al. The RISKSUR EVA tool (Survtool): A tool for the
integrated evaluation of animal health surveillance systems. Prev Vet Med.
2019;173:104777. doi:10.1016/j.prevetmed.2019.104777

61. Auer AM, Dobmeier TM, Haglund BJ, Tillgren P. The relevance of WHO injury
surveillance guidelines for evaluation: learning from the aboriginal community-centered
injury surveillance system (ACCISS) and two institution-based systems. BMC Public
Health. 2011;11(744):1-15. doi:10.1186/1471-2458-11-744

62. Centers for Disease Control and Prevention (CDC), Office of Strategy and Innovation.
Introduction to Program Evaluation for Public Health Programs: A Self-Study Guide.
Atlanta, GA: Centers for Disease Control and Prevention; 2011.
http://www.cdc.gov/eval/guide/cdcevalmanual.pdf

63. Centers for Disease Control and Prevention, Program Performance and Evaluation Office.
Framework for program evaluation in public health. Morb Mortal Wkly Rep.
1999;48(No.RR-11):1-40.

64. Calba C, Goutard FL, Hoinville L, et al. Surveillance systems evaluation: a systematic
review of the existing approaches. BMC Public Health. 2015;15:448. doi:10.1186/s12889-
015-1791-5

65. Drewe JA, Hoinville LJ, Cook AJC, Floyd T, Stärk KDC. Evaluation of animal and public
health surveillance systems: a systematic review. Epidemiol Infect. 2012;140(4):575-590.
doi:10.1017/S0950268811002160

66. Allaki FE, Bigras-Poulin M, Ravel A. Conceptual evaluation of population health


surveillance programs: method and example. Prev Vet Med. 2013;108(4):241–252.

67. Spreeuwers D, de Boer AG, Verbeek JH, van Dijk FJ. Characteristics of national registries
for occupational diseases: international development and validation of an audit tool
(ODIT). BMC Health Serv Res. 2009;9:194. doi:10.1186/1472-6963-9-194

68. Holder Y, Peden M, Krug E, et al (Eds). Injury Surveillance Guidelines.; 2001.

69. Adamson PC, Tafuma TA, Davis SM, Xaba S, Herman-Roloff A. A systems-based
assessment of the PrePex device adverse events active surveillance system in Zimbabwe.
PLOS ONE. 2017;12(12):e0190055. doi:10.1371/journal.pone.0190055

70. Jefferson H, Dupuy B, Chaudet H, et al. Evaluation of a syndromic surveillance for the
early detection of outbreaks among military personnel in a tropical country. J Public
Health Oxf Engl. 2008;30(4):375-383. doi:10.1093/pubmed/fdn026

71. Jhung MA, Budnitz DS, Mendelsohn AB, Weidenbach KN, Nelson TD, Pollock DA.
Evaluation and overview of the National Electronic Injury Surveillance System-
Cooperative Adverse Drug Event Surveillance Project (NEISS-CADES). Med Care.
2007;45(10 Supl 2):S96-102. doi:10.1097/MLR.0b013e318041f737
132

72. Joseph A, Patrick N, Lawrence N, Lilian O, Olufemi A. Evaluation of Malaria


Surveillance System in Ebonyi state, Nigeria, 2014. Ann Med Health Sci Res. Published
online 2017. Accessed December 30, 2017. https://www.amhsr.org/abstract/evaluation-of-
malaria-surveillance-system-in-ebonyi-state-nigeria-2014-3901.html

73. Kaburi BB, Kubio C, Kenu E, et al. Evaluation of the enhanced meningitis surveillance
system, Yendi municipality, northern Ghana, 2010–2015. BMC Infect Dis. 2017;17:306.
doi:10.1186/s12879-017-2410-0

74. Liu X, Li L, Cui H, Jackson VW. Evaluation of an emergency department-based injury


surveillance project in China using WHO guidelines. Inj Prev. 2009;15(2):105–110.

75. Patel K, Watanabe-Galloway S, Gofin R, Haynatzki G, Rautiainen R. Non-fatal


agricultural injury surveillance in the United States: A review of national-level survey-
based systems. Am J Ind Med. 2017;60(7):599-620. doi:10.1002/ajim.22720

76. Pinell-McNamara VA, Acosta AM, Pedreira MC, et al. Expanding Pertussis Epidemiology
in 6 Latin America Countries through the Latin American Pertussis Project. Emerg Infect
Dis. 2017;23(Suppl 1):S94-S100. doi:10.3201/eid2313.170457

77. Thomas MJ, Yoon PW, Collins JM, Davidson AJ, Mac Kenzie WR. Evaluation of
Syndromic Surveillance Systems in 6 US State and Local Health Departments: J Public
Health Manag Pract. Published online September 2017:1.
doi:10.1097/PHH.0000000000000679

78. Velasco-Mondragón HE, Martin J, Chacón-Sosa F. Technology evaluation of a USA-


Mexico health information system for epidemiological surveillance of Mexican migrant
workers. Rev Panam Salud Pública. 2000;7:185-192. doi:10.1590/S1020-
49892000000300008

79. Hoffman S, Dekraai M. Brief Evaluation Protocol for Public Health Surveillance Systems.
The Public Policy Center, University of Nebraska; 2016.

80. Centers for Disease Control and Prevention. Evaluating an NCD-Related Surveillance
System. Centers for Disease Control and Prevention; 2013.

81. Centers for Disease Control and Prevention. Framework for evaluating public health
surveillance systems for early detection of outbreaks: recommendations from the CDC
Working Group. Morb Mortal Wkly Rep. 2004;53(No. RR-5):1-13.

82. Salman null, Stärk KDC, Zepeda C. Quality assurance applied to animal disease
surveillance systems. Rev Sci Tech Int Off Epizoot. 2003;22(2):689-696.

83. Meynard J-B, Chaudet H, Green AD, et al. Proposal of a framework for evaluating
military surveillance systems for early detection of outbreaks on duty areas. BMC Public
Health. 2008;8(1):146.
133

84. Hendrikx P, Gay E, Chazel M, et al. OASIS: an assessment tool of epidemiological


surveillance systems in animal health and food safety. Epidemiol Infect.
2011;139(10):1486-1496.

85. Health Canada. Framework and Tools for Evaluating Health Surveillance Systems.
Population and Public Health Branch, Health Canada; 2004. Accessed October 18, 2017.
http://publications.gc.ca/site/eng/260337/publication.html

86. Hagemeyer A, Azofeifa A, Stroup DF, Tomedi LE. Evaluating Surveillance for Excessive
Alcohol Use in New Mexico. Prev Chronic Dis. 2018;15. doi:10.5888/pcd15.180358

87. Austin C. An evaluation of the census of fatal occupational injuries as a system for
surveillance. Compens Work Cond. 1995;1:51–54.

88. Patel K, Watanabe-Galloway S, Gofin R, Haynatzki G, Rautiainen R. Non-fatal


agricultural injury surveillance in the United States: A review of national-level survey-
based systems. Am J Ind Med. 2017;60(7):599-620. doi:10.1002/ajim.22720

89. Garland R. Evaluation of the Oregon Asthma Surveillance System. Oregon Department of
Human Services, Public Health Division, Office of Disease Prevention and Epidemiology,
Health Promotion and Chronic Disease Prevention, Oregon Asthma Program; 2009.

90. Samoff E, MacDonald PDM, Fangman MT, Waller AE. Local Surveillance Practice
Evaluation in North Carolina and Value of New National Accreditation Measures. J
Public Health Manag Pract. 2013;19(2):146–152. doi:10.1097/PHH.0b013e318252ee21

91. Erondu N. Evaluating communicable disease surveillance in resource-poor settings: A new


approach applied to meningitis surveillance in Chad. Published online 2016.

92. Dufour B. Technical and economic evaluation method for use in improving infectious
animal disease surveillance networks. Vet Res. 1999;30(1):27–37.

93. World Health Organization. Assessing the National Health Information System: An
Assessment Tool (v4.0). World Health Organization; 2008.
http://apps.who.int/iris/bitstream/10665/43932/1/9789241547512_eng.pdf?ua=1

94. Drewe JA, Hoinville LJ, Cook AJC, Floyd T, Staerk KDC. Evaluation of animal and
public health surveillance systems: a systematic review. Epidemiol Infect.
2012;140(4):575-590. doi:10.1017/S0950268811002160

95. Zoellner JM, Porter KJ. Translational Research: Concepts and Methods in Dissemination
and Implementation Research. In: Coulston AM, Boushey CJ, Ferruzzi MG, Delahanty
LM, eds. Nutrition in the Prevention and Treatment of Disease (Fourth Edition).
Academic Press; 2017:125-143. doi:10.1016/B978-0-12-802928-2.00006-0

96. National Institute for Occupational Safety and Health. Research to Practice (r2p).
Published October 19, 2018. Accessed May 15, 2020.
https://www.cdc.gov/niosh/r2p/default.html
134

97. Landrigan PJ. Improving the surveillance of occupational disease. Am J Public Health.
1989;79(12):1601–1602.

98. Centers for Disease Control and Prevention. Overview of Evaluating Surveillance
Systems.; 2013.

99. Hoinville LJ, Alban L, Drewe JA, et al. Proposed terms and concepts for describing and
evaluating animal-health surveillance systems. Prev Vet Med. 2013;112(1-2):1-12.
doi:10.1016/j.prevetmed.2013.06.006

100. Auer AM, Dobmeier TM, Haglund BJ, Tillgren P. The relevance of WHO injury
surveillance guidelines for evaluation: learning from the aboriginal community-centered
injury surveillance system (ACCISS) and two institution-based systems. BMC Public
Health. 2011;11:744. doi:10.1186/1471-2458-11-744

101. Patel K, Watanabe-Galloway S, Gofin R, Haynatzki G, Rautiainen R. Non-fatal


agricultural injury surveillance in the United States: A review of national-level survey-
based systems. Am J Ind Med. 2017;60(7):599-620. doi:10.1002/ajim.22720

102. Birrell L. Developing evidence-based guidelines in occupational health. Occup Med.


2001;51(2):73-74. doi:10.1093/occmed/51.2.073

103. Foth T, Efstathiou N, Vanderspank-Wright B, et al. The use of Delphi and Nominal Group
Technique in nursing education: A review. Int J Nurs Stud. 2016;60:112-120.
doi:10.1016/j.ijnurstu.2016.04.015

104. Murphy MK, Black NA, Lamping DL, et al. consensus development methods, and their
use in clinical guideline development. Health Technol Assess. 1998;2(3):1-88.

105. Adler M, Ziglio E. Gazing Into the Oracle: The Delphi Method and Its Application to
Social Policy and Public Health. Jessica Kingsley Publishers; 1996.

106. Linstone HA, Turoff M. The Delphi Method: Techniques and Applications. First Edition.
Addison-Wesley Educational Publishers Inc; 1975.

107. Powell C. The Delphi technique: myths and realities. J Adv Nurs. 2003;41(4):376-382.
doi:10.1046/j.1365-2648.2003.02537.x

108. Hsu C-C, Sandford BA. The Delphi technique: making sense of consensus. Pract Assess
Res Eval. 2007;12(10):1–8.

109. Rowe G, Wright G. The Delphi technique as a forecasting tool: issues and analysis. Int J
Forecast. 1999;15(4):353-375. doi:10.1016/S0169-2070(99)00018-7

110. Boulkedid R, Abdoul H, Loustau M, Sibony O, Alberti C. Using and Reporting the Delphi
Method for Selecting Healthcare Quality Indicators: A Systematic Review. Wright JM, ed.
PLoS ONE. 2011;6(6):e20476. doi:10.1371/journal.pone.0020476
135

111. Habibi A, Sarafrazi A, Izadyar S. Delphi Technique Theoretical Framework in Qualitative


Research. Int J Eng Sci. 2014;3(4):8-13.

112. Hasson F, Keeney S. Enhancing rigour in the Delphi technique research. Technol Forecast
Soc Change. 2011;78(9):1695-1704. doi:10.1016/j.techfore.2011.04.005

113. Hsu C-C, Sandford BA. Minimizing Non-Response in The Delphi Process: How to
Respond to Non-Response. Pract Assess Res Eval. 2007;12(17):1-6.

114. Sinha IP, Smyth RL, Williamson PR. Using the Delphi Technique to Determine Which
Outcomes to Measure in Clinical Trials: Recommendations for the Future Based on a
Systematic Review of Existing Studies. PLoS Med. 2011;8(1):e1000393.
doi:10.1371/journal.pmed.1000393

115. El Allaki F, Bigras-Poulin M, Ravel A. Conceptual evaluation of population health


surveillance programs: Method and example. Prev Vet Med. 2013;108(4):241-252.
doi:10.1016/j.prevetmed.2012.11.001

116. Greatorex J, Dexter T. An accessible analytical approach for investigating what happens
between the rounds of a Delphi study. J Adv Nurs. 2000;32(4):1016-1024.

117. Makkonen M, Hujala T, Uusivuori J. Policy experts’ propensity to change their opinion
along Delphi rounds. Technol Forecast Soc Change. 2016;109:61-68.
doi:10.1016/j.techfore.2016.05.020

118. Bolger F, Wright G. Improving the Delphi process: Lessons from social psychological
research. Technol Forecast Soc Change. 2011;78(9):1500-1513.
doi:10.1016/j.techfore.2011.07.007

119. Meijering JV, Tobi H. The effect of controlled opinion feedback on Delphi features:
Mixed messages from a real-world Delphi experiment. Technol Forecast Soc Change.
2016;103:166-173. doi:10.1016/j.techfore.2015.11.008

120. Calba C, RVC JD, Goutard F, et al. The Evaluation Attributes Used for Evaluating Animal
Health Surveillance Systems. RISKSUR Project; 2013.

121. Winkler J, Moser R. Biases in future-oriented Delphi studies: A cognitive perspective.


Technol Forecast Soc Change. 2016;105:63-76. doi:10.1016/j.techfore.2016.01.021

122. Harris MJ. Evaluating Public and Community Health Programs. John Wiley & Sons,
Incorporated; 2016. Accessed May 1, 2020.
http://ebookcentral.proquest.com/lib/osu/detail.action?docID=4722974

123. McNabb SJN, Surdo AM, Redmond A, et al. Applying a new conceptual framework to
evaluate tuberculosis surveillance and action performance and measure the costs,
Hillsborough County, Florida, 2002. Ann Epidemiol. 2004;14(9):640-645.
doi:10.1016/j.annepidem.2003.09.021
136

124. National Institute for Occupational Safety and Health. NIOSH Workplace Safety and
Health Topic: Young Worker Safety and Health. Published 2019. Accessed February 9,
2020. https://www.cdc.gov/niosh/topics/youth/default.html

125. National research Council. Protecting Youth at Work: Health, Safety, and Development of
Working Children and Adolescents in the United States. The National Academies Press;
1998. doi:10.17226/6019

126. Pratt B, Cheesman J, Breslin C, Do MT. Occupational injuries in Canadian youth: an


analysis of 22 years of surveillance data collected from the Canadian Hospitals Injury
Reporting and Prevention Program. Health Promot Chronic Dis Prev Can Res Policy
Pract. 2016;36(5):89-98.

127. Breslin FC, Day D, Tompa E, Bhattacharyya S, Clarke J, Wang A. Systematic Review of
Risk Factors for Work Injury among Youth. Institute for Work & Health; 2005:101.

128. Windau J, Samuel M. Occupational injuries among young workers. Montyly Labor Rev.
Published online 2005:11-23.

129. Bena A, Leombruni R, Giraudo M, Costa G. A new Italian surveillance system for
occupational injuries: characteristics and initial results. Am J Ind Med. 2012;55(7):584-
592. doi:10.1002/ajim.22025

130. Breslin FC, Smith P. Age-related differences in work injuries: a multivariate, population-
based study. Am J Ind Med. 2005;48(1):50-56. doi:10.1002/ajim.20185

131. Centers for Disease Control and Prevention. Nonfatal Occupational Injuries and Illnesses
Treated in Hospital Emergency Departments--United States, 2003. MMWR Morb Mortal
Wkly Rep. 2006;55(449-452):313.

132. Hanvold TN, Kines P, Nykänen M, et al. Occupational Safety and Health Among Young
Workers in the Nordic Countries: A Systematic Literature Review. Saf Health Work.
Published online December 2018. doi:10.1016/j.shaw.2018.12.003

133. Salminen S. Have young workers more injuries than older ones? An international literature
review. J Safety Res. 2004;35(5):513-521. doi:10.1016/j.jsr.2004.08.005

134. Hanvold TN. Young Workers and Sustainable Work Life: Special Emphasis on Nordic
Conditions. Nordic Council of Ministers; 2016.

135. Paterson JL, Clarkson L, Rainsird S, Etherton H, Blevvett V. Occupational fatigue and
other health and safety issues for young Australian workers: an exploratory mixed
methods study. Ind Health. 2015;53(3):293-299. doi:10.2486/indhealth.2014-0257

136. Layne L, Castillo D, Stout N, Cutlip P. Adolescent Occupational Injuries Requiring


Hospital Emergency Department Treatment - a Nationally Representative Sample. Am J
Public Health. 1994;84(4):657-660. doi:10.2105/AJPH.84.4.657
137

137. Miller ME, Kaufman JD. Occupational injuries among adolescents in Washington State,
1988-1991. Am J Ind Med. 1998;34(2):121-132. doi:10.1002/(SICI)1097-
0274(199808)34:2<121::AID-AJIM4>3.3.CO;2-#

138. Breslin FC, Polzer J, MacEachen E, Morrongiello B, Shannon H. Workplace injury or


“part of the job”? Towards a gendered understanding of injuries and complaints among
young workers. Soc Sci Med. 2007;64(4):782-793. doi:10.1016/j.socscimed.2006.10.024

139. Dragano N, Barbaranelli C, Reuter M, et al. Young Workers’ Access to and Awareness of
Occupational Safety and Health Services: Age-Differences and Possible Drivers in a
Large Survey of Employees in Italy. Int J Environ Res Public Health. 2018;15(7):1511.
doi:10.3390/ijerph15071511

140. Runyan CW, Schulman M, Dal Santo J, Bowling JM, Agans R, Ta M. Work-related
hazards and workplace safety of US adolescents employed in the retail and service sectors.
Pediatrics. 2007;119(3):526-534. doi:10.1542/peds.2006-2009

141. Nielsen ML, Dyreborg J, Kines P, Nielsen KJ, Rasmussen K. Exploring and Expanding
the Category of ‘Young Workers’ According to Situated Ways of Doing Risk and
Safety—a Case Study in the Retail Industry. Nord J Work Life Stud. 2013;3(3):219-234.
doi:10.19154/njwls.v3i3.3019

142. Mujuru P, Mutambudzi M. Injuries and Seasonal Risks among Young Workers in West
Virginia - A 10-Year Retrospective Descriptive Analysis. AAOHN J. 2007;55(9):381-387.
doi:10.1177/216507990705500906

143. Laberge M, Ledoux E. Occupational health and safety issues affecting young workers: A
literature review. Work- J Prev Assess Rehabil. 2011;39(3):215-232. doi:10.3233/WOR-
2011-1170

144. Parker DL, Carl WR, French LR, Martin FB. Nature and incidence of self-reported
adolescent work injury in Minnesota. Am J Ind Med. 1994;26(4):529-541.
doi:10.1002/ajim.4700260410

145. Brooks DR, Davis LK, Gallagher SS. Work-related injuries among Massachusetts
children: A study based on emergency department data. Am J Ind Med. 1993;24(3):313-
324. doi:10.1002/ajim.4700240308

146. Graves JM, Sears JM, Vavilala MS, Rivara FP. The burden of traumatic brain injury
among adolescent and young adult workers in Washington State. J Safety Res.
2013;45:133-139. doi:10.1016/j.jsr.2012.11.001

147. US Bureau of Labor Statistics. Survey of Occupational Injuries and Illnesses Data.
Published 2019. Accessed March 10, 2020. https://www.bls.gov/iif/soii-overview.htm

148. Brooks DR, Davis LK. Work-related injuries to Massachusetts teens, 1987–1990. Am J
Ind Med. 1996;29(2):153-160. doi:10.1002/(SICI)1097-0274(199602)29:2<153::AID-
AJIM5>3.0.CO;2-T
138

149. McCall BP, Horwitz IB, Carr BS. Adolescent Occupational Injuries and Workplace Risks:
An Analysis of Oregon Workers’ Compensation Data 1990–1997. J Adolesc Health.
2007;41(3):248-255. doi:10.1016/j.jadohealth.2007.02.004

150. Tucker S, Diekrager D, Turner N, Kelloway EK. Work-related injury underreporting


among young workers: Prevalence, gender differences, and explanations for
underreporting. J Safety Res. 2014;50:67-73. doi:10.1016/j.jsr.2014.04.001

151. Turner N, Tucker S, Kelloway EK. Prevalence and demographic differences in


microaccidents and safety behaviors among young workers in Canada. J Safety Res.
2015;53:39-43. doi:10.1016/j.jsr.2015.03.004

152. SAIF. SAIF company information. Published 2020. Accessed March 7, 2020.
https://www.saif.com/x640.xml

153. Anderson NJ, Bonauto DK, Adams D. Prioritizing Industries for Occupational Injury and
Illness Prevention and Research, Washington State Workers’ Compensation Claims Data,
2002–2010. Safety and Health Assessment and Research for Prevention (SHARP)
Program, Washington State Department of Labor & Industries; 2013:1-58. Accessed
February 22, 2017. http://www.lni.wa.gov/Safety/Research/Files/bd_3F.pdf

154. Bonauto D, Silverstein B, Adams D, Foley M. Prioritizing Industries for Occupational


Injury and Illness Prevention and Research, Washington State Workers’ Compensation
Claims, 1999-2003. J Occup Environ Med. 2006;48(8):840-851.
doi:10.1097/01.jom.0000225062.88285.b3

155. Oregon State Legislature. Workers’ Compensation (2019 Edition).; 2019. Accessed March
10, 2020. https://www.oregonlegislature.gov/bills_laws/ors/ors656.html

156. Oregon State Legislature. Workers’ Compensation (2017 Edition).; 2017. Accessed
November 20, 2017. https://www.oregonlegislature.gov/bills_laws/ors/ors656.html

157. Department of Consumer and Business Services. State of Oregon Workers’ compensation
insurance and self insurance. Accessed April 4, 2020.
https://www.oregon.gov/dcbs/reports/compensation/insurance/Pages/index.aspx

158. US Census Bureau Center for Economic Studies. Longitudinal Employer-Household


Dynamics. Published 2020. Accessed March 10, 2020. https://lehd.ces.census.gov/data/

159. Syron LN, Kincl L, Yang L, Cain DT, Smit E. Analysis of workers’ compensation
disabling claims in Oregon’s seafood preparation and packaging industry, 2007-2013. Am
J Ind Med. 2017;60(5):484-493. doi:10.1002/ajim.22706

160. US Census Bureau. North American Industry Classification System. Published 2020.
Accessed March 10, 2020. https://www.census.gov/cgi-
bin/sssd/naics/naicsrch?chart=2012
139

161. Centers for Disease Control and Prevention. National Occupational Research Agenda
(NORA): Sectors. Published February 4, 2019. Accessed March 10, 2020.
https://www.cdc.gov/nora/sectorapproach.html

162. Centers for Disease Control and Prevention. Occupational Injury and Illness Classification
System (OIICS). Published 2019. Accessed March 10, 2020.
https://wwwn.cdc.gov/Wisards/oiics/default.aspx

163. Workers Compensation Insurance Organizations. Injury Descritpion Table -


Part/Nature/Cause. Published 2020. Accessed March 10, 2020.
https://www.wcio.org/Document%20Library/InjuryDescriptionTablePage.aspx

164. Anderson NJ, Bonauto DK, Adams D. Prioritizing industries for occupational injury
prevention and research in the Services Sector in Washington State, 2002–2010. J Occup
Med Toxicol. 2014;9(37):1-15. doi:10.1186/s12995-014-0037-2

165. US Bureau of Labor Statistics. Occupational Safety and Health Definitions. Published
2016. Accessed March 10, 2020. https://www.bls.gov/iif/oshdef.htm

166. Bureau of Labor Statistics. Consumer Price Index (CPI). Published 2017. Accessed May
17, 2017. https://www.bls.gov/cpi/

167. Fouquet N, Bodin J, Chazelle E, Descatha A, Roquelaure Y. Use of Multiple Data Sources
for Surveillance of Work-Related Chronic Low-Back Pain and Disc-Related Sciatica in a
French Region. Ann Work Expo Health. 2018;62(5):530-546.
doi:10.1093/annweh/wxy023

168. Silverstein B, Viikari-Juntura E, Kalat J. Use of a prevention index to identify industries at


high risk for work-related musculoskeletal disorders of the neck, back, and upper
extremity in Washington state, 1990-1998. Am J Ind Med. 2002;41(3):149-169.

169. Yang L, Branscum A, Smit E, Dreher D, Howard K, Kincl L. Work-related injuries and
illnesses and their association with hour of work: Analysis of the Oregon construction
industry in the US using workers’ compensation accepted disabling claims, 2007-2013. J
Occup Health. 2020;62(1):e12118. doi:10.1002/1348-9585.12118

170. Graitcer PL. The development of state and local injury surveillance systems. J Safety Res.
1987;18(4):191-198. doi:10.1016/0022-4375(87)90082-X

171. Saha A, Sadhu HG. Occupational Injury Proneness in Young Workers: A Survey in Stone
Quarries. J Occup Health. 2013;55(5):333-339. doi:10.1539/joh.12-0150-OA

172. Runyan CW, Zakocs RC. Epidemiology and Prevention of Injuries Among Adolescent
Workers in the United States. Annu Rev Public Health. 2000;21(1):247-269.
doi:10.1146/annurev.publhealth.21.1.247

173. Belville R, Pollack SH, Godbold JH, Landrigan PJ. Occupational injuries among working
adolescents in New York State. JAMA. 1993;269(21):2754-2759.
140

174. Margolis KA. Underground coal mining injury: A look at how age and experience relate to
days lost from work following an injury. Saf Sci. 2010;48(4):417-421.
doi:10.1016/j.ssci.2009.12.015

175. Taylor EJ, Neis B, Messing K, Dumais L. Invisible: issues in women’s occupational
health. Resour Fem Res Tor. 1996;25(1/2):36-37.

176. Biddle J, Roberts K, Rosenman KD, Welch EM. What Percentage of Workers With Work-
Related Illnesses Receive Workers’ Compensation Benefits? J Occup Environ Med.
1998;40(4):325–331.

177. Shannon HS, Lowe GS. How many injured workers do not file claims for workers’
compensation benefits? Am J Ind Med. 2002;42(6):467-473. doi:10.1002/ajim.10142
141

Appendices

Appendix 1-1 Historical events in occupational safety and health (OSH) and the

development of OSH surveillance in the US

Time period Event/OSH surveillance system


early 1900s Bureau of Labor Statistics (BLS) (predating the Department of Labor) reported data on
work-related injuries in certain industries such as rail transportation, and occupational
health problems such as phosphorous poisoning and industrial lead poisoning.
1937-1960s In 1937, BLS adopted the American National Standards Institute (ANSI) standard,
ANSI Z16.1, which provided a standard basis for recording and coding injuries at work,
including nature of the injury, part of the body affected, source of the injury.
Employers voluntarily recorded and reported injuries and illnesses to BLS under ANSI
Z16.1.
1950s-1960s the National Safety Council (NSC) compiled occupational injury statistics based on
ANSI Z16.1, mostly among its voluntary members.
1970 OSH Act of 1970 was enacted. It creates two federal agencies: Occupational Safety and
Health Administration (OSHA) and the National Institute for Occupational Health
(NIOSH). It sets forth agencies’ responsibilities with respect to occupational safety and
health enforcement, statistics and research.
BLS under the Department of Labor was delegated to be responsible for collecting
statistics on occupational injuries and illnesses.
The NIOSH's mandate is to respond to requests to investigate health hazards in the
workplace and to conduct research to inform standards and regulations in OSH.

1971 BLS implemented the national Survey of Occupational Injuries and Illnesses (SOII).
The SOII is a collaboration between BLS and States to collect nationwide annual data
on non-fatal occupational injuries and illnesses from a sample of private sector
employers based on OSHA record keeping requirements .
Later in 1973 and 1978, BLS developed other data systems to supplement the SOII,
including the Supplementary Data System (SDS), which collected characteristics of
occupational injuries and illnesses and demographic information, and Work Injury
Report Surveys on detailed causal information.
BLS has continued to improve SOII. For example, since 1992, SOII started to collect
data on the nature and circumstances of the injury or the illness. In 2006, BLS began
publishing rates stratified by occupation, age and gender in addition to rates by
industry, establishment. In 2008, BLS expanded the SOII to include state and local
government workers.
142

Time period Event/OSH surveillance system


1970s-1980s NIOSH conducted several data collection activities including
the National Occupational Hazard Survey (NOHS) in 1972-1974. The NOHS
conducted plant surveys and walk-through inspections using a representative sample of
5,000 non-agriculture establishments covering under OSH Act. The survey identified
more than 9,000 potential workplace hazards.
Similar to the NOHS, the National Occupational Exposure Survey (NOES) in 1981-
1983. and the National Occupational Health Survey of Mining (NOHSM) in the late
1980s.
In the early 1980s, NIOSH developed the National Occupational Mortality System
(NOMS) by collaborating with the National Center for Health Statistics (NCHS) and
State vital statistics departments.

1977 Mine Safety and Health Act of 1977 established Mine Safety and Health
Administration (MSHA). The Act specifies that NIOSH is responsible for conducting
research related to mine safety and health and making recommendations to he MSHA.
1979 Since 1979, OSHA designed the Integrated Management Information System (IMIS) to
collect OSHA Enforcement, Consultation, and Whistleblower information. IMIS
contains publicly available information on workplace exposure measures from surveys
performed by OSHA officers. The IMIS database has been recently replace by the
Occupational Safety and Health Information System (OIS) which incorporates OSHA's
various enforcement activities and provides a tool to help identify injury, illness and
fatality trends at local and national level.
1980s OSH surveillance started to take the concept and methodology of "sentinel health
events" in the 1980s. Supported by NIOSH, the State-based Sentinel Events
Notification System for Occupational Risks (SENSOR) program was invented to build
OSH surveillance capacity within States to carry out active surveillance, investigation
and preventive intervention for specific occupational conditions, such as work-related
asthma, pesticide poisoning.
The SENSOR-Pesticides programs have been funded by the NIOSH and the
Environmental Protection Agency (EPA) to conduct surveillance on one or more
occupational illnesses or injuries among 13 States as of 2020.
1980s In the late 1980s, NIOSH established the National Surveillance System of
Pneumoconiosis Mortality (NSSPM), providing annually updated information on all
pneumoconiosis related deaths in the US.
In 1990s, NIOSH began the periodic publication of the Work-Related Lung Disease
Surveillance Report, which present data from the NSSPM and other sources.

1985 NIOSH began collecting death certificates from all US States and the District of
Columbia to establish the National Traumatic Occupational Fatalities (NTOF)
surveillance system. The NTOF system contains 30 variables to describe characteristics
of the injury and the victim.

1987 the Adult Blood Lead Epidemiology and Surveillance System (ABLES) was
established as NIOSH and States collaboration among 4 states in 1987. As of 2018, 37
States have been maintaining state ABLES programs to measure trends in work-related
adult blood lead levels to better target interventions and prevent lead exposures.
143

Time period Event/OSH surveillance system


1989 NIOSH initiated the case-based national Fatality Assessment and Control Evaluation
(FACE) program in 1982, with participating States voluntarily notify NIOSH of
traumatic occupational fatalities resulting from targeted causes of death. NIOSH is
currently targeting investigations of deaths associated with machinery, deaths of foreign
born workers, energy production, and falls in construction.
State FACE programs were started since 1989. As of 2020, seven States are funded by
NIOSH to conduct surveillance, targeted investigations and prevention activities using
the FACE model.

1984 & 1986 The US Congress held hearings in 1984 and 1986 regarding the under-reporting and
credibility issues on occupational safety and health statistics.
The 1984 Congress hearing concluded "No reliable national estimates exist today, with
the exception of a limited number of substance-specific studies (such as on asbestos),
on the level of occupational disease, cancer, disability, or deaths." and "occupational
diseases data collection was fragmented, unreliable, and seventy years behind
communicable disease surveillance."
The 1986 Congress hearing stated that statistical information on occupational illnesses
is still grossly inadequate.

1987 To address concerns on the reliability of OSH statistics, Congress appointed the funds,
the BLS requested the National Research Council to assess the validity and quality of
SOII data and other key issues related to OSH data collection. The resulted report,
"Counting injuries and illnesses in the workplace: proposals for a better system"
provided seminal guidance on how to organize and enhance the US OSH surveillance.

1989 In 1986, NIOSH initiated a project to comprehensively evaluate its surveillance effort
and part of the results were published as a monograph in the American Journal of
Public Health in 1989, covering topics from reviews of its current surveillance systems
as well as recommendations for further improvement.

1990s Since the early 1990s, NIOSH have been collaborating with the Consumer Product
Safety Commission to collect data on non-fatal occupational injuries through an
occupational supplement to the National Electronic Injury Surveillance System
(NEISS), resulting the NEISS-Work system which captures work-related injuries
seeking treatment in Emergency Departments (ED).

1992 BLS implemented the national Census of Fatal Occupational Injuries (CFOI). The
CFOI is a complete census to use multiple data sources including death certificates,
workers’ compensation reports, and federal and state agency administrative reports.

1996 NIOSH initiated National Occupational Research Agenda (NORA).


In the first decade of NORA (1996-2006), "surveillance research methods was specified
as one focus area. In the second decade (2006-2016) and the third decade (2016-2026),
surveillance is considered as a crosscutting concern relevant to all sectors and health
outcomes.

1996 OSHA Data Initiative collected data on work-related illnesses and injuries from
employers within specific industries and with specific employment size from 1996 to
2011. The data is used by OSHA to calculate establishment specific injury and illness
rates. OSHA provides the data in a searchable database online.
144

Time period Event/OSH surveillance system


2001 In 1998, NIOSH sought recommendations from the NIOSH Surveillance Coordination
Group and the NIOSH-State Work Group (in collaboration with the Council of State
and Territorial Epidemiologists (CSTE)) regarding national and state-based
surveillance activities to establish the NIOSH Surveillance Strategic Plan in 2001 (ohi9,
2001). The Plan recognized the needs for better coordination and collaboration of
surveillance activities, improved dissemination and use of surveillance data, and
standardized coding of OSH information, as well as State-based surveillance activities.

2004 The CSTE Occupational Health Surveillance Work Group developed Occupational
Health Indicators (OHIs) to help States establish fundamental surveillance capacity.
OHIs are quantitative measures of work-related illnesses, injuries, and factors
associated with occupational safety and health that can be generated using state-level
data, developed by the CSTE Occupational Health Surveillance Work Group to reflect
OSH priority. In NIOSH long-term vision of a nationwide surveillance program, all
States will have the core capacity and infrastructure to conduct surveillance of
occupational injuries and illnesses. In 2004, the initial 19 OHIs were piloted in 2004
among 13 States to generate indicators using data of 2000. Since then, NIOSH has been
funding State-based OSH surveillance programs to conduct OHI surveillance and other
surveillance activities. As of 2020, NIOSH has been funding 26 States, including eight
fundamental surveillance programs (able to establish and maintain surveillance of a
minimum of 15 OHIs), eleven fundamental plus surveillance (already established
fundamental surveillance program and demonstrated capacity in OSH surveillance) and
seven expanded surveillance programs (with longer-tenure or demonstrated experience
in conducting OSH surveillance) to conduct surveillance on a total of 24 OHIs.
2010 NIOSH places importance on leveraging existing surveys and data systems to
supplement the limited resources for OSH surveillance. For example, NIOSH has been
continuously working with NCHS to add additional questions concerning work-
relatedness of conditions and workplace exposures in the National Health Interview
Survey (NHIS) in 2010 and 2015 and the Asthma Supplement in the Behavioral Risk
Factor Surveillance System (BRFSS).
Other survey systems NIOSH has also been working with include National Birth
Defects Prevention Study (NBDPS), National Ambulatory Medical Care Survey
Asthma Supplement, National Crime Victimization Survey (NCVS), National Health
and Nutrition Examination Survey (NHANES).
2010 In 2010, OSHA made the Chemical Exposure Health Data (CEHD) publicly available,
which include industrial hygiene measurements and related information from a variety
of sources. Currently only the analyses for the Salt Lake City laboratory samples are
available.

2018 The National Academies of Sciences, Engineering and Medicine under the joint request
by NIOSH, BLS and OSHA, published a study report named "A smarter national
surveillance system for occupational safety and health in the 21st century". The report
provides comprehensive review of current issues with nationwide OSH surveillance as
well as proposes suggestions for a more coordinated, cost-effective approaches for
OSH surveillance in the US.
The authors proposed the concept of a "smarter OSH surveillance system" that needs to
demonstrates efficiencies, integrates strategies across multiple data sources, coordinates
efforts across key surveillance agencies, and applies domain knowledge effectively to
interpret data and to deliver insights to key stakeholders.

Resources:
[1] National Academies of Sciences, Engineering, and Medicine. A Smarter National Surveillance System for
Occupational Safety and Health in the 21st Century. the National Academies Press; 2018.
145

[2] Council NR. Counting Injuries and Illnesses in the Workplace: Proposals for a Better System. The National
Academies Press; 1987.
[3] National Institute for Occupational Safety and Health. Tracking Occupational Injuries, Illnesses, and Hazards:
The NIOSH Surveillance Strategic Plan. U.S. Department of Health and Human Services, Public Health Service,
Centers for Disease Control and Prevention, National Institute for Occupational Safety and Health; 2001.
[4] Shire JD, Marsh GM, Talbott EO, Sharma RK. Advances and Current Themes in Occupational Health and
Environmental Public Health Surveillance. Annu Rev Public Health. 2011;32(1):109-132.
[5] Centers for Disease Control and Prevention (CDC). Pesticide Illness & Injury Surveillance. Published 2017.
[6] Baker EL. Sentinel Event Notification System for Occupational Risks (SENSOR): the concept. Am J Public
Health. 1989;79(Suppl):18-20.
[7] Control C for D, Prevention USA. Fatal occupational injuries–United States, 1980-1994. MMWR Morb Mortal
Wkly Rep. 1998;47(15):297-302.
[8] National Institute for Occupational Safety and Health (NIOSH). Adult Blood Lead Epidemiology and
Surveillance (ABLES). Published 2018.
[9] Biddle EA, Marsh SM. Comparison of two fatal occupational injury surveillance systems in the United States. J
Safety Res. 2002;33(3):337-354.
[10] National Institute for Occupational Safety and Health (NIOSH). Fatality Assessment and Control Evaluation
(FACE) Program. Published 2019.
[11] Froines J, Wegman D, Eisen E. Hazard surveillance in occupational disease. Am J Public Health.
1989;79(Suppl):26-31.
[12] Lavoue J, Friesen MC, Burstyn I. Workplace Measurements by the US Occupational Safety and Health
Administration since 1979: Descriptive Analysis and Potential Uses for Exposure Assessment. Ann Occup Hyg.
2013;57(1):77-97.
[13] Bureau of Labor Statistics (BLS). Census of Fatal Occupational Injuries (CFOI). Published 2012.
[14] Council of State and Territorial Epidemiologists. The Role of the States in a Nationwide, Comprehensive
Surveillance System for Work-Related Diseases, Injuries and Hazards: A Report from NIOSH-States Surveillance
Planning Work Group. Council of State and Territorial Epidemiologists (CSTE); 2001.
[15] National Institute for Occupational Safety and Health (NIOSH). NHIS Occupational Health Supplement: OHS
Questionnaire Content. Published March 4, 2019.
[16] National Institute for Occupational Safety and Health (NIOSH). OHS Behavioral Risk Factor Surveillance
System (BRFSS). Published December 3, 2019.
146

Appendix 1-2 Identified guidelines and frameworks for evaluating public health surveillance systems in a thorough

literature review

Guideline/framework Brief introduction Components abstracted Attributes abstracted


Guidelines for evaluating surveillance This is the first guidelines for evaluating / simplicity, flexibility, acceptability,
systems surveillance systems published by the sensitivity, PVP, representativeness,
CDC. It describes common evaluation timeliness, usefulness
Centers for Disease Control and Prevention steps and seven surveillance attributes.
(CDC). Guidelines for evaluating
surveillance systems. Morb Mortal Wkly
Rep. 1988;37(No. S-5).

A method for evaluating systems of Discusses evaluation of public health collection, analysis, dissemination, quality cost, sensitivity, usefulness, specificity,
epidemiological surveillance surveillance systems using a set of control representativeness, timeliness, simplicity,
attributes. flexibility, acceptability
Thacker SB, Parrish RG, Trowbridge FL.
A method for evaluating systems of
epidemiological surveillance. World Health
Stat Q Rapp Trimest Stat Sanit Mond.
1988;41(1):11-18.

Protocol for the Evaluation of Aimed to provide a guide on assessing an objectives of the system, population under coverage, impact, availability, utility,
Epidemiological Surveillance Systems existing surveillance system. It discusses surveillance, event, priority, event sensitivity, specificity, representativeness,
evaluation steps and methods. It also detection, reporting, decision making and timeliness, simplicity, flexibility,
World Health Organization. Protocol for includes definition and some example action taken, feedback, resources, staffing, acceptability, completeness, usefulness
the Evaluation of Epidemiological measures for surveillance attributes (i.e. equipment, budget
Surveillance Systems. Geneva: World called system capacity).
Health Organization; 1997.
147

Guideline/framework Brief introduction Components abstracted Attributes abstracted


Technical and economic evaluation This method introduces the evaluation surveillance aims, sampling, coordination accuracy, accuracy, cost-effectiveness,
method for use in improving infectious grids developed for animal health and awareness, training, staff hours,
animal disease surveillance networks surveillance systems and their application vectors, suspectable wildlife, techniques,
to three surveillance systems. Economic screening, diagnosis, laboratory, data
Dufour B. Technical and economic analysis is also discussed. collection, data circulation, standardization,
evaluation method for use in improving data processing, analysis, dissemination
infectious animal disease surveillance
networks. Vet Res. 1999;30(1):27–37.

Guidelines for the use of performance This report focuses on national level objective, surveillance methodology, data sensitivity, specificity, data quality,
indicators in rinderpest surveillance rinderpest surveillance program for collection, data manipulation, data timeliness
programmes: A part of the Global countries in GREP. Major components for analysis, data interpretation,
Rinderpest Eradication Programme different types of surveillance were structure, staff, materials
(GREP) suggested. Performance indicators are
proposed and described. Example indicator
Joint FAO/IAEA Division of Nuclear criteria are included.
Techniques in Food and Agriculture.
Guidelines for the Use of Performance
Indicators in Rinderpest Surveillance
Programmes: A Part of the Global
Rinderpest Eradication Programme
(GREP). International Atomic Energy
Agency, Animal Production and Health
Section; 2000.
Evaluating Public Health Surveillance This textbook chapter explains the objective, event, public health importance, cost, cost-effectiveness, usefulness,
evaluation steps and attributes. Important organization, quality control, surveillance simplicity, flexibility, acceptability,
Romaguera RA, German RR, Klaucke DN. concepts and considerations for each step methods, data source, stakeholder, report, sensitivity, PVP, representativeness,
Evaluating Public Health Surveillance. In are discussed. It includes definitions and retrieved data timeliness,
Teutsch SM, Churchill RE. Principles and methods for evaluating attributes.
Practice of Public Health Surveillance. 2 Attributes beyond the CDC updated
edition. Oxford ; New York: Oxford guidelines are also discussed. Example
University Press; 2000. evaluations are also provided.
Older version of that in 2010
148

Guideline/framework Brief introduction Components abstracted Attributes abstracted


Injury Surveillance Guidelines These guidelines provide the design, / accuracy, data quality, simplicity,
implementation and evaluation of injury usefulness
Holder Y, Peden M, Krug E, et al (Eds). surveillance systems. It proposes and
Injury Surveillance Guidelines. Geneva, briefly discusses three types of evaluation
World Health Organization; 2001. for an injury surveillance system, i.e.,
retrospective, process and system
environment evaluation.

Updated guidelines for evaluating public This is the most well-known guideline and stakeholders, data users, data providers, simplicity, flexibility, data quality,
health surveillance systems. for all kinds of public health surveillance practitioners, providers, community, acceptability, sensitivity, PVP,
systems. It provides six evaluation steps, government agencies, professional, non- representativeness, timeliness, stability,
Centers for Disease Control and Prevention adapted from the Framework for Program profit organizations, public health usefulness
(CDC). Updated guidelines for evaluating Evaluation in Public Health [1]. Ten importance, objective, operation,
public health surveillance systems: general evaluation attributes are population under surveillance, data
recommendations from the guidelines recommended and discussed. management, data analysis, dissemination,
working group. Morb Mortal Wkly Rep. confidentiality, standards, report,
2001;50(No. 13):1-36. [1] Centers for Disease Control and publication, presentation, poster, data
Prevention (CDC), Program Performance release, resource, funding, personnel,
and Evaluation Office. Framework for training, supplies, cost
program evaluation in public health.
Morbidity and Mortality Weekly Report.
1999;48(No.RR-11):1-40.

Protocol for the assessment of national This protocol is developed to guide detection, registration, confirmation, acceptability, sensitivity, specificity
communicable disease surveillance and assessment of national level communicable reporting, early warning, routine reporting,
response systems. disease surveillance and response systems. analysis, interpretation, response, control,
Specific guides are included on evaluating outbreak investigation, program
World Health Organization, Department of the overall structure and performance of adjustment, changes in policy and
Communicable Disease Surveillance and surveillance activities. planning, feedback, evaluation and
Responses. Protocol for the Assessment of monitoring, setting standards, case
National Communicable Disease definition, case management, standard
Surveillance and Response Systems. procedures for investigation, training,
Guidelines for Assessment Teams. World supervision, communication system,
Health Organization; 2001. resources, human, material, financial,
communication, technology platform,
active case finding
149

Guideline/framework Brief introduction Components abstracted Attributes abstracted


Draft framework for evaluating This framework provides a guide for stakeholder, provider, laboratory, usefulness, acceptability, generalizability,
syndromic surveillance systems evaluating syndromic surveillance systems operation, informatics stability, cost, flexibility, sensitivity, PVP,
based on the CDC updated. The framework timeliness, data quality, representativeness,
Sosin DM. Draft framework for evaluating describes five evaluation tasks/functions. completeness, reliability,
syndromic surveillance systems. J Urban Attributes to evaluate in each task are
Health. 2003;80(Suppl 1):i8-i13. discussed.
Older version of that of 2004

Quality assurance applied to animal This paper provides a general consideration objectives, event, legislation and simplicity, flexibility, data quality,
disease surveillance systems in evaluating animal health monitoring and regulation, authorities, resources, target acceptability, sensitivity, PVP,
surveillance systems. It introduces factors population, surveillance design, diagnostic representativeness, timeliness, stability,
Salman null, Stärk KDC, Zepeda C. affecting the surveillance quality and methods, data management, data analysis, usefulness
Quality assurance applied to animal disease surveillance system elements. Methods like feedback, dissemination, protocol
surveillance systems. Rev - Off Int scoring system and quality indicators are
Epizoot. 2003;22(2):689-696. recommended for evaluating surveillance
systems.

Framework for evaluating public health This CDC guide is for the evaluation of objectives, stakeholders, data providers, timeliness, validity, sensitivity, PVP, PVN,
surveillance systems for early detection public health outbreak surveillance systems data users, practitioners, providers, data quality, representativeness,
of outbreaks: recommendations from the (e.g., syndromic surveillance). It discusses government agencies, officials, completeness, usefulness, flexibility,
CDC Working Group. four tasks in an evaluation and important organizations, commercial systems acceptability, portability, stability,
attributes in each task. Emphasis is given to developers, operation, data transmission reliability, cost, cost-effectiveness
Centers for Disease Control and Prevention timeliness and the balance among other standards, data sources, data processing,
(CDC). Framework for evaluating public attributes (e.g. sensitivity). data analysis, interpretation, investigation,
health surveillance systems for early outbreak detection, case definition
detection of outbreaks: recommendations
from the CDC Working Group. Morb
Mortal Wkly Rep. 2004;53(No. RR-5):1-
13.
150

Guideline/framework Brief introduction Components abstracted Attributes abstracted


Framework and Tools for Evaluating This Canadian framework provides a data collection, integration, analysis, acceptability, simplicity, flexibility, data
Health Surveillance Systems standard approach for evaluating public interpretation, surveillance products, quality, PVP, sensitivity,
health surveillance systems. It outlines and dissemination, management, coordination, representativeness, timeliness, stability,
Health Canada. Framework and Tools for discusses six evaluation steps. System legislation and regulation, purpose of robustness, compliance, effectiveness,
Evaluating Health Surveillance Systems. performance characteristics and quality surveillance, population under surveillance, efficiency, usefulness, accessibility,
Canada: Population and Public Health attributes are introduced and detailed in an use of data, technology accuracy, availability, coherence,
Branch, Health Canada; 2004. appendix of glossaries and terms. integration, completeness, confidentiality,
cost, cost-effectiveness, interoperability,
interpretability, relevance, specificity,
usability, validity

Framework for the evaluation and This framework was developed for coordination structure, decision-making, representativeness, timeliness,
assessment of EU-wide surveillance planning an evaluation project of EU-wide project management, administration, completeness of reporting, validity of data,
networks in 2006-2008 surveillance networks. Specific protocol supervision, case definitions, data to be quality assurance, workload, use of
including scope of work, evaluation team, collected, data management, data resources, usefulness
European Center for Disease Prevention data collection, evaluation process, protection, data access, data
and Control (ECDC). Framework for the timetable, performance indicators and confidentiality, interoperability, data
Evaluation and Assessment of EU-Wide measurements are included, which can be dissemination, data reporting, public health
Surveillance Networks in 2006-2008. referenced in other evaluations. action, infection control procedures,
European Center for Disease Prevention laboratory procedures
and Control (ECDC); 2006.

Communicable Disease Surveillance and WHO proposes a structural model for core function, case detection, registration, completeness, timeliness, usefulness,
Response Systems: Guide to Monitoring communicable diseases surveillance confirmation, reporting, data analysis, simplicity, acceptability, flexibility,
and Evaluating systems. It includes a detailed introduction interpretation, epidemic preparedness, sensitivity, specificity, PVP,
on each component and attributes. It also response and control, feedback, standards representativeness, transparency,
World Health Organization (WHO). discusses evaluation steps for two types of and guidelines, training, supervision, compliance, completeness, reliability
Communicable Disease Surveillance and evaluation, i.e. routine tracking and communication facilities, resources,
Response Systems: Guide to Monitoring periodic assessment. logistics, monitoring, evaluation,
and Evaluating. Geneva: World Health coordination, system structure, legislation,
Organization (WHO); 2006. surveillance strategy, surveillance
implementers and stakeholders,
networking, partnership, event under
surveillance, protocol
151

Guideline/framework Brief introduction Components abstracted Attributes abstracted


Assessing the National Health This tool aims to provide a guide for resources, system indicators, data sources, feasibility, cost-effectiveness,
Information System: An Assessment assessing national level health information data management, information products, sustainability, timeliness, consistency,
Tool (v4.0) systems (HIS). It specifies HIS components dissemination, data use, stakeholders, representativeness, confidentiality,
and standards and provides the scoring producers, users, financiers, officials, security, accessibility, flexibility
World Health Organization (WHO). framework for each component as a government agencies, care providers,
Assessing the National Health Information practical tool. donors, professional associations,
System: An Assessment Tool (v4.0). community, media, NGOs, infrastructure
Geneva: World Health Organization; 2008.

Proposal of a framework for evaluating This military framework is for evaluating / timeliness, validity, data quality,
military surveillance systems for early syndromic surveillance systems. It usefulness, costs, sensitivity, reliability,
detection of outbreaks on duty areas incorporates characteristics of French and acceptability, portability, stability,
British military syndromic surveillance flexibility, specificity, relevance,
Meynard J-B, Chaudet H, Green AD, et al. systems. Evaluation parameters (i.e. feasibility, consistency, security,
Proposal of a framework for evaluating attributes) are specified for different types availability, efficiency, interoperability,
military surveillance systems for early of evaluation corresponding to the impact
detection of outbreaks on duty areas. BMC development of a system (e.g., initial
Public Health. 2008;8(1):146. evaluation, final evaluation).

Effective environmental public health This framework is for the evaluation of system purpose, priority topic areas, validity, representativeness, trusted,
surveillance programs: a framework for surveillance data sources and indicators. It resource, data sources availability, measurability, feasibility,
identifying and evaluating data discusses different aspects to be considered accessibility, accuracy, reliability,
resources and indicators in an evaluation, such as scientific robustness, data quality, sensitivity, cost-
relevance and feasibility of an indicator. effectiveness, timeliness
Malecki KC, Resnick B, Burke TA. Attributes are listed as criteria for each
Effective environmental public health aspect.
surveillance programs: a framework for
identifying and evaluating data resources
and indicators. J Public Health Manag
Pract. 2008;14(6):543–551.
152

Guideline/framework Brief introduction Components abstracted Attributes abstracted


The development of an evaluation This describes the development of a system purpose and objectives, data completeness, sensitivity, specificity, PVP,
framework for injury surveillance framework for the evaluation of injury collection, case definition, quality control representativeness, timeliness,
systems surveillance systems. measure, data confidentiality, individual accessibility, usefulness
Characteristics/attributes practical to injury privacy, system security, uniform
Mitchell RJ, Williamson AM, O’Connor R. surveillance are listed with experts’ ratings. classification systems, routine data
The development of an evaluation Definitions and evaluation criteria are analysis, guidance material to aid
framework for injury surveillance systems. included. interpretation
BMC Public Health. 2009;9(1):260.

Characteristics of national registries for Aimed to guide the evaluation of national case investigation, case reporting, case completeness, coverage, acceptability
occupational diseases: international registries of occupational diseases in EU registration, physicians, publication of alert
development and validation of an audit countries . The audit tool includes information
tool (ODIT) indicators assessing a national registry
system which include monitor and/or alert
Spreeuwers D, de Boer AG, Verbeek JH, functions.
van Dijk FJ. Characteristics of national
registries for occupational diseases:
international development and validation of
an audit tool (ODIT). BMC Health Serv
Res. 2009;9:194.
153

Guideline/framework Brief introduction Components abstracted Attributes abstracted


Evaluating Public Health Surveillance This textbook chapter explains the public health importance, objective, simplicity, flexibility, data quality,
evaluation steps and attributes in the CDC operation, event, legal authority, system acceptability, sensitivity, PVP,
Groseclose SL, German RR, Nsubuga P. updated guidelines. Important concepts and organization, population under representativeness, timeliness, stability,
Evaluating Public Health Surveillance. In considerations for each step are discussed. surveillance, data collection, methods, integration, accuracy, completeness,
Lee LM, Teutsch SM, Thacker SB, Louis It includes definitions and methods for reporting sources, data management, relevance, consistency, usability,
MES. Principles and Practice of Public evaluating attributes. Attributes beyond the protocol, technology, data analysis, availability, adaptabililty, portability,
Health Surveillance. 3 edition. Oxford ; CDC updated guidelines are also discussed. dissemination, privacy, confidentiality, security, reliability, responsiveness
New York: Oxford University Press; 2010. Example evaluations are also provided. data release, funding, personnel,
stakeholder, informatics, data provider,
laboratory, guide intervention, guide
research, data integration, direct cost,
indirect cost

OASIS: an assessment tool of Research project comprise 10 experts, objectives, scope, organizational structure, simplicity, flexibility, data quality,
epidemiological surveillance systems in based on three existing methods central organization, field organization, acceptability, sensitivity, PVP,
animal health and food safety diagnostic, laboratory, surveillance representativeness, timeliness, stability,
procedure, data management, coordination, usefulness, relevance, consistency,
Hendrikx P, Gay E, Chazel M, et al. supervision, training, integration, integration, suitability, accuracy
OASIS: an assessment tool of evaluation, resources, budget, quality
epidemiological surveillance systems in assurance, standardization, protocol, active
animal health and food safety. Epidemiol surveillance, passive surveillance, news
Infect. 2011;139(10):1486-1496. bulletin, performance indicators
154

Guideline/framework Brief introduction Components abstracted Attributes abstracted


The relevance of WHO injury This article reviewed the applicability of / simplicity, flexibility, reliability, security,
surveillance guidelines for evaluation: WHO injury surveillance guidelines and confidentiality, acceptability, utility,
learning from the aboriginal performance attributes in guiding timeliness, sustainability
community-centered injury surveillance evaluation practice through comparative
system (ACCISS) and two institution- analysis of three evaluation applications. It
based systems compared performance attributes in both
epidemiologic dimensions and public
Auer AM, Dobmeier TM, Haglund BJ, health dimensions, as well as different
Tillgren P. The relevance of WHO injury types of evaluation. It also discusses
surveillance guidelines for evaluation: attributes that should be assigned with
learning from the aboriginal community- more weights.
centered injury surveillance system
(ACCISS) and two institution-based
systems. BMC Public Health.
2011;11(744):1-15.

Evaluating an NCD-Related This is a guide/tutorial for evaluating non- surveillance objective, stakeholders, simplicity, flexibility, data quality,
Surveillance System communicable disease (NCD) surveillance practitioners, providers, community, data acceptability, sensitivity, PVP,
systems based on CDC’s Updated provider, data user, professional representativeness, timeliness, stability,
Centers for Disease Control and Prevention guidelines. It discusses the six steps organization, government agency; clarity
(CDC). Evaluating an NCD-Related recommended in the CDC updated public health importance, cost, use of data,
Surveillance System. Atlanta, GA: Centers guidelines. Methods and measures for event, case definition
for Disease Control and Prevention; 2013. evaluating attributes in the context of NCD
surveillance are discussed.

The EVA Tool developed in RISKSUR The EVA tool provides comprehensive organization and management, training, availability, sustainability, acceptability,
Project guidance for the evaluation of animal evaluation, performance indicator, simplicity, flexibility, adaptability,
health surveillance systems. It lists resource, data collection, sampling coverage, portability, interoperability,
Calba C, Cameron A, Goutard F, et al. The structural/organizational, functional, and strategy, data management, internal representativeness, utility, accuracy,
EVA Tool: An Integrated Approach for effectiveness attributes and considers the communication, dissemination, laboratory timeliness, PVP, PVN, repeatability,
Evaluation of Animal Health Surveillance ranking of attributes. Example evaluation testing, data analysis, quality assurance, robustness, cost, impact, benefit, cost-
Systems. RISKSUR Project; 2013. questions are provided. Economic multiple hazard, risk-based criteria effectiveness
evaluation is also considered. definition
Peyre M, Hoinville L, Njoroge J, et al. The
RISKSUR EVA tool (Survtool): A tool for
the integrated evaluation of animal health
155

Guideline/framework Brief introduction Components abstracted Attributes abstracted


surveillance systems. Preventive
Veterinary Medicine. 2019;173:104777.

Animal Health Surveillance This terminology book is tailored to animal surveillance purpose, scope of surveillance, stability, sustainability, simplicity,
Terminology Final Report from Pre- health surveillance. Three groups of unit of interest, stakeholders, management, flexibility, repeatability, coverage,
ICAHS Workshop terminologies are listed and defined: personnel, organizational structure, basis of representativeness, utility, completeness,
general terms, characteristics of participation, legislation and regulation, accuracy, sensitivity, timeliness, accuracy,
Hoinville L. Animal Health Surveillance surveillance activities, attributes. species/breed, sampling strategy, bias, cost, impact, cost-effectiveness
Terminology Final Report from Pre- population stream, disease focus, data
ICAHS Workshop. In: International source, data collection method, study
Conference on Animal Health design, case definition, data analysis, data
Surveillance; Vol 17. 2013:1-27. providers, dissemination, data
management, communication, laboratory
management

Conceptual evaluation of population This provides a conceptual model to / /


health surveillance programs: method evaluate a public and animal health
and example surveillance program based on concepts
underlying the surveillance program (i.e.,
Allaki FE, Bigras-Poulin M, Ravel A. what the program is), which are compared
Conceptual evaluation of population health to the theoretical standards for health
surveillance programs: method and surveillance (i.e., what the program should
example. Prev Vet Med. 2013;108(4):241– be). Components and attributes are
252. discussed as theoretical standards.
156

Guideline/framework Brief introduction Components abstracted Attributes abstracted


Overview of Evaluating Surveillance A field guide to evaluation of public health public health importance, objective, simplicity, flexibility, data quality,
Systems surveillance systems based on the CDC resource acceptability, sensitivity, PVP,
updated guidelines (2001). Working tables representativeness, timeliness, stability,
Centers for Disease Control and and example evaluation are provided. usefulness
Prevention. Overview of Evaluating
Surveillance Systems. Atlanta GA; 2013.

Evaluating a National Surveillance This UN AID/WHO framework provides flow chart, level of integration, objectives, flexibility, timeliness, coverage,
System guidance for the evaluation of national data sources, surveillance approach, completeness, accuracy, confidentiality,
level HIV surveillance systems. It covers activities, geographical dimension, data simplicity, flexibility, data quality,
UNAIDS/WHO working group on global various aspects in evaluating HIV/AIDS collection method, data providers, acceptability, sensitivity, PVP,
HIV/AIDS and STI surveillance. surveillance and programs, such as resources, human, financial, stakeholders, representativeness, stability, usefulness
Evaluating a National Surveillance System. inventory, design and implementation, planners, implementers, data users,
Geneva: World Health Organization; 2013. surveillance data quality. Case studies are guidelines, case reporting, size estimation,
also provided as illustrations. protocol, quality standards, staffing,
supervision, laboratory, equipment, ethical
standards, data analysis, data
dissemination, data use, data management

European Data Quality Monitoring and This is a guide for evaluation of public case ascertainment, case reporting, completeness, validity, sensitivity,
Surveillance System Evaluation health surveillance systems in EU/EEA objective, event, case definition, data specificity, PVP, PVN, timeliness,
Handbook Member States. It includes a detailed source, data flow, networks, population usefulness, representativeness, simplicity,
introduction and guide on evaluating under surveillance, geographical coverage, flexibility, acceptability, stability,
European Center for Disease Prevention quality attributes in the context of data management, data entry, data reliability
and Control (ECDC). Data Quality communicable diseases surveillance. reporting process, database structure, data
Monitoring and Surveillance System quality monitoring, legislation,
Evaluation - A Handbook of Methods and stakeholders, data provider
Applications. Stockholm: ECDC: European
Center for Disease Prevention and Control;
2014.

Development of an index system for the This article reports a comprehensive structure, data processing, data utilization /
comprehensive evaluation on public evaluation index system for the the China
health emergency events surveillance Public Health Emergency Events
system in China Surveillance System (CPHEESS).

Hong Z, Ni D, Cao Y, et al. Development


of an index system for the comprehensive
157

Guideline/framework Brief introduction Components abstracted Attributes abstracted


evaluation on public health emergency
events surveillance system in China.
Zhonghua Liu Xing Bing Xue Za Zhi.
2015;36(6):547-551.

SERVAL (SuRveillance EVALuation This is a generic framework for evaluating conditions under surveillance, current Benefit, bias, cost, coverage, data
framework) animal health surveillance. It discusses situation, surveillance objectives, target completeness, flexibility, historical data,
evaluation steps and key questions related population, structure of the system, design accessibility, impact, laboratory multiple
Drewe JA, Hoinville LJ, Cook AJC, Floyd to each step. Guides on attributes and their of the system, system users, organizational utility, participation, precision,
T, Gunn G, Stärk KDC. SERVAL: a new selection are included. It also includes structure, communication, data analysis, repeatability, representativeness,
framework for the evaluation of animal guide on economic analysis for the system. data collection, data management, sensitivity, specificity, stability,
health surveillance. Transbound Emerg management sustainability, timeliness
Dis. 2015;62(1):33-45.

Drewe J, Staerk K. SERVAL


(SuRveillance EVALuation framework).
Royal Veterinary College.
https://www.rvc.ac.uk/research/research-
centres-and-facilities/veterinary-
epidemiology-economics-and-public-
health/projects/serval. Published 2017.
Accessed February 12, 2018.

Brief Evaluation Protocol for Public This protocol provides a practical tool stakeholders, staff, data providers, data cost, simplicity, flexibility, data quality,
Health Surveillance Systems based on the CDC updated guidelines managers, data users, general public, acceptability, sensitivity, PVP,
(2001) to assist evaluation of a public community members, representativeness, timeliness, stability,
Hoffman S, Dekraai M. Brief Evaluation health surveillance system in a short period system purpose, goals, standards, guidance, usefulness, mutual understanding,
Protocol for Public Health Surveillance of time and with limited resources. structure, reporting, resource, budget,
Systems. the Public Policy Center, Examples and working tables are included personnel (staff member/FTEs), training,
University of Nebraska; 2016. to illustrate evaluation steps and the computer
evaluation of attributes.
158

Guideline/framework Brief introduction Components abstracted Attributes abstracted


SurF: an innovative framework in This proposes a generic framework to organization and management, flexibility, availability, acceptability,
biosecurity and animal health ensure consistent evaluation of New performance indicators, evaluation, data interoperability, utility, reliability,
surveillance evaluation Zealand biosecurity and animal health analysis, data and information collection, availability, repeatability, robustness,
surveillance systems. It proposes four data management and storage, field timeliness, PVN, PVP, accuracy
Muellner P, Watts J, Bingham P, et al. phases/steps in an evaluation as well as laboratory services, resources, training, representativeness, bias, sensitivity,
SurF: an innovative framework in activities within each phase. It suggests 29 external communication and dissemination, specificity, efficiency
biosecurity and animal health surveillance attributes which are divided into core and internal communication, organization,
evaluation. Transbound Emerg Dis. 2018. accessory attributes. process, technical implementation, output,
impact
159

Appendix 1-3 Summary of six existing evaluations of OSH surveillance systems

Surveillance system(s) Attributes being assessed and


Publication Brief Introduction Evaluation methods
under evaluation definitions/evaluation questions used
Austin C. An The article described the The BLS CFOI is a Simplicity: Both the structure and ease of operation. The evaluation was based
evaluation of the census Bureau of Labor Statistics passive surveillance Flexibility: The system can easily adapt to changing on the CDC guidelines in
of fatal occupational (BLS) Census of Fatal system aimed to collect information needs or operating conditions. 1988 [1].
injuries as a system for Occupational Injuries (CFOI) information on fatal Acceptability: The number of individuals and Evaluation methods were
surveillance. system and assessed the seven occupational injuries. organizations that willingly participate in the system. not described.
Compensation and attributes recommended in the The evaluation was Sensitivity: The proportion of all occupational The seven attributes were
Working Conditions. CDC guidelines in 1988 [1]. conducted three years fatalities that are detected by the CFOI system. assessed qualitatively with
1995;1:51–54. Overall, the evaluation found after the system's Predictive value positive (PVP): The proportion of examples, with a focus on
the CFOI system had high establishment. events identified by CFOI that actually met the CFOI the system's data collection
levels of acceptability, case definition. and processing.
timeliness, representativeness, Representativeness: The ability to accurately No quantitative statistics
and could potentially have describe an event over time and the distribution of the were reported for sensitivity
high sensitivity and PVP. The population by place and person. and PVP attributes.
dissemination of information Timeliness: The delay between steps in the system.
could be further improved. Usefulness: Ability to produce data release and
reports, which can be useful in monitoring trends and
bringing focus to high-risk occupations and
industries.
160

Surveillance system(s) Attributes being assessed and


Publication Brief Introduction Evaluation methods
under evaluation definitions/evaluation questions used
Patel K, Watanabe- The article described a 1) BLS Survey of Simplicity: A surveillance system's structure A set of evaluation
Galloway S, Gofin R, comprehensive evaluation of Occupational Injuries and (sampling, data collection, management, analysis, and questions for each attribute
Haynatzki G, six national-level survey- Illnesses (SOII) dissemination plan) and ease of operation. were developed by the
Rautiainen R. Non-fatal based surveillance systems on 2) National Agriculture Flexibility: Ability of a system to adapt to changing researchers based on the
agricultural injury non-fatal agriculture injury in Workers Survey (NAWS) information needs and technology within minimal CDC updated guidelines [2].
surveillance in the the US, including four adult 3) Occupational Injury resources. Evaluation evidences were
United States: A review and two youth national Surveillance of Data quality: Validity and completeness of data collected mostly by a
of national-level surveys. Based on the CDC Production Agriculture recorded. systematic literature review
survey-based systems. updated guidelines [2], eight (OISPA) Acceptability: The willingness of people or from peer-reviewed articles,
Am J Ind Med. attributes were assessed for 4) Minority Farm organizations participating in it. government publications,
2017;60(7):599-620. each of the six systems. Major Operator Occupational Sensitivity: The proportion of true cases and trends and websites of relevant
gaps were identified for these Injury Surveillance of detected by the system, which was synonymous with government agencies.
systems, including under- Production Agriculture the completeness of data. Evaluation evidences were
coverage of certain (M-OISPA) Representativeness: Ability to describe the injury qualitatively summarized
populations, varied case 5) Childhood Agricultural occurrence for the survey year as well as over time for for each evaluation
definition and reporting Injury Survey (CAIS) the population of interest. question, with quantitative
criteria, under-reporting, as 6) Minority Farm Timeliness: The time taken by a system at each step statistics used when
well as the need for additional Operator Childhood from sampling to reporting of injury to data possible. Due to limited
data variables. The authors Agricultural Injury Survey dissemination. information, some
acknowledged that it may be (M-CAIS). Stability: Ability of a system to collect, manage, recommended attributes
subjective in the selection of disseminate the data properly without any outages, such as predictive value
measurements and and make it available to the pubic and stakeholders. positive, stability and data
interpretation of evaluation quality were not assessed or
evidences. not fully assessed.
161

Surveillance system(s) Attributes being assessed and


Publication Brief Introduction Evaluation methods
under evaluation definitions/evaluation questions used
Nobles RE. Process This dissertation report Texas occupational safety Simplicity: The structure/organization and ease of To assess the alignment of
evaluation of the Texas described a thorough process and health surveillance performing its stated objectives. TOSHSS objectives to the
occupational safety & evaluation of the Texas system (TOSHSS) was a Flexibility: ability to adapt to changing information funding requirements as
health surveillance occupational safety and health state-based OSH needs or operating conditions with minimal needs for well as its activities to its
system. 2009. surveillance system surveillance system additional time, personnel, or funds. stated objectives, logic
https://search.proquest.c (TOSHSS). The evaluation funded by NIOSH since Data quality: Completeness and validity of the model analyses were
om/openview/bbf7eddd was based on two CDC 2006 as a fundamental- recorded data. conducted.
1106df9c4ea7c4ea1861 guidelines[2-3]. The plus state-level Acceptability: The willingness of persons and To assess attributes, a set of
3bfb/1?pq- evaluation assessed the surveillance system with organizations to participate in the activities of the evaluation questions were
origsite=gscholar&cbl= alignment of TOSHSS 13 OHIs being monitored surveillance system. developed for each attribute
18750&diss=y. activities to the funding as of the evaluation. Sensitivity: The level of case reporting that allows for based on the CDC updated
requirements and its stated This evaluation was the calculations of proportion of occupational guidelines [2]. Evaluation
objectives. The system's conducted midway in this illnesses and injuries detected by the system, and the evidence were collected
performance regarding the ten NIOSH funding cycle. ability to detect outbreaks and changes of through interviews with the
attributes recommended in occupational injuries and illnesses over time. EIET staff, management and
CDC updated guidelines [2] PVP: The proportion of reported cases that are primary stakeholders.
were also evaluated. The determined to have the occupational injury or illness Evaluation evidences were
evaluation demonstrated that under surveillance. qualitatively summarized,
TOSHSS was simple, flexible, Representativeness: The ability to accurately with quantitative statistics
acceptable, fairly stable, describe the occurrence of occupational injuries and used when possible.
partially useful, and with good illnesses over time and the distribution of events
data quality. It also among Texans by place and person.
highlighted gaps in TOSHSS, Timeliness: The speed between the action steps of the
including not incorporating system.
plans to develop a scientific Stability: The ability to collect, manage, and provide
advisory committee or data properly without failure (reliability) and ability
maintain contacts with to be operational when needed (availability).
stakeholders. Usefulness: 1) Contribution to the improved
understanding, prevention and control of occupational
injuries and illnesses. 2) Determination of
occupational injury or illness previously thought to be
unimportant. 3) Contribution to the development of
performance measures either within the program or
with stakeholder groups.
162

Surveillance system(s) Attributes being assessed and


Publication Brief Introduction Evaluation methods
under evaluation definitions/evaluation questions used
Spreeuwers D, de Boer The authors who developed 1) Statistics of No explicit statement on attributes in this evaluation. Evaluation evidences were
AGEM, Verbeek the ODIT tool, the scoring Occupational Diseases in The indicators included in the ODIT and this collected through
JHAM, van Dijk FJH. system for evaluating the Austria; evaluation are more like evaluation questions and questionnaire and in-person
Evaluation of quality of national registries of 2) Fund for Occupational indicators which can be summarized for certain interview with a contact
occupational disease occupational diseases in EU Diseases in Belgium; attributes. For example: person in each country. The
surveillance in six EU countries [4], further reported 3) Czech National Completeness of notification form in both monitoring questionnaire included
countries. Occupational a systematic evaluation of the Registry of Occupational and alert functions, including information on questions on the objective of
medicine. national registries in six EU Diseases; diagnosis, exposure, occupation, economic sector, the registry and different
2010;60(7):509–516. countries with the ODIT tool. 4) Finnish Registry of worker's demographics, etc. quality aspects according to
The evaluation focused on two Occupational Diseases; Coverage of registration. the ODIT tool [4].
functions in these registries, 5) General Regime in Guidelines or criteria for notification. Based on verified audit
i.e., the monitoring and alert France; Education and training report, the reviewers further
functions and scored each 6) THOR in UK Participation of notifying physicians. determined quality score for
ODIT indicators for the two Publication of monitor and alert information. each ODIT indicator.
functions. The evaluation Investigation of special cases
exhibited that the monitoring
functions in all the six
countries were of limited
quality and the alert function
were poor in four countries. It
highlighted the need for
quality improvement of these
registries of occupational
diseases.
163

Surveillance system(s) Attributes being assessed and


Publication Brief Introduction Evaluation methods
under evaluation definitions/evaluation questions used
Henderson AK, Payne This article reported a 68 state-based No attributes were explicitly evaluated in this report. A standard questionnaire
MM, Ossiander E, comprehensive evaluation of surveillance programs in were developed based on
Evans CG, Kaufman the 68 state-based surveillance all 50 States and the the CDC guidelines in 1988
JD. Surveillance of programs in all 50 States and District of Columbia and [1], including questions on
Occupational Diseases the District of Columbia and New York city in the US describing the program and
in the United States: A New York city in the US its objectives. Telephone
Survey of Activities based on the CDC guidelines interviews were
and Determinants of in 1988 [1]. The authors administered by trained
Success. Journal of defined a successful program interviewers with program
Occupational and as meeting one self-reported managers. Evaluation
Environmental objective. The study found evidences were analyzed
Medicine. 56% programs were descriptively and/or
1998;40(8):714–719. successful, in which OSH qualitatively.
conditions had short latency
periods, were easily
diagnosed, and were related to
a workplace hazard.
Successful programs tended to
have larger budgets and more
staff.
164

Surveillance system(s) Attributes being assessed and


Publication Brief Introduction Evaluation methods
under evaluation definitions/evaluation questions used
Estes CR, Marsh SM, This article reported an 1) BLS Census of Fatal No attribute was explicitly assessed in this evaluation, No guidelines were
Castillo DN. evaluation of four surveillance Occupational Injuries except some discussion related to data completeness, followed in this evaluation.
Surveillance of systems of traumatic (CFOI). coverage (representativeness), and data quality. The authors did not explain
traumatic firefighter firefighter fatalities on their 2) A system maintained how they collected
fatalities: an assessment usability in characterizing by the US Fire information to describe and
of four systems. Public traumatic firefighter fatalities Administration (USFA). assess the four systems.
Health Rep. and their potential in 3) A system maintained Information on case
2011;126(4):540-551. informing prevention by the US National Fire definitions, case
strategies. The authors Protection Association ascertainment, variables
reviewed the four systems' (NFPA). collected and rate
goals, scope, data sources, 4) the NIOSH Firefighter calculation methods in the
work activities and compared Fatality Investigation and four systems were
case definitions, case Prevention Program summarized and compared.
ascertainment, variables (FFFIPP)
collected and rate calculation
methods in the four systems.
They found that the magnitude
of fatalities differed among
systems and methods for
estimating risk were disparate
and limited fatality rate
comparison.
[1] Centers for Disease Control and Prevention (CDC). Guidelines for evaluating surveillance systems. Morb Mortal Wkly Rep. 1988;37(No. S-5).
[2] Centers for Disease Control and Prevention (CDC). Updated guidelines for evaluating public health surveillance systems: recommendations from the guidelines working
group. Morb Mortal Wkly Rep. 2001;50(No. 13):1-36.
[3] Centers for Disease Control and Prevention (CDC). Office of Strategy and Innovation. Introduction to Program Evaluation for Public Health Programs: A Self-Study
Guide. Atlanta, GA: Centers for Disease Control and Prevention; 2011.
[4] Spreeuwers D, de Boer AG, Verbeek JH, van Dijk FJ. Characteristics of national registries for occupational diseases: international development and validation of an audit
tool (ODIT). BMC Health Services Research. 2009;9:194.
165

Appendix 3-1 Questions/aspects to describe the system3

Infrastructure

1. What functional/quality/other legislation/standards/guidelines does the system need to meet?


• Are there quantified goals/requirements for this surveillance system?
2. Describe the structure of the organization that has legal responsibility for operation of the
surveillance system/activity/initiative.
3. How is the organization/surveillance system/activity/initiative funded?
• What is the overall budget for the surveillance system/activity/initiative?
• Budget/resources for training/supplies/equipment (e.g., computers)/services (e.g.,
telephone, hardware maintenance, internet connection)/technical platforms (e.g.,
databases)
4. Personnel requirements to operate the system
• How many staff members/FTEs are needed?
• Expertise required for the surveillance activities/initiatives (e.g., informatics)
5. How does the surveillance system/activity/initiative integrate with other systems/programs?
• What is the purpose of the integration?
6. How is the integration implemented?

Surveillance Strategy

1. Purpose and objectives of the surveillance system/activity/initiative?


• Are there quantified goals/indicators set for the system/activity/initiative?
2. What is the event under surveillance?
• How are cases defined?
• What is the population under surveillance?
3. Collect parameters to describe the public health importance of the event under
1,5,6,15,16
surveillance

3
Adapted from:
[1] Centers for Disease Control and Prevention. Updated guidelines for evaluating public health surveillance systems:
recommendations from the guidelines working group. Morb Mortal Wkly Rep. 2001;50(No. 13):1-36.
[2] Hoffman S, Dekraai M. Brief Evaluation Protocol for Public Health Surveillance Systems. the Public Policy Center,
University of Nebraska; 2016.
166

• The number of population affected by the event – stratified by age/gender/location,


etc.
• Indices of frequency (e.g., total number of cases/incidence and prevalence
rates/quality-adjusted life years (QALYS);
• Indices of severity (e.g., bed-disability days, hospitalization rates, disability/death
rates, case-fatality ratio)
• Costs associated with the occurrence/prevention of the health-related event
• Disparities or inequities associated with the health-related event (e.g., by
age/gender/location)
i. How does the system help to address the disparities/inequities
• How interested or concerned is the general public about the event and/or in
prevention of the event?
4. What is the period of time of data collection?
• How often the data is collected or submitted to the system?
5. How data are managed?
• How is data transferred/entered into the surveillance system?
i. Schemes manipulating the data (e.g., re-coding, data linking, creating new
variables)
• How is data checked for accuracy and edited?
• How is data stored and backed up?
• What are the applicable standards for managing/coding/transferring data?
• How are confidentiality/privacy/data security consideration addressed?
6. What strategies/procedures exist in the system to
• Detect the events under surveillance
• Ascertain true events (e.g., cases) and determine false positive
• Take actions based on event occurrence (if needed)
• Conform applicable statutes/regulations/standards
• Track activities occurred in the system
7. How are data analyzed?
• Who is responsible for this?
• How often are data analyzed?
167

• Protocols/procedures on data analyses


8. How are data disseminated?
• Who is responsible for this?
• How often are data disseminated?
• What are the formats/channels to disseminate surveillance data/statistics?
i. What are possible uses of the data?
ii. Are tailored materials created (e.g., reports, papers, posters, pamphlets,
videos) suitable for audiences?

Data sources

1. What data are collected for the event under surveillance?


• What are the required/optional data variables?
2. How are data collected?
• Direct data collection:
i. How are data collected (e.g., by mail, phone, electronically)
ii. For each source, is data submission optional or mandatory?
• Using secondary data sources:
i. Data sources and providers (e.g., publicly available sources or through
research/collaboration agreements)?
ii. Specific agency/contact person for each data source.
3. How are data managed before they enter into the surveillance system?
• How are data coded – coding systems/standards?
• How is data checked for accuracy and edited?

Stakeholders

1. Who provide data sources to the surveillance system?


2. Who use the data provided by the surveillance system?
3. Advisory committee board
4. What are other possible partners/collaborators in the system? e.g.
168

• software developers/maintainers, such as Electronic Health Records)?


• Medical care providers
• Employers and employees
• Community members
5. For each identified stakeholder:
• Specific agencies/contact persons.
• How important is the stakeholder to the system/activity/initiative?
• Is it feasible to collect evaluation evidence from the stakeholder?
169

Appendix 3-2 Components common to OSH surveillance

Mean
Component Sub-component Definition
rating
Infrastructure / Basic structure and foundations for establishing and 4.6
maintaining a surveillance system. It should be flexible and
able to change/evolve as the system grows and as needed.

Legislation & regulations Legal requirements and stipulations supporting the 3.8
establishment of the surveillance activities and mandatory
requirements that the system should follow.

Standards & guidelines Available standards and guidelines identified to guide the 4.8
development, implementation, maintenance and evaluation of
the surveillance system.
Funding resources Funding sources, funds available for developing and 4.5
implementing basic and advanced/expanded surveillance
activities, and funding continuity.
Organizational structure & Organization of the surveillance system and human resources, 4.1
human resources including the system's leadership, staff, communication and
coordination mechanism.
Material resources Logistics, material items and technological resources vital to 4.6
the surveillance system.
Surveillance / Guiding standards, methodologies and procedures for the 4.9
strategy establishment and maintenance of a surveillance system,
including surveillance objectives, events under surveillance and
case definitions, available data sources and surveillance
techniques.

Surveillance objectives Objectives and priorities relevant to the current and emerging 4.4
occupational safety and health needs should be clearly stated to
provide an overall guide to the surveillance system. Objectives
should be specific, adaptable to changing needs, and
measurable.

Events under surveillance Selection and prioritization of occupational safety and health 4.2
cases/events under surveillance in accordance with the
surveillance objectives. For each event, case definition is clear
and valid without ambiguity. Available data sources and
analysis methods are specified.

Surveillance technologies Technologies enable effective and efficient implementation of 3.7


surveillance activities. Common technologies include devices
for surveillance (e.g. sensors, computers, smartphones),
technical platforms (e.g. electronic database for health records,
online data sharing mechanism, analytical software), and
surveillance techniques (e.g. standardized terminology, data
encoding system, analytic techniques).
170

Mean
Component Sub-component Definition
rating
Surveillance protocols Detailed protocols established to regulate surveillance 4.8
activities, including the list of prioritized events under
surveillance, case definitions, applicable data sources, data
collection and processing methods, frequency of reports and
ways for data dissemination, as well as data quality control
process and periodical evaluation mechanism.

Data sources / Instead of being a single, comprehensive surveillance system, 4.7


current OSH surveillance systems make use of a variety of data
sources which are collected by different organizations with
different surveillance objectives, strengths and weaknesses.

Mandatory reports Certain diseases are required to be notified to state health 4.6
departments or Poison Control Centers by health care
providers, hospitals and emergency departments, clinical
laboratories. The list of notifiable diseases is historically
geared for infectious diseases but now include some
occupational and environmental health conditions and
exposures such as lead poisoning.

Administrative data Includes hospital discharge data, emergency department data, 4.6
hospital outpatient data, workers' compensation data, OSHA
records, Coast Guard records, etc. All payer claims data and
electronic health records are also potential data sources.

Registry data Includes birth and death certificates, cancer registries, birth 4.2
defect registries, trauma registries, and burn registries, etc.
Census data Provides injury and illness data such as the Census of Fatal 4.6
Occupational Injuries (CFOI), and demographics data such as
Quarterly Census of Employment and Wages (QCEW).
Survey data Include state and national Survey of Injuries and Illnesses 4.2
(SOII) data, Behavioral Risk Factor Surveillance System data
(BRFSS), Youth Risk Behavior Surveillance System (YRBS),
National Health Interview Survey (NHIS), National Health and
Nutrition Examination Survey (NHANES), etc.

Other data source Other potential data sources for OSH surveillance may include 3.8
news reports, Google alerts. Social media and smart devices are
promising new data sources.
Stakeholders / Those who provide data, funding or technical support; or are 4.4
involved in the surveillance activities such as survey
participants; or impacted by surveillance activities and results;
or surveillance data users.
171

Mean
Component Sub-component Definition
rating
Federal agencies Various federal agencies may play a role, including the Bureau 4.4
of Labor Statistics (BLS), the National Center for Health
Statistics (NCHS), the Occupational Safety and Health
Administration (OSHA), the Mine Safety and Health
Administration (MSHA), the National Institute for
Occupational Safety and Health (NIOSH). Federal agencies
have responsibilities and programs pertaining OSH
surveillance. They may serve as major funding sources for
OSH surveillance programs. They may also oversight and
coordinate OSH surveillance activities.

State agencies State agencies such as state health departments, labor 4.6
departments, workers' compensation bureaus/divisions, or
employment security departments, play a critical role in
collecting, analyzing and disseminating data from local sources
and conduct various OSH surveillance systems in partnership
with federal agencies and other national and local
organizations.
Employers & employees Employers are asked to participate in the collection and 3.8
submission of data relevant to their company and their
industry, including injuries and illnesses happened to their
employees. Employees themselves may choose to (or not to)
seek medical care, file a workers' compensation claim, or
report poisoning or other work-related conditions. Employers
and employees' engagement may directly impact the quality
and representativeness of data collected. Employers and
employees (e.g. unions) can also be helpful distributors of
surveillance findings.

Medical care providers Include clinics/physicians, testing laboratories who make 4.2
diagnosis and treat health conditions. They are required to
report selected diseases and health conditions to state health
agencies. Hospitals and emergency departments can provide
important medical record data.

Professionals and Provide technical supports. For example, occupational health 3.6
professional organizations advisory committee of the surveillance system provides subject
matter expertise on surveillance activities and improvement.
NGOs, businesses (e.g. Electronic Medical Record developers)
and research institutions provide education programs,
technological assistance or resources for promoting data
dissemination and use. They may also provide data to the
surveillance system.

Other stakeholders Include communities and individuals who are affected 3.9
by/benefited from the surveillance activities, organizations and
the general public who are interested in the surveillance and the
data it generates.
172

Mean
Component Sub-component Definition
rating
Data Processing / The function of system to synthesize and analyze data at 4.7
individual and aggregated levels and produce interpretable
results. Protocols should be in place specifying data collection
methods & procedures.

Data collection The process of collecting and recording data into the system. 4.6
Different local and state health agencies, medical care
providers, insurance companies, businesses and other
organizations may be involved in data collection and curation
before the surveillance system obtain access to the data.

Case investigation In case-based surveillance systems, cases need to be 3.8


investigated for details on the injury/illness/ disease,
causing/contributing factors, and other associated factors.
Data analysis Routine data analysis should be established on daily/monthly 4.3
basis or in longer time frame depending on the surveillance
purposes/needs. Also allow analysis "on demand" for more
data exploration, more specialized analysis, or dealing with
urgent issue.

Data interpretation Analysis results should be interpreted in the occupational 4.7


safety and health context to establish relevance.
Early Detection / Timely data processing, analysis and dissemination to guide 3.9
immediate actions to stop workplace hazards and possible
diseases clusters. Early detection may not be a critical function
in current OSH surveillance practice. However, there is the
need for more timely case detection and investigation to guide
effective intervention action.

Provide timely data In current OSH surveillance practice, there could be months to 3.5
years lag in data incubation. New and emerging data sources
such as syndromic data system, electronic records data, social
media or other crowd sourcing data may provide hope for more
timely data.

Detect clusters & unusual Clusters or unusual events such as multiple injuries/illness 3.8
events happening in the same workplace at the same time may signal
unsafe working condition. More timely data can help to detect
clusters and unusual events in a quick manner.

Guide immediate actions More timely data help to guide immediate actions to prevent 3.5
more events from happening on workers. Data can be provided
to relevant organizations if the surveillance system does not
have the action capacity. For example, OSHA may use the
information to guide workplace inspections on workplace
hazards.

Ongoing / Ongoing surveillance activities with an emphasis on 4.9


Monitoring identifying burdens and trends over time and population.
173

Mean
Component Sub-component Definition
rating
Track burdens & trends Measure burdens of work-related injuries or illnesses (e.g. 4.7
frequency, rate, & severity) and monitor trends over time and
space.
Identify populations at risk Identify industries, occupations, and worksites as well as 4.7
populations, defined by sociodemographic characteristics or
work arrangements, at high risk for work-related injury, illness,
or hazardous exposures.

Detect workplace hazards Detect workplace hazards and facilitate the investigation of 4.8
diseases/events linked to occupational exposures.
Data / The function to routinely and actively disseminate data and 4.6
Dissemination information to the public to guide actions on improving
occupational safety and health.
Produce dissemination Consistently produce data and information for dissemination. 4.3
materials Customizing contents and formats relevant to targeted users
promotes the use of information for actions by stakeholders.
Dissemination timing Data dissemination on a routine and timely basis. Also consider 4.1
flexibility in the plan.
Dissemination Plan dissemination mechanism and maximizing dissemination 4.3
channels/mechanisms channels. Wide collaboration with local, state and national
government departments, institutions, industry associations and
employee unions, and other agencies helps to identify more
ways and channels for dissemination.

Supporting / Functions that facilitate implementation of the core functions /


activities
Supervision Activities of directing and watching over the surveillance work 3.7
and performance and ensuring the necessary resources and
logistics in place.
Management The day to day management of the surveillance system. It 4
ensures that planned activities are implemented according to
working schedule and protocol.
Training The process to build skills and knowledge for staff and other 4.3
stakeholders involved within the surveillance system through
knowledge transfer and/or hands-on coaching.
Communication & Effective communication and coordination within and outside 3.9
coordination the surveillance system are needed to facilitate collaborations
and ensure various surveillance activities can be conducted
effectively and efficiently. Adequate communication improves
mutual understanding on surveillance objectives and
requirements. Coordination facilitates integration among
different functions and components to achieve cost-
effectiveness.
174

Mean
Component Sub-component Definition
rating
Surveillance evaluation Evaluation of the system periodically and when needed 4
throughout the surveillance system's life cycle. Important to
ensure that the planned activities are on track and the
surveillance objectives are being achieved. Findings and
recommendations resulting from the evaluation should be
disseminated and utilized to improve the system.

Publications / Data (e.g. summary statistics, case investigation, trend analysis, 4.8
etc.) and interpretable information generated routinely and on
demand need to be reported in various forms, depending on the
surveillance purposes and dissemination methods.

Technical reports Describes the process, progress, and/or results of surveillance 4.2
activities, including data collection, generation and
interpretation in a scientific manner. Technical reports may
include findings and recommendations.

Media reports A way to disseminate surveillance findings and information to 4.2


general audience. Could include multiple media sources, such
as newspapers, news reports, webpages, webinars. Social
media is promising in reaching out to more end-users.

Conference presentations Way to disseminate surveillance and connect with academic 4.2
researchers and professionals from various fields, who could
potentially be data users and collaborate with the surveillance
system.

Peer-reviewed publications An official way to disseminate important findings to academic 4


researchers, professionals and other interested people. Peer-
reviewed publications are important in producing impact.
Educational Materials A way to disseminate information to targeted audience, such as 4.5
employers and employees dealing with work-related hazards,
injuries or illnesses. Materials need to be tailored for the
reading level of targeted audience in terms of content, format
and language used.

Datasets / Individual level and aggregated data records released by the 3.8
surveillance system to the public and interested
organizations/individuals for research and other purposes.
Other outputs / Other outputs in the surveillance system may include research 3.8
and outreach, intervention prevention guidance, strategic policy
recommendations, etc., depending on the surveillance system's
objectives and resources available.

Short & mid- Support research Provides data for hypotheses generation and support applied 3.7
term outcomes research & epidemiological studies.
Guide intervention Data and information produced help to guide the planning, 4.7
programs implementation, and evaluation of actions & programs
intended to prevent and control work-related injuries, illnesses,
and hazardous workplace exposures.
175

Mean
Component Sub-component Definition
rating
Increase awareness and Data and information produced help to increased awareness 4.5
knowledge and knowledge of occupational safety and health among public
health and occupational health professionals, decision makers,
and working populations.

Long-term Policy changes at Surveillance data and findings inform policy-making at 4.5
outcomes national/state/local level different administrative levels, which addresses issues and gaps
in occupational safety and health field.
Safer workplace Results and findings in the surveillance system help to create a 4.9
safer workplace, where hazardous exposures are eliminated and
interventions to promote workers' health are implemented.
Reduced workplace The ultimate goal and overarching outcome of the surveillance 4.9
injuries, illnesses & system is to lead reduced work-related injuries, illnesses, and
diseases diseases.
176

Appendix 3-3 Attributes to OSH surveillance components

Mean
Component Attribute Definition
Rating
Infrastructure Legislative support Existence of legal and regulatory requirements to support the 3.6
establishment and maintenance of the surveillance system and
its activities.
Compliance Degree to which the surveillance system complies with all 4.2
relevant legislation, regulations and policies, including ethics
and confidentiality requirements.
Sustainability The system is able to sustain itself with adequate financial, 4.8
human and material resources.
Surveillance Confidentiality Privacy and data confidentiality requirements for the 4.5
strategy collection, storage, backup, transport and retrieval of
information (especially over the internet), based on relevant
standards and guidelines.

Significance Objectives, priorities and events under surveillance in the 4


system reflect significant occupational safety and health
concerns.
Feasibility Surveillance strategies and methods are suitable and applicable 4.5
to the surveillance objectives and available resources.
Transparency Surveillance strategies, methodologies and activities are 4.4
planned and revised following a well-established, sound and
transparent approach and process.
Stakeholders Acceptability Stakeholders understand the purpose of the surveillance 4.5
system, advocate the system, feel the system is useful, and/or
are willing to participate in surveillance activities.
Mutual understanding Relevant information is shared among stakeholders, including 4.2
surveillance objectives and requirements (such as privacy and
confidentiality requirements), the flow of surveillance data and
process, and the methodologies of information dissemination
and utilization. This helps to achieve mutual understanding and
expectations among stakeholders.

Mutual benefit The capability of the surveillance system to create 4


opportunities where collaborations can benefit both the system
and its partners/collaborators.
Activities Relevance Degree to which functions and activities in the surveillance 4.3
system are relevant to its objectives and priorities as per
occupational safety and health needs.
Adherence Degree to which the surveillance system follows guidelines, 4.1
standards, protocols in its core and supporting functions.
Interoperability Ability of different components and functions in the
surveillance system to work together in a smooth and
symbiotic manner.
177

Mean
Component Attribute Definition
Rating
Outputs Accessibility Ability of the surveillance system to make data and 4.5
information accessible to those who need it and when they
need it.
Usability Data and products produced by the surveillance system should 4.6
be tailored to meet the needs of intended users.
Outcomes Usefulness Ability of the surveillance system to directly and indirectly 4.7
contribute to the prevention and control of adverse
occupational health conditions and improve workers' safety and
health.
System related Simplicity A guiding principle in designing and implementing the 3.8
surveillance system, including the organizational structure, the
surveillance strategies, information flow from data providers to
data users, coordination of various surveillance activities.
Simplicity is conductive to the effectiveness and efficiency. All
different components in the surveillance system should be
designed to avoid unnecessary complexity.

Flexibility Ability of the system to adapt to changes in operating 4.3


conditions, technologies or information with little additional
time, personnel, or funds.
Timeliness Capability of the system to finish working steps and fulfills its 4.2
objectives with a speed that's quick and appropriate.
Stability Ability of the surveillance system to be operational and provide 4.5
surveillance products reliably and stably (without failure).
Integration Ability of the surveillance system or components in the system 3.3
to integrate/connect with other surveillance or public health
systems to enhance interoperability, effectiveness and/or to
reduce cost.

Effectiveness Ability of a surveillance system to achieve its intended 4.5


objectives.
Cost-effectiveness Goes beyond effectiveness by bringing in a reference to the 3.6
amount of resources involved. A surveillance system is cost
effective if it identifies data with the biggest impact on
improving worker safety and health while minimizing the cost
of collecting the data. Cost-effectiveness analysis should
consider the societal costs associated with the occurrence of
events under surveillance, i.e., the costs associated with the
health and productivity consequences of occupational
exposures, injuries, illnesses, and mortality on workers, their
families, and society, including both direct and indirect costs.

Data quality Validity Degree to which information correctly describe the health 4.3
related event it was designed to measure.
Sensitivity The ability of the surveillance system to capture true 4.1
cases/events or outbreaks/clusters.
178

Mean
Component Attribute Definition
Rating
Specificity The proportion of individuals not having the occupational 4.1
health related event identified by the system as not having the
event.
Predictive Value Positive The proportion of reported cases/events that are actually true 4.1
(PVP) cases/events.
Representativeness The extent to which data adequately represent the population 4.6
under surveillance and relevant sub-populations by time, place,
population demographics and socio-demographics.
Consistency Data have the same meanings (e.g. case definition, diagnosis 4.5
standards) to allow for consistent interpretation. This includes
internal consistency (within a dataset) and external consistency
(across different data sources), as well as consistency over
time.
Completeness Required variables recorded in the surveillance datasets are 4.1
complete without missing data.
Accuracy Data, statistics, and information produced in the surveillance 4.7
datasets are accurate without errors.
Clarity Data, statistics, and information is coded/presented clearly 4.2
without ambiguity. For example, variables are named
appropriately. Data dictionary is clear and easy to follow.
179

Appendix 3-4 Example measures for attributes

Attribute Measure
Legislative support Are there mandatory requirements on the establishment of the system?
Are there mandatory requirements on data reporting/collection for the
events/cases under surveillance?
Compliance Is the system in compliance with all legal and regulatory requirements?
Sustainability Is funding secure for short-term (e.g. 3-5 years) and long-term (e.g. more than 5
years) future?
Are the necessary human resources (e.g. staffing) sustainable for the short-term
and long-term future?
Are the necessary material resources (e.g. technical infrastructure) sustainable for
the short-term and long-term future?
Can the system maintain and expand collaborations in the future (e.g. for data
collection, analysis and dissemination)?
Can financial, human and material resources support the system to adapt to
envisioned changes and/or to expand activities?
Confidentiality Are privacy and confidentiality requirements clearly specified in
protocols/documentation?
Are the requirements reviewed regularly to be updated as necessary?
Are well-established methodologies in place to secure confidentiality in the data
collection and transport, especially over the internet?
Is there a process to check for and document deviations and corrective actions
when privacy or confidentiality is breached?

Significance Does the system identify significant OSH issues (i.e. potentially catastrophic,
associated with big economic and/or social impacts, or concerns among the public
or industries)?
Once identified, is there a process to set objectives and prioritize events to
surveil? e.g. with advisory board or stakeholders, through consensus, through
grant peer review.
How does the system adjust its objectives and priorities based on emerging OSH
issues?
Are there past examples of the system reflecting significant occupational safety
and health concerns?
Are populations at risk (defined by age, ethnicity, size, spatial distribution, etc.)
identified and tracked in the system on an ongoing basis?
Feasibility Do the system strategies correspond to objectives and priorities?
Do necessary outside resources exist (e.g. steering committee, technical
committee, government affiliation)?
Do the staff have the required knowledge, skills and experience to operate and
maintain the system? Multidisciplinary expertise may be needed for modern day
surveillance.
180

Attribute Measure
Are the surveillance events/cases trackable over time given the data sources
available?
Do the resources available match the strategies and processes (data collection,
data analysis, and data dissemination, etc.)?
Transparency Are important details (e.g. surveillance strategies, planned activities,
methodologies for data collection and analysis, changes) clearly documented in
system protocols/documentation on a regular basis?
Acceptability Are stakeholders aware of the existence of legal and mandatory requirements for
the surveillance activities?
Do stakeholders actively communicate with the system staff?
Participation rate/responding speed of stakeholders in a certain surveillance
activity (e.g. meeting, data request, survey).
The level of willingness of stakeholders to collaborate with the surveillance
system.
Mutual understanding Are responsibilities clearly specified for staff and collaborators in the surveillance
system?
Is there a mutual understanding of security, confidentiality and privacy process
between the system and stakeholders (e.g. data providers, data users)?
Has a trusted relationship been established between the surveillance system and
its collaborators?
Mutual benefit Are opportunities created for collaborators to benefit from the system?
Relevance Is there an up-to-date logic model/flowchart that links the system's functions and
objectives?
Is each function/activity necessary to achieve objective/meet priorities?
Are performance indicators specified in the surveillance system? Example
indicators include expected frequency of case reporting, data analysis and
dissemination, or frequency of work meetings?
Adherence Is there auditing process to track how function protocols are followed?
Are methods established for data collection and processing to ensure high-quality
data?
Accessibility Is there an appropriate plan for dissemination to increase accessibility, including
channels and formats?
Is the system able to respond to internal and external data requests? Balance
between confidentiality requirements and benefits of feeding data to researchers
and the public may be a challenge.
Usability Are useful data interpretations provided to the audience beyond the data (e.g.
statistics)?
Are products tailored to the need of the end users (e.g. reading level, knowledge
level, language, industry/occupation, disability needs, culture and formats of
communication)?
Can statistics be stratified by factors of interest, such as sex, age, ethnicity,
socioeconomic status, geography, etc?
181

Attribute Measure
Does the system seek feedback on usability by stakeholders to guide
improvement?
Usefulness Does the system provide timely data to detect outbreaks/clusters/unusual events?
Calculate the number/percentage of events detected.
Does the surveillance data allow for investigation of clusters, outbreaks, and/or
unusual events to guide timely mitigation action? Calculate the
number/percentage of the investigated/actionable.
Does the system measure the burden of work-related injuries or illnesses and
monitor trends over time and space?
Can the surveillance data guide identification of worker populations at risk?
Does system track leading indicators of occupational safety and health (e.g.
workplace hazards)?
Can the system detect new/emerging diseases and/or workplace hazards?
Has the data dissemination helped to increase the workers' and employers' safety
and health awareness?
Does the system provide data to support epidemiological studies and applied
research/practices?
Can surveillance data guide the planning, implementation, and evaluation of
interventions or programs intended to prevent and control work-related injuries,
illnesses, and diseases?
Has the surveillance data lead to policy/regulation changes at national/state/local
level?
Simplicity Is the case definition of events/conditions under surveillance simple and easy to
understand?
Does the system rely on sophisticated data collection methods, multiple data
sources and/or multiple collaborators?
Does the system involve active surveillance activities?
Are system protocols easy to follow?
Is the organizational structure simple but sufficiently effective to meet the
surveillance needs?
Does the system use accepted standards for data coding and data sharing so that
merging and comparison within/outside the system is easy?
Are processes and platforms (e.g. software, database) for data reporting, entry,
coding, merging, and sharing streamlined and easy to use?
Is there any part of the system unnecessarily complicated? Is there a better way to
streamline the system?
Flexibility Can the system respond to event/case definition changes?
Can the system respond to changes and/or variations across existing data sources?
Can the system easily incorporate new data sources?
Can the system respond to new data standards (e.g. ICD-9 to ICD-10)?
Can the system respond to changing surveillance technologies (e.g. online data
sharing, machine learning techniques for data coding)?
182

Attribute Measure
Can the system adjust surveillance objectives and strategies as needed (e.g.
respond to new hazards/conditions)?
Is there funding/personnel/material redundancy to support flexibility?
Timeliness Time required for collaborating agencies to process data before it is transferred to
the system.
Time needed to investigate and analyze the data for mortality/morbidity, trends,
outbreak, risk factors.
Time needed for the surveillance system to release warning alerts (outbreak,
unusual events) and/or data/statistics reports.
Time needed for the system to respond to requests (e.g. data request,
regular/urgent work needs).
Overall time spent from events (exposure, seeking health information, medical
care, and/or medicine) to data dissemination and eventually to public health
actions.
Are the time spent appropriate for the objectives of the system?

Stability Can the system continue its planned activities with staff turnover or other
resource problems?
Are data sources and data collection collaboration stable to ensure ongoing
surveillance?
Can the system maintain its activities under changing situations such as coding
system changes, technological updating, switching technical platforms?
Integration Is the system connected with other public health systems within or outside the
same organization (in terms of data collection/processing/sharing)?
Does the system follow standardized data collection and sharing for integration
ease?
Are problems/opportunities identified with functions as they relate to each other?
Are potential issues addressed for a successful integration (e.g. to avoid
annexation)?
Are there data sharing mechanisms among different surveillance functions, which
includes information to share, to whom the information being communicated, and
data sharing platform?
Effectiveness Is the system effective in accomplishing its stated objectives?
Are various methods used to increase effectiveness, such as shared efforts on data
collection/processing/dissemination through internal and external collaboration?
Percentage of planned tasks (e.g. obtaining source data, outbreak detection & case
investigation, data analysis, producing data reports) accomplished by due dates.
What are the costs (fixed and variable) associated with each surveillance activity
in the system?
Cost-effectiveness What are the costs (direct and indirect, individual and societal) for
injuries/illnesses/events under surveillance?
Is cost-effectiveness (financial and social) a consideration in decision-making for
objectives and priorities, and the selection of surveillance methodologies?
183

Attribute Measure
Is there up-to-date cost-effectiveness analysis done for the system?
Validity Is their standard scientific evidence that the event under surveillance is work-
related, or significantly associated with occupational factors.
Validity of the survey constructs.
Are the data of sufficient quality to generate a valid estimate?
Are biases introduced in data collection process (e.g. case reporting, data coding,
method to identify work-relatedness) that affect the validity of measurements?
Sensitivity The likelihood/percentage that workers will seek medical care or compensation
and be reported/captured to the surveillance system?
Sensitivity of the screening/diagnosis procedure.
The proportion/number of cases/events are captured by the surveillance system
compared to certain standard OSH data source(s)?
Is there any active approach used to help to detect true cases/events?
Can the system accurately detect clusters, outbreaks or usual events in an
appropriate time frame?
Specificity Specificity of the screening/diagnosis procedure.
The proportion/number of non-cases/events correctly determined by the
surveillance methodology/algorithm (e.g. by comparing two methodologies
identifying work-related cases).
Predictive Value Positive The proportion/number of cases/events captured by the surveillance system that
(PVP) are confirmed as true cases/events.
Is there active approach used to verify cases/events?
Representativeness Does the system reflect the characteristics of working populations under
surveillance?
Does the system reflect the change of population characteristics over time?
Are there sub-populations of interest excluded or under-reported?
Consistency Are data collected by different entities within the system following consistent and
rigorous formats?
Are data collected from different sources share common methodology (e.g. case
definition, coding systems, and classifications)?
Is data coded the same way over time to track trends?
Completeness Proportion of missing or blank data?
What is the reason for missing data?
What is the impact of missing data?
Accuracy Are there proper quality control measures to identify and correct errors in place?
Is there data corruption due to incorrect data merging or conversion?
Clarity Are variables coded clearly and consistently?
Does the surveillance system keep a clear and easy-to-follow, up-to-date data
dictionary?
Are data provided by the system in formats easy to understand and provided with
accurate data dictionaries?
184

Attribute Measure
Are educational materials and other materials for dissemination formatted clearly
and precisely?
185

Appendix 3-5 Template working tables for developing evaluation methods on selected each attribute and measures

Table S1 demonstrates how methods are developed for each attribute by selecting measures and developing evidence

collection methods, data analysis and evaluation criteria (if applicable) for each measure. Table S2 further sort evaluation methods by

different data collection methods.

Table S1. Evaluation methods by attributes and measures


Definition/Operational
Attributes Evaluation Measures Evidence collection methods Data analysis Evaluation criteria Remark
definition
Simplicity A guiding principle in designing 1) Number of data sources · Review surveillance · Describe for each · May set a cut-off
and implementing the surveillance needed for each event guide/protocol event number for
system, including the · Quantitative numbers simple/complex
organizational structure, the 2) Ease of obtaining and · Review surveillance · Describe for each
surveillance strategies, processing data; guide/protocol; event
information flow from data · Interview key surveillance
providers to data users, staff for information on data
coordination of various processing;
surveillance activities. Simplicity 3) Ease of event/case · Review surveillance · Describe for each · May set a cut-off score
is conductive to the effectiveness ascertainment; guide/protocol; event; for simple/complex
and efficiency. All different · Interview key surveillance · Summary statistics of
components in the surveillance staff for details on case staff ratings
system should be designed to ascertainment;
avoid unnecessary complexity. · Key surveillance staff rate the
ease of case ascertainment.
4) Number of organizations · Interview key surveillance ﹒Describe for the
requiring data reports. staff to learn how reports are system
disseminated
186

Flexibility Surveillance strategies and 1) whether it accommodates · Check previous changes in · Describe possible
methods are suitable and updates and changes in data surveillance activities; changes;
applicable to the surveillance sources and coding · Interview key surveillance · Describe ways the
objectives and available resources. standards? staff for experience/process on system works with
adopting changes and possible possible changes
problems with it;
· Interview main data
providers for information on
any changes in the past years
and future;

2) Whether it · Interview supporting · Describe potential


accommodates other leaders for their insights into changes proposed by
changes (e.g., emerging possible changes in the future stakeholders;
OSH issues/surveillance & how the system could · Summarize inputs to
technologies/funding prepare for them; describe past examples
sources/legislations)? · Focus group among program and future preparations
management and key
personnel for their insights
into possible changes in the
future & how the system could
prepare for them
· Survey advisory committee
for their insights into future
changes & preparation

Other attributes



187

Table S2. Evaluation methods by data collection methods


Evidence
collection Participant/Subject Entity represented Measurement Evaluation question/item to collect Question type Remark
method
Interview Key surveillance staff Name of the surveillance Simplicity - 2) Ease of · Please describe how you collect the · Open and
system/program obtaining and processing data? Closed-ended
data; · Do you think the process is easy? questions
· Is there any difficulty in the
process?

Simplicity - 3) Ease of · Could you rate the ease of case · Likert-scale


event/case ascertainment; ascertainment/definition for each rating
event/case definition using 5-point · Open-ended
scale? questions
· Are there some case definitions
difficult or tricky to ascertain?

Simplicity - 3) Number of · How many organizations require · Open


organizations requiring data reports? questions
reports. · Could you explain the process
working with them?

Flexibility - 1) Whether it · Could you describe some previous · Open


accommodates updates and changes in the surveillance? questions
changes in data sources and · Are there any problems adapting
coding standards? these changes?
· How did you address these
problems?
188

Data provider (source #1) Position title/Name of Flexibility - 1) Whether it · Do you expect any changes related · Open
the agency accommodates updates and to the surveillance system's access to questions
changes in data sources and and use of your data?
coding standards? · How should the system be prepared
for the changes?

Supporting leader Position title/Name of Flexibility - 2) Whether it · Do you expect any changes related · Open
the system accommodates other to ~ in the surveillance system (e.g., questions
changes (e.g. emerging funding, or other aspects related to
OSH issues/surveillance the responsibility of the supporting
technologies/funding leader)?
sources/legislations)? · How should the system be prepared
for the changes?

Other data
collection
method



189

Appendix 3-6 Template working table for data collection plan

Evidence Responsible
Participants and Evaluation Planned time Actual time
collection person and Remark
contact protocol and location and location
method contact
Document /
review
Interview Surveillance key Protocol name
staff name
Email:
Data provider name Protocol name
(source #1)
Email:
Data provider name Protocol name
(source #2)
Email:
Supporting leader Protocol name
name
Email:



190

Appendix 3-7 Sample interview guides 4

1. Interview guide for higher level leaders

1. How do you see the importance and significance of the surveillance system?

2. How does the surveillance system fit into the organization’s objectives and goals?

3. What is the long-term strategic plan in your organization to further support the system?

• What are the challenges or barriers for the organization to support the system?

4. What are your suggestions to further improve the surveillance system?

2. Interview guide for key OSH surveillance staff

1. Please describe the working process/steps including data request, and the approximate

time spent for each step in the surveillance process?

2. For each event/case definition, please rate how simple it is to identify cases on a 5-point

scale, with 1 indicating not simple and 5 the simplest (show a table list).

• For those you rated 1 or 2, please explain the reasons.

3. How many organizations request work reports and the surveillance data on a regular

basis?

4. Have there been changes regarding surveillance methods and guide?

• Any problems on adapting these changes?

5. Are there formal trainings on knowledges and skills needed for the OSH surveillance?

6. How quickly and timely the data providers and other collaborators respond to your work

requests?

4
Adapted from:
[1] Yang L., Weston C., Cude C., Kincl L. Evaluating Oregon’s Occupational Public Health Surveillance System Based on CDC
Updated Guidelines. Am J Ind Med. In press.
191

7. How do you think of the overall data quality in the system?

8. Do you have any concerns on the data completeness and validity in this system?

9. Are you aware of any study done on validity, sensitivity and/or PVP for source data

and/or data in this system?

10. Do you think any sub-populations are excluded/under-reported in the surveillance?

11. Are there any active surveillance approaches in place to increase the sensitivity and/or

correct errors?

12. Have you experience any failure in the work infrastructure that caused you not being able

to complete the surveillance work in time?

13. Are there any potential changes/challenges with data availability and the surveillance

infrastructure that could be a threat to the stability of the surveillance system?

3. Interview guide for data providers

1. Please briefly describe how you process your data from obtaining the raw data to

releasing the data.

2. Please describe the data quality control process in place to monitor errors and missing

values in your system.

3. Do you have protocols on the data quality control process?

4. Do you have formal training on data processing and data quality control?

5. Please describe the process dealing with the OSH surveillance system’s data request, and

the approximate time it takes.

6. How do you think of the overall data quality in your system?


192

7. Do you have any concerns on the completeness, validity, sensitivity, or PVP of the data

in your system?

8. Do you have completeness rate data of the key variables that the OSH surveillance

system needed?

• If yes, could you show me the statistics?

9. Are you aware of any study done on validity, sensitivity, and/or PVP for data in your

system?

10. Do you think any sub-populations are excluded/under-reported in your data?

11. Are there any active surveillance approaches in place to increase the sensitivity and/or

correct errors?

12. Do you expect major technical changes in the next 5 years for your data system?

• If yes, please briefly introduce them.

13. Do you expect major other changes in the next 5 years in your system, which may impact

the collaboration between you and the OSH surveillance system?

• If yes, please briefly introduce them.

14. Do you have any suggestions for us to better collaborate with your agency?
193

Appendix 3-8 A sketch of evaluation report

Table of Contents

List of Tables

List of Figures

List of Acronyms

Executive summary

1. Description of the OSH surveillance system

1.1 Purpose and activities of the surveillance system

1.2 Events under surveillance and public health importance

1.3 Infrastructure and resources for the system

2. Evaluation design

2.1 Purpose and objectives of the evaluation

2.2 Scope of work

2.2.1 Components under evaluation

2.2.2 Attributes under evaluation

2.3 Evaluation approach

2.3.1 Evaluation measures and data collection methods

2.3.2 Evaluation steps and timelines

3. Evaluation results

4. Recommendations for improving the system

5. Conclusions

6. Correction actions and timelines (Optional)


194

Appendix 3-9 Further references

1. Centers for Disease Control and Prevention. Updated guidelines for evaluating public health
surveillance systems: recommendations from the guidelines working group. Morb Mortal Wkly
Rep. 2001;50(No. 13):1-36.

Introduction: This CDC updated guidelines is the most well-known guideline for all kinds of
public health surveillance systems. It introduces evaluation steps and details on collecting
evidence to evaluate the ten most commonly used attributes, which are good reference for OSH
surveillance evaluation.

2. Centers for Disease Control and Prevention (CDC), Office of Strategy and Innovation.
Introduction to Program Evaluation for Public Health Programs: A Self-Study Guide. Atlanta,
GA: Centers for Disease Control and Prevention; 2011.

Introduction: This document includes introduction on developing and conducting health


program evaluation, which may be reference in OSH surveillance evaluation.

3. Hoffman S, Dekraai M. Brief Evaluation Protocol for Public Health Surveillance Systems. the
Public Policy Center, University of Nebraska; 2016.

Introduction: This brief protocol converts the CDC updated guidelines into more specific guides
with practical tools for designing and implementing a brief evaluation of public health
surveillance systems.

4. Harris MJ. Evaluating Public and Community Health Programs. John Wiley & Sons,
Incorporated; 2016.

Introduction: This book includes thorough introduction on many aspects of health program
evaluation, including collecting and analyzing quantitative and qualitative data. It may serve as a
guide book for basic theory and practice in evaluation.

5. Bernard HR, Wutich A, Ryan GW. Analyzing Qualitative Data: Systematic Approaches.
SAGE publications; 2016.

Introduction: This book presents systematic methods for analyzing qualitative data with clear
and easy-to-understand steps. It can be used as a guide book for qualitative data analysis.
195

Appendix 4-1 NORA industry sectors ranked by Prevention Index (PI) for male and female young workers

Table S1. NORA industry sectors ranked by Prevention Index (PI) for male young workers, Oregon workers' compensation accepted disabling and non-disabling claims,
2013-2018
Claim Cost Claim Cost
Claim Rate Rate Basic Expanded Claim Rate Rate Basic Expanded
NORA Sector Description Count /10,000 /100 PI PI Count /10,000 /100 PI PI
Workers Workers Workers Workers
Disabling Claims Non-disabling Claims
Agriculture, Forestry & Fishing (except Wildland 887 375.8 26357.7 1 1 1651 782.5 7908.8 3 4
Firefighting and including seafood processing)
Construction 1050 262.3 16495.2 2 3 2478 801.2 6849.2 2 2
Healthcare & Social Assistance (including 397 125.1 4162.2 9 9 859 445.0 3561.3 7 5
Veterinary Medicine/Animal Care)
Manufacturing (except seafood processing) 1219 223.5 12082.0 4 5 2503 853.6 7312.3 1 1
Mining 14 236.5 41408.4 8 4 41 942.5 11198.1 4 3
Public Safety (including Wildland Firefighting) 92 245.0 15070.7 7 6 85 396.5 3166.3 9 9
Services (except Public Safety and Veterinary 2493 94.0 3955.0 6 8 5059 366.9 2771.4 6 8
Medicine/Animal Care)
Transportation, Warehousing & Utilities 645 314.2 17184.1 3 2 279 406.3 3800.9 8 7
Wholesale and Retail Trade 1618 115.5 4865.1 5 7 2311 442.2 3541.0 5 6
NORA: National Occupational Research Agenda
196

Table S2. NORA industry sectors ranked by Prevention Index (PI) for female young workers, Oregon workers' compensation accepted disabling and non-disabling claims,
2013-2018
Claim Cost Claim Cost
Claim Rate Rate Basic Expanded Claim Rate Rate Basic Expanded
NORA Sector Description Count /10,000 /100 PI PI Count /10,000 /100 PI PI
Workers Workers Workers Workers
Disabling Claims Non-disabling Claims
Agriculture, Forestry & Fishing (except Wildland 130 141.4 7094.7 6 4 276 336.7 4392.5 4 3
Firefighting and including seafood processing)
Construction 36 57.5 2992.2 8 8 87 179.8 2370.8 7 5
Healthcare & Social Assistance (including 1590 142.3 5396.1 1 3 3689 542.7 5017.4 1 1
Veterinary Medicine/Animal Care)
Manufacturing (except seafood processing) 316 146.9 7470.5 3 2 426 368.4 3725.1 2 2
Mining / /
Public Safety (including Wildland Firefighting) 34 109.1 6721.1 7 6 25 139.5 1185.7 8 8
Services (except Public Safety and Veterinary 1719 53.4 2270.1 5 7 3510 208.7 1683.8 3 4
Medicine/Animal Care)
Transportation, Warehousing & Utilities 161 207.6 10002.9 2 1 53 206.4 1749.8 6 6
Wholesale and Retail Trade 930 68.5 2679.5 4 5 816 166.5 1430.4 5 7
NORA: National Occupational Research Agenda
197

Appendix 4-2 Industry groups (NAICS 4-digit) ranked by Prevention Index (PI)

Table S3. Top 5 Industry groups ranked by Prevention Index (PI) for young workers, Oregon workers' compensation accepted
disabling and non-disabling claims, 2013-2018
Claim
Cost Rate
4-digit Industry group (4-digit NAICS Claim Rate Basic Expanded
NORA sector /100
NAICS Description) Count /10,000 PI PI
Workers
Workers
Disabling Claims
Agriculture 1133 Logging 217 788.8 79941.6 1 1
Agriculture 1153 Support Activities for Forestry 191 481.1 33162.7 2 2
Healthcare 2381 Foundation, Structure, and Building 273 429.2 31059.1 3 3
Exterior Contractors
Manufacturing 3219 Other Wood Product Manufacturing 179 433.7 21519.0 4 4
Wholesale & 4244 Grocery and Related Product Merchant 198 329.9 19507.7 5 6
Retail Wholesalers

Non-disabling Claims
Healthcare 6239 Other Residential Care Facilities 643 8712.7 71249.1 1 1
Healthcare 6231 Nursing Care Facilities (Skilled 1377 2356.7 26040.7 2 2
Nursing Facilities)
Construction 2383 Building Finishing Contractors 778 1373.6 11955.9 3 5
Manufacturing 3331 Agriculture, Construction, and Mining 178 5562.5 49842.6 4 3
Machinery Manufacturing
Manufacturing 3211 Sawmills and Wood Preservation 322 1816.1 18314.5 5 4
NORA: National Occupational Research Agenda.
198

Table S4. Top 5 Industry groups ranked by Prevention Index (PI) for male young workers, Oregon workers'
compensation accepted disabling and non-disabling claims, 2013-2018
Claim Cost
4-digit Industry group (4-digit NAICS Claim Rate Rate Basic Expanded
NORA sector
NAICS Description) Count /10,000 /100 PI PI
Workers Workers
Disabling Claims
Agriculture 1133 Logging 216 835.9 85001.3 1 1
Construction 2381 Foundation, Structure, and 268 469.8 34186.6 2 2
Building Exterior Contractors
Agriculture 1153 Support Activities for Forestry 184 525.7 34440.0 3 3
Manufacturing 3219 Other Wood Product 159 457.8 22283.9 4 5
Manufacturing
Transportation 4931 Warehousing and Storage 156 424.0 30393.0 5 4
Non-disabling Claims
Healthcare 6239 Other Residential Care Facilities 179 6884.6 53307.1 1 1
Manufacturing 3331 Agriculture, Construction, and 159 5845.6 46271.8 2 2
Mining Machinery Manufacturing
Construction 2383 Building Finishing Contractors 739 1512.8 12388.6 3 5
Manufacturing 3211 Sawmills and Wood Preservation 291 1796.3 18456.4 4 3
Manufacturing 3219 Other Wood Product 297 1590.8 12939.6 5 7
Manufacturing
NORA: National Occupational Research Agenda.
199

Table S5. Top 5 Industry groups ranked by Prevention Index (PI) for female young workers, Oregon workers'
compensation accepted disabling and non-disabling claims, 2013-2018
Claim Cost
4-digit Industry group (4-digit NAICS Claim Rate Rate Basic Expanded
NORA sector
NAICS Description) Count /10,000 /100 PI PI
Workers Workers
Disabling Claims
Healthcare 6231 Nursing Care Facilities (Skilled 252 334.3 11642.9 1 1
Nursing Facilities)
Healthcare 6221 General Medical and Surgical 293 268.6 9774.7 2 3
Hospitals
Healthcare 6232 Residential Intellectual and 201 293.6 11255.1 3 3
Developmental Disability, Mental
Health, and Substance Abuse
Facilities
Transportation 4921 Couriers and Express Delivery 56 525.3 26491.1 4 2
Services
Healthcare 6233 Continuing Care Retirement 415 179.5 6889.8 5 10
Communities and Assisted Living
Facilities for the Elderly
Non-disabling Claims
Healthcare 6239 Other Residential Care Facilities 443 9229.2 74996.9 1 1
Healthcare 6231 Nursing Care Facilities (Skilled 1177 2509.6 28178.2 2 2
Nursing Facilities)
Healthcare 5419 Other Professional, Scientific, and 292 1334.6 8217.7 3 3
Technical Services
Healthcare 8129 Other Personal Services 130 797.6 5419.0 4 8
Healthcare 6212 Offices of Dentists 244 638.4 4400.7 4 11
NORA: National Occupational Research Agenda.
200

Table S6. Top 5 Industry groups under major NORA industry sector ranked by Prevention Index for young workers, Oregon
workers' compensation accepted disabling claims, 2013-2018
Claim Rate Cost Rate
NORA 4-digit Industry group (4-digit Claim Basic Expanded
/10,000 /100
sector NAICS NAICS Description) Count PI PI
Workers Workers
Agriculture, 1133 Logging 217 788.8 79941.6 1 1
Forestry &
1153 Support Activities for Forestry 191 481.11 33162.7 2 2
Fishing (except
Wildland 1119 Other Crop Farming 123 301.32 24426.0 3 3
Firefighting and
1151 Support Activities for Crop 156 284.46 14363.3 3 4
including
Production
Seafood
Processing) 1121 Cattle Ranching and Farming 65 319.88 16977.2 5 5
Construction 2381 Foundation, Structure, and 273 429.18 31059.1 1 1
Building Exterior Contractors
2361 Residential Building 197 257.89 15746.2 2 2
Construction
2383 Building Finishing Contractors 169 230.5 15333.2 3 3
2389 Other Specialty Trade 91 209.05 11439.7 4 5
Contractors
2382 Building Equipment 184 164.29 8028.9 4 6
Contractors
Healthcare & 6231 Nursing Care Facilities (Skilled 275 292.8 10004.2 1 1
Social Assistance Nursing Facilities)
(including
6221 General Medical and Surgical 383 278.59 9819.1 1 2
Veterinary
Hospitals
Medicine/Animal
Care) 6232 Residential Intellectual and 262 261.35 9727.5 3 4
Developmental Disability,
Mental Health, and Substance
Abuse Facilities

6233 Continuing Care Retirement 476 157.3 6005.1 4 6


Communities and Assisted
Living Facilities for the Elderly
6222 Psychiatric and Substance 49 1195.12 23106.6 5 3
Abuse Hospitals
Manufacturing 3219 Other Wood Product 179 433.73 21519.0 1 1
(except Seafood Manufacturing
Processing) 3362 Motor Vehicle Body and Trailer 106 436.93 17362.8 2 2
Manufacturing
3211 Sawmills and Wood 104 315.34 20099.4 3 3
Preservation
3114 Fruit and Vegetable Preserving 157 258.18 10731.6 4 9
and Specialty Food
Manufacturing
3371 Household and Institutional 58 270.02 16277.9 5 7
Furniture and Kitchen Cabinet
Manufacturing
201

Services (except 5613 Employment Services 740 201.19 9533.1 1 2


Public Safety and
5617 Services to Buildings and 333 222.64 10931.3 2 1
Veterinary
Dwellings
Medicine/Animal
Care) 8111 Automotive Repair and 138 145.68 7108.3 3 5
Maintenance
9241 Administration of 23 297.54 24270.1 4 3
Environmental Quality
Programs
9231 Administration of Human 23 165.83 8134.6 5 7
Resource Programs
Transportation, 4811 Scheduled Air Transportation 85 840.75 29126.3 1 1
Warehousing &
4841 General Freight Trucking 115 359.38 23004.2 2 2
Utilities
4931 Warehousing and Storage 185 301.06 20401.2 2 3
4842 Specialized Freight Trucking 89 387.46 15391.3 2 4
4921 Couriers and Express Delivery 190 280.48 15230.0 5 5
Services
Wholesale and 4244 Grocery and Related Product 198 329.89 19507.7 1 1
Retail Trade Merchant Wholesalers
4413 Automotive Parts, Accessories, 154 198.84 10316.2 2 2
and Tire Stores
4441 Building Material and Supplies 158 152.82 6479.3 3 7
Dealers
4421 Furniture Stores 43 209.96 8580.4 4 4
4451 Grocery Stores 526 118.7 3876.1 4 10
NORA: the National Occupational Research Agenda
202

Table S7. Top 5 Industry groups under each major NORA industry sector ranked by Prevention Index for young workers,
Oregon workers' compensation accepted non-disabling claims, 2013-2018
Claim Rate Cost Rate
NORA 4-digit Industry group (4-digit Claim Basic Expanded
/10,000 /100
sector NAICS NAICS Description) Count PI PI
Workers Workers
Agriculture, 1133 Logging 312 1259.08 16702.5 1 1
Forestry &
1153 Support Activities for Forestry 415 1160.19 13135.7 2 2
Fishing (except
Wildland 1119 Other Crop Farming 257 698.75 6815.7 3 4
Firefighting and
including 1131 Timber Tract Operations 41 2645.16 23922.5 4 3
seafood 1129 Other Animal Production 35 1174.5 13167.1 5 5
processing)
Construction 2383 Building Finishing Contractors 778 1373.59 11955.9 1 1
2381 Foundation, Structure, and 349 710.22 5645.7 2 3
Building Exterior Contractors
2382 Building Equipment 546 631.07 4927.8 2 5
Contractors
2389 Other Specialty Trade 263 782.04 5920.4 4 3
Contractors
2379 Other Heavy and Civil 74 1541.67 25863.0 5 2
Engineering Construction
Healthcare & 6239 Other Residential Care 643 8712.74 71249.1 1 1
Social Assistance Facilities
(including 6231 Nursing Care Facilities (Skilled 1377 2356.67 26040.7 2 2
Veterinary Nursing Facilities)
Medicine/Animal
5419 Other Professional, Scientific, 335 1182.07 6542.0 3 3
Care)
and Technical Services
6221 General Medical and Surgical 451 527.3 4172.1 4 7
Hospitals
6222 Psychiatric and Substance 100 3921.57 27482.1 5 4
Abuse Hospitals
Manufacturing 3331 Agriculture, Construction, and 178 5562.5 49842.6 1 1
(except seafood Mining Machinery
processing) Manufacturing
3211 Sawmills and Wood 322 1816.13 18314.5 2 2
Preservation
3219 Other Wood Product 327 1474.3 12078.0 3 3
Manufacturing
3323 Architectural and Structural 160 1515.15 12244.3 4 4
Metals Manufacturing
3339 Other General Purpose 125 2055.92 11970.9 4 6
Machinery Manufacturing
Services (except 9241 Administration of 144 3348.84 21636.7 1 1
Public Safety and Environmental Quality
Veterinary Programs
Medicine/Animal 6113 Colleges, Universities, and 565 714.74 5114.9 2 7
Care) Professional Schools
203

5413 Architectural, Engineering, and 146 989.16 8652.7 3 4


Related Services
5617 Services to Buildings and 477 634.65 5775.9 4 6
Dwellings
7212 RV (Recreational Vehicle) 106 1606.06 16850.3 5 3
Parks and Recreational Camps
Transportation, 4859 Other Transit and Ground 52 4727.27 40569.0 1 1
Warehousing & Passenger Transportation
Utilities
4841 General Freight Trucking 177 1614.96 17650.8 2 2
2211 Electric Power Generation, 19 2878.79 19363.0 3 2
Transmission and Distribution
4911 Postal Service 12 4000 40691.7 4 2
4921 Couriers and Express Delivery 19 81.9 587.4 5 5
Services
Wholesale and 4233 Lumber and Other Construction 121 2180.18 26269.0 1 1
Retail Trade Materials Merchant
Wholesalers
4441 Building Material and Supplies 343 943.6 8127.8 2 6
Dealers
4411 Automobile Dealers 349 938.68 7019.3 2 7
4235 Metal and Mineral (except 85 2656.25 28355.7 4 2
Petroleum) Merchant
Wholesalers
4239 Miscellaneous Durable Goods 105 1539.59 10735.5 5 4
Merchant Wholesalers
NORA: the National Occupational Research Agenda
204

Table S8. Top 3 Industry groups ranked by Prevention Index (PI) for four major injury types among young workers, Oregon
workers' compensation accepted disabling and non-disabling claims, 2013-2018
Claim Cost
NORA 4-digit Industry group (4-digit Claim Rate Rate Basic Expanded
Injury Type
sector NAICS NAICS Description) Count /10,000 /100 PI PI
Workers Workers
Disabling Claims
Healthcare 6231 Nursing Care Facilities 197 209.8 6984.4 1 1
(Skilled Nursing Facilities)
Work-related
Musculoskeletal Healthcare 6221 General Medical and 239 173.8 6136.2 2 2
Disorders Surgical Hospitals
Transportation 4931 Warehousing and Storage 94 153.0 7553.1 3 3
Agriculture 1133 Logging 90 327.2 45906.5 1 1
Construction 2381 Foundation, Structure, and 91 143.1 8365.9 2 3
Struck Building Exterior
by/against Contractors
Agriculture 1153 Support Activities for 61 153.7 13077.3 3 2
Forestry
Agriculture 1133 Logging 42 162.5 7491.4 1 1
Agriculture 1153 Support Activities for 41 117.1 7271.5 2 2
Fall on same Forestry
level Construction 2381 Foundation, Structure, and 22 38.6 1827.5 3 3
Building Exterior
Contractors
Healthcare 6232 Residential Intellectual and 145 144.6 4932.8 1 2
Developmental Disability,
Mental Health, and
Violence Substance Abuse Facilities
Healthcare 6222 Psychiatric and Substance 33 804.9 15067.5 2 1
Abuse Hospitals
Healthcare 8129 Other Personal Services 30 73.8 3238.8 3 3
Non-disabling Claims
Healthcare 6231 Nursing Care Facilities 516 883.1 13915.8 1 1
(Skilled Nursing Facilities)
Work-related
Musculoskeletal Healthcare 6239 Other Residential Care 92 1246.6 16438.9 2 1
Disorders Facilities
Manufacturing 3211 Sawmills and Wood 50 282.0 4591.6 3 3
Preservation
2383 Building Finishing 459 939.6 7174.0 1 2
Construction Contractors
Manufacturing 3331 Agriculture, Construction, 80 2941.2 22160.9 2 1
Struck
and Mining Machinery
by/against
Manufacturing
Manufacturing 3219 Other Wood Product 178 953.4 6605.6 3 4
Manufacturing
Fall on same Healthcare 6239 Other Residential Care 42 569.1 4449.7 1 1
level Facilities
205

Healthcare 6231 Nursing Care Facilities 103 176.3 2092.0 2 2


(Skilled Nursing Facilities)
Agriculture 1133 Logging 41 165.5 1339.4 3 3
Healthcare 6239 Other Residential Care 237 3211.4 29056.6 1 1
Facilities
Healthcare 5419 Other Professional, 275 970.4 4288.3 2 2
Violence Scientific, and Technical
Services
Healthcare 6222 Psychiatric and Substance 54 2117.7 15440.9 3 3
Abuse Hospitals
NORA: National Occupational Research Agenda.

You might also like