0% found this document useful (0 votes)
153 views6 pages

Caep Rubric Assessment Response Form and Content Validity Protocol

This document provides instructions for establishing the content validity of an assessment using the Lawshe method. It involves having a panel of subject matter experts review assessment items and rate them on how well they measure predefined constructs, representativeness, importance, and clarity. The experts also provide feedback on additional items that could be included or deleted. Their ratings and feedback are analyzed to determine how effectively each item measures the intended constructs and domains. The overall goal is to gather evidence that the assessment adequately represents and measures the target knowledge or skills.

Uploaded by

Dwayne Dela Vega
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
153 views6 pages

Caep Rubric Assessment Response Form and Content Validity Protocol

This document provides instructions for establishing the content validity of an assessment using the Lawshe method. It involves having a panel of subject matter experts review assessment items and rate them on how well they measure predefined constructs, representativeness, importance, and clarity. The experts also provide feedback on additional items that could be included or deleted. Their ratings and feedback are analyzed to determine how effectively each item measures the intended constructs and domains. The overall goal is to gather evidence that the assessment adequately represents and measures the target knowledge or skills.

Uploaded by

Dwayne Dela Vega
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

Establishing Content Validity – Rubric/Assessment Response Form

Name of Reviewer: ________________________ Position: _______________________________

INSTRUCTIONS: This measure is designed to evaluate the content validity of (insert title of assessment) . Please rate each item as
follows:

 Please rate the level of representativeness of item in measuring the aligned overarching construct on a scale of 1-4, with 4 being the most
representative. Space is provided for you to comment on the item or suggest revisions.
 Please rate the importance of the item in measuring the aligned overarching construct on a scale of 1-4, with 4 being the most essential. Space
is provided for you to comment on the item or suggest revisions.
 Please rate the level of clarity for each item on a scale of 1-4, with 4 being the clearest. Space is provided for you to comment on the item or
suggest revisions.

Overarching Operational Item measuring Representativeness Importance of item in Clarity of item Comments:
This row construct Definition overarching of item in measuring the  1 = item is
doesn’t (i.e., “big construct measuring the overarching construct not clear
idea to overarching  1 = item is not  2 = item
change –
measure”) (Uses the exact construct necessary to needs
same for wording as  1 = item is not measure the major
all rubrics appears on the representative construct revisions to
assessment  2 = item needs  2 = item provides be clear
rubric). major revisions some information  3 = item
to be but is not needs
representative essential to minor
 3 = item needs measure the revisions to
minor revisions construct be clear
to be  3 = item is useful  4 = item is
representative but not essential clear
 4 = item is to measure the
representative construct
 4 = item is
essential to
measure the
construct
Construct 1: (fill in the blank) – the construct “Content Knowledge” is used for this example.
Content Knowledge K2a:
Knowledge about actual Demonstrates
These subject knowledge of
rows will (Example) matter that content
1 2 3 4 1 2 3 4 1 2 3 4
is to be (Example)
change learned or
depending taught
(Example)
on your Content Knowledge K2b:
program Knowledge about actual Implements
rubric … subject interdisciplinary
matter that approaches and
these 1 2 3 4 1 2 3 4 1 2 3 4
is to be multiple
rows will learned or perspectives for
taught teaching
be content
different Content Knowledge K2c:
for each Knowledge about actual Demonstrates
subject awareness of
rubric matter that literacy 1 2 3 4 1 2 3 4 1 2 3 4
used is to be instruction
learned or across all
taught content areas
Content Knowledge K2d: Makes
Knowledge about actual content relevant
1 2 3 4 1 2 3 4 1 2 3 4
subject for all learners
matter that
is to be
learned or
taught
To the reviewer: What additional items would you recommend including to measure the construct? If you have no suggestions, please
enter “none.”

These three 1 (This row would be inserted after each group of items aligned with an identified overarching construct).
(3) open- To the reviewer: What additional items would you recommend deleting? If you have no suggestions, please enter “none.”
ended
response (This row would be inserted after each group of items aligned with an identified overarching construct).
rows are 2 To the reviewer: Please provide any additional information you believe may be useful in assessing the identified construct with this
inserted instrument. If you have no suggestions, please enter “none.”
after each
group of 3 (This row would be inserted after each group of items aligned with an identified overarching construct).
items
aligned with
an identified Construct 2: (fill in the blank) – the construct “Learning Environments” is used for this example.
overarching
competency

Learning The diverse Etc. – form Etc. Etc. Etc. Etc.


Start with a Environment physical would go on to
new (Example) locations, list all items,
competency contexts, etc.
for next group and cultures
of items in which
students
learn.
(Example)
Assessment Instructions Step 2

Content Validity Protocol using the Lawshe Method

Content Validity determines the extent to which an assessment represents all facets of a given
construct: The assessment instrument should answer the following questions:
 Does the indicator measures what it was designed to measure?
 Do the constructs include the concept, attribute, or variable that are the target of
measurement?
 Does the instrument estimate how much a measure represents every single element of a
construct?
 Does the instrument assess constructs or domains?
 Does the instrument assess the body of knowledge surveyed?
 What degree does the content of the indicator reflect the content domain of interest?

The process of determining if an assessment is valid begins with gathering evidence to determine
how accurately an assessment addresses various aspects of the specific construct question and
adequately represents a defined domain of knowledge or performance. In other words, do the
questions assess the constructs or are the responses by the person answering questions influenced
by other factors. The purpose of content validity protocol is to guide the collection of evidence to
document the adequate technical quality of rubrics, surveys, etc. that are being used to determine
the validity of assessments to evaluate Program Learning Outcomes in the College of Education
at Fort Hays State University.

How does a committee establish Content Validity for an initial EPP created assessment?

To establish Content Validity for EPP created assessments/rubrics, a panel of experts identifies
the essential constructs for the assessment/rubric. Although there are other methods for
establishing content validity, the College of Education will use the Lawshe Seminal research
method as approved by CAEP. The Lawshe method requires a Content Evaluation Panel (e.g.,
subject-matter experts) to provide feedback on how well each question measures the construct in
question. The Content Evaluation Panel will identify the overlap between the construct and the
performance domain. Their feedback will be analyzed, and informed decisions will be made
about the effectiveness of each question.

Protocol

The EPP will determine content validity using the Lawshe method. Content validity refers to the
appropriateness of the content of an instrument. In other words, do the measures (questions,
observation logs, etc.) accurately assess what we want to know? The expert judgment I not
statistics) is the primary method used to determine content validity. The process of review and
ratings establishes it by subject matter experts or stakeholders. The Lawshe proposed that each of

Approved by CAEP Steering Committee 01/2019


Assessment Instructions Step 2

the subject matter experts (SMEs) raters on the judging panel respond to the following questions
for each item:

1. Is the skill or knowledge measured by this item essential


2. Useful, but not essential
3. Not necessary to the performance of the construct.
Please follow directions to complete the Content Validity Index (CVI) and Content Validity
Ratio (CVR). This is a two-step process.

Step 1:

1. Determine the body of knowledge for construct measure. Complete the initial
assessment/rubric review form. For each assessment/rubric used to evaluate candidate
performance in the program. Make sure that all constructs measured in the identified
assessment/rubric.
2. Identify a Content Evaluation Panel and credentials for this selection. The Content
Evaluation Panel should be a combination of all stakeholders to include the College of
Education faculty (i.e., content experts) and P-12 school or community practitioners (lay
experts). Each panel expert should have minimum credentials established by program
faculty.
a. At least one content expert from the program/department in the College of
Education.
b. At least one external content expert from outside the program/department. This
person can be from FHSU or another college or university as long as the requisite
content expertise established; and
c. At least one practitioner expert from the field
Total Number of Subject-matter Experts on the panel: A minimum of seven (3)

3. Create the response form: For each EPP created an assessment, there should be an
accompanying response form that Content Panel members use to rate items that appear
on the rubric/instrument. Program faculty work collaboratively to develop the response
form needed for each rubric/instrument used in the program to evaluate the candidate’s
performance.
The Content Evaluation Panel (subject-matter experts) Per the Lawshe method

a. Each panel member is given the list of indicators or items independently.


b. For each item, the primary construct that the item purpose measure should be
identified and defined
c. Each item should be written as it appears on the rubric/instrument
d. Each panelist rates items on a scale of 1-3 with one (1) being the most essential, two
(2) useful but not essential or three (3) not necessary. The form should have space for
each item to provide feedback on the item with suggested corrections or revisions.

Approved by CAEP Steering Committee 01/2019


Assessment Instructions Step 2

For example:

3 = Essential
2 = Useful but not essential
3 = Not essential

Essential
Total
Useful but
members Not essential
Constructs not essential
responded Total
Total
to
construct
Ability to create lessons 2 1 7
Accepts Criticism (Merged into Response to
10 0 0
Feedback)
Assessment Skills 1 2 7
Reflective Educator 10 0 0
Response to Feedback 10 0 0

Content Panel Experts (Example)


Name and Tile of Content Panel Experts.
Nienkamp, Paul, Associate Professor, Co-Chair
Stramel, Janet, Associate Professor, Co-Chair
Henderson, Shawn--FHSU
Smith, Sara, USD 469
Jones, Elodie--FHSU

The frequency of “essential” ratings is the basis for the decisions.

4. Create an assessment packet for each Content Evaluation panel member. The packet
should include the following:
a. A letter explaining the purpose of the assessment/rubric, the reason the experts were
selected, a description of the measure and its scoring, and an explanation of the
response form.
b. A copy of the assessment/rubric instructions
c. A copy of the form used to evaluate the assessment/rubric
d. The response form aligned with the assessment/rubric for the panel member to rate
each item.

Approved by CAEP Steering Committee 01/2019


Assessment Instructions Step 2

5. Initiate the Evaluation of assessment/rubric


a. Set a deadline for the panel to return the response forms to the assessment contact
person.
6. Collecting Data: Once response data for each EPP created assessment/rubric have been
collected from the panel members, that information will be submitted to the COE
assessment contact person. Copies of all forms and scores will be submitted via email and
placed in an EPP assessment share file or Tk20 designated file. The file will be assessable
by program coordinators.

7. Content Validity results Submitted: The COE assessment contact person will generate a
Content Validity Index (CVI) and Content Validity Ratio (CVR). The index will be
calculated based on recommendations by Rubio et al. (2003), Davis (1992), and Lynn
(1986).

8. The assessment contact person will use the CVR = (ne – n/2)/(n/2.

a. Ne = number of panelists indicating “essential.”


b. N = total number of panelists
c. Step 1: How many panelists say it is essential? (20 – n/2) (n/2)
d. Step 2: How many total panelists = n =36

Approved by CAEP Steering Committee 01/2019

You might also like