Institutional Assessment Process
Institutional Assessment Process
Act: Close loops, make improvements and re-measure Engage campus (professional development)
The committee reports to the Provost. The Assessment Committee is comprised of the Chair;
Vice Provost (ex officio); Associate Vice Provost of Academic Excellence; at least one faculty
member from each college and campus; and at least one faculty member from Online Learning.
Other membership includes the ISLO subcommittees divided by assessment cycle (plan, assess,
act), department chairs, and/or faculty designated by each academic department for a specified
term to assist with assessment. The Provost appoints one faculty member to serve as Chair of the
Assessment Committee for a three-year term.
ISLO Sub committees are charged by the Provost’s office in conjunction with recommendations
from the Assessment Committee with either planning for assessment of their particular assigned
outcome, analyzing the data collected on their particular outcome, or facilitating university-wide
actions on their particular outcomes. Subcommittees have 3 members each are as follows:
1. Communication, Teamwork, Ethical Reasoning (CTER),
2. Diverse Perspectives/Cultural Sensitivity & Global Awareness (DP)
3. Quantitative Literacy, Inquiry & Analysis (QLIA)
At least one representative from the Assessment Committee serves on the General Education
Advisory Council (GEAC). Communication between the Assessment committee and this
committee must be bi-directional. Representatives from the assessment committee ensure that
assessment in general education is prioritized within processes and that ISLO definitions are
consistent with state mandated standards for general education.
A representative from the Diverse Perspectives ISLO subcommittee should be in close contact
with or on the Diversity, Inclusion, and Cultural Engagement (DICE) steering committee.
DICE work guides assessment work related to standards of equitable curriculum delivery and
measurements on the Diverse Perspectives ISLO. Assessment work provides data to the DICE
office identifying equity gaps and actions related to the closure of those gaps.
The online representative member should be in contact with Online Learning Advisory
Council (OLAC) to ensure that best practices for online education are being assessed similarly
to in person programs.
The Office of Academic Excellence maintains a webpage with current information and
assessment practices and annual institutional summary assessment reports at
https://www.oit.edu/academic-excellence Linked to this webpage are accompanying pages where
departmental outcomes and program assessment reports are published for public consumption.
Office of Academic Excellence webpage contains links to data from Office of Institutional
Research, General Education standards, Commission on College Teaching, DICE and the
definitions of Institutional Outcomes.
The Office of Academic Excellence maintains a Teams drive which contains a record of
Agendas and Meetings for the committee, grades and feedback sent to departments regarding
assessment reports, trainings and requests for actions from faculty.
Sources of Data
Student perspective is utilized broadly across the institution. Every course is assigned an end of
course survey administered by IDEA. Faculty have direct access to the results of these surveys
for all of their courses. Faculty report these data in their Annual Performance Evaluations (APE).
Training on how to access and interpret this data is conducted by CCT during their annual OTET
Workshop.
The Office of Academic Excellence conducts a Student Exit Survey for every department on
their graduating seniors through Qualtrics. Questions asked of these students cover student
perspective on their education’s impact on their performance of Programmatic Outcomes and
their post graduation success. This data is provided to programs for use in writing their program
assessment reports in summer.
The Office of Institutional Research Provides head count data on graduation, attrition, and
retention rates by term, department, and college. This data is shared with programs and
available on the OIR website at https://www.oit.edu/institutional-research Additionally, OIR data
dashboards that report student achievement data are readily available to faculty online through
faculty resources page on the universe’s intranet TECHweb.
External evaluation of programs is conducted by participation of Professional Advisory Boards
and Accreditation for individual programs.
Table 1. Accredited Programs
Tools
The institution has created dashboards for each faculty member to review their courses. The OIR
data dashboards report student achievement data and are readily available to faculty online
through faculty resources webpage on the universe’s intranet TECHweb with faculty log-in
credentials. Dashboards are maintained by the Office of Institutional Research also contain data
disaggregated data by race, gender, first generation college attendance, Pell Grant recipient
status, and full or part time status. Such data included in the dashboards is 6-Year Graduation
data, Retention for one year, and Dropped Failed Withdrew or Incomplete (DFWI) by term.
Faculty report review of this data in program assessment reports and in Course Learning
Outcomes (CLO) Worksheets due at the end of each term.
The CLO Worksheets were created by the Office of Academic Assessment and allow faculty a
place to enter assessment data based on course work performance that can then be summarized
by the chair of the department. Using the CLO worksheets, faculty determine which
programmatic and institutional outcomes their specific coursework pertains to. Faculty enter
performance targets for assignments and course work. The program determines a standard of
success to mean the number of students performing acceptably on the outcome that indicates the
outcome is met for the course. Faculty determine student success to be the student’s work
product compared to the rubric for the outcome on the assignment. The Program Assessment
Handbook expected to be published in 2022-23 academic year clarifies definitions for faculty on
each of these measures of success.
Outcomes
The Strategic plan for the Institution is published on the University Website at
https://www.oit.edu/about/strategic-plan and reads as follows:
“Oregon Institute of Technology (“Oregon Tech”), Oregon’s public polytechnic university, offers innovative, professionally-focused
undergraduate and graduate degree programs in the areas of engineering, health, business, technology, and applied arts and sciences.
To foster student and graduate success, the university provides a hands-on, project-based learning environment and emphasizes
innovation, scholarship, and applied research. With a commitment to diversity and leadership development, Oregon Tech offers
statewide educational opportunities and technical expertise to meet current and emerging needs of Oregonians as well as other national
and international constituents.”
It is structured using the guiding values of: Student success, respect, service, excellence,
integrity, Diversity Equity and Inclusion, Accountability and Confidence into four pillars.
• Commitment to Student Success
• Commitment to Innovation
• Commitment to Community
• Commitment to Institutional Excellence
This strategic plan informs the Academic Master plan published on the University Website at
https://www.oit.edu/provost which has a mission that reads as follows:
“Through a sense of community, collaboration and innovative degree programs, Oregon Tech Academic Affairs provides applied
hands-on learning from teacher-scholars who develop life-long learners and tomorrow’s leaders.”
Success of the work on these initiatives from the missions are measured through student success
on both the Institutional Success Indicators of Retention and Persistence, Graduation Rates,
Employment Rates, DFWI, and closing of equity gaps and student performance on academic
learning outcomes.
While CLO are set by faculty, and PSLO are set by programs, Oregon Tech's Institutional
Student Learning Outcomes (ISLOs) are set by the Office of Academic Excellence to ensure that
they support Oregon Tech's institutional mission and strategic goals. The outcomes and
associated criteria reflect the rigorous applied nature of Oregon Tech's degree programs. In depth
definitions on acceptable performance on these outcomes are published at
https://www.oit.edu/academic-excellence/GEAC/essential-studies/Institutional-student-learning-
outcome
Review Process
Each program submitting a report also delegates an individual faculty member to review other
department reports. At minimum, program reports are read by two faculty graders. Faculty
graders are given training on a grading rubric updated by the Assessment Committee for this
purpose. The contents of the rubric evaluate program reports for items specified in Fig 3. Graders
return individualized feedback to the department chair. Once feedback is received, programs may
choose to submit changes to the report for second review to the Office of Academic Excellence
or approve the posting of the report to the external assessment webpage for their department.
Fig 4. Program Assessment Report Rubric
Data from submitted reports is tabulated and summarized for reviewers within the Annual
Institutional Assessment Report. Meaningful indicators are identified within the report to assess
the quality of the reports and process. Items on process recorded in the report for the year
include changes to the structure or reporting of assessment committee, actions taken to change
the process, improvements to the tools used in assessment, and trainings provided to the campus
community that support assessment work. Items recorded on quality of reports may change
from year to year, depending on the quality of the reports submitted , however at minimum the
number of programs that submitted reports during the academic year should be reported. Other
program report items included would be % of reports that submitted a particular piece of data
that was previously found to be a gap, such as the % of programs identifying equity gaps or the
% of programs reporting action plans. Additionally, Summarized University Trends data is
recorded in the Annual Institutional Assessment report, items such as University level averages
and trends in institutional level indicators of success (retention, graduation, DFWI) over time and
compared with external sources, trends in program assessment reported gaps and actions, faculty
interpretations of student performance on ISLO, and programmatic requests for University
resources.
Actions the University plans to take based on these data are identified throughout the annual
report for the varying topics of process improvement, faculty education, resource allocation or
other items indicated.
• Faulty report Fall term data in CLO worksheets and participate in program assessment meetings
December
• Feedback is given to programs on their submitted assessment reports which are published to external website
• Faculty report Winter term data in CLO worksheets and participate in program assessment meetings.
March
This process is written by the Assessment Committee and sent to the Director and Provost for approval.
Review of this process should occur at regular intervals and changes made as gaps are identified.
___________________ _____________
Author Date
______________________ ____________
_______________________ ___________
Provost Date