0% found this document useful (0 votes)
48 views6 pages

SBM Assessment Process

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
48 views6 pages

SBM Assessment Process

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 6
D. THE REVISED SBM ASSESSMENT PROCESS Purpose of the Assessment. SBM Assessment is conducted by the school to determine the depth of its SBM practice alongside the principles of ACCESs. It is conducted by the Division to determine the profile of its schools; which needs assistance and which needs recognition for good practices for benchmarking by other schools. Basic Steps in the Assessment Process Step 1: Organize a team of at least ten members composed of faculty members, students and other external stakeholders. One shall be elected as team leader, and the other as team secretary and the rest shall obtain and validate evidence Step 2: Let team members select the area (Principle) they want to assess. There should be at least two in each team. Step 3: In the pre-assessment meeting, decide whether to use the Whole or Part method. In the former (whole method), all team members shall work as a group, to validate one principle after another. In the latter (part method), at least two (2) members shall be assigned to every Principle. The team leader acts as coordinator and facilitator while the secretary acts as documenter. The team should study the Assessment Manual, especially the D-O-D (document analysis, observation, and discussion) process Step 4: Assessment Proper (school visit if assessment is done by an external team). Proper courtesies must be accorded to the School Head in planning the assessment process, During the assessment proper. Have a schedule of activities for document analysis (1 day), observation analysis (1 day), and discussion of data (1/2 day) Classify documents by Principle. Note that one document like the SIP, can be a source of evidence for several indicators across principles. Gather evidence using the D-O-D process and select samples of documents using emergent saturation sampling and snowballing. Summarize the evidences, and arive at a consensus, what rating to give on each indicator based on documented evidences. Step 5: Conduct process validation, the purpose is to gather process evidence to validate documented evidences using observation of classes and activities. Follow D- O-D process. Step 6:Discussion of Documents and Process Evidence. Summarize data for each Principle indicator. Clarify Issues, problems, opportunities, etc. Team scores the indicators. ‘Step 7: Closure or Exit Conference/Meeting, Step 8: Report Writing by team, ‘The Validation Procedure: DOD ‘The SBM Assessment Tool uses evidence to determine a school’s level of practice. DOD is a means of evaluating the validity or truthfulness of the evidence. DOD is an acronym for Document Analysis, Observation, and Discussion- three essential steps in evaluating the validity of an evidence of an SBM practice. Below are the steps: 1, Conduct Document Analysis. Obtain and assemble all existing artifacts related to the indicator being assessed. Artifacts are the things used by the school community to achieve educational goals, e.g. lesson plans, annual reports, assessment tools, test results, community learning centers, computers, organization charts, development plans, things made by the learners, and the like. Evaluate the validity or truthfulness of each artifact against the four RACS criteria namely: * Relevance. The evidence must be appropriate to the indicator being assessed. It is appropriate if the artifact or document is a tool or a product of a practice expressed in the indicator. * Accuracy. The evidence must be correct. If it is a lesson plan, then both content and procedure must be correct. + Currency. The evidence must be present, existing, or actual. * Consistency. The evidence must be verifiable and generates the same results from most of the sources. + Sufficiency. The evidence must be adequate or enough. If a student learning portfolio is presented as evidence of self-directed learning, its presence in only two or three classes is not an adequate evidence of school-wide implementation. Collect and analyze evidence horizontally (by subject) and vertically (by year and grade level) to ensure content validity, and then synthesize the results of the document analysis 2. Conduct observations to obtain process evidence. Documentary evidence may show the school's focus on learner-centered learning like cooperative, interactive, problem solving, and decision making. There is a need to obtain process evidence to know if these are being practiced. Process evidence is obtained by scrutinizing instructional, leadership, and management styles, methods, techniques, approaches, and activities used by the school community to achieve the SBM goal. Evidence is identified through participant or nonparticipant observations which may be conducted formally or informally. Individual or group interviews are held to verify or clarify the evidence. Evidence is scrutinized for validity using the RACS criteria Determining the number of observations, interviews, and documents to be scrutinized is a sampling problem in conducting DOD. The problem is commonly addressed by using saturation sampling. The technique is described in the Attachment of the SBM Assessment Tool. Use the process evidence to cross-validate documentary evidence. Synthesize the process evidence for group discussion. Discuss the synthesized documentary and process evidence. Conduct the discussion as a friendly non-confrontational conversation to explain, verily, clarify, and augment the evidence. Invite members of the school community who were engaged in the collection and presentation of evidence to participate in the discussion. As the team arrives at a consensus on the level of practice of the indicator being assessed, indicate in the scale with a check mark (¥) in the appropriate box. Continue the process until all four dimensions are assessed, Practices vary in establishing the level of practice of an indicator. The most common is the integrative approach in which the entire body of evidence for all indicators of a standard is assembled first, scrutinized for internal consistency, and finally used as guide in making a consensual decision to which level of practice an indicator belongs. ‘The other practice is non-integrative. Indicators of a standard are scrutinized one by one for evidence and also classified one by one for level of practice. Relationships among indicators are given less attention. Who conducts the DOD? A school assessment committee conducts the DOD if assessment is school-initiated. A Division assessment committee conducts the DOD if the assessment is Division-initiated, or if the assessment is requested by a school. Who constitute the assessment committee? A leader assisted by a secretary, heads the assessment committee. Four subcommittees are organized and each one is assigned to assess an SBM standard. Four to five members may compose one subcommittee. What operational principles guide the DOD process? " Collaboration. The assessors work as a team. Leadership is shared. Decisions are made by consensus and every member is accountable for the performance of the team. * Transparency. The validation of evidence is open to stakeholders’ view and review. * Confidentiality. Information obtained from the DOD process that may prejudice individuals, groups or the school is handled judiciously. + Validity. Documentary analyses and observations are rigorous in procedure and demanding in quality of results. + Reform-oriented. DOD comes up with informed recommendations and action programs that continuously move the school to higher levels of practice « Principle-oriented. DOD is guided by the ACCESs principles. Stakeholders’ satisfaction. DOD is an exciting growth experience. Analysis of documents, artifacts, and processes unfold the progress made, objectives achieved, new techniques developed, best practices mainstreamed, prices won-- despite limited resources and physical, social and political constraints. E, THE REVISED SBM ASSESSMENT TOOL The Revised School-Based Management (SBM) Assessment tool is guided by the four principles of ACCESs (A Child (Learner) - and Community- Centered Education System). The indicators of SBM practice were contextualized from the ideals of an ACCESs school system. The unit of analysis is the school system, which may be classified as developing, maturing, or advanced (accredited level). The SBM practice is ascertained by the existence of structured mechanisms, processes, and practices in all indicators. A team of practitioners and experts from the district, division, region, and central office validates the self-study/assessment before a level of SBM practice is established. The highest level, “advanced”, is a candidacy for accreditation after a team of external validators confirmed the evidence of practices and procedures that satisfies quality standards. Characteristics and Features. The revised tool is systems oriented, principle- guided, evidence- based, learner-centered, process-focused, non-prescriptive, user- friendly, collaborative in approach, and results /outcomes focused. Parts of the Tool. The tool shall contain the following parts: a) basic school/learning center information; b) principle-guided indicators; ¢) description of SBM practice scaled in terms of extent of community involvement; d) learner centeredness, and e) scoring instructions. Users. The users of the tool are the teachers, school heads, learners, parents, LGU, Private Sector and NGO/PO and the different administrative levels of DepEd. Scoring Instructions 1. The four (4) principles were assigned percentage weights on the basis of their relative importance to the aim of school (improved learning outcomes and school operations); + Leadership and Governance = 30% + Curriculum and Learning = 30% * Accountability and Continuous Improvement — - 25% + Management of Resources ~ 15% 2, Each principle has several indicators, Based on the results of the D-O-D (Document Analysis, Observation, Discussion), summarize the evidences, and arrive at a consensus on the rating that will be given to each indicator; 3. Rate the items by checking the appropriate boxes. These are the points earned by the school for the specific indicator. The rating scale is: evidence idence indicates early or preliminary stages of implementation 2- Evidence indicates planned practices and procedures are fully implemented 3- Evidence indicates practices and procedure satisfy quality standards 4. Assemble the Rubrics rated by the respondents; edit them for errors like double entries or incomplete responses; Count the number of check marks in each indicator and record in the appropriate box in the summary table for the area / standard rated; 6. Multiply the number of check marks in each column by the points (1-3); 7. Get the average rating for each principle by dividing the total score by the number of indicators of the principle; 8, Record the average ratings for the principle in the Summary Table for the computation of the General Average; 9. Multiply the rating for each principle by its percentage weight to get the weighted average rating; 10. To get the total rating for the four principles, get the sum of all the weighted ratings. The value derived is the school rating based on DOD; it. The level of practice will be computed based on the criteria below: * 60% based on improvement of learning outcomes; * 40% according to the validated practices using DOD 12. The final scoring criteria as described in item No. 11 will be issued after the operational try out. Description of SBM Levels of Practice. The resulting levels are described as follows: Level I: DEVELOPING-Developing structures and mechanisms with acceptable level and extent of community participation and impact on learning outcomes. Level Il: MATURING- Introducing and sustaining continuous improvement process that integrates wider community participation and improve significantly performance and learning outcomes. Level Ill: ADVANCED (ACCREDITED LEVEL) - Ensuring the production of intended outputs/outcomes and meeting all standards of a system fully integrated the local community and is self-renewing and self-sustaining,

You might also like