Model-Assessment-Framework-MAF01973
Model-Assessment-Framework-MAF01973
1|Page
Model Assessment Framework
Table of Contents
Preface ....................................................................................................................................... 8
Message from the Federal Minister, Education and Professional Training....................... 9
Message from the Secretary, Ministry of Federal Education & Professional Training .. 10
Message from the Executive Director of IBCC ................................................................... 11
Executive Summary ............................................................................................................... 12
Highlights of Each Chapter................................................................................................... 13
Chapter 1. Introduction ........................................................................................................ 15
1.1. Introduction ............................................................................................................... 15
1.2. Background ............................................................................................................... 15
1.3. Rationale.................................................................................................................... 15
1.4. Significance ............................................................................................................... 16
1.5. Objectives .................................................................................................................. 16
Chapter 2. Standards for Assessment .................................................................................. 18
2.1. The Concept of Assessment ...................................................................................... 18
2.2. Washback Effect of Assessment on Teaching and Learning .................................... 18
2.3. Importance and Purpose of Assessment .................................................................... 19
2.4. Types of Assessments ............................................................................................... 19
2.4.1 Formative Assessment ....................................................................................... 19
2.4.2 Formative Assessment Based on Bloom’s Taxonomy for High-Stake Exams
(Grades IX to XII) ............................................................................................................ 20
2.4.3 Integration of Technology in Formative Assessment ........................................ 21
2.4.4 International Best Practices................................................................................ 22
2.4.5 General Principles for Formative Assessment ................................................... 23
2.4.6 Summative Assessment (Board Examinations) ................................................. 23
2.5 Minimum Quality Standards for Board Examinations .................................................. 24
2.5.1 Structure of Quality Assessment ............................................................................ 24
2.5.2 Formulating the Structure of Assessment .............................................................. 24
2.5.3 Pre-Examination Standards .................................................................................... 26
2.5.4 During Examination Standards .............................................................................. 29
2.5.5 Post-Examination Standards .................................................................................. 31
2.5.6 Standards for Digitization of Examinations ........................................................... 33
2.6 Capacity Building and Training ..................................................................................... 35
2|Page
Model Assessment Framework
Chapter 3. Requirements/Alignment with Standards of Curriculum .............................. 37
3.1 National Curriculum ..................................................................................................... 37
3.2 Standards, Benchmarks, and Learning Outcomes ......................................................... 37
Chapter 4. Process of Assessment Development ................................................................. 39
4.1. Standards of Assessment ........................................................................................... 39
4.2. Importance of Item Writing and Review................................................................... 39
4.3. Key Principles for Item Writing ................................................................................ 39
4.4. Item Review Process ................................................................................................. 39
4.5. Examples of Bad Items and Good Items ................................................................... 40
Chapter 5. Conduct of Examination .................................................................................... 42
5.1. Professional Training ................................................................................................ 42
5.2. Conducive Environment ............................................................................................ 42
5.3. Transparency ............................................................................................................. 42
5.4. Appropriate Logistic Support .................................................................................... 43
5.5. Appropriate Measures (SOPs for Secrecy) ............................................................... 43
5.6. Controlling Cheating and Examination Malpractices ............................................... 44
Chapter 6. Coding, Marking, and Compilation of Results ................................................ 45
6.1. Coding of Answer Scripts ......................................................................................... 45
6.2. Marking of Answer Scripts ....................................................................................... 45
6.3. Compilation of Results .............................................................................................. 46
6.4. On-Screen Marking ................................................................................................... 47
Chapter 7. Post-Exam Analysis ............................................................................................ 48
7.1 Purpose of Post-Exam Analysis ................................................................................ 48
7.1.1 Improving Learning Outcomes .......................................................................... 48
7.1.2 Informing Instruction and Feedback to Teachers .............................................. 48
7.1.3 Enhancing Assessment Quality.......................................................................... 49
7.1.4 Stakeholder Communication .............................................................................. 49
7.2 Data Collection and Management ............................................................................. 49
7.2.1 Student Performance Data ................................................................................. 49
7.2.2 Importance of Accurate Data Collection ........................................................... 50
7.2.3 Item Analysis ..................................................................................................... 50
7.3 Item Anlaysis Theories.............................................................................................. 51
7.4 Exam Results Dashboard .......................................................................................... 52
7.4.1 Key Features and Importance ............................................................................ 52
7.4.2 Contextual Data and Demographic Information ................................................ 53
3|Page
Model Assessment Framework
7.4.3 Integrating Data for Comprehensive Analysis ................................................... 53
7.5 Data Analysis Techniques ......................................................................................... 54
7.5.1 Descriptive Statistics .......................................................................................... 54
7.5.2 Item Analysis ..................................................................................................... 54
7.5.3 Inferential Statistics ........................................................................................... 55
7.5.4 Subgroup Analysis ............................................................................................. 56
7.6 Reporting and Interpretation of Results .................................................................... 57
7.6.1 Overview of Student Performance .......................................................................... 57
7.6.2 Item-wise Analysis Report ..................................................................................... 57
7.6.3 Subgroup Performance Report ............................................................................... 58
7.7 Feedback Mechanism ................................................................................................ 58
7.7.1 Student Reports: ...................................................................................................... 58
7.7.2 Strengths and Areas for Improvement .................................................................... 58
7.7.3 Teacher Feedback and Aggregate Performance Data ............................................. 58
7.7.4 Instructional Recommendations.............................................................................. 58
7.7.5 School and Province/Region Reports ..................................................................... 59
7.7.6 Policy Recommendations for Administrators and Policymakers ........................... 59
7.8 Utilizing Analysis for Continuous Improvement ...................................................... 59
7.8.1 Curriculum Adjustment ......................................................................................... 59
7.8.2 Professional Development ..................................................................................... 60
7.8.3 Refining Future Assessments Based on Item Analysis and Feedback .................. 60
7.8.4 Informing Educational Policies and Resource Allocation Based on Performance
Data .................................................................................................................................. 60
7.9 Communicating Results to Stakeholders .................................................................. 61
7.9.1 Students and Parents ............................................................................................... 61
7.9.2 Teachers: Comprehensive Data to Guide Instructional Strategies .................... 61
7.9.3 Educational Authorities for Policy Briefs .......................................................... 61
7.9.4 Educational Authorities for Strategic Planning ................................................. 62
7.10 Conclusion ............................................................................................................. 62
Chapter 8. Policy Decisions on IBCC Forum ...................................................................... 63
Chapter 9. Guidelines and SOPs .......................................................................................... 67
9.1 Guidelines and SOPs for Item Writers ...................................................................... 67
9.1.1 Guidelines for Item Writing ............................................................................... 67
9.1.2 SOPs for Item Writers ........................................................................................ 68
9.2 Guidelines and SOPs for Reviewers ......................................................................... 69
4|Page
Model Assessment Framework
9.2.1 Guidelines for Reviewers ................................................................................... 69
9.2.2 SOPs for Reviewers ........................................................................................... 70
9.3 Guidelines and SOPs for Moderators ........................................................................ 70
9.3.1 Guidelines for Moderators ................................................................................. 70
9.3.2 SOPs for Moderators.......................................................................................... 71
9.4 Guidelines and SOPs for Test Assembly .................................................................. 72
9.4.1 Guidelines for Test Assembly............................................................................ 72
9.4.2 SOPs for Test Assembly .................................................................................... 72
9.5 Guidelines and SOPs for Test Administration .......................................................... 73
9.5.1 Guidelines for Test Administration ................................................................... 73
9.5.2 SOPs for Test Administration ............................................................................ 74
9.6 Guidelines and SOPs for Marking/Coding ............................................................... 74
9.6.1 Guidelines for Marking/Coding ......................................................................... 74
9.6.2 SOPs for Marking/Coding ................................................................................. 75
Resources ................................................................................................................................ 76
Lead Contributors ................................................................................................................. 77
5|Page
List of Figures
Figure 1. Difference between Formative and Summative Assessment .................................. 19
Figure 2. General Principles for Formative Assessment......................................................... 23
Figure 3. Assessment Blueprint .............................................................................................. 26
Figure 4. Pyramid of Cognitive Domain ................................................................................. 28
Figure 5. National Curriculum Framework............................................................................. 37
Figure 6. Item Review Process ............................................................................................... 40
Figure 7. Making of Answer Scripts ....................................................................................... 45
Figure 8. Compilation of Results ............................................................................................ 46
Figure 9. Purpose of Post-Exam Analysis .............................................................................. 49
Figure 10. Sample Data Dashboard ........................................................................................ 53
Figure 11. Guidelines and Standard Operating Procedures (SOPs) for Assessment Staff ..... 67
Figure 12. Process and Guidelines for Item Moderators ........................................................ 71
Figure 13. Process and SOPs for Test Assembly .................................................................... 73
6|Page
Model Assessment Framework
List of Tables
Table 1. Assessment Blueprint Example................................................................................. 26
Table 2. Curriculum-Based Question Matrix .......................................................................... 27
Table 3. SLO-Based Question Matrix Example...................................................................... 28
Table 4. Validity Types, Description and Standard Values .................................................... 30
Table 5. Matric of Minimum Standards .................................................................................. 32
Table 6. Pre-Examination Phase Matrix ................................................................................. 33
Table 7. Duration Examination Phase Matrix ......................................................................... 34
Table 8. Post-Examination Phase Matrix ................................................................................ 34
Table 9. Criteria Summary for Capacity Building .................................................................. 36
Table 10. Bad Item Examples and its Improved Item ............................................................. 41
Table 11. Comparison between CTT and IRT ........................................................................ 52
Table 12. Example of Grading Points for Grade 11 ................................................................ 65
Table 13. Example of Grading Points for Grade 12 ................................................................ 66
Table 14. New Grading System for SSC and HSSC ............................................................... 66
7|Page
Model Assessment Framework
Preface
The Model Assessment Framework (MAF) represents a major step in the country's educational
progress, demonstrating joint dedication to improving the quality of education and providing
equal learning opportunities for all students. As we embark on a new chapter in educational
development, this document is a detailed guide to comprehending and executing a strong
student assessment system that adheres to international standards while considering our distinct
cultural and academic settings.
The Inter Boards Coordination Commission (IBCC) serves as a forum for the chairpersons and
chief executives of member examination boards and allied organizations. Its mandate includes
exchanging information, devising policies for unified implementation, transforming the
examination system through modern techniques, and improving capacity building among
teachers and examination staff. To address the identified gaps and improve assessment
practices, the IBCC formulated a Model Assessment Framework of Pakistan for secondary and
higher secondary levels in collaboration with allied departments from Federal and Provincial
Governments.
The Model Assessment Framework document has been developed by reviewing key
frameworks and standards such as TIMSS, the Curriculum Framework, the Cambridge
Assessment Framework Standards, and the Global Proficiency Framework. The aim is to
define minimum standards for quality assessment and effective testing. By adhering to these
proposed standards, all educational boards in Pakistan can enhance the quality of their
assessments and, consequently, the overall quality of education.
The Model Assessment Framework (MAF) of Pakistan has been designed to offer a clear and
organized method for assessing students at secondary and higher secondary education levels.
This framework results from thorough research, consultation, and cooperation among experts
from different facets of the education sector. It represents a collective vision for a fair,
transparent, and accountable education system that encourages constant improvement and
excellence.
The Model Assessment Framework is structured to cover the secondary and higher secondary
levels of education. It is designed to be implemented in phases, and its key components include
assessment standards, assessment tools and techniques, implementation guidelines, reporting,
and feedback mechanisms. Successfully implementing the Model Assessment Framework
requires collaboration from all stakeholders in the education sector. It calls for a shift in how
we approach assessments, moving from a focus on testing to a broader perspective of
continuous and comprehensive evaluation.
This document is not an end but a living guide that will evolve over time in response to
emerging educational challenges and opportunities. Hopefully, the Model Assessment
Framework will catalyze positive change, inspiring a culture of learning and excellence that
will propel Pakistan towards a brighter and more prosperous future.
We would like to express our sincere gratitude to all those who have contributed to developing
this framework. Their dedication, expertise, and insights have been invaluable in shaping this
visionary document. Together, we can build an education system that truly meets the needs of
students and prepares them for the challenges of the 21st century.
8|Page
Model Assessment Framework
I urge all educators, policymakers, and stakeholders to embrace this framework with the same
commitment and vision with which it was created. Together, let us work toward a brighter
educational future where every child in Pakistan has access to the quality education they
deserve.
9|Page
Model Assessment Framework
I extend my heartfelt appreciation to the Inter Boards Coordination Commission (IBCC) and
the dedicated teams involved in developing this pioneering document. Their tireless efforts
have produced a framework that sets a new benchmark for quality in educational assessments,
ensuring that our system remains both nationally relevant and internationally competitive.
Special Secretary of the Ministry of Federal Education and Professional Training (MoFEPT)
10 | P a g e
Model Assessment Framework
The MAF reflects our commitment to aligning assessments with the curriculum, ensuring that
every student, regardless of region or background, has an equal opportunity to succeed. By
establishing a standardized and reliable system of assessments, we are laying the foundation
for equitable educational practices that will benefit students, educators, and institutions alike.
I sincerely appreciate the dedicated team of experts and contributors who have worked
tirelessly to develop this comprehensive document. Their insights, research, and passion have
developed a framework to revolutionize how we assess and nurture talent across Pakistan.
As the Executive Director of the IBCC, I am confident that the Model Assessment Framework
will catalyze positive change in our education system. I hope this framework will guide our
educators and policymakers and inspire a new era of learning and development in Pakistan.
Together, we can build an educational future that is both inclusive and innovative, providing
every student with the tools they need to thrive.
I look forward to witnessing the positive impact of the MAF as we work together to uphold the
highest standards of education and assessment in our country.
11 | P a g e
Model Assessment Framework
Executive Summary
The Model Assessment Framework (MAF) is a comprehensive policy document establishing
a standardized, equitable, and quality-driven assessment system for grades 9 to 12 in Pakistan.
Developed in alignment with the National Curriculum Framework (NCF), the MAF provides
a structured approach to assessment, ensuring that it is valid, reliable, and fair across the
country. The framework outlines key assessment principles and processes designed to enhance
learners' educational outcomes and inform teaching practices.
The primary purpose of the MAF is twofold:
The framework is divided into three core phases: Pre-Examination Standards: These standards
ensure the development of assessment tools aligned with the Student Learning Outcomes
(SLOs) outlined in the National Curriculum. Item writing, validation, test blueprinting, and
security protocols are key elements of this phase, designed to ensure fairness, content coverage,
and cognitive balance. Bias-free content, pilot testing, and detailed test blueprints guarantee
that the assessment process begins with high-quality test items. During Examination Standards:
The framework emphasizes maintaining the integrity of the examination process by outlining
clear protocols for invigilation, time management, and student accommodations. It focuses on
creating an environment that ensures fairness and equal opportunities for all learners. In
addition, emergency handling and security measures are established to protect the
examination’s integrity. Post-Examination Standards: This phase includes fair and accurate
marking guidelines, result dissemination, and stakeholder feedback. Moderation processes
ensure consistency in grading, while feedback mechanisms offer insights to students and
educators on performance and areas for improvement. The standards also include a focus on
post-examination analysis, ensuring continuous improvement of the assessment system.
The MAF also addresses the washback effect of assessments, emphasizing how well-designed
assessments can positively influence teaching and learning. The framework seeks to encourage
teaching practices that promote deep learning rather than superficial test preparation by
ensuring that assessments align with the curriculum and educational goals. In addition, the
document highlights the importance of developing comprehensive Guidelines and Standard
Operating Procedures (SOPs) for item writers, reviewers, moderators, and others involved in
the assessment process, ensuring consistency and quality at every stage of the examination
lifecycle. Overall, the Model Assessment Framework is a foundation for building a robust and
uniform assessment system that measures student achievement and promotes educational
quality and equity across Pakistan. The framework aims to ensure that assessments provide
valuable insights for learners and educators, fostering continuous improvement and
accountability in the education system.
12 | P a g e
Model Assessment Framework
Chapter 1: Introduction
This chapter introduces the Model Assessment Framework (MAF) for Pakistan, explaining its
purpose to standardize and improve the quality of educational assessments for grades 9 to 12.
It emphasizes the importance of aligning assessments with the national curriculum to ensure
fairness, inclusivity, and consistency. The chapter also highlights the role of assessments in
evaluating student competencies beyond rote learning, focusing on critical thinking and
problem-solving.
Chapter 2: Standards for Assessment
This chapter defines what constitutes an effective assessment and discusses its various types,
including formative and summative assessments. It introduces the washback effect, describing
how assessments influence teaching and learning practices. The chapter stresses the need for
assessments to be aligned with learning outcomes, cover a range of cognitive levels, and
incorporate both formative and summative methods for comprehensive evaluation.
Chapter 3: Alignment with National Curriculum Standards
This chapter outlines how the MAF aligns with the National Curriculum. It highlights the
importance of setting standards, benchmarks, and learning outcomes that prepare students for
the challenges of the 21st century. The chapter stresses the need for assessments to be
competency-based, promoting problem-solving skills rather than relying solely on content-
based testing.
Chapter 4: Process of Assessment Development
This chapter focuses on the item writing and reviewing process, stressing the importance of
creating valid, reliable, and unbiased assessment items. It also explains the moderation process,
including pilot testing and statistical analysis, to ensure the quality and fairness of each
assessment item. Examples of poorly constructed and improved items are provided to illustrate
best practices.
Chapter 5: Conduct of Examination
This chapter discusses the procedures for administering exams, including training personnel,
ensuring a conducive environment, maintaining transparency, and providing appropriate
logistical support. It includes detailed Standard Operating Procedures (SOPs) for managing the
security of exam materials and handling unforeseen situations during the exam process.
Chapter 6: Coding, Marking, and Compilation of Results
This chapter explains the procedures for coding answer scripts, marking them according to
standardized rubrics, and compiling results. It highlights the use of On-Screen Marking (OSM)
technology to improve accuracy and efficiency in grading. The chapter also discusses measures
to ensure uniformity in marking through multi-layered checks and moderation.
13 | P a g e
Model Assessment Framework
Chapter 7: Post Exam Analysis
This chapter emphasizes the importance of analyzing exam results to improve learning
outcomes and enhance the quality of future assessments. It explains data collection and
management techniques, including item analysis and using descriptive statistics to inform
educational policies and teaching strategies. The chapter also discusses how stakeholder
feedback can be used to refine the assessment process.
Chapter 8: Policy Decisions on IBCC Forum
This chapter outlines key policy decisions made by the forum of the Inter Boards Coordination
Commission (IBCC) to standardize and improve examination practices across Pakistan. Major
reforms include raising the passing marks from 33% to 40%, limiting the use of grace marks
to second attempts, and ensuring the alignment of student learning outcomes (SLOs) with the
National Curriculum. It emphasizes creating item banks, ensuring transparent paper-setting
practices, and separating theory and practical examinations. Additionally, the chapter discusses
the importance of building teacher capacity, establishing research and development units in
boards, and implementing security measures for exam materials.
Chapter 9: Guidelines and SOPs
This chapter provides comprehensive guidelines and SOPs for various personnel involved in
the assessment process, including item writers, reviewers, moderators, and those responsible
for test assembly, administration, and marking. It aims to ensure consistency, fairness, and
quality across all stages of the assessment lifecycle.
14 | P a g e
Model Assessment Framework
Chapter 1. Introduction
1.1. Introduction
The Model Assessment Framework (MAF) is a comprehensive guideline designed to
standardize and elevate the quality of education across secondary and higher secondary levels.
The MAF aims to ensure that the academic assessments are aligned with the national
curriculum, promoting consistency, fairness, and a high standard of education throughout the
country.
The framework focuses on curriculum alignment, ensuring that assessments accurately reflect
the learning objectives outlined in the national curriculum. It stresses evaluating student’s
competencies by measuring rote memorization and understanding and applying knowledge,
critical thinking, and problem-solving skills. It promotes fairness and inclusivity by creating
assessments that are equitable and accessible to all students, regardless of their socio-economic
backgrounds. Teacher development is its prime objective by providing guidelines and
resources for educators to prepare students for assessments effectively. It suggests a robust
mechanism for continuous improvement by incorporating feedback mechanisms to
continuously refine and improve assessment practices.
1.2. Background
The Model Assessment Framework (MAF) for Secondary and Higher Secondary Levels is a
strategic initiative aimed at standardizing and improving the quality of education across the
country. Pakistan's education system has historically faced challenges such as disparities in
quality, regional imbalances, and varying standards across different educational boards. The
assessment systems are often criticized for focusing on rote learning rather than critical
thinking and problem-solving skills.
There was a growing recognition of the need for a comprehensive assessment framework that
could provide a uniform evaluation standard. The aim was to ensure that assessments are
aligned with international best practices and measure a broad range of student competencies.
The formulation of the MAF aligns with the National Education Policy and other strategic
documents that emphasize the need for educational reforms. The framework addresses the
goals set out in the Sustainable Development Goals (SDGs), particularly Goal 4, which focuses
on quality education.
1.3. Rationale
The formulation of a Model Assessment Framework for Secondary and Higher Secondary
Levels in Pakistan is driven by several key factors.
o There have been issues relating to the standardization and equity in the prevailing
assessment system.
o The prevailing system does not provide a structured approach to evaluating the
effectiveness of educational programs and students' performance, resulting in unmet
educational objectives.
o The assessments are not aligned with the national curriculum, which, in turn, does not help
promote coherence in teaching and learning processes.
15 | P a g e
Model Assessment Framework
o The current system does not provide feedback to students, parents, and educators on
academic progress and areas needing improvement; thus, it does not provide a continuous
learning and growth culture.
1.4. Significance
The Model Assessment Framework ensures that assessment standards are consistent across all
regions and schools, reducing disparities and promoting equity. By having a standardized
assessment, the quality of education can be uniformly measured and improved across the
country. The MAF also provides benchmarks for student performance, allowing for the
identification of strengths and weaknesses in the education system and its implementation will
help in providing accurate data from assessments, helping to make informed decisions for
improving educational practices and policies.
The MAF ensures that assessments are aligned with the national curriculum, fostering a more
coherent educational experience. Also, it focuses on assessing skills and knowledge relevant
to the curriculum, and future educational and career needs. The MAF provides valuable
feedback to students, teachers, and parents, guiding improvements in teaching methodologies
and student learning strategies, and also highlights areas where teachers may need additional
training or support, promoting continuous professional development. The MAF will also help
identify gaps in educational attainment between different regions, socio-economic groups, and
genders, leading to targeted interventions.
The MAF also aligns the national assessment system with international standards, facilitating
comparisons and improvements based on global best practices, which, in turn, will help
students' credentials be recognized internationally, aiding those who pursue education or
careers abroad. The MAF focuses on assessing critical thinking, problem-solving, and other
21st-century skills essential for success in higher education and the workforce. It helps ensure
that students are adequately prepared for the demands of higher education and the job market.
The Model Assessment Framework for Secondary and Higher Secondary Levels in Pakistan
aims to create a robust, equitable, and transparent education system. It is designed to enhance
educational quality, promote fairness, and ensure students are well-prepared for their future
academic and professional endeavors.
1.5. Objectives
The Model Assessment Framework (MAF) for Secondary and Higher Secondary levels in
Pakistan serves several crucial objectives aimed at enhancing the quality of education and
improving educational outcomes across the country. These include:
o Establishing uniform guidelines and standards to ensure consistency in assessment
practices across educational boards of different regions of the country to help maintain
fairness and transparency in evaluating student performance.
o Setting clear assessment criteria and benchmarks to uphold and monitor the quality of
education provided at secondary and higher secondary levels, ensuring that assessments
accurately reflect the learning objectives and curriculum standards.
o Aligning assessment practices with broader educational goals and objectives ensures that
assessments not only measure academic achievement but also support the development of
critical thinking, problem-solving skills, and other competencies essential for students'
overall growth.
16 | P a g e
Model Assessment Framework
o Designing assessments not only to evaluate but also to facilitate learning, providing
valuable feedback to students, teachers, and policymakers, helping them identify areas of
improvement and tailor instructional strategies accordingly.
o Support evidence-based decision-making in education policy by collecting and analyzing
national assessment data, including identifying trends, disparities, and areas needing
intervention or improvement.
o Ensure that assessments are fair and inclusive, accommodating the diverse needs and
backgrounds of students across Pakistan, and help reduce disparities in educational
outcomes and promote equal opportunities for all learners.
o Provisions for training teachers and educators in effective assessment practices ensure
that educators are equipped with the necessary skills to implement valid, reliable
assessments and aligned with educational objectives.
17 | P a g e
Model Assessment Framework
18 | P a g e
Model Assessment Framework
19 | P a g e
Model Assessment Framework
• Help students identify their strengths and weaknesses, target areas that need to be
addressed, and
• Help teachers recognize where students are struggling and address these problems.
2.4.2 Formative Assessment Based on Bloom’s Taxonomy for High-Stake Exams
(Grades IX to XII)
A wide range of educational research sheds light on the significance of assessment in the
teaching-learning process. One of the most vital benefits of assessment is that it helps evaluate
the children's performance. This can be helpful both for the students and the teachers. Effective
assessment of students’ performance requires a structured framework that comprehensively
evaluates cognitive, psychomotor, and affective domains.
Bloom's Taxonomy provides a well-established classification of educational objectives that can
be used to design fair, reliable, and valid assessments. This framework includes a detailed
breakdown of Bloom's Taxonomy and offers guidelines for implementation, tailored according
to the educational context of Pakistan. Here are the details about Bloom’s Taxonomy.
20 | P a g e
Model Assessment Framework
o Tools: Project proposals, reflection essays, community partner feedback.
o Cultural Immersion Projects: Encourage appreciation of cultural diversity and
linguistic heritage.
o Tools: Research assignments, presentations, cultural artifacts.
o Literature Circles: Enhance engagement with literature and foster a sense of community
o Tools: Reading guides, discussion questions, and group evaluation forms.
o Reflective Essays: Encourage deep reflection on personal growth and attitudes.
o Tools: Essay prompts, reflection guidelines, assessment rubrics.
o Teacher Observations: Monitor students’ behaviors, attitudes, and interactions in
various settings.
o Tools: Observation checklists, anecdotal records, behavior rubrics.
o Project-based Assessments: Assess behavioral changes, attitudes, and interests in
learners.
o Tools: Theoretical project-based assessments, experimental project-based assessments.
o Quizzes: Focus primarily on evaluating the conceptual understanding of students.
o Tools: scored quizzes, personality quizzes
o Portfolio: Provide valuable insight into the learners’ work ethics and set of values.
o Tools: Research papers, creative writing pieces, artwork, graphic designs, and 3D
renderings.
21 | P a g e
Model Assessment Framework
Gamify Learning with Quiz-Based Platforms
o Tip: Gamify assessments to engage students and make learning more enjoyable.
o Example Tool: Kahoot! - Create quizzes and games that students can participate in using
their devices. It provides immediate feedback and encourages friendly competition.
The above-mentioned tools and tips can help integrate technology into formative assessment
strategies, making assessments more engaging, timely, and informative for teachers and
students.
22 | P a g e
Model Assessment Framework
o Contextual Relevance: Adapting global best practices to fit the cultural and educational
context.
o Government Policies: Ensuring assessments align with national education policies and
standards.
o Teacher Training: Providing continuous professional development on modern
assessment techniques.
o Community Involvement: Engaging parents and communities in the educational
process.
o Resource Allocation: Ensuring adequate resources and infrastructure for effective
assessment implementation.
Clear Criteria
Develop rubrics or guidelines that outline specific criteria related to attitudes, values, and
emotional responses.
Feedback Mechanisms
Provide constructive feedback that encourages students to reflect on and refine their
affective engagement.
Multiple Modalities
Contextual Relevance
Ensure assessments are culturally and contextually relevant to the student's language and
science learning experiences.
23 | P a g e
Model Assessment Framework
The structure of assessments must be meticulously designed to ensure alignment with the
curriculum and learning objectives, capturing the core competencies and knowledge areas
that students are expected to master. A Table of Specifications (ToS) serves as a critical tool,
24 | P a g e
Model Assessment Framework
providing a blueprint to balance content coverage and cognitive levels, ensuring fairness,
transparency, and comprehensive evaluation.
25 | P a g e
Model Assessment Framework
Table 1. Assessment Blueprint Example
26 | P a g e
Model Assessment Framework
Criteria:
o Ensure all assessments align with the national curriculum and specific Student
Learning Outcomes (SLOs).
o Develop mapping documents that clearly link assessment items to curriculum objectives.
Curriculum Assessment
Topic SLOs
Objectives Type
Apply algebraic Multiple-
Algebra Solve equations
concepts choice
Reading
Analyze texts Interpret text meaning Short-answer
Comprehension
Conduct Explain Scientific
Science Experiment Practical exam
Experiments Methods
Historical Events Understand history Analyze historical impact Essay
27 | P a g e
Model Assessment Framework
28 | P a g e
Model Assessment Framework
examination. This standard ensures that all individuals engaged in the assessment process are
adequately trained, their roles clearly defined, and their capacities continuously developed to
maintain the integrity and quality of assessments. The roles include educators, examiners,
invigilators, and administrative staff, whose expertise and skills are critical in ensuring smooth,
fair, and accurate assessments.
2.5.4 During Examination Standards
During examination standards focus on ensuring the fair, transparent, and efficient conduct of
exams. These standards prioritize the creation of a conducive environment for students, strict
adherence to guidelines by supervisory staff, and the use of digital tools to enhance monitoring
and security. Proper implementation of these standards ensures the integrity of the examination
process and fosters trust among all stakeholders.
2.5.4.1 Standard 6: Quality of Test (Validity, Reliability)
Ensuring the quality of the test is paramount for producing reliable and valid results.
Conducting validity and reliability checks helps confirm that the assessment measures what it
intends to measure and yields consistent results over time. This involves pilot testing the
assessment items and conducting item analysis to refine and improve them. Such rigorous
quality assurance processes help identify and eliminate any potential biases or errors, thereby
enhancing the overall credibility of the assessment.
Criteria:
Conduct validity and reliability checks to ensure the assessment measures what it is intended
to measure and produces consistent results.
The standard for Validity and Reliability
Ensuring the quality of an assessment is crucial for producing reliable and valid results.
Validity and reliability checks confirm that an assessment measures what it is intended to
measure and yields consistent results over time. This process involves pilot-testing the
assessment items and conducting item analysis to refine and improve them. Rigorous quality
assurance processes help identify and eliminate potential biases or errors, thereby enhancing
the overall credibility of the assessment.
Criteria for the Standard
Conduct Validity and Reliability Checks
o Validity: Ensure the assessment accurately measures the intended content and
construct.
o Content Validity: Ensure the test covers all relevant content areas.
o Construct Validity: Ensure the test measures the intended construct, often assessed
through correlations with other established measures of the same construct.
o Criterion-Related Validity: Ensure the test predicts outcomes it is supposed to predict,
assessed through correlation with a criterion measure.
o Reliability: Ensure the assessment yields consistent results over time and across
different administrations.
o Cronbach's Alpha: A measure of internal consistency, with a value of 0.7 or higher
generally indicating acceptable reliability.
o Test-Retest Reliability: Assess stability over time by administering the same test to
the same group at different points in time. A correlation coefficient (e.g., Pearson’s r)
of 0.7 or higher indicates good reliability.
29 | P a g e
Model Assessment Framework
o Inter-Rater Reliability: Ensure consistency across different raters, measured by
Cohen’s kappa or interclass correlation coefficients (ICC), with values above 0.75
indicating good agreement.
30 | P a g e
Model Assessment Framework
Conduct preliminary testing to
Pilot test sample size,
Pilot Testing identify and address issues with test
representative sampling
items.
Difficulty Index: 0.3 - 0.7,
Use statistical techniques to refine and
Item Analysis Discrimination Index >
improve test items.
0.2
Regular review and
Align with ITC, AERA, APA, and
International Standards alignment with
NCME standards benchmarks.
international benchmarks
α ≥ 0.7, Validity
Ensure a Cronbach’s alpha of 0.7+ for
Coefficient ≥ 0.3, Test-
Minimum Values reliability and a validity coefficient of
Retest ≥ 0.7, Inter-Rater
0.3+.
Reliability ≥ 0.75
31 | P a g e
Model Assessment Framework
2.5.5.2 Standard 9: Results Analysis (Internal and External)
Conducting thorough internal and external analysis of assessment results is crucial for
identifying trends, strengths, and areas for improvement. This involves using statistical
methods to analyze the data and generate meaningful reports. These reports provide valuable
insights that can inform instructional practices and policy decisions. By analyzing the results
comprehensively, educators and policymakers can make data-driven decisions to enhance the
quality of education.
Criteria:
o Conduct thorough internal and external analysis of assessment results to identify trends,
strengths, and areas for improvement.
o Use statistical methods to analyze data and generate meaningful reports.
32 | P a g e
Model Assessment Framework
2.5.6 Standards for Digitization of Examinations
The standards for digitization of examinations focus on integrating technology-driven solutions
to enhance the efficiency, security, and fairness of the assessment process. These standards
include the use of digital question paper delivery systems, automated proctoring tools, and
electronic marking platforms to ensure accuracy and minimize human errors. By adopting
innovative practices such as Question Item Banks (QIBs) and real-time monitoring systems,
digitization ensures a transparent and streamlined examination process that meets modern
educational demands.
2.5.6.1 Standard 11: Digitization Pre-Examination:
Digitizing the pre-examination processes involves developing and implementing digital tools
for curriculum mapping, SLO alignment, and creating the Table of Specifications (ToS).
Online platforms can be used for item bank development and question paper creation, ensuring
a streamlined and efficient process. These digital tools enhance accuracy and consistency,
making the pre-examination phase more effective and manageable.
Plan
Objective Actions Outcome
Period
Develop digital tools for aligning
Curriculum and SLO Initial digital mapping
assessments with curriculum and
Mapping tools in use
SLOs; Train educators
Short-
Item Bank Create initial digital item bank for Basic item bank available;
Term
Development key subjects; Conduct workshops Educators trained
Table of Develop and implement digital ToS Digital ToS in use; Boards
Specification template; Train examination boards trained
Expand digital mapping tools;
Curriculum and SLO Comprehensive mapping
Regular updates based on
Mapping tools; Regular updates
feedback
Expand item bank to more
Mid- Item Bank Expanded and updated
subjects; Periodic reviews and
Term Development item bank
updates
Table of Enhance the ToS system with Advanced ToS
Specification automated checks; Ongoing Features; Continuous
(ToS) training educator training
Long- Curriculum and Integrate advanced analytics; Refined alignment process;
Term SLO Mapping Universal tool usage Universal adoption
33 | P a g e
Model Assessment Framework
Maintain comprehensive item
Item Bank Extensive item bank; AI-
bank; Implement AI-driven
Development enhanced items
analysis
Table of ToS integrated into
Fully integrate ToS into curriculum
Specification planning; Universal
planning; Universal use
(ToS) adoption
Plan
Objective Actions Outcome
Period
Pilot digital exams in select Pilot digital exams
Digital
schools; Implement real-time conducted; Monitoring is in
Short- Administration
monitoring and invigilation place.
Term
Security Establish digital security Enhanced security measures;
Protocols protocols; Train staff Trained staff
Expand digital exams to more Widespread digital exam
Digital
schools; Improve systems based adoption; System
Mid- Administration
on feedback improvements
Term
Security Implement advanced security Advanced security in all
Protocols measures; Regular reviews centers; Regular updates
Implement fully digital exams
Digital Nationwide digital exams;
nationwide; Use AI and
Administration AI-enhanced monitoring
Long- advanced tools for monitoring.
Term Continuous updates to
Security Cutting-edge security
security measures; Universal
Protocols Measures universally applied
implementation
Plan
Objective Actions Outcome
Period
Implement digital scoring for
Digital Scoring and Digital scoring for
MCQs; Train markers on digital
Analysis MCQs; Trained markers
Short- rubrics
Term Develop an online results Online platform
Results
platform; Create a feedback available; Feedback
Dissemination
system system in place
Full digital scoring Comprehensive digital
Digital Scoring and
implementation; Regular scoring; Continuous
Analysis
Mid- training sessions Training
Term Enhance the results platform with Improved platform;
Results
detailed reports; Collect and Regular stakeholder
Dissemination
analyze feedback feedback
Digital Scoring and Use AI for scoring and analysis; AI-driven scoring;
Analysis Continuous training Regular marker training
Long-
Term Develop a comprehensive Advanced platform:
Results
digital platform integrating all Updated with the latest
Dissemination
aspects; Regular updates. tech
34 | P a g e
Model Assessment Framework
The successful implementation of the Model Assessment Framework (MAF) relies heavily
on the capacity and expertise of assessment staff, board officers, and teachers involved in the
examination and assessment process. Effective capacity building and training are essential to
ensure that all stakeholders are equipped with the knowledge and skills needed to carry out
standardized and reliable assessments aligned with the framework.
The Inter Boards Coordination Commission (IBCC) will be pivotal in facilitating training
programs designed to empower examination board officers and teachers. These programs will
focus on various aspects of the assessment process, including test item development,
evaluation techniques, and adherence to the guidelines and standards outlined in the
MAF. Special attention will be given to enhancing teachers' ability to implement SLO-based
teaching and assessments, ensuring alignment between classroom practices and examination
requirements. Additionally, the IBCC will spearhead efforts to digitalize the examination
process, enabling more efficient and secure management of assessments. As part of this
initiative, the IBCC will assist boards in developing and implementing indigenous Question
Item Banks (QIBs). These QIBs will standardize the examination process by providing a
centralized repository of validated items, ensuring fair and balanced assessments across all
regions. Trainings will be conducted for each phase of the examination process.
Pre-Examination Training:
• Conduct workshops and training sessions focused on curriculum alignment, question
setting, and ethical standards. Ensure all personnel undergo at least 10 hours of training
annually on assessment design and integrity.
• Regularly assess the capacity and readiness of question setters and planners. Use
feedback from past assessments to improve training programs and address gaps in
knowledge.
Post-Examination Training
• Offer training on marking schemes, data handling, and result compilation. Require
markers and graders to participate in standardization meetings and training on unbiased
grading techniques.
• Evaluate the performance of markers and graders through double marking and
consistency checks. Implement peer reviews and moderation sessions to ensure
accuracy and fairness in grading.
35 | P a g e
Model Assessment Framework
Table 9. Criteria Summary for Capacity Building
Monitoring
Training and Capacity
Stage Personnel
Building
and Evaluation
Regular
Curriculum developers,
Pre- Annual 10-hour training, assessments,
question setters, exam
Examination workshops feedback from
planners, admin staff
past assessments
Spot checks,
During Examiners, invigilators, Certification program,
feedback
Examination support staff, IT personnel invigilation techniques
mechanisms
Peer reviews,
Standardization
Post- Markers, graders, data double marking,
meetings, unbiased
Examination analysts, result compilers consistency
grading techniques
checks
In addition, the MAF provides guiding principles for teacher-education departments to update
their teacher-education curricula. This will enable future teachers to develop the competencies
required to meet emerging educational challenges and ensure the effective implementation of
the framework. By integrating MAF-aligned strategies into teacher education, institutions can
prepare teachers to adopt innovative teaching and assessment methods that align with 21st-
century educational demands. Through structured training, capacity-building initiatives, and
curriculum updates in teacher education, the IBCC aims to ensure that all assessment personnel
are well-prepared to implement the MAF effectively, contributing to a more equitable and
transparent education system in Pakistan.
36 | P a g e
Model Assessment Framework
Page | 37
Model Assessment Framework
in a team setting, have an acquired knowledge base, be able to extend and refine knowledge,
be able to construct new knowledge and applications and have a habit of self-assessing their
assimilation of each dimension in their everyday decision-making process. The curriculum is
built upon Standards, Benchmarks, and Learning Outcomes to benefit student growth and
progress.
STANDARDS are what students should know and be able to do. Standards are broad
descriptions of the knowledge and skills students should acquire in a subject area. The
knowledge includes important and enduring ideas, concepts, issues, and information. The
skills that characterize a subject area include thinking, working, communication, reasoning,
and investigating. Standards may emphasize interdisciplinary themes and concepts in the core
academic subjects.
Standards are based on:
o Higher Order Thinking:
It involves students manipulating information and ideas by synthesizing, generalizing,
explaining, or arriving at conclusions that produce new meaning and understanding.
o Deep Knowledge:
It addresses the central ideas of a topic or discipline with enough thoroughness to explore
connections and relationships and to produce a relatively complex understanding.
o Substantive Conversation:
Students engage in extended conversational exchanges with the teacher and/or peers about
the subject matter in a way that builds an improved and shared understanding of ideas or
topics.
o Connections to the World Beyond the Classroom:
Students make connections between substantive knowledge and either public problems or
personal experiences.
BENCHMARKS indicate what students should know and be able to do at various
developmental levels.
LEARNING OUTCOMES indicate what students should know and be able to do for each
topic in any subject area at the appropriate developmental level. The Learning Outcomes sum
up the total expectations from the student.
Page | 38
Model Assessment Framework
o Alignment with Learning Outcomes: Items should directly assess specific Student
Learning Outcomes (SLOs). Each item must measure the intended knowledge or skill.
o Clarity and Simplicity: Language used in items should be clear, concise, and appropriate
for the target grade level. Avoid complex or ambiguous wording that may confuse students.
o One Focus per Item: Each item should focus on one skill or concept to ensure accurate
measurement.
o Avoid Bias and Sensitivity: Items should be free from cultural, gender, or socio-economic
biases that could disadvantage any group of students.
o Content Validity: Ensure each item covers relevant content aligned with the curriculum.
o Cognitive Level: Check that items measure the appropriate cognitive levels based on
Bloom’s Taxonomy (e.g., knowledge, application, analysis).
o Fairness and Accessibility: Items should be reviewed for fairness and accessibility,
including ensuring they are understandable for students with disabilities.
Moderation Rules Moderation ensures consistency and fairness across different assessments.
The moderation process involves:
Page | 39
Model Assessment Framework
o Expert Review: Subject matter experts should review items for alignment with curriculum
and assessment goals.
o Pilot Testing: Pilot items with a sample group of students to identify issues related to item
difficulty, discrimination, and clarity.
o Statistical Analysis: Identify problematic items using item analysis techniques (e.g.,
Difficulty Index, Discrimination Index).
o Final Approval: Only items that meet all content, cognitive, and fairness standards should
be included in the assessment.
o Bad Items: These questions either ask for rote memorization, lack context, or are too
simplistic, failing to engage students at higher cognitive levels.
o Good Items: These questions encourage deeper thinking, ask students to apply
knowledge to real-world situations, or analyze concepts in a context aligned with
Bloom’s Taxonomy.
Page | 40
Model Assessment Framework
Here's a clearer and more rational set of Bad Items and Improved Items based on common
subjects for grades 9 to 12, highlighting how each Improved Item addresses the limitations of
the Bad Item while aligning better with the SLOs and fostering higher-order thinking.
Table 10. Bad Item Examples and its Improved Item
Sr.
Bad Item Rationale Improved Item
No.
Only asks for the law Describe how Newton’s First Law applies to a
State Newton’s First
2 without requiring an car coming to a sudden stop, and explain the role
Law.
application. of seat belts in preventing injuries.
Simple recall with no Compare the properties of strong and weak acids
3 Define an acid. deeper thought or and provide examples of how they are used in
application. everyday life.
When was the Focuses only on a date, Analyze the significance of the Lahore
5 Lahore Resolution encouraging Resolution of 1940 and its role in the formation
passed? memorization. of Pakistan.
Asks only for a basic Explain the spiritual significance of Hajj in Islam
7 What is Hajj?
definition. and how it fosters unity among Muslims globally.
Page | 41
Model Assessment Framework
5.3. Transparency
It has always remained imperative to maintain transparency in the Examinations as
transparency is instrumental in conducting a fair Exam.
All possible measures are taken to achieve this goal, including the following.
o A team of the highest level of integrity undertakes the composing and printing of Question
Papers.
o The Sealed Question Papers are handed over to the Banks for safe custody. On each day
of the Exam, the relevant Question Paper Packets are handed over to the Superintendent of
the Examination center.
Page | 42
Model Assessment Framework
o The supervisory staff is appointed through Draw to rule out the possibility of arbitration.
o Independent monitoring of the proceedings during the Exam should be carried out through
Inspections.
o Online surveillance of the Exam Centers should be introduced and cameras should be
installed in every Exam center to depict real-time view of the activities.
Page | 43
Model Assessment Framework
Cheating and examination malpractices undermine the integrity of the assessment process and
compromise the quality of education. Addressing these issues requires innovative strategies,
robust policies, and the integration of digital technology to ensure fairness and transparency.
This section provides guidelines and recommendations to control cheating and malpractices
during the conduct of examinations.
Digital tools play a transformative role in reducing cheating and ensuring secure examination
processes. The following technologies can be integrated into the examination system:
Page | 44
Model Assessment Framework
Page | 45
Model Assessment Framework
o Marking Key: Marking keys for each subject are prepared through concerned subject
experts. The marking key is shared with all the Head Examiners. It is ensured that marking
is strictly carried out in accordance with the marking key.
o Subject Coordinators: Subject coordinators are appointed in each major subject to
examine the prescribed ratio of Answer Scripts to ensure uniformity in marking by the
groups and to check whether marking is being undertaken according to the marking key or
otherwise. In case of any diversion from the given marking key, necessary corrective
measures are immediately taken.
o Super Checking: The super checkers are appointed for each subject as a part of the policy
of multi-layered checking of the Answer Scripts to ensure that the possibility of mistakes
in totaling marks is ruled out. The super checkers also point out the unmarked questions,
and if any such case is found, the same is returned to the concerned Head Examiner for
correction.
6.3. Compilation of Results
The compilation of results is a critical step in ensuring the accuracy and reliability of
examination outcomes. This process involves decoding and entering award lists, scrutinizing
computerized entries against original records, and consolidating data into final results.
o Decoding and Entry of Award Lists: The award lists are obtained after marks are handed
over to the IT staff, who enter the awards in the computer database. The Fictitious numbers
are decoded, and marks are allotted against the original Roll numbers.
o Scrutiny of Award Lists: The computerized award lists are scrutinized with the original
award lists. This process further endorses the accuracy of the marks posted on scripts with
those entered in the database.
o Compilation: The results are compiled through software in consolidated form. The
decisions of the UFM cases and absent cases are also included in the result.
o Use of Information Technology: Information Technology has greatly helped improve the
compilation of results and led to a more accurate and brisk compilation of results. One such
measure is the introduction of On-Screen Marking (OSM), which enhances marking
accuracy and saves time. E-marking allows markers to mark a scanned script or online
response on a computer screen rather than on paper.
Page | 46
Model Assessment Framework
Page | 47
Model Assessment Framework
Page | 48
Model Assessment Framework
Page | 49
Model Assessment Framework
• Improves Test Quality: Identifies poorly performing items for revision or removal,
ensuring that assessments are fair and valid.
• Supports Equity: Reduces bias and ensures all items are accessible to students from
diverse backgrounds.
• Informs Test Development: Provides data-driven insights for creating balanced and
effective assessments in future test cycles.
• Enhances Decision-Making: Supplies examination boards with reliable metrics to
refine assessments and improve educational outcomes.
• Item Banking: Ensures that only high-quality items are added to the Question Item
Bank (QIB) for future use.
• Standardized Testing: Supports the creation of exams that are consistent, reliable,
and aligned with curriculum standards.
• Policy Decisions: Informs examination boards and policymakers about the
effectiveness of assessment practices.
Page | 50
Model Assessment Framework
Distractor Analysis is a crucial component of item analysis that evaluates the effectiveness of
incorrect answer choices (distractors) in multiple-choice questions. Its primary purpose is to
determine whether distractors are functioning as intended—plausible enough to challenge test-
takers who lack the correct knowledge while being unlikely to confuse those who do.
Page | 51
Model Assessment Framework
• Data Visualization: The dashboard presents student performance data through charts,
graphs, and heatmaps, making complex data easy to understand and interpret.
• Comparative Analysis: It enables performance comparison across different schools,
districts, and demographic categories, identifying trends and areas for improvement.
• Performance Insights: The tool highlights performance gaps, strengths, and
weaknesses, offering actionable insights to tailor interventions for students and schools.
• Transparency and Accountability: The dashboard promotes transparency and fosters
trust among stakeholders by centralizing and standardizing result data.
• Real-Time Access: Administrators can quickly access real-time data to respond to
emerging challenges and opportunities.
The Exam Result Dashboard is an important step in embracing the digital transformation within
Pakistan's education system. It empowers educational leaders with the ability to evaluate
outcomes effectively, ensuring that resources and efforts are directed toward enhancing student
achievement and fostering equitable educational opportunities nationwide.
Page | 52
Model Assessment Framework
Page | 53
Model Assessment Framework
o Holistic View: Combining student performance, item analysis, and contextual data to
create a comprehensive picture of student achievement. This holistic approach allows for
more nuanced insights and targeted interventions.
o Informed Decision-Making: Using the integrated data to inform decisions about
curriculum adjustments, teaching strategies, and resource allocation. Educators can make
more effective and equitable decisions to support student learning by considering all
relevant factors.
Page | 54
Model Assessment Framework
question.
Calculation: Number of students who answered the item correctly divided by the total
number of students. For example, if 30 out of 50 students answered a question
correctly, the difficulty index would be 0.6.
ii. Discrimination Index: A measure of how well an item differentiates between high-
performing and low-performing students.
Purpose: Identifies the effectiveness of each item in distinguishing between students
who understand the material and those who do not. A high discrimination index means
the item is good at distinguishing between different levels of student performance.
Calculation: Typically calculated by comparing the performance of the top 27% of
students to the bottom 27%. For example, if an item is correctly answered by 80% of
the top students and 30% of the bottom students, the discrimination index would be
0.5.
iii. Distractor Analysis: Examination of the incorrect response choices (distractors) to
understand patterns in student mistakes.
Purpose: Identifies whether distractors are functioning effectively by attracting
students who do not know the correct answer. This analysis helps in improving the
quality of multiple-choice questions by ensuring that distractors are plausible and
discriminating.
Process: Reviewing the frequency with which each distractor is chosen. For example,
if a distractor is rarely chosen, it might be too obviously incorrect and need revision.
ii. ANOVA (Analysis of Variance): A statistical method for comparing means across
three or more groups.
Purpose: To identify if differences exist among multiple categories, such as regions or
school types.
Application: Comparing average scores of students from different provinces in
science.
Example: Evaluating if students from Punjab, Sindh, and KPK perform differently in
biology.
Page | 55
Model Assessment Framework
iv. Linear Regression: A statistical technique to predict the value of a dependent variable
based on one independent variable.
Purpose: To quantify the impact of a single factor on student performance.
Application: Predicting student scores based on their attendance percentage.
Example: Estimating the likely score in English based on the number of practice tests
attempted.
vi. Time Series Analysis: A technique for analyzing trends over time.
Purpose: To examine performance patterns and predict future outcomes.
Application: Tracking student performance in national exams over the past five years.
Example: Analyzing yearly trends in mathematics scores to predict next year’s
performance.
vii. Predictive Modeling: The use of statistical techniques to forecast outcomes based on
historical data.
Purpose: To predict future performance or identify at-risk students.
Application: Creating models to predict dropout rates or high-performing students.
Example: Forecasting which students are likely to excel in STEM subjects based on
their grade trends.
Page | 56
Model Assessment Framework
The reporting and interpretation of results is a n important phase in the assessment process,
transforming raw examination data into meaningful insights for stakeholders. Effective
reporting not only highlights overall achievement levels but also identifies trends, gaps, and
areas for improvement. By combining statistical analysis with detailed interpretation, this
phase ensures that assessment results serve as a valuable resource for enhancing educational
practices and outcomes.
Page | 57
Model Assessment Framework
Page | 58
Model Assessment Framework
o Remediation Plans: Suggest structured remediation plans for students who need extra
support, including after-school programs, peer tutoring, or individualized instruction.
o Professional Development: Recommending professional development opportunities for
teachers to enhance their skills in areas where students are struggling. For example, if
students consistently perform poorly in geometry, teachers might benefit from training in
effective geometry instruction techniques.
Page | 59
Model Assessment Framework
Page | 60
Model Assessment Framework
Effective communication of assessment results is essential for ensuring that all stakeholders—
students, parents, teachers, administrators, policymakers, and examination boards—can
understand and act on the insights derived from the data. This process involves presenting
results in a clear, concise, and accessible format, using tools like reports, dashboards, and
graphical representations to cater to diverse audiences. Tailored communication ensures that
each stakeholder receives relevant information, whether it is detailed analysis for policymakers,
performance summaries for teachers, or individual feedback for students. By bridging the gap
between data and decision-making, clear communication fosters transparency, builds trust, and
empowers stakeholders to take informed actions to improve educational outcomes.
Page | 61
Model Assessment Framework
insights that can inform policy decisions. This might include trends in student
performance, areas of concern, and potential causes of disparities.
o Evidence-Based Recommendations: Offering evidence-based recommendations for
policy changes or initiatives aimed at improving educational outcomes and for
example, suggesting the introduction of new instructional programs or changes to
existing curriculum standards.
o Clear Communication: Ensuring that policy briefs are clear, concise, and accessible
to non-specialists, including policymakers, school administrators, and other
stakeholders.
7.9.4 Educational Authorities for Strategic Planning
o Long-Term Goals: Using assessment data to set long-term educational goals and
priorities. For example, aiming to improve math proficiency rates by a certain
percentage over the next five years. Resource Allocation: Informing decisions about
resource allocation based on identified needs. For example, directing additional
funding or support to schools or regions with lower performance.
o Monitoring Progress: Establishing metrics and benchmarks to monitor progress
towards strategic goals. Regularly reviewing assessment data to track progress and
make adjustments as needed.
o Stakeholder Engagement: Engaging with a broad range of stakeholders, including
teachers, parents, and community members, to gather input and build support for
strategic initiatives.
7.10 Conclusion
Page | 62
Model Assessment Framework
1. The bar is being raised from 33% to 40% with an intent to produce quality results.
Therefore, the award of grace marks may be stopped with implementing the new
grading scheme.
2. Up to a maximum of seven grace marks may be awarded in one subject only for those
students appearing in second attempts for improvement of grades. There will be no
indication on the result card/certificate about the award of grace marks.
3. Concerned respective Education Departments may refine student learning outcomes
(SLOs) and revise examination syllabi based on the National Curriculum 2006 of SSC
and HSSC. All concerned teachers' training institutions in respective Provinces may
provide training to teachers according to the SLOs. Provincial Governments may be
requested to impart necessary training to the teachers for the purpose.
4. Textbook Boards and Curriculum Bureaus of respective Provinces may make sure that
there may be minor changes in text contents every year to avoid making Helping
Books/Guides. Textbook Boards are to print textbooks only annually so that
preparation of the cheating material in Guides and Pockets gets discouraged. The paper
setters will make sure to prepare their questions and will be banned from setting papers
out of exercises, guides, and previous three- or four-year papers. Provincial
Governments and Textbook Boards may be requested for yearly textbook changes.
5. All BISEs may set uniform examination standards across the board for both SSC and
HSSC examinations based on the scheme of assessment, type of questions/items,
cognitive levels, and weighting for objective/subjective questions.
6. A gradual move is highly recommended toward higher cognitive level questions,
starting with Knowledge 30%, Understanding 50%, and Application 20%. However,
the range may vary from subject to subject and Board to Board.
7. BISEs to make sure the selection of competent professionals and build their capacity
as item writers, reviewers, paper setters, markers, and other exam-related professionals.
Detailed SOP and guidelines are provided in this policy document. IBCC may offer
capacity-building training to officers/teachers involved in assessment.
8. Teachers training institutions in respective Provinces may organize intensive training
courses to shift their teaching-learning from single textbooks to curriculum/SLOs
based classroom practices in collaboration with the Provincial Education Departments,
Education Institutions, and other key stakeholders.
9. Establish Item Banks at the IBCC level where reviewed and piloted items are securely
stored as part of the examination development process regularly. The Boards at the
provincial level may develop one Item Bank as a Central Repository in one of the
Boards having appropriate facilities such as Lahore for Punjab, Peshawar for KP,
Karachi for Sindh, Quetta for Balochistan, and Federal Board for AJK/GB. The Central
Item Bank will be established at the IBCC level to facilitate other Boards by regular
in/outflow of items/questions once all the provinces have adopted a uniform Scheme
of Studies. In setting every paper, 20% of the items shall be taken from the Central
Item Bank of IBCC to ensure National uniformity and standardization. BISEs, BTEs,
and the IBCC Secretariat will jointly work to develop the Central Item Bank and the
Page | 63
Model Assessment Framework
percentage of items to be taken from the Central Item Bank by the member Boards.
10. The practical examination shall be separated from the theory. The students have to pass
theory and practical separately. If a student fails in practical and has passed in theory,
he will appear only in practical in the next exam and vice versa.
11. The practical examination shall be conducted in two phases.
a. Based on MCQs, the answer script of the practical examination will be
evaluated through OMR in the concerned BISEs. The practical examination
(written) of all the subjects will be conducted on the same day and shall be
reflected in the date sheet of the commencing examination.
b. The concerned teacher of the respective subject shall perform as examiner of
his/her class.
12. It was recommended that irrespective of the allocation of marks for the practical
examinations, the practical exam should be separated from the theory, and a student
who failed in the practical may pass the practical only and vice versa. The portion of
the exam, either theory or practical, should remain on the student's credit, and they may
again pass the failed portion only. It was also recommended that Practical Note Books
marked and certified by the examiners may also be deposited in the Boards.
13. All BISEs need to share the best practices in Pre-Conduct, During Conduct, and Post-
Conduct SOPs for conducting fair and transparent examinations at the IBCC Forum.
14. All BISEs may select examination centers carefully based on defined criteria. All
BISEs may select supervisory staff with good reputations and integrity and train them
on their critical role and responsibilities. Therefore, they may make the
examination/assessment/marking duty of teachers/principals obligatory, and no excuse
would be accepted in refusing examination duty.
15. Supply examination material on time, ensure the safety and security of material through
check and balance, and ensure the security of answer scripts in packing at the centers
and dispatch to Board offices. Maintain security and law and order situation at the
centers in collaboration with local administration and education institutions. Develop
a secure and safe storage system for answer scripts at the Board offices and ensure the
safety and security of answer scripts during the marking process.
16. Research & Development Units must be established at each Board with capacity
building and provision of necessary facilities to evaluate results, categorize institutions
based on results (subject-wise), and convey the same to the institutions. Prepare and
send “Performance Reports” to affiliated institutions highlighting the key
achievements as compared to overall averages with areas needing improvement,
preferably subject-wise.
17. Prepare and send “Performance Reports” to affiliated institutions highlighting the key
achievements as compared to overall averages with areas needing improvement,
preferably subject-wise.
18. Research key issues. Each Board is expected to conduct at least one research annually
based on the analysis of annual results of SSC and HSSC. IBCC Secretariat may
emphasize the member Boards for regular research and analysis of results. IBCC will
remain in touch with the member Boards on the subject issue.
19. Recheck marks of the top twenty position holders by subject/group and overall and
maintain zero tolerance in preparing examination results.
Page | 64
Model Assessment Framework
20. Organize Heads of Institutions conference as an annual feature to discuss ways and
means for improving school/college performance based on the lessons learned and
organize review and follow-up meetings involving officers of the Education
Department, Heads and other key stakeholders. Boards (BISEs and BTEs) will
organize the annual conference at the respective level. However, IBCC shall monitor
the implementation by the Boards.
According to the new grading scheme, students will be graded based on GPA and CGPA and
the GPA and CGPA which is also proposed to be included in certificates along with grades.
What is GPA?
Grade point average (GPA) is a raw score average based on the letter grades. Each letter grade
is assigned a numerical value from 0-4 or 5 points, depending on the institution's scale.
Unfortunately, there is no universal way to calculate GPA since methods vary by country and
by institution. However, to calculate GPA, the basic need is to find the grading scale, translate
each letter grade to a corresponding numerical value within the scale, and then average those
values to find the current GPA.
How to Calculate GPA and CGPA
1. Record the point value for each grade. Using the five-point scale, write down the correct
point value next to each grade. So, if you have an A+ in a class, record a 4.5.
2. Add up all the values of your grades. After recording the scores for your grades, add the
values up. So, let's say you received an A+ in Biology, a B+ in English, and a C in
Chemistry. You'd add up the totals in the following way: 4.5 + 3.3 + 2.5 = 10.3. Divide this
final number by the number of courses you are taking. If you have a value of 10.3 on a 5-
point scale for three courses, you will calculate the GPA using the following equation: 10.3
/ 3 = 3.4. You have a GPA of 3.4.
Page | 65
Model Assessment Framework
Table 13. Example of Grading Points for Grade 12
GPA = 5.0 + 5.0 + 5.0 + 5.0 + 5.0 + 5.0 + 4.5 = 34.5 / 7 = 4.92
CGPA = (GPA Grade-11 + GPA Grade-12) / 2
CGPA = (2.92 + 4.92) / 2 = 3.92 out of 5
The HEC, PMC, and PEC have already been taken on board with a new grading policy and a
new pattern of transcripts and certificates to devise admission criteria accordingly.
Page | 66
Model Assessment Framework
Page | 67
Model Assessment Framework
iv. Contextual Relevance:
o Where applicable, items should relate to real-life scenarios or culturally relevant
contexts to enhance understanding and engagement.
o Avoid culturally biased items that may disadvantage any group of students.
Page | 68
Model Assessment Framework
Step 6: Submit for Review: Once the items are finalized, submit them to the reviewers for
further evaluation, accompanied by the item’s intended SLO and cognitive level.
Page | 69
Model Assessment Framework
9.2.2 SOPs for Reviewers
The following are the SOPs and steps for reviewers, providing structured protocols to critically
evaluate and validate test items, ensuring they meet the required standards of quality, fairness,
and alignment with learning objectives.
Step 1: Review Alignment with SLOs: Check if each item aligns with a specific SLO.
Ensure that there is balanced coverage across all necessary content areas.
Step 2: Evaluate Cognitive Levels: Verify that a range of cognitive levels is represented,
ensuring a balance of recall, understanding, application, and analysis items.
Step 3: Test for Clarity and Simplicity: Review each item for clarity. Reword items that are
overly complex or confusing for students. Ensure the item instructions are clear and concise.
Step 4: Check for Bias and Inclusivity: Evaluate each item for potential cultural or gender
bias. Make necessary adjustments to ensure fairness across different student groups.
Step 5: Suggest Improvements: Provide constructive feedback to the item writer on how to
improve clarity, relevance, and cognitive level if necessary.
Step 6: Approve or Reject Items: Mark items as “approved,” “conditionally approved with
revisions,” or “rejected with reasons,” ensuring that only high-quality items are included in
the final assessment.
9.3 Guidelines and SOPs for Moderators
Moderators ensure that the overall test is fair, valid, and aligned with the curriculum.
Their role is to review the full set of items, ensuring balance in difficulty, cognitive
levels, and content coverage.
9.3.1 Guidelines for Moderators
The following are the guidelines for moderators, aimed at ensuring the quality and consistency
of assessments by overseeing the review and validation processes for test items and
examination papers.
i. Balanced Test Coverage:
o Ensure that the test adequately covers all key topics in the curriculum, with a
balanced representation of various subject areas.
o Ensure that the test has an appropriate mix of easy, moderate, and difficult items.
Page | 70
Model Assessment Framework
iv. Clarity and Structure:
o Ensure that the test follows a logical structure and that instructions for each section
are clear and easy to understand.
o Check the flow of the test, ensuring there are no abrupt shifts in difficulty or content
between sections.
v. Time Management:
o Ensure that the estimated time for completing the test is reasonable based on the
difficulty and number of items.
o Make sure that time limits for sections are clearly indicated.
Step 5: Prepare Moderation Report: Provide a detailed report on any changes made or
issues found during moderation, outlining how the test aligns with curriculum goals.
Page | 71
Model Assessment Framework
9.4 Guidelines and SOPs for Test Assembly
Test assembly involves putting together the final version of the assessment, ensuring that
items are logically ordered, visually clear, and balanced across topics and difficulty levels.
9.4.1 Guidelines for Test Assembly
The following are the guidelines for test assembly, focusing on creating balanced, fair, and
curriculum-aligned assessments by carefully selecting and organizing test items based on
cognitive levels, content coverage, and SLOs.
i. Logical Flow:
o Assemble items in a logical sequence, starting from easier items and progressing to
more challenging ones to prevent student fatigue.
o Group similar types of items (e.g., multiple-choice, short-answer) together.
ii. Clarity of Instructions:
o Ensure that instructions for each section are clear and unambiguous.
o Highlight any special instructions (e.g., for essays or problem-solving tasks) to avoid
confusion during the test.
iii. Visual Presentation:
o Ensure that the test is visually clear, with appropriate spacing between items and
enough space for students to write their answers.
o Check that diagrams, charts, and visuals are clearly printed and labeled.
iv. Consistency:
o Ensure that the formatting of all items (font, size, numbering) is consistent throughout
the test.
o Use clear and uniform numbering systems and make sure page numbers are clearly
indicated.
v. Time Allotment:
o Make sure that time allotments are indicated at the start of each section.
o Balance the time students are expected to spend on each section with the difficulty
and length of items.
Page | 72
Model Assessment Framework
v. Time Management:
o Start and end the test exactly on time. Provide warnings before the end of the test
period to help students manage their time.
Page | 73
Model Assessment Framework
vi. Handling Emergencies:
o Have a contingency plan in place to deal with emergencies (e.g., power outages,
medical issues).
o Train invigilators to handle disruptions while maintaining the integrity of the test
environment.
9.5.2 SOPs for Test Administration
The following are the SOPs for test administration, outlining standardized steps and
protocols for the smooth and secure execution of examinations, including handling unforeseen
situations and maintaining integrity.
Step 1: Distribute Test Materials: Ensure that students receive the correct test materials,
including question papers and answer sheets. Provide any required tools (e.g., calculators,
geometry sets).
Step 2: Monitor Students: Monitor students consistently throughout the test, ensuring that
no unauthorized collaboration or cheating occurs.
Step 3: Provide Time Warnings: Announce time warnings (e.g., 10 minutes remaining) to
help students manage their time.
Step 4: Handle Issues: If a student requires assistance (e.g., for clarification or technical
difficulties), handle it promptly without disrupting other students.
Step 5: Collect and Secure Test Materials: At the end of the test, collect all materials
promptly, ensuring no items are left behind. Securely store test papers for grading.
9.6 Guidelines and SOPs for Marking/Coding
The following are the guidelines and SOP for marking/coding, focusing on ensuring
consistency, accuracy, and impartiality in the evaluation process, while maintaining alignment
with predefined marking schemes.
9.6.1 Guidelines for Marking/Coding
Following are the key guidelines for marking and coding.
i. Use of Marking Schemes:
o Ensure that all markers are trained on and follow the official marking schemes or
rubrics.
o Marking schemes should provide clear guidance for each item, including
acceptable variations in responses.
iii. Confidentiality:
o Maintain confidentiality of student responses and grades during the marking
process.
iv. Timeliness:
o Ensure that marking is completed within the specified timeframe without
sacrificing quality or consistency.
Page | 74
Model Assessment Framework
v. Appeals Process:
o Establish a clear process for handling student appeals or requests for re-marking.
Page | 75
Model Assessment Framework
Resources
As part of the Model Assessment Framework’s commitment to enhancing assessment quality
and standardization, the Federal Board of Intermediate and Secondary Education (FBISE) has
developed a comprehensive subject-specific assessment framework for each core subject at the
SSC and HSSC levels, aligned with the 2022-23 Curriculum. These frameworks provide a
robust structure for educators and examiners to implement consistent and meaningful
assessments across Pakistan.
To access the detailed assessment frameworks for each subject, including SLOs and model
exam papers, please visit the IBCC’s website at https://ibcc.edu.pk/MAF/
This resource is a valuable tool for educators, examiners, and stakeholders, supporting the
successful implementation of the Model Assessment Framework by providing structured,
accessible subject-specific materials.
Page | 76
Model Assessment Framework
Lead Contributors
in Developing the Model Assessment Farmwork Policy Document
Dr. Ghulam Ali Mallah is the Executive Director of the Inter Boards
Coordination Commission (IBCC) and a distinguished scholar with
a Post-doctorate in Educational Technology & Academic
Leadership from the University of Glasgow. With over 26 years of
experience in the education sector, Dr. Mallah has made significant
contributions, particularly in digitalizing examination and
certification processes in Pakistan. He has led a range of
groundbreaking reforms in the national assessment system,
focusing on digitalization, policy formulation, and improving
Dr. Ghullam Ali Mallah efficiency. Under his leadership, innovative practices such as e-
Executive Director, office systems and ISO certification have been successfully
IBCC, Islamabad implemented at IBCC, transforming the assessment and
certification landscape. His vision for incorporating technology into
education has revolutionized the way assessments are conducted,
ensuring transparency, inclusivity, and adherence to international
standards. Dr. Mallah's forward-thinking approach continues to
shape national policies, making lasting contributions to the quality
and integrity of Pakistan's educational system.
Dr. Sajid Ali Yousuf Zai holds a PhD from the University of
Arkansas, USA, and is an accomplished expert in educational
research, data analytics, psychometrics analysis, and large-scale
assessments. With over 13 years of experience, he has contributed
to various national policy documents for FBISE, IBCC, and HEC,
including developing the Model Assessment Framework (MAF).
His work digitalizes the assessment process, aligns assessments
with Student Learning Outcomes (SLOs), and advances reliable,
inclusive, and standardized assessments. Dr. Sajid Ali Yousuf Zai’s
Dr. Sajid Ali Yousuf Zai expertise in Artificial Intelligence and EdTech tools enabled him to
Consultant, IBCC, develop comprehensive and interactive Exam Result Dashboards
Islamabad for examination boards, enhancing data visualization, analysis, and
reporting to make informed decisions. Dr. Sajid has also conducted
numerous professional trainings for teachers to enhance their
pedagogical skills and research capabilities and prepare them to
meet the challenges of 21st-century education. Additionally, he is
the author of two published books: "Quantitative Data Analysis:
Simply Explained using SPSS" and "21st Century Skills
Development".
Page | 77
Model Assessment Framework
Dr. Muhammad Shafique is the Chairman of the Board of
Intermediate and Secondary Education (BISE), Abbottabad, and a
seasoned educational assessment and evaluation expert. He has led
numerous initiatives, including developing Student Learning
Outcome (SLO) based examination systems and implementing
rubric-based on-screen marking. Dr. Shafique’s efforts have
significantly contributed to the modernization of the assessment
framework across Khyber Pakhtunkhwa, ensuring that exams align
with national standards and Pakistan's evolving educational needs.
Dr. Muhammad His work in teacher training and capacity building continues to
Shafique impact the country’s education system.
Chairman BISE,
Abbottabad
Mirza Ali is the Director of Research and Academics at FBISE, with
over 20 years of experience in academic leadership and assessment
strategies. He has significantly contributed to secondary education,
particularly in developing assessment frameworks and aligning
examinations with modern educational standards. As a key
contributor to the IBCC Assessment Framework, he has played a
vital role in its conceptualization and refinement, ensuring
alignment with international best practices for student evaluation.
Throughout his professional career, Mirza Ali has remained
Mirza Ali dedicated to enhancing the quality of education by focusing on
Director, Research & student-centric learning and the alignment of curriculum and
Academics, FBISE assessment practices with modern educational standards. His work
focuses on holistic student assessment, integrating cognitive levels
to foster critical thinking and lifelong learning. As a director, Mirza
Ali has spearheaded several initiatives to improve the assessment
processes within the federal education system. His work
emphasizes the integration of cognitive levels in assessments,
catering to knowledge, understanding, application, analysis, and
synthesis, thus preparing students for both academic excellence and
real-world challenges.
Ms. Riffat Jabeen is the Director of Academics at the Federal
Directorate of Education (FDE), where she manages academic
programs for over 432 educational institutions in Islamabad
territory, serving around 226,000 students. With expertise in
Applied Psychology and Science Education, she has been at the
forefront of curriculum development, centralized examinations, and
professional development programs. Her leadership in initiatives
such as the Tele-School program during the COVID-19 pandemic
has significantly impacted education access. During the
Ms. Riffat Jabeen development of the Model Assessment Framework, her input and
Director Academics valuable suggestions were instrumental in shaping the final policy
Federal Directorate of document. She has led innovative projects such as the Tele-School
Education, Islamabad initiative, educating millions of students. She has pioneered several
initiatives to improve Pakistan's literacy, STEM education, and
blended learning.
Page | 78
Model Assessment Framework
Dr. Nasir Mahmood is a renowned figure in educational research
and assessment in Pakistan, currently serving as Director of
Assessment at the Punjab Examination Commission. He holds a
Ph.D. in Education from the University of Education. Dr.
Mahmood’s expertise lies in psychometrics, educational assessment
frameworks, and data analysis, using advanced methodologies like
Classical Test Theory and Item Response Theory. His contributions
include the development of item banks, formative and summative
assessments, and large-scale evaluations that have enhanced the
quality and fairness of education in Punjab. Internationally trained
at London Metropolitan University, he has collaborated with
leading organizations such as the World Bank and the National
Dr. Nasir Mahmood Foundation for Educational Research (UK). His career is defined by
Director leadership in educational reforms, professional training for
Punjab Examination educators, and a commitment to improving student learning
Commission outcomes across Pakistan.
Dr. Muhammad Zaigham Qadeer is the Director of Policy Research
at the Pakistan Institute of Education (PIE), where he leads key
education policy initiatives. He actively developed the Draft
Education Plan and the National Education Policy Framework
2024, supporting provincial governments in policy formation. Dr.
Zaigham has contributed to the PIE’s Three-Year Operational Plan,
Policy Briefs on Girls' Education and Public Financing, and has led
policy dialogues to impact the education sector. With a PhD in
Curriculum Evaluation, he has extensive academic and
Dr. Muhammad administrative experience, collaborating with international
Zaigham Qadeer organizations such as UNESCO, JICA, and the World Bank to drive
Director, Policy Research educational reforms in Pakistan.
Pakistan Institute of
Education (PIE)
Dr. Shafqat Ali Janjua is the Joint Education Advisor in the Ministry
of Federal Education and Professional Training and the Lead of the
National Curriculum Council. He has played a key role in
developing numerous national education policies and frameworks,
including the National Curriculum Framework 2024, the Adult
Literacy Curriculum 2024, and the Inclusive Scheme of Studies
2024. His contributions to curriculum development, teacher
professional development, and curriculum-based assessment are
well-recognized. He has authored language textbooks and
Dr. Shafqat Janjua contributed to significant initiatives such as the SOLO-Based
Joint Education Advisor Assessment Manual for FBISE and the National Professional
Ministry of Federal Standards for Teachers in Pakistan. He has formulated and
Education and implemented key indicators for teacher performance appraisal in
Professional Training educational institutions under the Federal Directorate of
Education Islamabad. Dr. Janjua has held various leadership roles
in education and remains dedicated to improving Pakistan's teacher
performance and educational standards.
Page | 79
Model Assessment Framework
Dr. Muhammad Idrees has more than twenty years of experience in
teaching, academics, and administration in different positions. He
got his doctorate in the discipline of Education. During his
doctorate, he got special expertise in curriculum-based assessment
from the University of Leicester. He has teaching experience at the
school, college, and university levels. He has extensive experience
working in curriculum and textbook development, review, and
approval in the Federal Ministry of Education in various positions.
He has served as Director of academics and curriculum in the
Dr. Muhammad Idrees Higher Education Commission where he has developed curriculums
Director, National of various disciples on Outcome Based Education (OBE). He has
Assessment experience heading a historical and renowned prestigious institute
Coordination, of higher education in Punjab. Currently, he is heading to the
Pakistan Institute of National Assessment Wing of the Pakistan Institute of Education,
Education (PIE) Ministry of Federal Education and Professional
Training, Islamabad.
Syed Amar Hassan Gilani is the director of coordination and legal
affairs at IBCC, Islamabad. In developing the Model Assessment
Framework (MAF), Mr. Gilani's contributions were pivotal. He led
the coordination efforts with key stakeholders, sharing drafts,
collecting feedback, and compiling suggestions to shape a truly
representative framework. Through his management skills, he
ensured that all voices—from provincial boards to subject experts
and policy advisors—were heard, making the MAF a collaborative
national initiative. His dedication to involving diverse stakeholders
Syed Ammar Hassan has been instrumental in creating a framework that reflects
Gilani Pakistan’s educational diversity and regional needs.
Director Coordination
and Legal Affairs
IBCC, Islamabad
Page | 80