SlideShare a Scribd company logo
1
Software Project Management
Session 10: Integration & Testing
2
Today
• Software Quality Assurance
• Integration
• Test planning
• Types of testing
• Test metrics
• Test tools
• More MS-Project how-to
3
Session 9 Review
• Project Control
– Planning
– Measuring
– Evaluating
– Acting
• MS Project
4
Earned Value Analysis
• BCWS
• BCWP
• Earned value
• ACWP
• Variances
• CV, SV
• Ratios
• SPI, CPI. CR
• Benefits
– Consistency, forecasting, early warning
5
MS Project
• Continued
6
Deliverables by Phase
Software
Concept
Requirements
Analysis
Design
Coding and
Debugging
Systems
Testing
Deployment &
Maintenance
Possible Deliverables by Phase
 Concept Document
 Statement of Work (SOW)
 Project Charter
 RFP & Proposal
 Requirements Document (Software Requirements Specification)
 Work Breakdown Structure (WBS)
 Functional Specification ( Top Level Design Specification)
 Entity Relationship Diagram
 Data Flow Diagram
 Detailed Design Specification
 Object Diagrams
 Detailed Data Model
 Coding Standards
 Working Code
 Unit Tests
 Acceptance Test Procedures
 Tested Application
 Maintenance Specification
 Deployed Application
 Project Development Plan
 (Software Development Plan )
 Baseline Project Plan
 Quality Assurance Plan
 Configuration Management Plan
 Risk Management Plan
 Integration Plan
 Detailed SQA Test Plan
 SQA Test Cases
 User Documentation
 Training Plan
7
If 99.9% Were Good Enough
• 9,703 checks would be deducted from the
wrong bank accounts each hour
• 27,800 pieces of mail would be lost per hour
• 3,000,000 incorrect drug prescriptions per year
• 8,605 commercial aircraft takeoffs would
annually result in crashes
Futrell, Shafer, Shafer, “Quality Software Project Management”, 2002
8
Development Costs
7%
16%
24%
24%
29%
Requirements
Preliminary Design
Detailed Design
Code & Unit Test
Integration & System
Test
9
Integration & Testing
• Development/Integration/Testing
• Most common place for schedule & activity overlap
• Sometimes Integration/Testing thought of
as one phase
• Progressively aggregates functionality
• QA team works in parallel with dev. team
10
Integration Approaches
• Top Down
• Core or overarching system(s) implemented 1st
• Combined into minimal “shell” system
• “Stubs” are used to fill-out incomplete sections
– Eventually replaced by actual modules
• Bottom Up
• Starts with individual modules and builds-up
• Individual units (after unit testing) are combined
into sub-systems
• Sub-systems are combined into the whole
11
Integration
• Who does integration testing?
– Can be either development and/or QA team
• Staffing and budget are at peak
• “Crunch mode”
• Issues
• Pressure
• Delivery date nears
• Unexpected failures (bugs)
• Motivation issues
• User acceptance conflicts
12
Validation and Verification
• V & V
• Validation
– Are we building the right product?
• Verification
– Are we building the product right?
– Testing
– Inspection
– Static analysis
13
Quality Assurance
• QA or SQA (Software Quality Assurance)
• Good QA comes from good process
• When does SQA begin?
– During requirements
• A CMM Level 2 function
• QA is your best window into the project
14
Test Plans (SQAP)
• Software Quality Assurance Plan
– Should be complete near end of requirements
• See example
– Even use the IEEE 730 standard
15
SQAP
• Standard sections
– Purpose
– Reference documents
– Management
– Documentation
– Standards, practices, conventions, metrics
• Quality measures
• Testing practices
16
SQAP
• Standard sections continued
– Reviews and Audits
• Process and specific reviews
– Requirements Review (SRR)
– Test Plan Review
– Code reviews
– Post-mortem review
– Risk Management
• Tie-in QA to overall risk mgmt. Plan
– Problem Reporting and Corrective Action
– Tools, Techniques, Methodologies
– Records Collection and Retention
17
Software Quality
• Traceability
• Ability to track relationship between work products
• Ex: how well do requirements/design/test cases
match
• Formal Reviews
• Conducted at the end of each lifecycle phase
• SRR, CDR, etc.
18
Testing
• Exercising computer program with
predetermined inputs
• Comparing the actual results against the
expected results
• Testing is a form of sampling
• Cannot absolutely prove absence of defects
• All software has bugs. Period.
• Testing is not debugging.
19
Test Cases
• Key elements of a test plan
• May include scripts, data, checklists
• May map to a Requirements Coverage
Matrix
• A traceability tool
20
Rework
• Software equivalent of “scrap” in manufacturing
6%
1% 12%
4%
16%
8%
12%
12%
10%
19%
0%
5%
10%
15%
20%
25%
30%
Requirements Detailed
Design
Integration &
System Test
Rew ork
Production
21
Sources of Defects
27%
10%
7%
56%
Design
Other
Code
Requirements
22
V Process Model
Product
Requirements and
Specification
Analysis
Project
Requirements and
Planning
Production,
Operations, and
Maintenance
System Testing
and Acceptance
Testing
Integration and
Testing
Unit Testing
Coding
Detailed Design
High-Level Desig
Non-functional
Requirements
Load &
Performance Test
User Interface
Design
Usability Test
23
Project Testing Flow
• Unit Testing
• Integration Testing
• System Testing
• User Acceptance Testing
24
Black-Box Testing
• Functional Testing
• Program is a “black-box”
– Not concerned with how it works but what it
does
– Focus on inputs & outputs
• Test cases are based on SRS (specs)
25
White-Box Testing
• Accounts for the structure of the program
• Coverage
– Statements executed
– Paths followed through the code
26
Unit Testing
• a.k.a. Module Testing
• Type of white-box testing
– Sometimes treated black-box
• Who does Unit Testing?
• Developers
• Unit tests are written in code
– Same language as the module
– a.k.a. “Test drivers”
• When do Unit Testing?
• Ongoing during development
• As individual modules are completed
27
Unit Testing
• Individual tests can be grouped
– “Test Suites”
• JUnit
• Part of the XP methodology
• “Test-first programming”
28
Integration Testing
• Testing interfaces between components
• First step after Unit Testing
• Components may work alone but fail when
put together
• Defect may exist in one module but
manifest in another
• Black-box tests
29
System Testing
• Testing the complete system
• A type of black-box testing
30
User Acceptance Testing
• Last milestone in testing phase
• Ultimate customer test & sign-off
• Sometimes synonymous with beta tests
• Customer is satisfied software meets their
requirements
• Based on “Acceptance Criteria”
– Conditions the software must meet for customer to
accept the system
– Ideally defined before contract is signed
– Use quantifiable, measurable conditions
31
Regression Testing
– Re-running of tests after fixes or changes are
made to software or the environment
– EX: QA finds defect, developer fixes, QA runs
regression test to verify
– Automated tools very helpful for this
32
Compatibility Testing
– Testing against other “platforms”
• Ex: Testing against multiple browsers
• Does it work under Netscape/IE, Windows/Mac
33
External Testing Milestones
• Alpha 1st
, Beta 2nd
• Testing by users outside the organization
• Typically done by users
• Alpha release
• Given to very limited user set
• Product is not feature-complete
• During later portions of test phase
• Beta release
• Customer testing and evaluation
• Most important feature
• Preferably after software stabilizes
34
External Testing Milestones
• Value of Beta Testing
• Testing in the real world
• Getting a software assessment
• Marketing
• Augmenting you staff
• Do not determine features based on it
• Too late!
• Beta testers must be “recruited”
• From: Existing base, marketing, tech support, site
• Requires the role of “Beta Manager”
• All this must be scheduled by PM
35
External Testing Milestones
• Release Candidate (RC)
• To be sent to manufacturing if testing successful
• Release to Manufacturing (RTM)
• Production release formally sent to manufacturing
• Aim for a “stabilization period” before each
of these milestones
• Team focus on quality, integration, stability
36
Test Scripts
• Two meanings
• 1. Set of step-by-step instructions intended
to lead test personnel through tests
– List of all actions and expected responses
• 2. Automated test script (program)
37
Static Testing
• Reviews
• Most artifacts can be reviewed
• Proposal, contract, schedule, requirements, code,
data model, test plans
– Peer Reviews
• Methodical examination of software work products
by peers to identify defects and necessary changes
• Goal: remove defects early and efficiently
• Planned by PM, performed in meetings, documented
• CMM Level 3 activity
38
Automated Testing
• Human testers = inefficient
• Pros
• Lowers overall cost of testing
• Tools can run unattended
• Tools run through ‘suites’ faster than people
• Great for regression and compatibility tests
• Tests create a body of knowledge
• Can reduce QA staff size
• Cons
• But not everything can be automated
• Learning curve or expertise in tools
• Cost of high-end tools $5-80K (low-end are still cheap)
39
Test Tools
• Capture & Playback
• Coverage Analysis
• Performance Testing
• Test Case Management
40
Load & Stress Testing
• Push system beyond capacity limits
• Often done via automated scripts
• By the QA team
• Near end of functional tests
• Can show
– Hidden functional issues
– Maximum system capacity
– Unacceptable data or service loss
– Determine if “Performance Requirements” met
• Remember, these are part of “non-functional” requirements
41
Load & Stress Testing
• Metrics
– Minimal acceptable response time
– Minimal acceptable number of concurrent users
– Minimal acceptable downtime
• Vendors: High-End
– Segue
– Mercury
– Empirix
42
Performance Metrics
Source: Athens Consulting Group
Bad Good
Must support 500 users Must support 500
simultaneous users
10 second response time [Average|Maximum|90th
percentile] response time
must be X seconds
Must handle 1M hits per
day
Must handle peak load
of 28 page requests per
second
43
Other Testing
• Installation Testing
– Very important if not a Web-based system
– Can lead to high support costs and customer
dissatisfaction
• Usability Testing
– Verification of user satisfaction
• Navigability
• User-friendliness
• Ability to accomplish primary tasks
44
Miscellaneous
• Pareto Analysis
– The 80-20 rule
• 80% of defects from 20% of code
– Identifying the problem modules
• Phase Containment
– Testing at the end of each phase
– Prevent problems moving phase-to-phase
• Burn-in
– Allowing system to run “longer” period of time
– Variation of stress testing
45
Miscellaneous
• “Code Freeze”
– When developers stop writing new code and
only do bug fixes
– Occurs at a varying point in integration/testing
• Tester-to-Coder Ratio
– It depends
– Often 1:3 or 1:4
– QA staff size grows: QA Mgr and/or lead early
46
Stopping Testing
• When do you stop?
• Rarely are all defects “closed” by release
• Shoot for all Critical/High/Medium defects
• Often, occurs when time runs out
• Final Sign-off (see also UAT)
• By: customers, engineering, product mgmt.,
47
Test Metrics
• Load: Max. acceptable response time, min. # of
simultaneous users
• Disaster: Max. allowable downtime
• Compatibility: Min/Max. browsers & OS’s
supported
• Usability: Min. approval rating from focus groups
• Functional: Requirements coverage; 100% pass
rate for automated test suites
48
Defect Metrics
• These are very important to the PM
• Number of outstanding defects
– Ranked by severity
• Critical, High, Medium, Low
• Showstoppers
• Opened vs. closed
49
Defect Tracking
• Get tools to do this for you
– Bugzilla, TestTrack Pro, Rational ClearCase
– Some good ones are free or low-cost
• Make sure all necessary team members have
access (meaning nearly all)
• Have regular ‘defect review meetings’
– Can be weekly early in test, daily in crunch
• Who can enter defects into the tracking system?
– Lots of people: QA staff, developers, analysts,
managers, (sometimes) users, PM
50
Defect Tracking
• Fields
– State: open, closed, pending
– Date created, updated, closed
– Description of problem
– Release/version number
– Person submitting
– Priority: low, medium, high, critical
– Comments: by QA, developer, other
51
Defect Metrics
• Open Rates
– How many new bugs over a period of time
• Close Rates
– How many closed over that same period
– Ex: 10 bugs/day
• Change Rate
– Number of times the same issue updated
• Fix Failed Counts
– Fixes that didn’t really fix (still open)
– One measure of “vibration” in project
52
Defect Rates
• Microsoft Study
– 10-20/KLOC during test
– 0.5/KLOC after release
53
Test Environments
• You need to test somewhere. Where?
• Typically separate hardware/network
environment(s)
54
Hardware Environments
• Development
• QA
• Staging (optional)
• Production
55
Hardware Environments
• Typical environments
– Development
• Where programmers work
• Unit tests happen here
– Test
• For integration, system, and regression testing
– Stage
• For burn-in and load testing
– Production
• Final deployment environment(s)
56
Web Site Testing
• Unique factors
– Distributed (N-tiers, can be many)
– Very high availability needs
– Uses public network (Internet)
– Large number of platforms (browsers + OS)
• 5 causes of most site failures (Jupiter, 1999)
– Internal network performance
– External network performance
– Hardware performance
– Unforeseeable traffic spikes
– Web application performance
57
Web Site Testing
• Commercial Tools: Load Test & Site Management
– Mercury Interactive
• SiteScope, SiteSeer
– Segue
• Commercial Subscription Services
– Keynote Systems
• Monitoring Tools
• Availability: More “Nines” = More $’s
• Must balance QA & availability costs vs. benefits
58
QA Roles
• QA Manager
• Hires QA team; creates test plans; selects tools; manages team
• Salary: $50-80K/yr, $50-100/hr
• Test Developer/Test Engineer
• Performs functional tests; develops automated scripts
• Salary: $35-70K/yr, $40-100/hr
• System Administrator
• Supports QA functions but not official QA team member
• Copy Editor/Documentation Writer
• Supports QA; also not part of official team
59
MS-Project Q&A
60
Homework
• McConnell: 16 “Project Recovery”
• Schwalbe: 16 “Closing”
• Your final MS-Project schedule due class after
next
– Add resources and dependencies to your plan
– Add durations and costs
– Send interim versions
– Remember, most important part of your grade
– Get to me with any questions
• Iterate & get feedback
• Don’t work in the dark
61
Questions?

More Related Content

What's hot (20)

PPT
Software Project Management (lecture 3)
Syed Muhammad Hammad
 
PPTX
Agile Method - Lec 1-2-3
Ahmed Alageed
 
PPTX
CS 414 (IT Project Management)
raszky
 
PPTX
Chapter 03
andyburghardt
 
PPT
Other software processes (Software project Management)
Ankit Gupta
 
PPT
Software Engineering (Project Management )
ShudipPal
 
PPT
Project management
Usman Bin Saad
 
PDF
Software engineering jwfiles 3
Azhar Shaik
 
PDF
Planning Phase - P&MSP2010 (3/11)
Emanuele Della Valle
 
PPTX
Mg6088 spm unit-2
SIMONTHOMAS S
 
PPTX
Beit 381 se lec 3 - 46 - 12 feb14 - sd needs teams to develop intro
babak danyal
 
PDF
EIS_Case_Study_29march2016
Tanaya Bose
 
PPT
Spm lecture-3
Sulman Ahmed
 
PPT
Presentation of se
Usman Bin Saad
 
PPT
PM Symposium RUP UC Realization
Terry Startzel, MS, PMP, SCPM, CSM
 
PPTX
Software Project Management
ShauryaGupta38
 
PPT
Spm lecture 1
Sulman Ahmed
 
PDF
Software management framework
Kuppusamy P
 
PPTX
Rational unified process
naveed428
 
PPT
Software Project Management Basics
Amarjeet Singh
 
Software Project Management (lecture 3)
Syed Muhammad Hammad
 
Agile Method - Lec 1-2-3
Ahmed Alageed
 
CS 414 (IT Project Management)
raszky
 
Chapter 03
andyburghardt
 
Other software processes (Software project Management)
Ankit Gupta
 
Software Engineering (Project Management )
ShudipPal
 
Project management
Usman Bin Saad
 
Software engineering jwfiles 3
Azhar Shaik
 
Planning Phase - P&MSP2010 (3/11)
Emanuele Della Valle
 
Mg6088 spm unit-2
SIMONTHOMAS S
 
Beit 381 se lec 3 - 46 - 12 feb14 - sd needs teams to develop intro
babak danyal
 
EIS_Case_Study_29march2016
Tanaya Bose
 
Spm lecture-3
Sulman Ahmed
 
Presentation of se
Usman Bin Saad
 
PM Symposium RUP UC Realization
Terry Startzel, MS, PMP, SCPM, CSM
 
Software Project Management
ShauryaGupta38
 
Spm lecture 1
Sulman Ahmed
 
Software management framework
Kuppusamy P
 
Rational unified process
naveed428
 
Software Project Management Basics
Amarjeet Singh
 

Similar to Software Project Management lecture 10 (20)

ODP
QA Process Overview
Deepak Rathod
 
PDF
Software testing methods, levels and types
Confiz
 
PPT
Lec25
Omkar Gupta
 
PPT
Sw Software QA Testing
jonathan077070
 
PPTX
Software testing a guide from experience
Rajakrishnan S, MCA,MBA,MA Phil,PMP,CSM,ISTQB-Test Mgr,ITIL
 
PPT
_VoicePPT_QA_Testing_Training_4_Days_Schedule.ppt
AnilKumarARS
 
PPTX
An introduction to Software Testing and Test Management
Anuraj S.L
 
PPT
Software Engineering (Testing Overview)
ShudipPal
 
PPTX
Software Risk and Quality management.pptx
HassanBangash9
 
PDF
manualtesting-170218090020 (1).pdf
peramdevi06
 
PPT
Software Testing Presentation in Cegonsoft Pvt Ltd...
ChithraCegon
 
PPT
Software Testing Tutorials - MindScripts Technologies, Pune
sanjayjadhav8789
 
PPTX
Software testing
Madhumita Chatterjee
 
PPTX
Object Oriented Testing
AMITJain879
 
PDF
Manual Testing software testing all slide
SmileySmiley39
 
PPTX
Structured system analysis and design
Jayant Dalvi
 
PPT
1 sqa and testing concepts
sulaimanr85
 
PDF
sample-test-plan-template.pdf
empite
 
PDF
Sample test-plan-template
Dell R&D Center, Bangalore
 
PPTX
CTFL Module 02
Davis Thomas
 
QA Process Overview
Deepak Rathod
 
Software testing methods, levels and types
Confiz
 
Sw Software QA Testing
jonathan077070
 
Software testing a guide from experience
Rajakrishnan S, MCA,MBA,MA Phil,PMP,CSM,ISTQB-Test Mgr,ITIL
 
_VoicePPT_QA_Testing_Training_4_Days_Schedule.ppt
AnilKumarARS
 
An introduction to Software Testing and Test Management
Anuraj S.L
 
Software Engineering (Testing Overview)
ShudipPal
 
Software Risk and Quality management.pptx
HassanBangash9
 
manualtesting-170218090020 (1).pdf
peramdevi06
 
Software Testing Presentation in Cegonsoft Pvt Ltd...
ChithraCegon
 
Software Testing Tutorials - MindScripts Technologies, Pune
sanjayjadhav8789
 
Software testing
Madhumita Chatterjee
 
Object Oriented Testing
AMITJain879
 
Manual Testing software testing all slide
SmileySmiley39
 
Structured system analysis and design
Jayant Dalvi
 
1 sqa and testing concepts
sulaimanr85
 
sample-test-plan-template.pdf
empite
 
Sample test-plan-template
Dell R&D Center, Bangalore
 
CTFL Module 02
Davis Thomas
 
Ad

More from Syed Muhammad Hammad (8)

PPT
Software Project Management (lecture 4)
Syed Muhammad Hammad
 
PPT
Software Project Management( lecture 1)
Syed Muhammad Hammad
 
PDF
Java easy learning
Syed Muhammad Hammad
 
PDF
An interdisciplinary course_in_digital_image_processing
Syed Muhammad Hammad
 
PDF
Image Segmentation
Syed Muhammad Hammad
 
PDF
Image processing tatorial
Syed Muhammad Hammad
 
PDF
Mat-lab image processing tatorial
Syed Muhammad Hammad
 
Software Project Management (lecture 4)
Syed Muhammad Hammad
 
Software Project Management( lecture 1)
Syed Muhammad Hammad
 
Java easy learning
Syed Muhammad Hammad
 
An interdisciplinary course_in_digital_image_processing
Syed Muhammad Hammad
 
Image Segmentation
Syed Muhammad Hammad
 
Image processing tatorial
Syed Muhammad Hammad
 
Mat-lab image processing tatorial
Syed Muhammad Hammad
 
Ad

Recently uploaded (20)

PDF
Our Guide to the July 2025 USPS® Rate Change
Postal Advocate Inc.
 
PDF
WATERSHED MANAGEMENT CASE STUDIES - ULUGURU MOUNTAINS AND ARVARI RIVERpdf
Ar.Asna
 
PPTX
How to Setup Automatic Reordering Rule in Odoo 18 Inventory
Celine George
 
PDF
Supply Chain Security A Comprehensive Approach 1st Edition Arthur G. Arway
rxgnika452
 
PPTX
Exploring Linear and Angular Quantities and Ergonomic Design.pptx
AngeliqueTolentinoDe
 
PPTX
Natural Language processing using nltk.pptx
Ramakrishna Reddy Bijjam
 
PDF
Cooperative wireless communications 1st Edition Yan Zhang
jsphyftmkb123
 
PDF
I3PM Case study smart parking 2025 with uptoIP® and ABP
MIPLM
 
PDF
The Power of Compound Interest (Stanford Initiative for Financial Decision-Ma...
Stanford IFDM
 
PPTX
Building Powerful Agentic AI with Google ADK, MCP, RAG, and Ollama.pptx
Tamanna36
 
PDF
Wikinomics How Mass Collaboration Changes Everything Don Tapscott
wcsqyzf5909
 
PPTX
Connecting Linear and Angular Quantities in Human Movement.pptx
AngeliqueTolentinoDe
 
PPTX
How to Add a Custom Button in Odoo 18 POS Screen
Celine George
 
PDF
AI-assisted IP-Design lecture from the MIPLM 2025
MIPLM
 
PDF
I3PM Industry Case Study Siemens on Strategic and Value-Oriented IP Management
MIPLM
 
PDF
IMPORTANT GUIDELINES FOR M.Sc.ZOOLOGY DISSERTATION
raviralanaresh2
 
PDF
Andreas Schleicher_Teaching Compass_Education 2040.pdf
EduSkills OECD
 
PPTX
Iván Bornacelly - Presentation of the report - Empowering the workforce in th...
EduSkills OECD
 
PDF
Lesson 1 - Nature of Inquiry and Research.pdf
marvinnbustamante1
 
PDF
Indian National movement PPT by Simanchala Sarab, Covering The INC(Formation,...
Simanchala Sarab, BABed(ITEP Secondary stage) in History student at GNDU Amritsar
 
Our Guide to the July 2025 USPS® Rate Change
Postal Advocate Inc.
 
WATERSHED MANAGEMENT CASE STUDIES - ULUGURU MOUNTAINS AND ARVARI RIVERpdf
Ar.Asna
 
How to Setup Automatic Reordering Rule in Odoo 18 Inventory
Celine George
 
Supply Chain Security A Comprehensive Approach 1st Edition Arthur G. Arway
rxgnika452
 
Exploring Linear and Angular Quantities and Ergonomic Design.pptx
AngeliqueTolentinoDe
 
Natural Language processing using nltk.pptx
Ramakrishna Reddy Bijjam
 
Cooperative wireless communications 1st Edition Yan Zhang
jsphyftmkb123
 
I3PM Case study smart parking 2025 with uptoIP® and ABP
MIPLM
 
The Power of Compound Interest (Stanford Initiative for Financial Decision-Ma...
Stanford IFDM
 
Building Powerful Agentic AI with Google ADK, MCP, RAG, and Ollama.pptx
Tamanna36
 
Wikinomics How Mass Collaboration Changes Everything Don Tapscott
wcsqyzf5909
 
Connecting Linear and Angular Quantities in Human Movement.pptx
AngeliqueTolentinoDe
 
How to Add a Custom Button in Odoo 18 POS Screen
Celine George
 
AI-assisted IP-Design lecture from the MIPLM 2025
MIPLM
 
I3PM Industry Case Study Siemens on Strategic and Value-Oriented IP Management
MIPLM
 
IMPORTANT GUIDELINES FOR M.Sc.ZOOLOGY DISSERTATION
raviralanaresh2
 
Andreas Schleicher_Teaching Compass_Education 2040.pdf
EduSkills OECD
 
Iván Bornacelly - Presentation of the report - Empowering the workforce in th...
EduSkills OECD
 
Lesson 1 - Nature of Inquiry and Research.pdf
marvinnbustamante1
 
Indian National movement PPT by Simanchala Sarab, Covering The INC(Formation,...
Simanchala Sarab, BABed(ITEP Secondary stage) in History student at GNDU Amritsar
 

Software Project Management lecture 10

  • 1. 1 Software Project Management Session 10: Integration & Testing
  • 2. 2 Today • Software Quality Assurance • Integration • Test planning • Types of testing • Test metrics • Test tools • More MS-Project how-to
  • 3. 3 Session 9 Review • Project Control – Planning – Measuring – Evaluating – Acting • MS Project
  • 4. 4 Earned Value Analysis • BCWS • BCWP • Earned value • ACWP • Variances • CV, SV • Ratios • SPI, CPI. CR • Benefits – Consistency, forecasting, early warning
  • 6. 6 Deliverables by Phase Software Concept Requirements Analysis Design Coding and Debugging Systems Testing Deployment & Maintenance Possible Deliverables by Phase  Concept Document  Statement of Work (SOW)  Project Charter  RFP & Proposal  Requirements Document (Software Requirements Specification)  Work Breakdown Structure (WBS)  Functional Specification ( Top Level Design Specification)  Entity Relationship Diagram  Data Flow Diagram  Detailed Design Specification  Object Diagrams  Detailed Data Model  Coding Standards  Working Code  Unit Tests  Acceptance Test Procedures  Tested Application  Maintenance Specification  Deployed Application  Project Development Plan  (Software Development Plan )  Baseline Project Plan  Quality Assurance Plan  Configuration Management Plan  Risk Management Plan  Integration Plan  Detailed SQA Test Plan  SQA Test Cases  User Documentation  Training Plan
  • 7. 7 If 99.9% Were Good Enough • 9,703 checks would be deducted from the wrong bank accounts each hour • 27,800 pieces of mail would be lost per hour • 3,000,000 incorrect drug prescriptions per year • 8,605 commercial aircraft takeoffs would annually result in crashes Futrell, Shafer, Shafer, “Quality Software Project Management”, 2002
  • 9. 9 Integration & Testing • Development/Integration/Testing • Most common place for schedule & activity overlap • Sometimes Integration/Testing thought of as one phase • Progressively aggregates functionality • QA team works in parallel with dev. team
  • 10. 10 Integration Approaches • Top Down • Core or overarching system(s) implemented 1st • Combined into minimal “shell” system • “Stubs” are used to fill-out incomplete sections – Eventually replaced by actual modules • Bottom Up • Starts with individual modules and builds-up • Individual units (after unit testing) are combined into sub-systems • Sub-systems are combined into the whole
  • 11. 11 Integration • Who does integration testing? – Can be either development and/or QA team • Staffing and budget are at peak • “Crunch mode” • Issues • Pressure • Delivery date nears • Unexpected failures (bugs) • Motivation issues • User acceptance conflicts
  • 12. 12 Validation and Verification • V & V • Validation – Are we building the right product? • Verification – Are we building the product right? – Testing – Inspection – Static analysis
  • 13. 13 Quality Assurance • QA or SQA (Software Quality Assurance) • Good QA comes from good process • When does SQA begin? – During requirements • A CMM Level 2 function • QA is your best window into the project
  • 14. 14 Test Plans (SQAP) • Software Quality Assurance Plan – Should be complete near end of requirements • See example – Even use the IEEE 730 standard
  • 15. 15 SQAP • Standard sections – Purpose – Reference documents – Management – Documentation – Standards, practices, conventions, metrics • Quality measures • Testing practices
  • 16. 16 SQAP • Standard sections continued – Reviews and Audits • Process and specific reviews – Requirements Review (SRR) – Test Plan Review – Code reviews – Post-mortem review – Risk Management • Tie-in QA to overall risk mgmt. Plan – Problem Reporting and Corrective Action – Tools, Techniques, Methodologies – Records Collection and Retention
  • 17. 17 Software Quality • Traceability • Ability to track relationship between work products • Ex: how well do requirements/design/test cases match • Formal Reviews • Conducted at the end of each lifecycle phase • SRR, CDR, etc.
  • 18. 18 Testing • Exercising computer program with predetermined inputs • Comparing the actual results against the expected results • Testing is a form of sampling • Cannot absolutely prove absence of defects • All software has bugs. Period. • Testing is not debugging.
  • 19. 19 Test Cases • Key elements of a test plan • May include scripts, data, checklists • May map to a Requirements Coverage Matrix • A traceability tool
  • 20. 20 Rework • Software equivalent of “scrap” in manufacturing 6% 1% 12% 4% 16% 8% 12% 12% 10% 19% 0% 5% 10% 15% 20% 25% 30% Requirements Detailed Design Integration & System Test Rew ork Production
  • 22. 22 V Process Model Product Requirements and Specification Analysis Project Requirements and Planning Production, Operations, and Maintenance System Testing and Acceptance Testing Integration and Testing Unit Testing Coding Detailed Design High-Level Desig Non-functional Requirements Load & Performance Test User Interface Design Usability Test
  • 23. 23 Project Testing Flow • Unit Testing • Integration Testing • System Testing • User Acceptance Testing
  • 24. 24 Black-Box Testing • Functional Testing • Program is a “black-box” – Not concerned with how it works but what it does – Focus on inputs & outputs • Test cases are based on SRS (specs)
  • 25. 25 White-Box Testing • Accounts for the structure of the program • Coverage – Statements executed – Paths followed through the code
  • 26. 26 Unit Testing • a.k.a. Module Testing • Type of white-box testing – Sometimes treated black-box • Who does Unit Testing? • Developers • Unit tests are written in code – Same language as the module – a.k.a. “Test drivers” • When do Unit Testing? • Ongoing during development • As individual modules are completed
  • 27. 27 Unit Testing • Individual tests can be grouped – “Test Suites” • JUnit • Part of the XP methodology • “Test-first programming”
  • 28. 28 Integration Testing • Testing interfaces between components • First step after Unit Testing • Components may work alone but fail when put together • Defect may exist in one module but manifest in another • Black-box tests
  • 29. 29 System Testing • Testing the complete system • A type of black-box testing
  • 30. 30 User Acceptance Testing • Last milestone in testing phase • Ultimate customer test & sign-off • Sometimes synonymous with beta tests • Customer is satisfied software meets their requirements • Based on “Acceptance Criteria” – Conditions the software must meet for customer to accept the system – Ideally defined before contract is signed – Use quantifiable, measurable conditions
  • 31. 31 Regression Testing – Re-running of tests after fixes or changes are made to software or the environment – EX: QA finds defect, developer fixes, QA runs regression test to verify – Automated tools very helpful for this
  • 32. 32 Compatibility Testing – Testing against other “platforms” • Ex: Testing against multiple browsers • Does it work under Netscape/IE, Windows/Mac
  • 33. 33 External Testing Milestones • Alpha 1st , Beta 2nd • Testing by users outside the organization • Typically done by users • Alpha release • Given to very limited user set • Product is not feature-complete • During later portions of test phase • Beta release • Customer testing and evaluation • Most important feature • Preferably after software stabilizes
  • 34. 34 External Testing Milestones • Value of Beta Testing • Testing in the real world • Getting a software assessment • Marketing • Augmenting you staff • Do not determine features based on it • Too late! • Beta testers must be “recruited” • From: Existing base, marketing, tech support, site • Requires the role of “Beta Manager” • All this must be scheduled by PM
  • 35. 35 External Testing Milestones • Release Candidate (RC) • To be sent to manufacturing if testing successful • Release to Manufacturing (RTM) • Production release formally sent to manufacturing • Aim for a “stabilization period” before each of these milestones • Team focus on quality, integration, stability
  • 36. 36 Test Scripts • Two meanings • 1. Set of step-by-step instructions intended to lead test personnel through tests – List of all actions and expected responses • 2. Automated test script (program)
  • 37. 37 Static Testing • Reviews • Most artifacts can be reviewed • Proposal, contract, schedule, requirements, code, data model, test plans – Peer Reviews • Methodical examination of software work products by peers to identify defects and necessary changes • Goal: remove defects early and efficiently • Planned by PM, performed in meetings, documented • CMM Level 3 activity
  • 38. 38 Automated Testing • Human testers = inefficient • Pros • Lowers overall cost of testing • Tools can run unattended • Tools run through ‘suites’ faster than people • Great for regression and compatibility tests • Tests create a body of knowledge • Can reduce QA staff size • Cons • But not everything can be automated • Learning curve or expertise in tools • Cost of high-end tools $5-80K (low-end are still cheap)
  • 39. 39 Test Tools • Capture & Playback • Coverage Analysis • Performance Testing • Test Case Management
  • 40. 40 Load & Stress Testing • Push system beyond capacity limits • Often done via automated scripts • By the QA team • Near end of functional tests • Can show – Hidden functional issues – Maximum system capacity – Unacceptable data or service loss – Determine if “Performance Requirements” met • Remember, these are part of “non-functional” requirements
  • 41. 41 Load & Stress Testing • Metrics – Minimal acceptable response time – Minimal acceptable number of concurrent users – Minimal acceptable downtime • Vendors: High-End – Segue – Mercury – Empirix
  • 42. 42 Performance Metrics Source: Athens Consulting Group Bad Good Must support 500 users Must support 500 simultaneous users 10 second response time [Average|Maximum|90th percentile] response time must be X seconds Must handle 1M hits per day Must handle peak load of 28 page requests per second
  • 43. 43 Other Testing • Installation Testing – Very important if not a Web-based system – Can lead to high support costs and customer dissatisfaction • Usability Testing – Verification of user satisfaction • Navigability • User-friendliness • Ability to accomplish primary tasks
  • 44. 44 Miscellaneous • Pareto Analysis – The 80-20 rule • 80% of defects from 20% of code – Identifying the problem modules • Phase Containment – Testing at the end of each phase – Prevent problems moving phase-to-phase • Burn-in – Allowing system to run “longer” period of time – Variation of stress testing
  • 45. 45 Miscellaneous • “Code Freeze” – When developers stop writing new code and only do bug fixes – Occurs at a varying point in integration/testing • Tester-to-Coder Ratio – It depends – Often 1:3 or 1:4 – QA staff size grows: QA Mgr and/or lead early
  • 46. 46 Stopping Testing • When do you stop? • Rarely are all defects “closed” by release • Shoot for all Critical/High/Medium defects • Often, occurs when time runs out • Final Sign-off (see also UAT) • By: customers, engineering, product mgmt.,
  • 47. 47 Test Metrics • Load: Max. acceptable response time, min. # of simultaneous users • Disaster: Max. allowable downtime • Compatibility: Min/Max. browsers & OS’s supported • Usability: Min. approval rating from focus groups • Functional: Requirements coverage; 100% pass rate for automated test suites
  • 48. 48 Defect Metrics • These are very important to the PM • Number of outstanding defects – Ranked by severity • Critical, High, Medium, Low • Showstoppers • Opened vs. closed
  • 49. 49 Defect Tracking • Get tools to do this for you – Bugzilla, TestTrack Pro, Rational ClearCase – Some good ones are free or low-cost • Make sure all necessary team members have access (meaning nearly all) • Have regular ‘defect review meetings’ – Can be weekly early in test, daily in crunch • Who can enter defects into the tracking system? – Lots of people: QA staff, developers, analysts, managers, (sometimes) users, PM
  • 50. 50 Defect Tracking • Fields – State: open, closed, pending – Date created, updated, closed – Description of problem – Release/version number – Person submitting – Priority: low, medium, high, critical – Comments: by QA, developer, other
  • 51. 51 Defect Metrics • Open Rates – How many new bugs over a period of time • Close Rates – How many closed over that same period – Ex: 10 bugs/day • Change Rate – Number of times the same issue updated • Fix Failed Counts – Fixes that didn’t really fix (still open) – One measure of “vibration” in project
  • 52. 52 Defect Rates • Microsoft Study – 10-20/KLOC during test – 0.5/KLOC after release
  • 53. 53 Test Environments • You need to test somewhere. Where? • Typically separate hardware/network environment(s)
  • 54. 54 Hardware Environments • Development • QA • Staging (optional) • Production
  • 55. 55 Hardware Environments • Typical environments – Development • Where programmers work • Unit tests happen here – Test • For integration, system, and regression testing – Stage • For burn-in and load testing – Production • Final deployment environment(s)
  • 56. 56 Web Site Testing • Unique factors – Distributed (N-tiers, can be many) – Very high availability needs – Uses public network (Internet) – Large number of platforms (browsers + OS) • 5 causes of most site failures (Jupiter, 1999) – Internal network performance – External network performance – Hardware performance – Unforeseeable traffic spikes – Web application performance
  • 57. 57 Web Site Testing • Commercial Tools: Load Test & Site Management – Mercury Interactive • SiteScope, SiteSeer – Segue • Commercial Subscription Services – Keynote Systems • Monitoring Tools • Availability: More “Nines” = More $’s • Must balance QA & availability costs vs. benefits
  • 58. 58 QA Roles • QA Manager • Hires QA team; creates test plans; selects tools; manages team • Salary: $50-80K/yr, $50-100/hr • Test Developer/Test Engineer • Performs functional tests; develops automated scripts • Salary: $35-70K/yr, $40-100/hr • System Administrator • Supports QA functions but not official QA team member • Copy Editor/Documentation Writer • Supports QA; also not part of official team
  • 60. 60 Homework • McConnell: 16 “Project Recovery” • Schwalbe: 16 “Closing” • Your final MS-Project schedule due class after next – Add resources and dependencies to your plan – Add durations and costs – Send interim versions – Remember, most important part of your grade – Get to me with any questions • Iterate & get feedback • Don’t work in the dark

Editor's Notes

  • #3: No lab today More lab in later term