0% found this document useful (0 votes)
1 views

spm unit-2-2

Software processes in software engineering include methodologies like Waterfall, Agile, Scrum, and DevOps, each with unique advantages and disadvantages. Key components of software consist of programs, documentation, operating procedures, and other elements such as code and user interfaces. The document also discusses the software crisis, various process models, and the principles of Extreme Programming (XP), emphasizing the importance of iterative development, customer feedback, and collaboration.

Uploaded by

saifnawaz2025
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views

spm unit-2-2

Software processes in software engineering include methodologies like Waterfall, Agile, Scrum, and DevOps, each with unique advantages and disadvantages. Key components of software consist of programs, documentation, operating procedures, and other elements such as code and user interfaces. The document also discusses the software crisis, various process models, and the principles of Extreme Programming (XP), emphasizing the importance of iterative development, customer feedback, and collaboration.

Uploaded by

saifnawaz2025
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 23

UNIT-2

What are Software Processes?


Software processes in software engineering refer to the methods and techniques
used to develop and maintain software. Some examples of software processes
include:
 Waterfall: a linear, sequential approach to software development, with
distinct phases such as requirements gathering, design, implementation,
testing, and maintenance.
 Agile: a flexible, iterative approach to software development, with an
emphasis on rapid prototyping and continuous delivery.
 Scrum: a popular Agile methodology that emphasizes teamwork, iterative
development, and a flexible, adaptive approach to planning and
management.
 DevOps: a set of practices that aims to improve collaboration and
communication between development and operations teams, with an
emphasis on automating the software delivery process.
Each process has its own set of advantages and disadvantages, and the choice of
which one to use depends on the specific project and organization.
Components of Software
There are three main components of the software:
1. Program: A computer program is a list of instructions that tell a computer
what to do.
2. Documentation: Source information about the product contained in design
documents, detailed code comments, etc.
3. Operating Procedures: Set of step-by-step instructions compiled by an
organization to help workers carry out complex routine operations.
Other Software Components
Other Software Components are:
1. Code: the instructions that a computer executes in order to perform a
specific task or set of tasks.
2. Data: the information that the software uses or manipulates.
3. User interface: the means by which the user interacts with the software,
such as buttons, menus, and text fields.
4. Libraries: pre-written code that can be reused by the software to perform
common tasks.
5. Documentation: information that explains how to use and maintain the
software, such as user manuals and technical guides.
6. Test cases: a set of inputs, execution conditions, and expected outputs
that are used to test the software for correctness and reliability.
7. Configuration files: files that contain settings and parameters that are
used to configure the software to run in a specific environment.
8. Build and deployment scripts: scripts or tools that are used to build,
package, and deploy the software to different environments.
9. Metadata: information about the software, such as version numbers,
authors, and copyright information.
Key Process Activities
There four basic key process activities are:
1. Software Specifications: In this process, detailed description of a software
system to be developed with its functional and non-functional
requirements.
2. Software Development: In this process, designing, programming,
documenting, testing, and bug fixing is done.
3. Software Validation: In this process, evaluation software product is done to
ensure that the software meets the business requirements as well as the
end users needs.
4. Software Evolution: It is a process of developing software initially, then
timely updating it for various reasons.
Software Crisis
The term “software crisis” refers to a set of problems that were faced by the
software industry in the 1960s and 1970s, such as:
1. Size and Cost: Day to day growing complexity and expectation out of
software. Software are more expensive and more complex.
2. Quality: Software products must have good quality.
3. Delayed Delivery: Software takes longer than the estimated time to
develop, which in turn leads to cost shooting up.
4. High costs and long development times: software projects were taking
much longer and costing much more than expected.
5. Low quality: software was often delivered late, with bugs and other defects
that made it difficult to use.
6. Lack of standardization: there were no established best practices or
standards for software development, making it difficult to compare and
improve different approaches.
7. Lack of tools and methodologies: there were few tools and methodologies
available to help with software development, making it a difficult and time-
consuming process.
These problems led to a growing realization that the traditional approaches to
software development were not effective and needed to be improved. This led to
the development of new software development methodologies, such as the
Waterfall and Agile methodologies, as well as the creation of new tools and
technologies to support software development.
However, even today, software crisis could be seen in some form or the other,
like for example software projects going over budget, schedule and not meeting
the requirement.
Software Process Model
A software process model is an abstraction of the actual process, which is being
described. It can also be defined as a simplified representation of a software
process. Each model represents a process from a specific perspective.
Following are some basic software process models on which different type of
software process models can be implemented:
1. A workflow Model : It is the sequential series of tasks and decisions that
make up a business process.
2. The Waterfall Model: It is a sequential design process in which progress
is seen as flowing steadily downwards.
 Phases in waterfall model:
o Requirements Specification

o Software Design

o Implementation

o Testing

3. Dataflow Model: It is diagrammatic representation of the flow and


exchange of information within a system.
4. Evolutionary Development Model: Following activities are considered
in this method:
 Specification
 Development
 Validation
5. Role / Action Model: Roles of the people involved in the software
process and the activities.
Need for Process Model
The software development team must decide the process model that is to be
used for software product development and then the entire team must adhere to
it. This is necessary because the software product development can then be
done systematically. Each team member will understand what is the next activity
and how to do it. Thus process model will bring the definiteness and discipline in
overall development process. Every process model consists of definite entry
and exit criteria for each phase. Hence the transition of the product through
various phases is definite.
If the process model is not followed for software development then any team
member can perform any software development activity, this will ultimately
cause a chaos and software project will definitely fail without using process
model, it is difficult to monitor the progress of software product. Thus process
model plays an important rule in software engineering.
Advantages or Disadvantages of Process Model
There are several advantages and disadvantages to different software
development methodologies, such as:
Waterfall
Advantages of waterfall model are:
1. Clear and defined phases of development make it easy to plan and
manage the project.
2. It is well-suited for projects with well-defined and unchanging
requirements.
Disadvantages of waterfall model are:
1. Changes made to the requirements during the development phase can be
costly and time-consuming.
2. It can be difficult to know how long each phase will take, making it difficult
to estimate the overall time and cost of the project.
3. It does not have much room for iteration and feedback throughout the
development process.
Agile
Advantages of Agile Model are:
1. Flexible and adaptable to changing requirements.
2. Emphasizes rapid prototyping and continuous delivery, which can help to
identify and fix problems early on.
3. Encourages collaboration and communication between development
teams and stakeholders.
Disadvantages of Agile Model are:
1. It may be difficult to plan and manage a project using Agile
methodologies, as requirements and deliverables are not always well-
defined in advance.
2. It can be difficult to estimate the overall time and cost of a project, as the
process is iterative and changes are made throughout the development.
Scrum
Advantages of Scrum are:
1. Encourages teamwork and collaboration.
2. Provides a flexible and adaptive framework for planning and managing
software development projects.
3. Helps to identify and fix problems early on by using frequent testing and
inspection.
Disadvantages of Scrum are:
1. A lack of understanding of Scrum methodologies can lead to confusion and
inefficiency.
2. It can be difficult to estimate the overall time and cost of a project, as the
process is iterative and changes are made throughout the development.
DevOps
Advantages of DevOps are:
1. Improves collaboration and communication between development and
operations teams.
2. Automates software delivery process, making it faster and more efficient.
3. Enables faster recovery and response time in case of issues.
Disadvantages of DevOps are:
1. Requires a significant investment in tools and technologies.
2. Can be difficult to implement in organizations with existing silos and lack
of culture of collaboration.
3. Need to have a skilled workforce to effectively implement the devops
practices.
4. Ultimately, the choice of which methodology to use depends on the
specific project and organization, as well as the goals and requirements of
the project.
Dynamic Systems Development Method (DSDM)

The Dynamic Systems Development technique (DSDM) is an associate degree


agile code development approach that provides a framework for building and
maintaining systems. The DSDM philosophy is borrowed from a modified version
of the sociologist principle—80 % of An application is often delivered in twenty
percent of the time it’d desire deliver the entire (100 percent) application.
DSDM is An iterative code method within which every iteration follows the 80%
rule that simply enough work is needed for every increment to facilitate
movement to the following increment. The remaining detail is often completed
later once a lot of business necessities are noted or changes are requested and
accommodated.
The DSDM tool (www.dsdm.org) could be a worldwide cluster of member
companies that put together tackle the role of “keeper” of the strategy. The pool
has outlined AN Agile Development Model, known as the DSDM life cycle that
defines 3 different unvarying cycles, preceded by 2 further life cycle activities:
1. Feasibility Study:
It establishes the essential business necessities and constraints related to
the applying to be designed then assesses whether or not the application
could be a viable candidate for the DSDM method.
2. Business Study:
It establishes the use and knowledge necessities that may permit the
applying to supply business value; additionally, it is the essential
application design and identifies the maintainability necessities for the
applying.
3. Functional Model Iteration:
It produces a collection of progressive prototypes that demonstrate
practicality for the client.
(Note: All DSDM prototypes are supposed to evolve into the deliverable
application.) The intent throughout this unvarying cycle is to collect further
necessities by eliciting feedback from users as they exercise the
paradigm.
4. Design and Build Iteration:
It revisits prototypes designed throughout useful model iteration to make
sure that everyone has been designed during a manner that may alter it
to supply operational business price for finish users. In some cases, useful
model iteration and style and build iteration occur at the same time.
5. Implementation:
It places the newest code increment (an “operationalized” prototype) into
the operational surroundings. It ought to be noted that:
 (a) the increment might not 100% complete or,
 (b) changes are also requested because the increment is placed into
place. In either case, DSDM development work continues by
returning to the useful model iteration activity.
What is Extreme Programming (XP)?
Extreme Programming (XP) is an Agile software development methodology that
focuses on delivering high-quality software through frequent and continuous
feedback, collaboration, and adaptation. XP emphasizes a close working
relationship between the development team, the customer, and stakeholders,
with an emphasis on rapid, iterative development and deployment.

Extreme Programming (XP)


Agile development approaches evolved in the 1990s as a reaction to
documentation and bureaucracy-based processes, particularly the waterfall
approach. Agile approaches are based on some common principles, some of
which are:
1. Working software is the key measure of progress in a project.
2. For progress in a project, therefore software should be developed and
delivered rapidly in small increments.
3. Even late changes in the requirements should be entertained.
4. Face-to-face communication is preferred over documentation.
5. Continuous feedback and involvement of customers are necessary for
developing good-quality software.
6. A simple design that involves and improves with time is a better approach
than doing an elaborate design up front for handling all possible scenarios.
7. The delivery dates are decided by empowered teams of talented
individuals.
Extreme programming is one of the most popular and well-known approaches in
the family of agile methods. an XP project starts with user stories which are short
descriptions of what scenarios the customers and users would like the system to
support. Each story is written on a separate card, so they can be flexibly
grouped.
Good Practices in Extreme Programming
Some of the good practices that have been recognized in the extreme
programming model and suggested to maximize their use are given below:

Extreme Programming Good Practices


 Code Review: Code review detects and corrects errors efficiently. It
suggests pair programming as coding and reviewing of written code
carried out by a pair of programmers who switch their work between them
every hour.
 Testing: Testing code helps to remove errors and improves its reliability. XP
suggests test-driven development (TDD) to continually write and execute
test cases. In the TDD approach, test cases are written even before any
code is written.
 Incremental development: Incremental development is very good because
customer feedback is gained and based on this development team comes
up with new increments every few days after each iteration.
 Simplicity: Simplicity makes it easier to develop good-quality code as well
as to test and debug it.
 Design: Good quality design is important to develop good quality software.
So, everybody should design daily.
 Integration testing: Integration Testing helps to identify bugs at the
interfaces of different functionalities. Extreme programming suggests that
the developers should achieve continuous integration by building and
performing integration testing several times a day.
Basic Principles of Extreme programming
XP is based on the frequent iteration through which the developers implement
User Stories. User stories are simple and informal statements of the customer
about the functionalities needed. A User Story is a conventional description by
the user of a feature of the required system. It does not mention finer details
such as the different scenarios that can occur. Based on User stories, the project
team proposes Metaphors. Metaphors are a common vision of how the system
would work. The development team may decide to build a Spike for some
features. A Spike is a very simple program that is constructed to explore the
suitability of a solution being proposed. It can be considered similar to a
prototype. Some of the basic activities that are followed during software
development by using the XP model are given below:
 Coding: The concept of coding which is used in the XP model is slightly
different from traditional coding. Here, the coding activity includes
drawing diagrams (modeling) that will be transformed into code, scripting
a web-based system, and choosing among several alternative solutions.
 Testing: The XP model gives high importance to testing and considers it to
be the primary factor in developing fault-free software.
 Listening: The developers need to carefully listen to the customers if they
have to develop good quality software. Sometimes programmers may not
have the depth knowledge of the system to be developed. So, the
programmers should understand properly the functionality of the system
and they have to listen to the customers.
 Designing: Without a proper design, a system implementation becomes
too complex, and very difficult to understand the solution, thus making
maintenance expensive. A good design results elimination of complex
dependencies within a system. So, effective use of suitable design is
emphasized.
 Feedback: One of the most important aspects of the XP model is to gain
feedback to understand the exact customer needs. Frequent contact with
the customer makes the development effective.
 Simplicity: The main principle of the XP model is to develop a simple
system that will work efficiently in the present time, rather than trying to
build something that would take time and may never be used. It focuses
on some specific features that are immediately needed, rather than
engaging time and effort on speculations of future requirements.
 Pair Programming: XP encourages pair programming where two
developers work together at the same workstation. This approach helps in
knowledge sharing, reduces errors, and improves code quality.
 Continuous Integration: In XP, developers integrate their code into a
shared repository several times a day. This helps to detect and resolve
integration issues early on in the development process.
 Refactoring: XP encourages refactoring, which is the process of
restructuring existing code to make it more efficient and maintainable.
Refactoring helps to keep the codebase clean, organized, and easy to
understand.
 Collective Code Ownership: In XP, there is no individual ownership of code.
Instead, the entire team is responsible for the codebase. This approach
ensures that all team members have a sense of ownership and
responsibility towards the code.
 Planning Game: XP follows a planning game, where the customer and the
development team collaborate to prioritize and plan development tasks.
This approach helps to ensure that the team is working on the most
important features and delivers value to the customer.
 On-site Customer: XP requires an on-site customer who works closely with
the development team throughout the project. This approach helps to
ensure that the customer’s needs are understood and met, and also
facilitates communication and feedback.
Applications of Extreme Programming (XP)
Some of the projects that are suitable to develop using the XP model are given
below:
 Small projects: The XP model is very useful in small projects consisting of
small teams as face-to-face meeting is easier to achieve.
 Projects involving new technology or Research projects: This type of
project faces changing requirements rapidly and technical problems. So XP
model is used to complete this type of project.
 Web development projects: The XP model is well-suited for web
development projects as the development process is iterative and requires
frequent testing to ensure the system meets the requirements.
 Collaborative projects: The XP model is useful for collaborative projects
that require close collaboration between the development team and the
customer.
 Projects with tight deadlines: The XP model can be used in projects that
have a tight deadline, as it emphasizes simplicity and iterative
development.
 Projects with rapidly changing requirements: The XP model is designed to
handle rapidly changing requirements, making it suitable for projects
where requirements may change frequently.
 Projects where quality is a high priority: The XP model places a strong
emphasis on testing and quality assurance, making it a suitable approach
for projects where quality is a high priority.
Project Size Estimation Techniques – Software Engineering
In the fast-paced world of Software Engineering, accurately estimating the size of
a project is key to its success. Understanding how big a project will be helps
predict the resources, time, and cost needed, ensuring the project starts off on
the right foot.
Project Size Estimation Techniques are vital because they allow you to plan and
allocate the necessary resources effectively. This is a critical step in software
engineering that ensures projects are feasible and managed efficiently from the
start.
What is Project Size Estimation?
Project size estimation is determining the scope and resources required for the
project.
1. It involves assessing the various aspects of the project to estimate the
effort, time, cost, and resources needed to complete the project.
2. Accurate project size estimation is important for effective and efficient
project planning, management, and execution.
Importance of Project Size Estimation
Here are some of the reasons why project size estimation is critical in project
management:
1. Financial Planning: Project size estimation helps in planning the
financial aspects of the project, thus helping to avoid financial shortfalls.
2. Resource Planning: It ensures the necessary resources are identified
and allocated accordingly.
3. Timeline Creation: It facilitates the development of realistic timelines
and milestones for the project.
4. Identifying Risks: It helps to identify potential risks associated with
overall project execution.
5. Detailed Planning: It helps to create a detailed plan for the project
execution, ensuring all the aspects of the project are considered.
6. Planning Quality Assurance: It helps in planning quality assurance
activities and ensuring that the project outcomes meet the required
standards.
Who Estimates Projects Size?
Here are the key roles involved in estimating the project size:
1. Project Manager: Project manager is responsible for overseeing the
estimation process.
2. Subject Matter Experts (SMEs): SMEs provide detailed knowledge
related to the specific areas of the project.
3. Business Analysts: Business Analysts help in understanding and
documenting the project requirements.
4. Technical Leads: They estimate the technical aspects of the project such
as system design, development, integration, and testing.
5. Developers: They will provide detailed estimates for the tasks they will
handle.
6. Financial Analysts: They provide estimates related to the financial
aspects of the project including labor costs, material costs, and other
expenses.
7. Risk Managers: They assess the potential risks that could impact the
projects’ size and effort.
8. Clients: They provide input on project requirements, constraints, and
expectations.
Different Methods of Project Estimation
1. Expert Judgment: In this technique, a group of experts in the relevant
field estimates the project size based on their experience and expertise.
This technique is often used when there is limited information available
about the project.
2. Analogous Estimation: This technique involves estimating the project
size based on the similarities between the current project and previously
completed projects. This technique is useful when historical data is
available for similar projects.
3. Bottom-up Estimation: In this technique, the project is divided into
smaller modules or tasks, and each task is estimated separately. The
estimates are then aggregated to arrive at the overall project estimate.
4. Three-point Estimation: This technique involves estimating the project
size using three values: optimistic, pessimistic, and most likely. These
values are then used to calculate the expected project size using a formula
such as the PERT formula.
5. Function Points: This technique involves estimating the project size
based on the functionality provided by the software. Function points
consider factors such as inputs, outputs, inquiries, and files to arrive at the
project size estimate.
6. Use Case Points: This technique involves estimating the project size
based on the number of use cases that the software must support. Use
case points consider factors such as the complexity of each use case, the
number of actors involved, and the number of use cases.
7. Parametric Estimation: For precise size estimation, mathematical
models founded on project parameters and historical data are used.
8. COCOMO (Constructive Cost Model): It is an algorithmic model that
estimates effort, time, and cost in software development projects by
taking into account several different elements.
9. Wideband Delphi: Consensus-based estimating method for balanced
size estimations that combines expert estimates from anonymous experts
with cooperative conversations.
10.Monte Carlo Simulation: This technique, which works especially well for
complicated and unpredictable projects, estimates project size and
analyses hazards using statistical methods and random sampling.
Each of these techniques has its strengths and weaknesses, and the choice of
technique depends on various factors such as the project’s complexity, available
data, and the expertise of the team.
Estimating the Size of the Software
Estimation of the size of the software is an essential part of Software Project
Management. It helps the project manager to further predict the effort and time
that will be needed to build the project. Here are some of the measures that are
used in project size estimation:
1. Lines of Code (LOC)
As the name suggests, LOC counts the total number of lines of source code in a
project. The units of LOC are:
1. KLOC: Thousand lines of code
2. NLOC: Non-comment lines of code
3. KDSI: Thousands of delivered source instruction
 The size is estimated by comparing it with the existing systems of the
same kind. The experts use it to predict the required size of various
components of software and then add them to get the total size.
 It’s tough to estimate LOC by analyzing the problem definition. Only after
the whole code has been developed can accurate LOC be estimated. This
statistic is of little utility to project managers because project planning
must be completed before development activity can begin.
 Two separate source files having a similar number of lines may not require
the same effort. A file with complicated logic would take longer to create
than one with simple logic. Proper estimation may not be attainable based
on LOC.
 The length of time it takes to solve an issue is measured in LOC. This
statistic will differ greatly from one programmer to the next. A seasoned
programmer can write the same logic in fewer lines than a newbie coder.
Advantages:
1. Universally accepted and is used in many models like COCOMO.
2. Estimation is closer to the developer’s perspective.
3. Both people throughout the world utilize and accept it.
4. At project completion, LOC is easily quantified.
5. It has a specific connection to the result.
6. Simple to use.
Disadvantages:
1. Different programming languages contain a different number of lines.
2. No proper industry standard exists for this technique.
3. It is difficult to estimate the size using this technique in the early stages of
the project.
4. When platforms and languages are different, LOC cannot be used to
normalize.
2. Number of Entities in ER Diagram
ER model provides a static view of the project. It describes the entities and their
relationships. The number of entities in the ER model can be used to measure
the estimation of the size of the project. The number of entities depends on the
size of the project. This is because more entities needed more classes/structures
thus leading to more coding.
Advantages:
1. Size estimation can be done during the initial stages of planning.
2. The number of entities is independent of the programming technologies
used.
Disadvantages:
1. No fixed standards exist. Some entities contribute more to project size
than others.
2. Just like FPA, it is less used in the cost estimation model. Hence, it must be
converted to LOC.
3. Total Number of Processes in DFD
Data Flow Diagram(DFD) represents the functional view of software. The model
depicts the main processes/functions involved in software and the flow of data
between them. Utilization of the number of functions in DFD to predict software
size. Already existing processes of similar type are studied and used to estimate
the size of the process. The sum of the estimated size of each process gives the
final estimated size.
Advantages:
1. It is independent of the programming language.
2. Each major process can be decomposed into smaller processes. This will
increase the accuracy of the estimation.
Disadvantages:
1. Studying similar kinds of processes to estimate size takes additional time
and effort.
2. All software projects are not required for the construction of DFD.
4. Function Point Analysis
In this method, the number and type of functions supported by the software are
utilized to find FPC(function point count). The steps in function point analysis
are:
1. Count the number of functions of each proposed type.
2. Compute the Unadjusted Function Points(UFP).
3. Find the Total Degree of Influence(TDI).
4. Compute Value Adjustment Factor(VAF).
5. Find the Function Point Count(FPC).
The explanation of the above points is given below:
1. Count the number of functions of each proposed type:
Find the number of functions belonging to the following types:
 External Inputs: Functions related to data entering the system.
 External outputs: Functions related to data exiting the system.
 External Inquiries: They lead to data retrieval from the system but don’t
change the system.
 Internal Files: Logical files maintained within the system. Log files are not
included here.
 External interface Files: These are logical files for other applications which
are used by our system.
2. Compute the Unadjusted Function Points(UFP):
Categorize each of the five function types as simple, average, or complex based
on their complexity. Multiply the count of each function type with its weighting
factor and find the weighted sum. The weighting factors for each type based on
their complexity are as follows:

Function type Simple Average Complex

External Inputs 3 4 6

External Output 4 5 7

External
3 4 6
Inquiries

Internal Logical
7 10 15
Files

External
5 7 10
Interface Files

3. Find the Total Degree of Influence:


Use the ’14 general characteristics of a system to find the degree of influence of
each of them. The sum of all 14 degrees of influence will give the TDI. The range
of TDI is 0 to 70. The 14 general characteristics are: Data Communications,
Distributed Data Processing, Performance, Heavily Used Configuration,
Transaction Rate, On-Line Data Entry, End-user Efficiency, Online Update,
Complex Processing Reusability, Installation Ease, Operational Ease, Multiple
Sites and Facilitate Change.
Each of the above characteristics is evaluated on a scale of 0-5.
4. Compute Value Adjustment Factor(VAF):
Use the following formula to calculate VAF:
VAF = (TDI * 0.01) + 0.65
5. Find the Function Point Count:
Use the following formula to calculate FPC:
FPC = UFP * VAF
Advantages:
1. It can be easily used in the early stages of project planning.
2. It is independent of the programming language.
3. It can be used to compare different projects even if they use different
technologies(database, language, etc).
Disadvantages:
1. It is not good for real-time systems and embedded systems.
2. Many cost estimation models like COCOMO use LOC and hence FPC must
be converted to LOC.
When Should Estimates Take Place?
Project size estimates must take place at multiple key points throughout the
project lifecycle. It should take place during the following stages to ensure
accuracy and relevance:
1. Project Initiation: Project is assessed to determine its feasibility and
scope.
2. Project Planning: Precise estimates are done to create a realistic budget
and timeline.
3. Project Execution: Res-estimation when there are significant changes in
scope.
4. Project Monitoring and Control: Regular reviews to make sure that the
project is on track.
5. Project Closeout: Comparing original estimates with actual outcomes
and documenting estimation accuracy.
Challenges in Project Size Estimation
Project size estimation can be challenging due to multiple factors. Here are some
factors that can affect the accuracy and reliability of estimates:
1. Unclear Requirements: Initial project requirements can be vague or
subject to change, thus making it difficult to estimate accurately.
2. Lack of Historical Data: Without access to the data of similar past
projects, it becomes difficult to make informed estimates, thus estimates
becoming overly optimistic or pessimistic and leading to inaccurate
planning.
3. Interdependencies: Project with numerous interdependent tasks are
harder to estimate due to the complicated interactions between
components.
4. Productivity Variability: Estimating the productivity of resources and
their availability can be challenging due to fluctuations and uncertainties.
5. Risks: Identifying and quantifying risks and uncertainties is very difficult.
Underestimating the potential risks can lead to inadequate contingency
planning, thus causing the project to go off track.
Improving Accuracy in Project Size Estimation
Improving the accuracy of project size estimation involves a combination of
techniques and best practices. Here are some key strategies to enhance
estimation accuracy:
1. Define Clear Requirements: Ensure all project requirements are
thoroughly documented and engage all stakeholders early and frequently
to clarify and validate the requirements.
2. Use Historical Data: Use data from similar past projects to make
informed estimates.
3. Use Estimation Techniques: Use various estimation techniques like
Analogue Estimation, Parametric Estimation, Bottom-Up Estimation, and
Three-Point Estimation.
4. Break Down the Project: Use Work Breakdown Structure (WBS) and
detailed take analysis to make sure that each task is specific and
measurable.
5. Incorporate Expert Judgement: Engage subject matter experts and
experienced team members to provide input on estimates.
Future of Project Size Estimation
The future of project size estimation will be shaped by the advancements in
technology and methodologies. Here are some key developments that can define
the future of project size estimation:
1. Smarter Technology: Artificial intelligence (AI) could analyze past
projects and code to give more accurate forecasts, considering how
complex the project features are.
2. Data-Driven Insights: Instead of just lines of code, estimates could
consider factors like the number of users, the type of software (mobile app
vs. web app), and how much data it handles.
3. Human-AI Collaboration: Combining human expertise with AI can
enhance the decision-making process in project size estimation.
4. Collaborative Platforms: Tools that facilitate collaboration among
geographically dispersed teams can help to enhance the project size
estimation process.
5. Agile Methodologies: The adoption of agile methodologies can promote
continuous estimation and iterative refinement.
Conclusion
In conclusion, accurate project size estimation is crucial for software project
success. Traditional techniques like lines of code have limitations. The future of
estimation lies in AI and data-driven insights for better resource allocation, risk
management, and project planning.
What is cost estimation?
Cost estimation is a process where project managers predict the amount of
money they need to fund their projects. The process entails direct and indirect
costs of the project. These costs may include utilities, materials, equipment,
vendors, and employee compensation.As managers estimate costs, they may
also consider project elements, including:
 Duration: The duration signifies how long it takes to finish the project.
 Size: The managers predict how big the project is. For example, designing
a community center may be a big project, while publishing a limited book
edition may be a small project.
 Scope: Scope refers to the extent of the project, such as what groups of
people may benefit from it and what parties are participating in its
execution.
 Complexity: The more complex a project is, the more steps it may require
for completion. Project managers may consider the cost of each step to
find an accurate estimate.
Why is cost estimation important?
As a project manager, cost estimation is important for planning because it can
help you achieve the following:
Maintain your budget
When you decide to launch a project, you may design a budget that dictates how
much money you can afford to spend on resources and equipment. Cost
estimation enables you to predict the funds needed and compare the estimation
with your budget. If the estimation exceeds your budget, you can refine your
plan before starting the project.Related: How To Create a Budget for a
Startup in 8 Steps (Plus Tips)
Prevent overspending
Without a strategic plan, you may overspend on resources. For example, if you
discover halfway through a project that purchasing more equipment is
necessary, you might spend more money than expected. To help with this
challenge, estimating all your costs before you begin a project is best.
Improve profit margins
Several factors can cause project costs to rise throughout a project's life cycle.
These may include poorly scoped work, unexpected events and inflation. These
factors can present a risk to completing the project within budget and meeting
profitability targets. Accurate estimating can help you determine expected and
unexpected costs, protecting a company's profit margins.Related: How To
Calculate a Profit Margin Ratio
When to use cost estimation
The cost estimation process typically occurs in a project's planning stages. Here
are examples of occasions when it may be helpful to use cost estimation:
 The project is extensive. Extensive projects entail several elements,
such as multiple third-party vendors and technological devices. For project
managers, cost estimation may be necessary to account for the costs for
every project piece.
 You're trying a new project. If the project is something you and other
project managers have never accomplished before, then cost estimation
can allow you to research the resources needed and refine your approach.
 You have multiple options for completing the project. During the
cost estimation process, you can compare the prices of the resources
necessary to build the most efficient or cost-effective plan. For example,
you might learn that you can save money by paying your permanent
employees to handle the project's tasks instead of hiring contractors.
11 cost-estimating methods
Here are several cost-estimating methods you can use for a project:
1. Parametric estimating
The parametric estimating method involves using historical data to determine
the costs of each part of the endeavor. For example, when planning to build a
two-story house, you can review the historical costs of building a house with the
same materials and square footage, which enables you to design an accurate
budget. The parametric method includes three steps:
1. Identify the number of project units, such as square footage.
2. Identify the cost of each unit.
3. Multiply the total number of units by the cost of one unit.
Related: Parametric Estimating in Project Management (Plus Benefits)
2. Analogous estimating
The analogous estimating method combines historical data and expert judgment
to anticipate the costs of a project. Here are its steps:
1. Identify the project's elements, such as size, scope and duration.
2. Research similar projects that have used the same elements.
3. Base the cost estimation for the current project on a budget of past
projects.
Read more: Analogous Estimation: Definition, Uses and Examples
3. Three-point estimating
When using the three-point estimating method, you can develop the following
estimations for the costs of an endeavor:
 Optimistic estimate: This prediction shows the best-case scenario,
where employees complete the project and maintain the budget.
 Pessimistic estimate: In the worst-case scenario, pessimistic estimates
entail overspending funds on resources.
 Most likely estimate: A realistic estimate is a median between
optimistic and pessimistic predictions. It refers to the actual effort
employees need to produce to complete the project and its costs.
As a project manager, you can calculate the final project cost using the program
evaluation and review technique (PERT) equation:PERT = [optimistic estimate
+ pessimistic estimate + (4 x most likely estimate)] / 6Read
more: Three-Point Estimating Technique: How To Calculate (With
Formula)
4. Top-down estimating
In the top-down estimating method, you determine the total cost of a project and
separate the cost into smaller categories. For example, a nonprofit organization
is hosting a gala with an approximate cost of $15,000 overall. The event
committee notes that decorations cost $2,000, food and drinks cost $7,000 and
entertainment costs $6,000. Top-down estimating may be most beneficial at the
beginning planning stages of a project, where you can gain insight into what
resources cost the most.Read more: Top-Down Estimating: Definition,
Benefits and Examples
5. Bottom-up estimating
Bottom-up estimating, also known as detail or engineering estimating, is the
opposite of the top-down method. You identify the prices of elements first, then
add the costs together to determine the overall costs. For instance, when
launching a marketing campaign, you might need $200 for social media
advertisements and $500 for buy-in commercials on television, which equals a
total cost of $700 for the campaign.Read more: What Is Bottom-Up
Estimating? (And How It Differs From Top-Down)
6. Project management information system
The project management information system (PMIS) technique uses specialized
software to manage the steps of your plan. You can input your resources and
their costs to determine their total price. The software also organizes your
resources into a calendar, showing how your plan progresses and the necessary
resources for each step.Related: What Is a Management Information
System?
7. Delphi method
With the Delphi method, a group of experts submits their predictions for the
costs of a project anonymously, and a mediator analyzes the responses until
they can reach an agreement. The experts respond to a questionnaire and apply
their statistical knowledge during a series of panels. They may adjust their
answers as they learn additional information, and the figure they agree on is the
final estimate for the project.Read more: Delphi Method: Definition and
How To Conduct It
8. Decision-making
The decision-making model considers every team member's opinion, meaning
the employees vote on the cost estimation figure. The team decides by achieving
votes in ways such as:
 Unanimous vote: Every team member agrees on the figure. For instance,
if there are five people on the team, all five need to share the same
perspective.
 Majority vote: A majority vote, or plurality, encompasses more than half
of the team. For example, if there are 10 members, at least six can vote
the same way.
 Points allocation: The team assigns 100 points across a particular
subject, which can highlight areas of interest or value. For instance, an
area that receives two points may be insignificant, while an area that
receives 89 points may require more attention.
9. Vendor bid analysis
Vendor bid analyses may be beneficial for projects requiring vendors' use
exclusively. First, you send a request for proposal (RFP) document to vendors
you're interested in hiring. In the document, the vendors list the price and quality
of their services and share their responses with you. Then, you compare the
prices to estimate how much the entire project may cost.
10. Reserve analysis
The reserve analysis technique accounts for challenges that may occur when
executing the project. It includes funds for the contingency reserve and money
for expected conflicts, such as technical difficulties or limited productivity. It also
includes the management reserve, which is funds that cover unexpected
conflicts, such as historical weather events or national health
emergencies.Related: Using the Management Reserve Process in Project
Management
11. Expert judgment
By receiving expert assistance, your team can resolve interpersonal conflict and
select the best estimating method for your endeavors. Experts examine historical
data and explain how an environment can affect the execution of a project. For
example, experts may advise you to debut a children's movie during the summer
since the target audience may be out of school and attending the movie theater
frequently. They may also suggest combining estimating methods to calculate
the most approximate cost figure.
COCOMO Model-Software Engineering
COCOMO-II is the revised version of the original Cocomo (Constructive Cost
Model) and was developed at the University of Southern California. It is the
model that allows one to estimate the cost, effort, and schedule when planning a
new software development activity.
Sub-Models of COCOMO Model

COCOMO Sub-models
1. End User Programming
Application generators are used in this sub-model. End user write the code by
using these application generators. For Example, Spreadsheets, report generator,
etc.
2. Intermediate Sector

COCOMO Intermediate Sector


 Application Generators and Composition Aids: This category will
create largely prepackaged capabilities for user programming. Their
product will have many reusable components. Typical firms operating in
this sector are Microsoft, Lotus, Oracle, IBM, Borland, Novell.
 Application Composition Sector: This category is too diversified and to
be handled by prepackaged solutions. It includes GUI, Databases, domain
specific components such as financial, medical or industrial process
control packages.
 System Integration: This category deals with large scale and highly
embedded systems.
3. Infrastructure Sector
This category provides infrastructure for the software development like
Operating System, Database Management System, User Interface Management
System, Networking System, etc.
Stages of COCOMO II

Stages of COCOMO
1. Stage-I
It supports estimation of prototyping. For this it uses Application Composition
Estimation Model. This model is used for the prototyping stage of application
generator and system integration.
2. Stage-II
It supports estimation in the early design stage of the project, when we less
know about it. For this it uses Early Design Estimation Model. This model is used
in early design stage of application generators, infrastructure, system
integration.
3. Stage-III
It supports estimation in the post architecture stage of a project. For this it uses
Post Architecture Estimation Model. This model is used after the completion of
the detailed architecture of application generator, infrastructure, system
integration.

You might also like