Estimation determines the resources needed to build a system and involves estimating the software size, effort, time, and cost. It is based on past data, documents, assumptions, and risks. The main steps are estimating the software size, effort, time, and cost. Software size can be estimated in lines of code or function points. Effort estimation calculates person-hours or months based on software size using formulas like COCOMO-II. Cost estimation considers additional factors like hardware, tools, personnel skills, and travel. Techniques for estimation include decomposition and empirical models like Putnam and COCOMO, which relate size to time and effort.
Software project management Improving Team EffectivenessREHMAT ULLAH
This document discusses improving team effectiveness for software project management. It emphasizes that managing the team is key and a well-managed team can overcome other shortcomings. Some recommendations include using top talent and fewer people, properly matching skills and motivations to jobs, allowing career progression, balancing the team's skills and personalities, and phasing out underperforming team members. Overall, the most important factors for an effective team are teamwork, balance, strong leadership that keeps the team together and recognizes both individual and group needs.
Software Engineering (Software Process: A Generic View)ShudipPal
This document provides an overview of software processes and engineering. It defines a software process as a series of predictable steps that lead to a timely, high-quality product. The document then discusses the generic process framework activities of communication, planning, modeling, construction, and deployment. It also covers umbrella activities like project management, reviews, and quality assurance that span the entire software process. Finally, it introduces the Capability Maturity Model Integration for assessing software processes and describes its five maturity levels from initial to optimized.
Project control and process instrumentationKuppusamy P
The document discusses project control and process instrumentation for software development projects. It describes 7 core metrics that can be used to measure: 1) management indicators like work progress, budget, and staffing, and 2) quality indicators like change activity, breakage, rework, and defects over time. These metrics provide objective assessments of progress, quality, and estimates. The document also discusses automating metric collection and displaying metrics through a software project control panel to provide visibility into the project.
This document discusses software architecture from both a management and technical perspective. From a management perspective, it defines an architecture as the design concept, an architecture baseline as tangible artifacts that satisfy stakeholders, and an architecture description as a human-readable representation of the design. It also notes that mature processes, clear requirements, and a demonstrable architecture are important for predictable project planning. Technically, it describes Philippe Kruchten's model of software architecture, which includes use case, design, process, component, and deployment views that model different aspects of realizing a system's design.
This document discusses software metrics and measurement. It describes how measurement can be used throughout the software development process to assist with estimation, quality control, productivity assessment, and project control. It defines key terms like measures, metrics, and indicators and explains how they provide insight into the software process and product. The document also discusses using metrics to evaluate and improve the software process as well as track project status, risks, and quality. Finally, it covers different types of metrics like size-oriented, function-oriented, and quality metrics.
The document discusses different structures for programming teams:
- Democratic structure where all members participate in decisions and leadership rotates.
- Chief programmer structure with one lead programmer who designs work and manages others.
- Hierarchical structure that combines aspects of the democratic and chief programmer models with levels like project leader, senior programmers, and junior programmers.
The structures vary in things like communication paths, decision making, and suitability for different types and sizes of projects.
The document outlines the various workflows that make up the software development process, including management, environment, requirements, design, implementation, assessment, and deployment workflows. It describes the key activities for each workflow, such as controlling the process, evolving requirements and design artifacts, programming components, assessing product quality, and transitioning the product to users. The document also notes that iterations consist of sequential activities that vary depending on where an iteration falls in the development cycle.
Risk management involves identifying potential problems, assessing their likelihood and impacts, and developing strategies to address them. There are two main risk strategies - reactive, which addresses risks after issues arise, and proactive, which plans ahead. Key steps in proactive risk management include identifying risks through checklists, estimating their probability and impacts, developing mitigation plans, monitoring risks and mitigation effectiveness, and adjusting plans as needed. Common risk categories include project risks, technical risks, and business risks.
The document discusses the software crisis, which occurs when software demand increases but development methods and tools do not. This leads to budget overruns, low quality software, missed deadlines, and unmanageable code. The software crisis is caused by scaling problems, high costs, delays, unreliability, complexity, and duplicated efforts. One solution is software engineering, which takes a systematic and disciplined approach to development through guidelines like reducing budgets, improving quality, shortening timelines, and using experienced teams.
This document discusses software project management artifacts. Artifacts are organized into management and engineering sets. The management set includes artifacts like the work breakdown structure, business case, and software development plan. The engineering set includes requirement, design, implementation, and deployment artifact sets. Each set captures information through various notations and tools to manage the software development lifecycle.
This document discusses different process models used in software development. It describes the key phases and characteristics of several common process models including waterfall, prototyping, V-model, incremental, iterative, spiral and agile development models. The waterfall model involves sequential phases from requirements to maintenance without iteration. Prototyping allows for user feedback earlier. The V-model adds verification and validation phases. Incremental and iterative models divide the work into smaller chunks to allow for iteration and user feedback throughout development.
The document discusses organization and team structures for software development organizations. It explains the differences between functional and project formats. The functional format divides teams by development phase (e.g. requirements, design), while the project format assigns teams to a single project. The document notes advantages of the functional format include specialization, documentation, and handling staff turnover. However, it is not suitable for small organizations with few projects. The document also describes common team structures like chief programmer, democratic, and mixed control models.
This document provides an overview of quality management concepts and techniques for software engineering. It discusses quality assurance, software reviews, formal technical reviews, statistical quality assurance, software reliability, and the ISO 9000 quality standards. The document includes slides on these topics with definitions, descriptions, and examples.
The document discusses various aspects of planning and managing the software development process, including:
1) Developing a solution strategy and selecting a software life cycle model to provide a framework for the project.
2) Common software life cycle activities like planning, development, testing, and maintenance.
3) Using milestones, documents, and reviews to improve project visibility and management.
4) Organizing development tasks and teams using different structures like project, functional, and matrix formats.
Software maintenance typically requires 40-60% of the total lifecycle effort for a software product, with some cases requiring as much as 90%. A widely used rule of thumb is that maintenance activities are distributed as 60% for enhancements, 20% for adaptations, and 20% for corrections. Studies show the typical level of effort devoted to software maintenance is around 50% of the total lifecycle effort. Boehm suggests measuring maintenance effort using an activity ratio that considers the number of instructions added or modified over the total instructions. The effort required can then be estimated using programmer months based on the activity ratio and an effort adjustment factor. Emphasis on reliability during development can reduce future maintenance effort.
Architecture design in software engineeringPreeti Mishra
The document discusses software architectural design. It defines architecture as the structure of a system's components, their relationships, and properties. An architectural design model is transferable across different systems. The architecture enables analysis of design requirements and consideration of alternatives early in development. It represents the system in an intellectually graspable way. Common architectural styles structure systems and their components in different ways, such as data-centered, data flow, and call-and-return styles.
The Delphi technique was developed to gain expert consensus on estimates without group influence. It can be adapted for software cost estimation by having estimators provide anonymous estimates in rounds. A coordinator summarizes estimates between rounds and asks outliers to justify differences, iterating until consensus. A variation allows group discussion with the coordinator but maintains anonymous estimating to focus on variances. Additional information may be needed if consensus is not reached.
The document discusses software quality management. It covers quality fundamentals like culture, costs and models. It describes quality management processes like quality assurance, verification and validation, reviews and audits. It discusses quality requirements, defect characterization and management techniques like static, people-intensive and dynamic techniques. The document provides details on quality measurement and testing to ensure software quality.
The waterfall model segments the software development process into sequential phases: planning, requirements definition, design, implementation, system testing, and maintenance. Each phase has defined inputs, processes, and outputs. The planning phase involves understanding the problem, feasibility studies, and developing a solution. Requirements definition produces a specification describing the required software functions and constraints. Design identifies software components and their relationships. Implementation translates the design into code. System testing integrates and accepts the software. Maintenance modifies the software after release. While the phases are linear, the development process is not always perfectly sequential.
This document discusses several software design techniques: stepwise refinement, levels of abstraction, structured design, integrated top-down development, and Jackson structured programming. Stepwise refinement is a top-down technique that decomposes a system into more elementary levels. Levels of abstraction designs systems as layers with each level performing services for the next higher level. Structured design converts data flow diagrams into structure charts using design heuristics. Integrated top-down development integrates design, implementation, and testing with a hierarchical structure. Jackson structured programming maps a problem's input/output structures and operations into a program structure to solve the problem.
The document discusses the limitations of the conventional or waterfall model of software development. It identifies five major problems with the conventional approach: 1) protracted integration and late design breakage due to lack of early integration and testing, 2) late risk resolution as risks are not addressed until late in the project, 3) requirements-driven decomposition leading to suboptimal component organization, 4) adversarial stakeholder relationships due to lack of early and frequent customer involvement, and 5) excessive focus on documents and review meetings rather than engineering work. The document advocates using a modern approach that assesses projects early and continuously to avoid these problems.
The document discusses several prescriptive software process models including:
1) The waterfall model which follows sequential phases from requirements to deployment but lacks iteration.
2) The incremental model which delivers functionality in increments with each phase repeated.
3) Prototyping which focuses on visible aspects to refine requirements through iterative prototypes and feedback.
4) The RAD (Rapid Application Development) model which emphasizes very short development cycles of 60-90 days using parallel teams and automated tools. The document provides descriptions and diagrams of each model.
This document discusses the evolution of software management approaches from conventional to modern practices. It begins by describing the conventional waterfall model and its weaknesses. It then discusses how software economics and management have evolved, moving from the 1960s-1970s era of custom and unpredictable projects to the 1980s-1990s introduction of repeatable processes, off-the-shelf tools, and some reuse. Finally, it describes modern practices from 2000 onward that focus on managed processes, integrated environments, and mostly reusable component-based development, enabling more predictable delivery.
* What is Engineering?
* Who is an Engineer?
* The reasons to become an Engineer
* What is Software Engineering?
* Software Engineering: History
* The principles of Software Engineering
* Who is a Software Engineer?
* The reasons to become Software Engineer
* Requirements of being Software Engineer
* The Areas of Software Engineers
* The working areas of Software Engineers
* Difference between Computer Science and Software Engineering
* Pros and Cons of being Software Engineer
* A Software Engineer Responsibilities
* The Most Popular Software Development Methodologies(Waterfall, Rapid Application, Agile and DevOps) Development Methodology
* Version control
* Centralized Version Control
The document outlines the various workflows that make up the software development process, including management, environment, requirements, design, implementation, assessment, and deployment workflows. It describes the key activities for each workflow, such as controlling the process, evolving requirements and design artifacts, programming components, assessing product quality, and transitioning the product to users. The document also notes that iterations consist of sequential activities that vary depending on where an iteration falls in the development cycle.
Risk management involves identifying potential problems, assessing their likelihood and impacts, and developing strategies to address them. There are two main risk strategies - reactive, which addresses risks after issues arise, and proactive, which plans ahead. Key steps in proactive risk management include identifying risks through checklists, estimating their probability and impacts, developing mitigation plans, monitoring risks and mitigation effectiveness, and adjusting plans as needed. Common risk categories include project risks, technical risks, and business risks.
The document discusses the software crisis, which occurs when software demand increases but development methods and tools do not. This leads to budget overruns, low quality software, missed deadlines, and unmanageable code. The software crisis is caused by scaling problems, high costs, delays, unreliability, complexity, and duplicated efforts. One solution is software engineering, which takes a systematic and disciplined approach to development through guidelines like reducing budgets, improving quality, shortening timelines, and using experienced teams.
This document discusses software project management artifacts. Artifacts are organized into management and engineering sets. The management set includes artifacts like the work breakdown structure, business case, and software development plan. The engineering set includes requirement, design, implementation, and deployment artifact sets. Each set captures information through various notations and tools to manage the software development lifecycle.
This document discusses different process models used in software development. It describes the key phases and characteristics of several common process models including waterfall, prototyping, V-model, incremental, iterative, spiral and agile development models. The waterfall model involves sequential phases from requirements to maintenance without iteration. Prototyping allows for user feedback earlier. The V-model adds verification and validation phases. Incremental and iterative models divide the work into smaller chunks to allow for iteration and user feedback throughout development.
The document discusses organization and team structures for software development organizations. It explains the differences between functional and project formats. The functional format divides teams by development phase (e.g. requirements, design), while the project format assigns teams to a single project. The document notes advantages of the functional format include specialization, documentation, and handling staff turnover. However, it is not suitable for small organizations with few projects. The document also describes common team structures like chief programmer, democratic, and mixed control models.
This document provides an overview of quality management concepts and techniques for software engineering. It discusses quality assurance, software reviews, formal technical reviews, statistical quality assurance, software reliability, and the ISO 9000 quality standards. The document includes slides on these topics with definitions, descriptions, and examples.
The document discusses various aspects of planning and managing the software development process, including:
1) Developing a solution strategy and selecting a software life cycle model to provide a framework for the project.
2) Common software life cycle activities like planning, development, testing, and maintenance.
3) Using milestones, documents, and reviews to improve project visibility and management.
4) Organizing development tasks and teams using different structures like project, functional, and matrix formats.
Software maintenance typically requires 40-60% of the total lifecycle effort for a software product, with some cases requiring as much as 90%. A widely used rule of thumb is that maintenance activities are distributed as 60% for enhancements, 20% for adaptations, and 20% for corrections. Studies show the typical level of effort devoted to software maintenance is around 50% of the total lifecycle effort. Boehm suggests measuring maintenance effort using an activity ratio that considers the number of instructions added or modified over the total instructions. The effort required can then be estimated using programmer months based on the activity ratio and an effort adjustment factor. Emphasis on reliability during development can reduce future maintenance effort.
Architecture design in software engineeringPreeti Mishra
The document discusses software architectural design. It defines architecture as the structure of a system's components, their relationships, and properties. An architectural design model is transferable across different systems. The architecture enables analysis of design requirements and consideration of alternatives early in development. It represents the system in an intellectually graspable way. Common architectural styles structure systems and their components in different ways, such as data-centered, data flow, and call-and-return styles.
The Delphi technique was developed to gain expert consensus on estimates without group influence. It can be adapted for software cost estimation by having estimators provide anonymous estimates in rounds. A coordinator summarizes estimates between rounds and asks outliers to justify differences, iterating until consensus. A variation allows group discussion with the coordinator but maintains anonymous estimating to focus on variances. Additional information may be needed if consensus is not reached.
The document discusses software quality management. It covers quality fundamentals like culture, costs and models. It describes quality management processes like quality assurance, verification and validation, reviews and audits. It discusses quality requirements, defect characterization and management techniques like static, people-intensive and dynamic techniques. The document provides details on quality measurement and testing to ensure software quality.
The waterfall model segments the software development process into sequential phases: planning, requirements definition, design, implementation, system testing, and maintenance. Each phase has defined inputs, processes, and outputs. The planning phase involves understanding the problem, feasibility studies, and developing a solution. Requirements definition produces a specification describing the required software functions and constraints. Design identifies software components and their relationships. Implementation translates the design into code. System testing integrates and accepts the software. Maintenance modifies the software after release. While the phases are linear, the development process is not always perfectly sequential.
This document discusses several software design techniques: stepwise refinement, levels of abstraction, structured design, integrated top-down development, and Jackson structured programming. Stepwise refinement is a top-down technique that decomposes a system into more elementary levels. Levels of abstraction designs systems as layers with each level performing services for the next higher level. Structured design converts data flow diagrams into structure charts using design heuristics. Integrated top-down development integrates design, implementation, and testing with a hierarchical structure. Jackson structured programming maps a problem's input/output structures and operations into a program structure to solve the problem.
The document discusses the limitations of the conventional or waterfall model of software development. It identifies five major problems with the conventional approach: 1) protracted integration and late design breakage due to lack of early integration and testing, 2) late risk resolution as risks are not addressed until late in the project, 3) requirements-driven decomposition leading to suboptimal component organization, 4) adversarial stakeholder relationships due to lack of early and frequent customer involvement, and 5) excessive focus on documents and review meetings rather than engineering work. The document advocates using a modern approach that assesses projects early and continuously to avoid these problems.
The document discusses several prescriptive software process models including:
1) The waterfall model which follows sequential phases from requirements to deployment but lacks iteration.
2) The incremental model which delivers functionality in increments with each phase repeated.
3) Prototyping which focuses on visible aspects to refine requirements through iterative prototypes and feedback.
4) The RAD (Rapid Application Development) model which emphasizes very short development cycles of 60-90 days using parallel teams and automated tools. The document provides descriptions and diagrams of each model.
This document discusses the evolution of software management approaches from conventional to modern practices. It begins by describing the conventional waterfall model and its weaknesses. It then discusses how software economics and management have evolved, moving from the 1960s-1970s era of custom and unpredictable projects to the 1980s-1990s introduction of repeatable processes, off-the-shelf tools, and some reuse. Finally, it describes modern practices from 2000 onward that focus on managed processes, integrated environments, and mostly reusable component-based development, enabling more predictable delivery.
* What is Engineering?
* Who is an Engineer?
* The reasons to become an Engineer
* What is Software Engineering?
* Software Engineering: History
* The principles of Software Engineering
* Who is a Software Engineer?
* The reasons to become Software Engineer
* Requirements of being Software Engineer
* The Areas of Software Engineers
* The working areas of Software Engineers
* Difference between Computer Science and Software Engineering
* Pros and Cons of being Software Engineer
* A Software Engineer Responsibilities
* The Most Popular Software Development Methodologies(Waterfall, Rapid Application, Agile and DevOps) Development Methodology
* Version control
* Centralized Version Control
The document discusses different software engineering process models including:
1. The waterfall model which is a linear sequential model where each phase must be completed before moving to the next.
2. Prototyping models which allow requirements to be refined through building prototypes.
3. RAD (Rapid Application Development) which emphasizes short development cycles through reuse and code generation.
4. Incremental models which deliver functionality in increments with early increments focusing on high priority requirements.
5. The spiral model which has multiple iterations of planning, risk analysis, engineering and evaluation phases.
1. The document discusses various software engineering process models including waterfall, prototyping, RAD, incremental, and spiral models. It describes the key phases and advantages/disadvantages of each.
2. It also covers system engineering and how software engineering occurs as part of developing larger systems. Business process engineering and product engineering are introduced for developing information systems and products respectively.
3. Key aspects of developing computer-based systems are outlined including the elements of software, hardware, people, databases, documentation and procedures.
Kelis king - software engineering and best practicesKelisKing
Kelis King offer involve conducting system testing to ensure correct operation, and integration testing to ensure the system integrates correctly with other required systems, such as databases.
The document discusses some key issues with conventional software management approaches like the waterfall model. It notes that software development is unpredictable and that management discipline is more important for success than technology. Some problems with the waterfall model are late risk resolution, adversarial stakeholder relationships due to rigid documentation requirements, and a focus on documents over engineering work. The document also provides metrics on the relative costs of development versus maintenance and how people are a major factor in productivity.
In this Business Analysis Training session you will learn, SDLC. Topics covered in this session are:
SDLC
• Waterfall-Sequential
• Prototyping
• Spiral-Evolutionary
• Rational Unified Process (RUP)-Iterative
To learn more about this course, visit this link: https://www.mindsmapped.com/courses/business-analysis/foundation-level-business-analyst-training/
The document provides an overview of traditional predictive and adaptive software development processes, including waterfall, iterative incremental, and spiral models. It then discusses agile software development processes like Scrum and extreme programming. Key aspects of each methodology are defined such as roles, meetings, user stories, and emphasis on rapid delivery through short iterations. Adaptive methods prioritize quickly adapting to changes while predictive methods focus on detailed long-term planning.
Lect-1: Software Project Management - Project Dimensions, Players, SDLC and P...Mubashir Ali
Course Synopsis:
This course gives you the overview about what Software Project Management actually is? What tools and techniques you will use to manage your project? Similarly, risk management, quality assurance activities, and project planning, scheduling activities will also be covered in this course.
Reference:
1. Software Project Management, Bob Hughes, Mike Cotterell, McGraw-Hill Higher Education, 5th Edition
2. Handouts & Research Papers
Project Management is the art of maximizing the probability that a project delivers its goals on Time, to Budget and at the required Quality.
Project management is the application of knowledge, skills, tools, and techniques to project activities to meet project requirements.
A project is an activity with specific goals which takes place over a finite period of time.
computer programs are not project management: they are tools for project managers to use. Project management is all that mix of components of control, leadership, teamwork, resource management etc that goes into a successful project.
Temporary means that every project has a definite beginning and a definite end.
Projects involve creating something that has not been done in exactly the same way before and which is, therefore, unique and distinct.
Four P's have a substantial influence on software project management-
People must be organized into effective teams, motivated to do high-quality software work, and coordinated to achieve effective communication.
The Product requirements must be communicated from customer to developer.
The Process must be adapted to the people and the problem.
The Project must be organized in a manner that enables the software team to succeed.
The document provides information on Agile vs Waterfall methodologies for software development. It describes Agile as an iterative approach that values individuals, interactions, working software and responding to change over processes, tools, documentation and following a plan. Waterfall is described as a linear sequential process where each phase must be completed before the next can begin. The document outlines the phases and characteristics of both approaches and discusses their pros and cons for different project types.
This document outlines a student feedback system project created by group members Mayur Sandbhor, Ganesh Mali, and Atish Johare under the guidance of Mithun Mhatre. The project uses Java and Oracle 10g to allow students to provide online feedback about college staff. It has two modules - one for students to submit feedback and one for administrators to view feedback reports. The project goes through phases of analysis, design using UML diagrams, testing, and concludes with discussing benefits and potential enhancements.
This document discusses iterative software development and its benefits over traditional waterfall development. It notes that iterative development addresses risks earlier through incremental deliverables. Each iteration includes integration, testing, and assessment. This allows problems to be identified and addressed sooner. In contrast, waterfall development delays testing until late in the project and does not allow for feedback and changes between phases. The document recommends iterative development as a best practice to address common problems like changing requirements and late discovery of issues.
This document discusses iterative software development and its benefits over traditional waterfall development. It notes that iterative development addresses risks earlier through incremental deliverables. Each iteration includes integration, testing, and assessment. This allows problems to be identified and addressed sooner. In contrast, waterfall development delays testing until late in the project and does not allow for feedback and changes between phases. The document recommends iterative development as a best practice to address common problems like changing requirements and late discovery of issues.
Balancing PM & Software Development Practices by Splunk Sr PMProduct School
Main takeaways:
- Software, Web/Mobile, Product Management and Leveraging the Cloud, AWS & Google Cloud Platform,
- Compiling Detailed Requirements and Design, UI/UX + Software Architecture & Design,
- Balancing Project Management and Software Development Practices, Agile/Scrum, and working with Engineering Teams
The document discusses the Rational Unified Process (RUP), an iterative software development process for building object-oriented systems. It is based on commonly accepted best practices like iterative development. The RUP combines requirements, analysis, design, and testing activities into a series of timed iterations. Each iteration results in an integrated and tested increment of functionality. The RUP aims to deliver early and continuous value through practices like iterative development, user involvement, and risk-driven development.
This document discusses architecture in agile projects. It covers how agile methods like Scrum incorporate architecture through iterative development and continuous delivery. It also discusses balancing upfront architecture work with flexibility through methods like Architecture Tradeoff Analysis and attribute-driven design. A case study shows how one project used agile practices like continuous experimentation, refactoring, and incremental improvements to develop a complex system architecture.
The document discusses ICT program and project management in financial industries. It provides an agenda covering general conditions of ICT in the financial sector, program management, agile program management using the Program Management Circle Agile (PMCA) method, project management, and high and low service level agreement methods. Examples of applying these methods in practice are also briefly discussed.
This document provides an overview of different models for managing technology projects, including the waterfall model, DevOps model, and spiral model. It discusses the key phases and aspects of each model. The waterfall model is a linear sequential approach, while DevOps emphasizes collaboration between development and operations teams. The spiral model is a risk-driven approach that combines elements of the waterfall and iterative processes. The document also outlines learning objectives, assessments, and additional references for each section.
Value Stream Mapping Worskshops for Intelligent Continuous SecurityMarc Hornbeek
This presentation provides detailed guidance and tools for conducting Current State and Future State Value Stream Mapping workshops for Intelligent Continuous Security.
In tube drawing process, a tube is pulled out through a die and a plug to reduce its diameter and thickness as per the requirement. Dimensional accuracy of cold drawn tubes plays a vital role in the further quality of end products and controlling rejection in manufacturing processes of these end products. Springback phenomenon is the elastic strain recovery after removal of forming loads, causes geometrical inaccuracies in drawn tubes. Further, this leads to difficulty in achieving close dimensional tolerances. In the present work springback of EN 8 D tube material is studied for various cold drawing parameters. The process parameters in this work include die semi-angle, land width and drawing speed. The experimentation is done using Taguchi’s L36 orthogonal array, and then optimization is done in data analysis software Minitab 17. The results of ANOVA shows that 15 degrees die semi-angle,5 mm land width and 6 m/min drawing speed yields least springback. Furthermore, optimization algorithms named Particle Swarm Optimization (PSO), Simulated Annealing (SA) and Genetic Algorithm (GA) are applied which shows that 15 degrees die semi-angle, 10 mm land width and 8 m/min drawing speed results in minimal springback with almost 10.5 % improvement. Finally, the results of experimentation are validated with Finite Element Analysis technique using ANSYS.
Lidar for Autonomous Driving, LiDAR Mapping for Driverless Cars.pptxRishavKumar530754
LiDAR-Based System for Autonomous Cars
Autonomous Driving with LiDAR Tech
LiDAR Integration in Self-Driving Cars
Self-Driving Vehicles Using LiDAR
LiDAR Mapping for Driverless Cars
π0.5: a Vision-Language-Action Model with Open-World GeneralizationNABLAS株式会社
今回の資料「Transfusion / π0 / π0.5」は、画像・言語・アクションを統合するロボット基盤モデルについて紹介しています。
拡散×自己回帰を融合したTransformerをベースに、π0.5ではオープンワールドでの推論・計画も可能に。
This presentation introduces robot foundation models that integrate vision, language, and action.
Built on a Transformer combining diffusion and autoregression, π0.5 enables reasoning and planning in open-world settings.
☁️ GDG Cloud Munich: Build With AI Workshop - Introduction to Vertex AI! ☁️
Join us for an exciting #BuildWithAi workshop on the 28th of April, 2025 at the Google Office in Munich!
Dive into the world of AI with our "Introduction to Vertex AI" session, presented by Google Cloud expert Randy Gupta.
Raish Khanji GTU 8th sem Internship Report.pdfRaishKhanji
This report details the practical experiences gained during an internship at Indo German Tool
Room, Ahmedabad. The internship provided hands-on training in various manufacturing technologies, encompassing both conventional and advanced techniques. Significant emphasis was placed on machining processes, including operation and fundamental
understanding of lathe and milling machines. Furthermore, the internship incorporated
modern welding technology, notably through the application of an Augmented Reality (AR)
simulator, offering a safe and effective environment for skill development. Exposure to
industrial automation was achieved through practical exercises in Programmable Logic Controllers (PLCs) using Siemens TIA software and direct operation of industrial robots
utilizing teach pendants. The principles and practical aspects of Computer Numerical Control
(CNC) technology were also explored. Complementing these manufacturing processes, the
internship included extensive application of SolidWorks software for design and modeling tasks. This comprehensive practical training has provided a foundational understanding of
key aspects of modern manufacturing and design, enhancing the technical proficiency and readiness for future engineering endeavors.
"Feed Water Heaters in Thermal Power Plants: Types, Working, and Efficiency G...Infopitaara
A feed water heater is a device used in power plants to preheat water before it enters the boiler. It plays a critical role in improving the overall efficiency of the power generation process, especially in thermal power plants.
🔧 Function of a Feed Water Heater:
It uses steam extracted from the turbine to preheat the feed water.
This reduces the fuel required to convert water into steam in the boiler.
It supports Regenerative Rankine Cycle, increasing plant efficiency.
🔍 Types of Feed Water Heaters:
Open Feed Water Heater (Direct Contact)
Steam and water come into direct contact.
Mixing occurs, and heat is transferred directly.
Common in low-pressure stages.
Closed Feed Water Heater (Surface Type)
Steam and water are separated by tubes.
Heat is transferred through tube walls.
Common in high-pressure systems.
⚙️ Advantages:
Improves thermal efficiency.
Reduces fuel consumption.
Lowers thermal stress on boiler components.
Minimizes corrosion by removing dissolved gases.
Passenger car unit (PCU) of a vehicle type depends on vehicular characteristics, stream characteristics, roadway characteristics, environmental factors, climate conditions and control conditions. Keeping in view various factors affecting PCU, a model was developed taking a volume to capacity ratio and percentage share of particular vehicle type as independent parameters. A microscopic traffic simulation model VISSIM has been used in present study for generating traffic flow data which some time very difficult to obtain from field survey. A comparison study was carried out with the purpose of verifying when the adaptive neuro-fuzzy inference system (ANFIS), artificial neural network (ANN) and multiple linear regression (MLR) models are appropriate for prediction of PCUs of different vehicle types. From the results observed that ANFIS model estimates were closer to the corresponding simulated PCU values compared to MLR and ANN models. It is concluded that the ANFIS model showed greater potential in predicting PCUs from v/c ratio and proportional share for all type of vehicles whereas MLR and ANN models did not perform well.
Concept of Problem Solving, Introduction to Algorithms, Characteristics of Algorithms, Introduction to Data Structure, Data Structure Classification (Linear and Non-linear, Static and Dynamic, Persistent and Ephemeral data structures), Time complexity and Space complexity, Asymptotic Notation - The Big-O, Omega and Theta notation, Algorithmic upper bounds, lower bounds, Best, Worst and Average case analysis of an Algorithm, Abstract Data Types (ADT)
ADVXAI IN MALWARE ANALYSIS FRAMEWORK: BALANCING EXPLAINABILITY WITH SECURITYijscai
With the increased use of Artificial Intelligence (AI) in malware analysis there is also an increased need to
understand the decisions models make when identifying malicious artifacts. Explainable AI (XAI) becomes
the answer to interpreting the decision-making process that AI malware analysis models use to determine
malicious benign samples to gain trust that in a production environment, the system is able to catch
malware. With any cyber innovation brings a new set of challenges and literature soon came out about XAI
as a new attack vector. Adversarial XAI (AdvXAI) is a relatively new concept but with AI applications in
many sectors, it is crucial to quickly respond to the attack surface that it creates. This paper seeks to
conceptualize a theoretical framework focused on addressing AdvXAI in malware analysis in an effort to
balance explainability with security. Following this framework, designing a machine with an AI malware
detection and analysis model will ensure that it can effectively analyze malware, explain how it came to its
decision, and be built securely to avoid adversarial attacks and manipulations. The framework focuses on
choosing malware datasets to train the model, choosing the AI model, choosing an XAI technique,
implementing AdvXAI defensive measures, and continually evaluating the model. This framework will
significantly contribute to automated malware detection and XAI efforts allowing for secure systems that
are resilient to adversarial attacks.
Software Process and Project Management - CS832E02 unit 3
1. MISSION
CHRIST is a nurturing ground for an individual’s
holistic development to make effective contribution to
the society in a dynamic environment
VISION
Excellence and Service
CORE VALUES
Faith in God | Moral Uprightness
Love of Fellow Beings
Social Responsibility | Pursuit of Excellence
Software Process and Project Management
(CS832E02)
Unit 3: Software Project Management
Renaissance
Mithun B N
Asst. Prof
Dept. of CSE
2. Excellence and Service
CHRIST
Deemed to be University
Unit 3: Software Project Management Renaissance
Contents:
Conventional Software Management
Evolution of Software Economics
Improving Software Economics
The old way and the new way
3. MISSION
CHRIST is a nurturing ground for an individual’s
holistic development to make effective contribution to
the society in a dynamic environment
VISION
Excellence and Service
CORE VALUES
Faith in God | Moral Uprightness
Love of Fellow Beings
Social Responsibility | Pursuit of Excellence
Unit 3: Software Project Management
Renaissance
Chapter 1: Conventional Software
Management
4. Excellence and Service
CHRIST
Deemed to be University
Conventional Software Management
● The best thing about software is its flexibility. It can be used
to programmed any thing.
● The worst thing about software is also its flexibility. The
anything characteristic has made it difficult to plan, monitor,
and control software development.
● The analysis in software development states that, the
success rate of software project is very low. The following
are three main reasons for low success rate.
5. Excellence and Service
CHRIST
Deemed to be University
● a) Software development is highly unpredictable. Only 10%
software projects are delivered successfully within budget
and time.
● b) Management discipline is more discriminator in success or
failure than technology advances.
● c) The level of software scrap and rework is indicative of an
immature process.
8. Excellence and Service
CHRIST
Deemed to be University
The Waterfall Model – In Theory
● There are three primary points:
○ There are two essential steps common to the development of computer
programs: analysis and coding
○ Need to other steps in between analysis and coding. Steps include system
requirements definition, software requirements definition, program design and
testing.
○ The basic framework described in the waterfall model is risky and invites failure.
Because of postponing the testing towards end of software development.
9. Excellence and Service
CHRIST
Deemed to be University
The Waterfall Model – In Theory
● Five improvements to the waterfall process are as follows:
○ Program design comes first
○ Document the design
○ Do it twice
○ Plan, control and monitor testing
○ Involve the customer
10. Excellence and Service
CHRIST
Deemed to be University
The Waterfall Model – In Theory
● Program design comes first
○ Add program design phase between the software requirements generation phase and the
analysis phase.
○ This would help program designer assures that the software will not fail because of storage,
timing and data flux.
○ Program designer must impose on the analyst the storage, timing and operational constraints in
such a way that he sense the consequences.
○ The steps required for adding program design phase are as follows:
■ Design process initiated with program designers, not analysts or programmers.
■ Design, define and allocate the data processing. Allocate processing functions, design the
database allocation execution time, define interfaces and processing modes with OS,
describe input and output processing, and define preliminary operations procedures.
■ Write an overview document that is understandable, informative and current so that every
worker on the project can gain an elemental understanding of the system.
11. Excellence and Service
CHRIST
Deemed to be University
The Waterfall Model – In Theory
● Document the design
○ Need of documentation are as follows:
○ Each designer must communicated with interfacing designers, managers and
customers
○ During early phases, the documentation is the design
○ The real monetary value of documentation is to support later modifications by a
separate test team, a separate maintenance team and operations personnel
who are not software literate
● Note: Major advances in notations, languages, browsers, tools, and
methods have rendered the need for many of the documents obsolete
12. Excellence and Service
CHRIST
Deemed to be University
The Waterfall Model – In Theory
● Do it Twice:
○ In the first version, the team must have a special broad competence where they
can quickly sense trouble spots in the design, model them, model alternatives,
forget the straight forward aspects of the design that aren’t worth studying at this
early point and finally arrive at an error-free program.
○ There is a need of architecture first development, architecture team is
responsible for the initial engineering. ‘Do it N times’ is a principle of modern-day
iterative development.
13. Excellence and Service
CHRIST
Deemed to be University
The Waterfall Model – In Theory
● Plan, control and monitor testing:
○ The biggest user of project resources – manpower, computer time and the test phase
(has greatest risk in terms of cost and schedule)
○ Previous three recommendations were aimed at solving problems before testing
○ Improvements in testing phase:
■ Employ a team of test specialists who were not responsible for the original design
■ Employ visual inspections to spot the obvious errors
■ Test every logic path
■ Employ the final checkout on the target computer
14. Excellence and Service
CHRIST
Deemed to be University
The Waterfall Model – In Theory
● Involve the customer
○ It is important to involve the customer in a formal way so that he has committed himself
at earlier points before final delivery.
○ Three points following requirements definitions where the insight, judgement and
commitment of the customer.
○ Involve customer during critical software design reviews, during program design and a
final software acceptance review
○ Involving the customer with early demonstrations and planned alpha/beta releases is a
proven, valuable technique
15. Excellence and Service
CHRIST
Deemed to be University
The Waterfall Model – In Practice
● Project destined for trouble, has following symptoms:
○ Protracted integration and late design breakage
○ Late risk resolution
○ Requirements-driven functional decomposition
○ Adversarial stakeholder relationships
○ Focus on documents and review meetings
16. Excellence and Service
CHRIST
Deemed to be University
The Waterfall Model – In Practice
● Protracted Integration and Late Design Breakage
○ Progress is defined as percent coded, demonstrable in its target form.
○ The sequence is:
■ Early success via paper designs and thorough briefings
■ Commitment to code late in the life cycle
■ Integration nightmares due to unforeseen implementation issues and interface
ambiguities
■ Heavy budget and schedule pressure to get the system working
■ Late shoe-horning of non optimal fixes, with no time for redesign
■ A very fragile, unmaintainable product delivered late
17. Excellence and Service
CHRIST
Deemed to be University
The Waterfall Model – In Practice
● Protracted Integration and Late design breakage:
○ Conventional techniques resulted in late integration
and performance showstoppers
○ The entire system was designed on paper, then
implemented all at once, then integrated.
○ At the end of the process it is possible to perform
system testing to verify that the fundamental
architecture was sound.
○ Testing activities consumed 40% or more of life cycle
resources.
○ Table provides a typical profile of cost expenditures
across the spectrum of software activities
18. Excellence and Service
CHRIST
Deemed to be University
The Waterfall Model – In Practice
● Late risk resolution
○ A serious issue associated with the waterfall life cycle was the lack of early risk resolution
○ Risk is defined as the probability of missing a cost, schedule, feature, or quality goal.
○ Early in the life cycle, as the requirements were being specified, the actual risk exposure
was highly unpredictable.
○ As the system was coded, some of the individual component risks got started becoming
tangible
○ Projects tended to have a protracted integration phase as major redesign initiatives were
implemented.
○ This would resolve important risks, but not sacrificing the quality of the end product.
19. Excellence and Service
CHRIST
Deemed to be University
The Waterfall Model – In Practice
● Requirements driven functional decomposition
○ Software development process has been requirements driven.
○ This approach depends on specifying requirements completely and unambiguously
before other development activities begin
○ Specifications of requirements is a difficult and important part of the software
development process.
○ The equal treatment of all requirements drains away substantial numbers of engineering
hours from the driving requirements and wastes those efforts on paperwork associated
with traceability, testability, logistics support and so on
○ In conventional approach, requirements were typically specified in a functional manner
which is built into the classic waterfall process was the fundamental assumption that the
software itself was decomposed into functions; requirements were then allocated to the
resulting components.
20. Excellence and Service
CHRIST
Deemed to be University
The Waterfall Model – In Practice
● Adversarial stakeholder relationships
○ The conventional process tended to result in adversarial stakeholder relationships, in
large part because of the difficulties of requirements specification and the exchange of
information through paper documents that capture engineering information in ad hoc
formats.
○ The following sequence of events was typical for most contractual s/w efforts:
■ The contractor prepared a draft contract-deliverable document that captured an
intermediate artefact and delivered it to the customer for approval
■ The customer was expected to provide comments
■ The contractor incorporated these comments and submitted a final version for
approval
■ This approach resulted in customer-contractor relationships degenerating into mutual
distrust, making it difficult to achieve a balance among requirements, schedule and cost
21. Excellence and Service
CHRIST
Deemed to be University
The Waterfall Model – In Practice
● Focus on documents and review meetings
○ The conventional process focused on producing various documents with insufficient
focus on producing tangible increments of the products themselves.
○ Contractors produce literally tons of paper to meet milestones and demonstrate progress
to stakeholders
○ Typical design review is as shown in figure
22. Excellence and Service
CHRIST
Deemed to be University
Conventional software management performance
● Barry Boehm top 10 list are as follows:
○ Finding and fixing a software problem after delivery costs 100 times more than finding
and fixing the problem in early design phases.
○ You can compress software development schedules 25% of nominal, but no more.
○ For every $1 you spend on development, you will spend $2 on maintenance.
○ Software development and maintenance costs are primarily a function of the number of
source lines of code
○ Variations among people account for the biggest differences in software productivity
○ The overall ratio of software to hardware costs is still growing. In 1955 it was 15:85, in
1985, 85:15
○ Only about 15% of software development effort is devoted to programming
23. Excellence and Service
CHRIST
Deemed to be University
Conventional software management performance
● Software systems and products typically cost 3 times as much per SLOC as
individual software programs. Software system products cost 9 times as much
● Walkthroughs catch 60% of the errors.
● 80% of the contribution comes from 20% of the contributors.
24. MISSION
CHRIST is a nurturing ground for an individual’s
holistic development to make effective contribution to
the society in a dynamic environment
VISION
Excellence and Service
CORE VALUES
Faith in God | Moral Uprightness
Love of Fellow Beings
Social Responsibility | Pursuit of Excellence
Unit 3: Software Project Management
Renaissance
Chapter 2: Evolution of Software Economics
25. Excellence and Service
CHRIST
Deemed to be University
Evolution of software economics
● Economic results of conventional software projects reflect an industry
dominated by custom development, ad hoc processes and diseconomies of
scale.
● Today’s cost models are based primarily on empirical project databases with
very few modern iterative development success stories.
● Good software cost estimates are difficult to attain. Decision makers must
deal with highly imprecise estimates.
● A modern process framework attacks the primary sources of the inherent
diseconomy of scale in the conventional software process.
26. Excellence and Service
CHRIST
Deemed to be University
Evolution of software economics -
● Five basic parameters of software cost models are as follows:
○ Size – end product, typically quantified in terms of the number of source instructions or
the number of function points required to develop the required functionality
○ Process – used to produce end product, the ability of the process to avoid non-value-
adding activities
○ Personnel – capabilities of software engineering personnel and particularly their
experience with the computer science issues and the applications domain issues of the
project.
○ Environment – is made up of the tools and techniques available to support efficient
software development and to automate the process.
○ Quality – the required quality of the product, including its features, performance,
reliability and adaptability.
27. Excellence and Service
CHRIST
Deemed to be University
Software Economics
● The relationships among these parameters and the estimated cost can be
written as follows:
○ Effort = (personnel) (environment) (quality) (size process)
● Important aspect of software economics is that the relationship between effort
and size exhibits a diseconomy of scale
○ Ex: for a given application, a 10,000 line software solution will cost less per line than a
100,000 line software solution. How much less?
○ Assume that 100,000 line system requires 900 staff-months for development, or about
111 lines per staff-month, or 1.37 hours per line.
○ If this same system were only 10,000 lines and all other parameters were held constant,
this project would be estimated at 62 staff-months or about 175 lines per staff-month, or
0.87 hour per line.
○ The per line cost for the smaller application is much less than for the larger application
29. Excellence and Service
CHRIST
Deemed to be University
Software Economics
● Figure shows three generations of basic technology advancement in tools,
components, and processes.
● The required levels of quality and personnel are assumed to be constant
● The ordinate of the graph refers to software unit costs realized by an
organization.
● The three generations of software development are defined as follows:
○ Conventional
○ Transition
○ Modern practice
30. Excellence and Service
CHRIST
Deemed to be University
Software Economics
● Conventional - 1960s and 1970s craftsmanship
○ Organizations used custom tools, custom processes and virtually all custom components
built in primitive languages
○ Project performance was highly predictable in that cost, schedule and quality objectives
were almost under achieved
● Transition – 1980s and 1990s software engineering
○ Organizations used more-repeated processes and off-the-shelf tools.
○ Commercial products (os, dbms, n/w and graphics) are available
○ Some organizations began achieving economies of scale, with the growth in applications
complexity
31. Excellence and Service
CHRIST
Deemed to be University
Software Economics
● Modern practices – 2000 and later software production
○ 30% of the components are custom built
○ With the advances in the software technology and integrated production environments,
these component based systems can be produced very rapidly
32. Excellence and Service
CHRIST
Deemed to be University
Pragmatic Software Cost Estimation
● One critical problem in software cost estimation is a lack of well-documented
case studies of projects that used an iterative development approach.
● The data from actual projects are highly suspect in terms of consistency and
comparability because the software industry has in consistently defined
metrics or atomic units of measure.
● Three points to decide on software cost are as follows:
○ Which cost estimation model to use?
○ Whether to measure software size in source lines of code or function points
○ What constitutes a good estimate?
33. Excellence and Service
CHRIST
Deemed to be University
Pragmatic Software Cost Estimation
● Popular cost estimation models are:
○ COCOMO
○ CHECKPOINT
○ ESTIMACS
○ KnowledgePlan
○ Price-S
○ ProQMS
○ SEER
○ SLIM
○ SOFTCOST
○ SPQR/20
34. Excellence and Service
CHRIST
Deemed to be University
Pragmatic Software Cost Estimation
● Measuring software size has two objective points of view: source lines of
code and function points.
● Many software experts argued that SLOC is a lousy measure of size.
● Ex: when a code segment is described as a 1,000 source line program, most
people feel comfortable with its general ‘mass’.
● If the description were 20 function points, 6 classes, 5 use cases, 4 object
points, 6 files, 2 subsystems , 1 component , or 6,000 bytes.
● SLOC works as an objective point for custom built software. It is easy to
automate and instrument.
● Language advances and the use of components, automatic source code
generation, and other object orientations have made SLOC an ambiguous
measure
35. Excellence and Service
CHRIST
Deemed to be University
Pragmatic Software Cost Estimation
● The use of function points has a large following.
● The international function point users group was formed in 1984 is the
dominant software measurement association in the industry.
● It is independent of the technology and it is better primitive unit for
comparisons among projects and organizations.
● But, primitive definitions are abstract and measurements are not easily
derived directly from the evolving artifacts.
● People working with cross-projects or cross-organization should be using
function points as the measure of size.
● The general accuracy of conventional cost models has been described as
within 20% of actuals, 70% of the time which is high level unpredictability in
the conventional software development process.
36. Excellence and Service
CHRIST
Deemed to be University
Pragmatic Software Cost Estimation
● Software cost estimation is usually bottom up as defined in most of the real
world models.
● The software project manager defines the target cost of the software, then
manipulates the parameters and sizing until the target cost can be justified.
● It is necessary to analyse the cost risks and understand the sensitivities and
trade offs objectively.
● It provides a good vehicle for a basis of estimate and an over cost analysis
● Independent cost estimates are usually inaccurate
● The only way to produce a credible estimate is by iterating through several
estimates and sensitivity analyses by software project manager and the
software architecture, development and test managers
37. Excellence and Service
CHRIST
Deemed to be University
Pragmatic Software Cost Estimation
● A good cost estimate has the following attributes:
○ It is conceived and supported by the project manager, architecture team, development
team and test team accountable for performing the work.
○ It is accepted by all stakeholders as ambitious but realize
○ It is based on a well defined software cost model with a credible basis
○ It is based on a database of relevant project experience that includes similar processes,
similar technologies, similar environments, similar quality requirements and similar
people.
○ It is defined in enough detail so that its key risk areas are understood and the probability
of success is objectively assessed.
39. MISSION
CHRIST is a nurturing ground for an individual’s
holistic development to make effective contribution to
the society in a dynamic environment
VISION
Excellence and Service
CORE VALUES
Faith in God | Moral Uprightness
Love of Fellow Beings
Social Responsibility | Pursuit of Excellence
Unit 3: Software Project Management
Renaissance
Chapter 3: Improving Software Economics
40. Excellence and Service
CHRIST
Deemed to be University
Improving Software Economics
● Improvements is the economics of software development have been not only
difficult to achieve but also difficult to measure and substantiate.
● The key to substantial improvement is a balanced attack across several
interrelated dimensions.
● Five basic parameters of the software cost model are:
○ Reducing the size or complexity of what need to be developed
○ Improving the development process
○ Using more-skilled personnel and better teams
○ Using better environments
○ Trading off or backing off on quality thresholds
42. Excellence and Service
CHRIST
Deemed to be University
Improving software economics
● Graphical User Interface (GUI) technology is a good example of tools
enabling a new and different process.
● GUI builder tools permitted engineering teams to construct an executable UI
faster and at less cost.
● The new process was geared toward taking the UI through a few realistic
versions, incorporating user feedback and achieving a stable understanding
of requirements and the design issues.
● Improvements in hardware performance also has influenced the software
technology
43. Excellence and Service
CHRIST
Deemed to be University
Reducing software product size
● The way to improve affordability and return on investment (ROI) is to produce
a product that achieves the design goals with the minimum amount of human
generated source material.
● CBD is introduced as the general term for reducing the source language size
necessary to achieve a software solution.
● Usage of newer programming languages contributed in reducing software
product size
44. Excellence and Service
CHRIST
Deemed to be University
Reducing software product size
● Languages:
○ Universal functional points (UFPs) are useful estimators for language independent, early
life cycle estimates.
○ The basic units of functional points are external user inputs, external outputs, internal
logical data groups, external data interfaces and extern inquiries.
○ Observe difference between ADA 83 and ADA 95
○ Observe difference between C and C++
● UFP is used to indicate the relative program sizes required to implement a
given functionality.
45. Excellence and Service
CHRIST
Deemed to be University
Reducing software product size
● Program sizes are as follows:
○ 1,000,000 lines of assembly language
○ 400,000 lines of C
○ 220,000 lines of Ada 83
○ 175,000 lines of Ada 95 or C++
● The difference between large and small projects has a greater than linear
impact on the life-cycle cost.
46. Excellence and Service
CHRIST
Deemed to be University
Reducing software product size
● O O Methods and Visual Modelling
○ Widespread movement in 1990s towards object oriented technology
● Three reasons for the success of object oriented approach are as follows:
○ An object oriented model of the problem and its solution encourages a common
vocabulary between the end users of a system and its developers, thus creating a shared
understanding of the problem being solved
○ The use of continuous integration creates opportunities to recognize risk early and make
incremental corrections without destabilizing the entire development effort.
○ An object oriented architecture provides a clear separation of concerns among disparate
elements of a system, creating firewalls that prevent a change in one part of the system
from rending the fabric of the entire architecture.
47. Excellence and Service
CHRIST
Deemed to be University
Reducing software product size
● Five characteristics of a successful as described by booch are:
○ A ruthless focus on the development of a system that provides a well understood
collection of essential minimal characteristics.
○ The existence of a culture that is centered on results, encourages communication, and
yet is not afraid to fail
○ The effectiveness use of object-oriented modelling
○ The existence of a strong architectural vision
○ The application of well managed iterative and incremental development life cycle
48. Excellence and Service
CHRIST
Deemed to be University
Reducing software product size
● Reuse:
○ Reusing existing components and building reusable components have natural software
engineering activities since the earlier improvements in programming languages
○ Reuse achieves undeserved importance within the software engineering community.
○ Reusable components of value are transitioned to commercial products supported by
organization with the following characteristics:
■ They have an economic motivation for continued support
■ The take ownership of improving product quality, adding new features, and
transitioning to new technologies
■ They have a sufficiently broad customer base to be profitable
49. Excellence and Service
CHRIST
Deemed to be University
Reducing software product size
● Commercial Components
○ A common approach being pursued today in many domains is to maximize integration of
commercial components and off the shelf products.
51. Excellence and Service
CHRIST
Deemed to be University
Improving Team Effectiveness
● Teamwork is much more important than the sum of the individuals.
● With software teams, a project manager needs to configure a balance of solid
talent with highly skilled people in the leverage positions
● Team management include the following:
○ A well-managed project can succeed with a nominal engineering team.
○ A mismanaged project will almost never succeed, even with an expert team of engineers
○ A well-architecture system can be built by a nominal team of software builders.
○ A poorly architecture system will flounder even with an expert team of builders.
52. Excellence and Service
CHRIST
Deemed to be University
Improving team effectiveness
● To improve staff of software project, Boehm has offered the following staffing
principles:
○ The principle of top talent: use better and fewer people
○ The principle of job matching: fit the tasks to the skills and motivation of the people
available
○ The principle of career progression: an organization does best in the long run by helping
its people to self-actualize
○ The principle of team balance: select people who will complement and harmonize with
one another
○ The principle of phaseout: keeping a misfit on the team doesn’t benefit anyone
53. Excellence and Service
CHRIST
Deemed to be University
Improving team effectiveness
● Software development is a team sport
● Managers must nurture a culture of team work and results rather than
individual accomplishment.
● Team balance and job matching are the primary objectives.
● Software project managers need many leadership qualities in order to
enhance team effectiveness
● Following are the crucial attributes of a successful software project managers:
○ Hiring skills
○ Customer interface skills
○ Decision making skills
○ Team-building skills
○ Selling skills
54. Excellence and Service
CHRIST
Deemed to be University
Improving Automation through software environments
● The tools and environment used in the software process have a linear effect
on the productivity of the process
● Planning tools, requirements management tools, visual modelling tools,
compilers, editors, debuggers, quality assurance analysis tools, test tools,
and user interfaces provide crucial automation support for evolving the
software engineering artifacts.
● An environment that supports incremental compilation, automated system
builds, and integrated regression can provide rapid turnaround for iterative
development and allow development teams to iterate more freely.
● Development and maintenance environment is defined as a first-class artefact
of the process in the modern approach
55. Excellence and Service
CHRIST
Deemed to be University
Improving Automation through software environments
● Round trip engineering is a term used to describe the key capability of
environments that supports iterative development.
● Automation support is required to ensure efficient and error-free transition of
data from one artefact to another.
● Forward engineering is the automation of one engineering artefact to another.
It is more abstract representation
● Reverse engineering is the generation or modification of a more abstract
representation from an existing artifact
56. Excellence and Service
CHRIST
Deemed to be University
Improving Automation through software environments
● Tool vendors make relatively accurate individual assessments of life-cycle
activities to support claims about the potential economic impact of the tools.
● Some of the claims are:
○ Requirements analysis and evolution activities consume 40% of life cycle costs
○ Software design activities have an impact on more than 50% of the resources
○ Coding and unit testing activities consume about 50% of software development effort and
schedule
○ Test activities can consume as much as 50% of a project’s resources.
○ Configuration control and change management are critical activities that can consume as
much as 25% of resources on a large-scale project
○ Documentation activities can consume more than 30% of project engineering resources
○ Project management, business administration, and progress assessment can consume
as much as 30% project budgets
57. Excellence and Service
CHRIST
Deemed to be University
Achieving required quality
● Key practices that improve overall software quality:
○ Focusing on driving requirements and critical use cases early in the life cycle, focusing
on requirements completeness and traceability late in the life cycle, and focusing
throughout the life cycle on a balance between requirements evolution, design evolution,
and plan evolution
○ Using metrics and indicators to measure the progress and quality of an architecture as it
evolves from a high-level prototype into a fully compliant product
○ Providing integrated life-cycle environments that support early and continuous
configuration control, change management, rigorous design methods, document
automation, and regression test automation
○ Using visual modeling and higher level language that support architectural control,
abstraction, reliable programming, reuse, and self-documentation
○ Early and continuous insight into performance issues through demonstration-based
evaluations
59. Excellence and Service
CHRIST
Deemed to be University
Achieving required quality
● The typical chronology of events in performance assessment is as follows:
○ Project inception:
○ Initial design review
○ Mid-life-cycle design review
○ Integration and test
● This sequence occurred because early performance insight was based on
naïve engineering judgement of innumerable criteria.
● Early performance issue are typical.
● They tend to expose architectural flaws or weaknesses in commercial
components.
60. Excellence and Service
CHRIST
Deemed to be University
Peer Inspections: A pragmatic view
● Peer inspections are frequently overhyped as the key aspect of a quality system.
Peer reviews are valuable as secondary mechanisms, but they are rarely significant
contributors to quality compared with following primary quality mechanisms and
indicators.
○ Transitioning engineering information from one artifact set to another, assessing consistency,
feasibility, understandability, and technology constraints inherent in the engineering artifacts.
○ Major milestone demonstrations that force the artifacts to be assessed against tangible criteria in
the context of relevant use cases
○ Environment tools that ensure representation rigor, consistency, completeness and change
control
○ Life-cycle testing for detailed insight into critical trade-offs, acceptance criteria and requirements
compliance.
○ Change management metircs for objective insight into multiple perspective change trends and
convergence or divergence from quality and progress goals
61. Excellence and Service
CHRIST
Deemed to be University
Peer Inspections: A pragmatic view
● Inspections are a good vehicle for holding authors accountable for quality
products.
● The coverage of inspections should be across all authors rather than across
all components.
● Junior authors need to have a random component inspected periodically, and
they can learn by inspecting the products of senior authors.
● Varying levels of informal inspection are performed continuously when
developers are reading or integrating software with another author’s software,
and during testing by independent test teams.
● A critical component deserves to be inspected by several people, preferably
those who have a stake in its quality, performance or feature set.
62. Excellence and Service
CHRIST
Deemed to be University
Peer Inspections: A pragmatic view
● Significant or substantial design errors or architecture issues are rarely
obvious.
● Random human inspections tend to degenerate into comments on style and
first-order semantic issues.
● Architectural issues are exposed with more rigorous engineering activities are
as follows:
○ Analysis, prototyping or experimentation
○ Constructing design models
○ Committing the current state of the design model to an executable implementation
○ Demonstrating the current implementation strengths and weaknesses in the context of
critical subsets of the use cases and scenarios
○ Incorporating lessons learned back into the models, use cases, implementations, and
plans.
63. MISSION
CHRIST is a nurturing ground for an individual’s
holistic development to make effective contribution to
the society in a dynamic environment
VISION
Excellence and Service
CORE VALUES
Faith in God | Moral Uprightness
Love of Fellow Beings
Social Responsibility | Pursuit of Excellence
Unit 3: Software Project Management
Renaissance
Chapter 4: The old way and the new
65. Excellence and Service
CHRIST
Deemed to be University
The principles of conventional software engineering
● Top 30 principles of David about conventional software engineering are as
follows:
● Make quality #1: quality must be quantified and mechanisms put into place to
motivate its achievement
● High-quality software is possible: techniques that have been demonstrated to
increase quality include involving the customer, prototyping, simplifying
design, conducting inspections, and hiring the best people
● Give products to customer early: No matter how hard you try to learn users
needs during the requirements phase, the most effective way to determine
real needs is to give users a product and let them play with it.
● Determine the problem before writing the requirements: when faced with what
they believe is a problem, most engineers rush to offer a solution. Before you
try to solve a problem, be sure to explore all the alternatives and don’t be
blinded by the obvious solutions
66. Excellence and Service
CHRIST
Deemed to be University
The principles of conventional software engineering
● Evaluate design alternatives: after the requirements are agreed upon, you must
examine a variety of architectures and algorithms. You certainly want to use an
‘architecture’ simply because it was used in the requirements specifications.
● Use an appropriate process model: each project must select a process that
makes the most sense for that project on the basis of corporate culture,
willingness to take risks, applications area, volatility of requirements and the
extent to which requirements are well understood.
● Use different languages for different phases: our industry’s eternal thirst for
simple solutions to complex problems has driven many to declare that the best
development method is one that uses the same notation throughout the life
cycle. Why should software engineers use ada for requirements, design and
code unless ada were optimal for all these phases?
67. Excellence and Service
CHRIST
Deemed to be University
The principles of conventional software engineering
● Minimize intellectual distance: to minimize intellectual distance, the software’s
structure should be as close as possible to the real-world structure.
● Put techniques before tools: an undisciplined software engineer with a tool
becomes a dangerous, undisciplined software engineer.
● Get it right before you make it faster: it is far easier to make a working program
run faster than it is to make a fast program work. Don’t worry about optimization
during initial coding.
● Inspect code: inspecting the detailed design and code is a much better way to
find errors than testing.
● Good management is more important than good technology: the best technology
will not compensate for poor management, and a good manager can produce
great results even with meagre resources. Good management motivates people
to do their best, but there are no universal ‘right’ styles of management.
68. Excellence and Service
CHRIST
Deemed to be University
The principles of conventional software engineering
● People are the key to success: highly skilled people with appropriate experience,
talent, and training are key. The right people with insufficient tools, languages and
process will succeed. The wrong people with appropriate tools, languages, and
process will probably fail.
● Follow with care: just because everybody is doing something does not make it right
for you. It may be right, but you must carefully assess its applicability to your
environment, object orientation, measurement, reuse, process improvement, CASE,
prototyping – all these might increase quality, decrease cost, and increase user
satisfaction. The potential of such techniques is often oversold and benefits are by
no means guaranteed or universal.
● Take responsibility: When a bridge collapse we ask – “ what did the engineers do
wrong?”. Even when software fails, we rarely ask this. The fact is that in any
engineering discipline, the best methods can be used to produce awful designs, and
the most antiquated methods to produce elegant designs.
69. Excellence and Service
CHRIST
Deemed to be University
The principles of conventional software engineering
● Understand the customer’s priorities: It is possible the customer would tolerate
90% of the functionality delivered late if they could have 10% of it on time.
● The more they see, the more they need: the more functionality (or
performance) you provide a user, the more functionality (or performance) the
user wants.
● Plan to throw one away: One of the most important critical success factors is
whether or not a product is entirely new. Such brand new applications,
architectures, interfaces, or algorithms rarely work the first time.
● Design for change: The architectures, components and specification
techniques you use must accommodate change.
● Design without documentation is not design: I have often heard software
engineers say, “I have finished the design. all that is left is the documentation”.
70. Excellence and Service
CHRIST
Deemed to be University
The principles of conventional software engineering
● Use tools, but be realistic: software tools make their users more efficient.
● Avoid tricks: Many programmers love to created programs with tricks –
constructs that perform a function correctly, but in an obscure way. Show the
world how smart you are by avoiding tricky code.
● Encapsulate: Information hiding is a simple, proven concept that results in
software that is easier to test and much easier to maintain.
● Use coupling and cohesion: Coupling and cohesion are the best ways to
measure software’s inherent maintainability and adaptability.
● Use the McCabe complexity measure: Although there are many metrics
available to report the inherent complexity of software, none is as intuitive and
easy to use as Tom McCabe’s.
● Don’t test your own software: Software developers should never be the
primary testers of their own software.
71. Excellence and Service
CHRIST
Deemed to be University
The principles of conventional software engineering
● Analyze causes for errors: it is far more cost-effective to reduce the effect of
an error by preventing it than it is to find and fix it. One way to do this is to
analyse the causes of errors as they are detected.
● Realize the software’s entropy increases: Any software system that
undergoes continuous change will grow in complexity and will become more
and more disorganized.
● People and time are not interchangeable: Measuring a project solely by
person-months makes little sense.
● Expect excellence: Your employees will do much better if you have high
expectations for them.
72. Excellence and Service
CHRIST
Deemed to be University
The principles of modern software management
● Davis top 10 principles of modern management are as follows:
● Base the process on an architecture-first approach: demonstrable balanced
achieved among the driving requirements, the architecturally significant
design decisions and the life-cycle plans before the resources are committed
for full-scale development.
● Establish an iterative life-cycle process that confronts risk early: at this age, it
is not possible to define the entire problem, design the entire solution, build
the software, then test the end product sequence. Instead use iterative
process.
● Transition design methods to emphasize component based development:
Moving from a line-of-code mentality to a CBD is necessary. A component is
cohesive set of pre-existing lines of code, either in source or executable
format with a defined interface and behaviour.
73. Excellence and Service
CHRIST
Deemed to be University
The principles of modern software management
● Establish a change management environment: The dynamics of iterative
development, by different teams working on shared artifacts, necessitates
objectively controlled baselines.
● Enhance change freedom through tools that support round-trip engineering:
RTE is the environment support necessary to automate and synchronize
engineering information in different formats.
● Capture design artifacts in rigorous, model-based notation: A model based
approach supports the evolution of semantically rich graphical and textual
design notations.
● Instrument the process for objective quality control and progress assessment:
Life-cycle assessment of the progress and the quality of all intermediate
products must be integrated into the process
74. Excellence and Service
CHRIST
Deemed to be University
The principles of modern software management
● Use a demonstrate-based approach to assess intermediate artifacts:
transitioning the current state-of-the-product artifacts into an executable
demonstration of relevant scenarios stimulates earlier convergence on
integration, a more tangible understanding of design tradeoffs and earlier
elimination of architectural defects.
● Plan intermediate releases in groups of usage scenarios with evolving levels
of detail: software management process drive toward early and continuous
demonstrations within the operational context of the system and its use
cases.
● Establish a configurable process that is economically scalable: No single
process is suitable for all software developments.
77. Excellence and Service
CHRIST
Deemed to be University
Transitioning to an iterative process
● Modern software development process have move away from the
conventional waterfall model, each stage of the development process is
dependent on completion of the previous stage.
● Development proceeds as a series of iterations, building on the core
architecture until the desired levels of functionality, performance, and
robustness are achieved.
● The economic benefits inherent in transitioning from the conventional
waterfall model to an iterative development process are significant but difficult
to quantify.
● Top 10 principles are combined into 5 principles which are as shown below
78. Excellence and Service
CHRIST
Deemed to be University
Transitioning to iterative process
● Application precedentedness: modern software industry has moved to an
iterative life-cycle process. Early iterations in the life-cycle establish
precedents from which the product, the process, and the plans can be
elaborated in evolving levels of detail.
● Process flexibility: project artifacts must be supported be efficient change
management commensurate with project needs. A configurable process that
allows a common framework to be adapted across a range of projects is
necessary to achieve a software return on investment.
79. Excellence and Service
CHRIST
Deemed to be University
Transitioning to iterative process
● Architecture risk resolution: Architecture first development is a crucial theme
underlying a successful iterative development process. An architecture-first
and component based development approach forces the infrastructure,
common mechanisms, and control mechanisms to be elaborated early in the
life cycle as the verification activity of the design process and products. Also
ensures the early attention to testability and a foundation for demonstration-
based assessment.
● Team cohesion: successful teams are cohesive, and cohesive teams are
successful. The model based formats have also enabled the round-trip
engineering support needed to establish change freedom sufficient for
evolving design representations.
80. Excellence and Service
CHRIST
Deemed to be University
Transitioning to iterative process
● Software process maturity: The software Engineering Institute’s Capability
Maturity Model (CMM) is a well-accepted benchmark for software process
assessment. One of the key themes is that truly mature processes are
enabled through an integrated