Out of Pocket (Expense Tracker) : Project Report On
Out of Pocket (Expense Tracker) : Project Report On
Submitted
By
Mukul Sharma
(R270529043.)
Under Supervision of
Mr. GD Makkar
SCHOOL OF COMPUTING
This is certified that this project report "OUT OF POCKET (EXPENSE TRACKER)" is the bona fide work of
"MUKUL SHARMA" university Roll No – R170529043. who carried out the project under my supervision.
SIGNATURE SIGNATURE
This is a great opportunity to acknowledge and to thanks all those persons without whose support and help this
project would have been impossible. I would like to add a few heartfelt words for the people who were part of
I am obliged and thankful to my project SUPERVISOR, Mr.GD Makkar, for her continuous encouragement,
motivation and professional guidance during the work of this project which has proven to be an integral part of
it. Without her valuable support and guidance, this project could not elevate up this level of development from
our point of view. I would like to thank all the Faculty members, School Of Computing, S.G.R.R University
for their valuable time spent in requirements analysis and evaluation of the project work.
I would like to express my sincere and cordial gratitude to the people those who have supported me directly,
purveyed mental encouragement, evaluated and criticized my work in several phases during the development
R170529043.
ABSTRACT
In today’s busy and expensive life we are in a great rush to make money. But at the end of the month we broke
off. As we are unknowingly spending money on little and unwanted things. So, we have come over with the
idea to track our earnings. Out Of Pocket (Daily Expense Tracker) aims to help everyone who are planning to
know their expenses and save from it. Out Of Pocket (Daily Expense Tracker) is an website which users can
execute in their mobile phones and update their daily expenses so that they are well known to their expenses.
Here user can manage their expense and budget where they have to enter the money that has been spent and
also can add some information in additional information to specify the expense. User will be able to see
doughnut chart of expense. Although this website is focused on new job holders, interns and teenagers,
everyone who wants to track their expense can use this app.
Every day Expense Tracker System is intended to monitor Income-Expense of a user on an everyday premise.
This System separates the Income-based daily expenses. On the off chance that you surpass day’s cost, the
system will cut it from your salary and will give new everyday cost permitted sum. On the off chance that that
day’s cost is less, the system will include it in reserve funds. Day by day cost following System will produce
report toward the finish of month to demonstrate Income-Expense Curve. It will give you a chance to include
the reserve funds sum, which you had put something aside for some specific Festivals or days like Birthday or
Anniversary.
INDEX
1.1 INTRODUCTION
The Out of Pocket expense tracker system is designed to keep a track of Income-Expense of a user on a day-
to-day basis. This system helps users to keep track of their spending and savings daily. Out of Pocket will
generate a report at the end of month to show Income-Expense Curve. It will let you add the savings amount,
The Out of Pocket expense tracker system aims to help everyone who are planning to know their expenses and
save from it. Out of Pocket is a react based web app, users can access the web app via their mobile phones or
personal computers and update their daily expenses so that they are well known to their expenses. Here user
can define their own categories for expense type like food, clothing, rent and bills where they have to enter the
money that has been spent and also can add some information in additional information to specify the expense.
User can also define expense categories. User will be able to see pie chart of expense. Although this web app
is focused on new job holders, interns and teenagers, everyone who wants to track their expense can use this
app.
1.2 AIM
Out of Pocket (Daily Expense Tracker) aims to help everyone who are planning to know their expenses and
save from it. Out of Pocket (Daily Expense Tracker) is an android app which users can execute in their mobile
phones and update their daily expenses so that they are well known to their expenses. Here user can define
their own categories for expense type like food, clothing, rent and bills where they have to enter the money
that has been spent and also can add some information in additional information to specify the expense. User
can also define expense categories. User will be able to see doughnut chart of expense. This app is focused on
new job holders, interns and teenagers, everyone who wants to track their expense can use this app.
In existing, we need to maintain the Excel sheets, CSV etc. files for the user daily and monthly expenses. In
existing, there is no as such complete solution to keep a track of its daily expenditure easily. To do so a person
as to keep a log in a diary or in a computer, also all the calculations needs to be done by the user which may
To reduce manual calculations, we propose a web application which is developed in MERN stack. This
application allows users to maintain a digital automated diary. Each user will be required to register on the
system at registration time, the user will be provided id, which will be used to maintain the record of each
unique user. Expense Tracker application which will keep a track of Income-Expense of a user on a day to day
basis. This application takes Income from user and divides in daily expense allowed. If u exceed that days
expense it will cut if from your income and give new daily expense allowed amount, and if that days expense
is less it will add it in savings. Expense tracking application will generate report at the end of month to show
Income-Expense via multiple graphs. It will let you add the savings amount which you had saved for some
particular occasions.
begins by classifying the problem definition. Feasibility is to determine if it’s worth doing. Once an
acceptance problem definition has been generated, the analyst develops a logical model of the system. A
search for alternatives is analysed carefully. There are 3 parts in feasibility study.
1) Operational Feasibility
2) Technical Feasibility
3) Economical Feasibility
Operational feasibility is the measure of how well a proposed system solves the problems, and takes advantage
of the opportunities identified during scope definition and how it satisfies the requirements identified in the
requirements analysis phase of system development. The operational feasibility assessment focuses on the
degree to which the proposed development projects fits in with the existing business environment and
objectives with regard to development schedule, delivery date, corporate culture and existing business
processes. To ensure success, desired operational outcomes must be imparted during design and development.
producibility, disposability, sustainability, affordability and others. These parameters are required to be
considered at the early stages of design if desired operational behaviours are to be realised. A system design
and development requires appropriate and timely application of engineering and management efforts to meet
the previously mentioned parameters. A system may serve its intended purpose most effectively when its
technical and operating characteristics are engineered into the design. Therefore, operational feasibility is a
critical aspect of systems engineering that needs to be an integral part of the early design phases.
build, and whether the firm has enough experience using that technology. The assessment is based on outline
design of system requirements in terms of input, processes, output, fields, programs and procedures. This can
be qualified in terms of volume of data, trends, frequency of updating in order to give an introduction to the
technical system. The application is the fact that it has been developed on windows XP platform and a high
configuration of 1GB RAM on Intel Pentium Dual core processor. This is technically feasible .The technical
feasibility assessment is focused on gaining an understanding of the present technical resources of the
organization and their applicability to the expected needs of the proposed system. It is an evaluation of the
hardware and software and how it meets the need of the proposed system.
Establishing the cost-effectiveness of the proposed system i.e. if the benefits do not outweigh the costs then it
is not worth going ahead. In the fast paced world today there is a great need of online social networking
facilities. Thus the benefits of this project in the current scenario make it economically feasible. The purpose
of the economic feasibility assessment is to determine the positive economic benefits to the organization that
the proposed system will provide. It includes quantification and identification of all the benefits expected. This
This section includes the overall view of the project i.e. the basic problem definition and the general overview
of the problem which describes the problem in layman terms. It also specifies the software used and the
This section includes the Software and hardware requirements for the smooth running of the application.
This section consists of the Software Development Life Cycle model. It also contains technical diagrams like
This section describes the different technologies used for the entire development process of the Front-end as
This section has screenshots of all the implementation i.e. user interface and their description.
This section has screenshots of all the implementation i.e. user interface and their description.
Number Description
1 PC with 250 GB or more of HDD.
2 PC with 2 GB RAM.
3 PC with Pentium 1 or Above.
The waterfall model was selected as the SDLC model due to the following reasons:
Requirements were very well documented, clear and fixed.
Add Transaction
Budget Expense
User
Maintain Events
Monthly View
A
d
G
d
e
T
n
U3.3 USE CASE DIAGRAM D Br
Ex
e
a uap
s r
t dne
e a ns
a gs
t
r R etae
e
a c
R
n ti
e
g o
p
e n
o
rt
s
M
ai
M
on
nt
ai
t
hn
lyE
Vv
e
ie
wn
ts
3.4 ER DIAGRAM
Name
Date Amount
Id Expense
Username
adds
Email
User
Password
Gets
Adds
Statistics
Name Budget
Doughnut chart
Amount
Date
Login to
User System
Manage
Modules
Manage Account
Forgot
Password
Check
Manage Expense
Credentials
Manage Budget
In this Section we will do Analysis of Technologies to use for implementing the project.
4.1 FRONTEND
4.1.1 React
React (also known as React.js or ReactJS) is a JavaScript library for building user interfaces. It is maintained
by Facebook and a community of individual developers and companies. React can be used as a base in the
development of single-page or mobile applications, as it is optimal for fetching rapidly changing data that
needs to be recorded. However, fetching data is only the beginning of what happens on a web page, which is
why complex React applications usually require the use of additional libraries for state management, routing,
and interaction with an API: Next.js and Gatsby.js are examples of such libraries.
React was created by Jordan Walke, a software engineer at Facebook, who released an early prototype of
React called "FaxJS".He was influenced by XHP, an HTML component framework for PHP. It was first
deployed on Facebook's News Feed in 2011 and later on Instagram in 2012.It was open-sourced at JSConf US
in May 2013.
JSX, or JavaScript XML, is an extension to the JavaScript language syntax. Similar in appearance to HTML,
JSX provides a way to structure component rendering using syntax familiar to many developers. React
components are typically written using JSX, although they do not have to be (components may also be written
in pure JavaScript). JSX is similar to another extension syntax created by Facebook for PHP called XHP.
4.1.2 HTML
Hypertext Markup Language (HTML) is the standard markup language for documents designed to be
displayed in a web browser. It can be assisted by technologies such as Cascading Style Sheets (CSS) and
scripting languages such as JavaScript. Web browsers receive HTML documents from a web server or from
local storage and render the documents into multimedia web pages. HTML describes the structure of a web
page semantically and originally included cues for the appearance of the document.
HTML elements are the building blocks of HTML pages. With HTML constructs, images and other objects
such as interactive forms may be embedded into the rendered page. HTML provides a means to create
structured documents by denoting structural semantics for text such as headings, paragraphs, lists, links,
quotes and other items. HTML elements are delineated by tags, written using angle brackets. Tags such as
<img /> and <input /> directly introduce content into the page. Other tags such as <p> surround and provide
information about document text and may include other tags as sub-elements. Browsers do not display the
HTML tags, but use them to interpret the content of the page.
HTML can embed programs written in a scripting language such as JavaScript, which affects the behaviour
and content of web pages. Inclusion of CSS defines the look and layout of content. The World Wide Web
Consortium (W3C), former maintainer of the HTML and current maintainer of the CSS standards, has
encouraged the use of CSS over explicit presentational HTML since 1997.
4.1.3 JavaScript
JavaScript s a high-level, interpreted scripting language that conforms to the ECMAScript specification.
JavaScript has curly-bracket syntax, dynamic typing, prototype-based object orientation, and first-class
functions. Alongside HTML and CSS, JavaScript is one of the core technologies of the World Wide Web.
JavaScript enables interactive web pages and is an essential part of web applications. The vast majority of
websites use it, and major web browsers have a dedicated JavaScript engine to execute it’s a multi-paradigm
language, JavaScript supports event driven, functional, and imperative (including object-oriented and
prototype-based) programming styles. It has APIs for working with text, arrays, dates, regular expressions, and
the DOM, but the language itself does not include any I/O, such as networking, storage, or graphics facilities.
It relies upon the host environment in which it is embedded to provide these features.
Initially only implemented client-side in web browsers, JavaScript engines are now embedded in many other
types of host software, including server-side in web servers and databases, and in non-web programs such as
word processors and PDF software, and in runtime environments that make JavaScript available for writing
The terms Vanilla JavaScript and Vanilla JS refer to JavaScript not extended by any frameworks or additional
libraries. Scripts written in Vanilla JS are plain JavaScript code. Google’s Chrome extensions, Opera's
extensions, Apple's Safari 5 extensions, Apple's Dashboard Widgets, Microsoft's Gadgets, Yahoo! Widgets,
Google Desktop Gadgets, and Serence Klipfolio are implemented using JavaScript.
4.1.4 CSS
Cascading Style Sheets (CSS) is a style sheet language used for describing the presentation of a document
written in a markup language like HTML.CSS is a cornerstone technology of the World Wide Web, alongside
HTML and JavaScript.CSS is designed to enable the separation of presentation and content, including layout,
colours, and fonts. This separation can improve content accessibility, provide more flexibility and control in
the specification of presentation characteristics, enable multiple web pages to share formatting by specifying
the relevant CSS in a separate .css file, and reduce complexity and repetition in the structural content.
CSS information can be provided from various sources. These sources can be the web browser, the user and
the author. The information from the author can be further classified into inline, media type, importance,
selector specificity, rule order, inheritance and property definition. CSS style information can be in a separate
document or it can be embedded into an HTML document. Multiple style sheets can be imported. Different
styles can be applied depending on the output device being used; for example, the screen version can be quite
different from the printed version, so that authors can tailor the presentation appropriately for each medium.
The style sheet with the highest priority controls the content display. Declarations not set in the highest
priority source are passed on to a source of lower priority, such as the user agent style. The process is called
cascading.
One of the goals of CSS is to allow users greater control over presentation. Someone who finds red italic
headings difficult to read may apply a different style sheet. Depending on the browser and the web site, a user
may choose from various style sheets provided by the designers, or may remove all added styles and view the
site using the browser's default styling, or may override just the red italic heading style without altering other
attributes.
4.2 BACKEND
4.2.1 Node.js
Node.js is an open-source, cross-platform, JavaScript run-time environment that executes JavaScript code
outside of a browser. Node.js lets developers use JavaScript to write command line tools and for server-side
scripting—running scripts server-side to produce dynamic web page content before the page is sent to the
user's web browser. Consequently, Node.js represents a "JavaScript everywhere" paradigm, unifying web
application development around a single programming language, rather than different languages for server-
Node.js was written initially by Ryan Dahl in 2009,[25] about thirteen years after the introduction of the first
server-side JavaScript environment, Netscape's Livewire Pro Web.Node.js allows the creation of Web servers
and networking tools using JavaScript and a collection of "modules" that handle various core functionality.
Modules are provided for file system I/O, networking (DNS, HTTP, TCP, TLS/SSL, or UDP), binary data
Node.js brings event-driven programming to web servers, enabling development of fast web servers in
JavaScript. Developers can create scalable servers without using threading, by using a simplified model of
event-driven programming that uses call-backs to signal the completion of a task.Node.js connects the ease of
4.2.2 MongoDB
MongoDB is a source-available cross-platform document-oriented database program. Classified as
developed by MongoDB Inc. and licensed under the Server Side Public License (SSPL) it is written in C++.
MongoDB is a document-oriented NoSQL database used for high volume data storage. Instead of using tables
and rows as in the traditional relational databases, MongoDB makes use of collections and documents.
Documents consist of key-value pairs which are the basic unit of data in MongoDB. Collections contain sets of
documents and function which is the equivalent of relational database tables. MongoDB is a database which
Since MongoDB is a NoSQL type database, instead of having data in a relational type format, it stores the data
in documents. This makes MongoDB very flexible and adaptable to real business world situation and
requirements. MongoDB can run over multiple servers, balancing the load and/or duplicating data to keep the
system up and running in case of hardware failure. Unlike in SQL databases, where you must have a table’s
schema declared before inserting data, MongoDB’s collections do not enforce document structure. This sort of
CHAPTER 5: TESTING
5.1 UNIT TESTING
5.1.1 INTRODUCTION
In computer programming, unit testing is a software testing method by which individual units of source code,
sets of one or more computer program modules together with associated control data, usage procedures, and
operating procedures, are tested to determine whether they are fit for use. Intuitively, one can view a unit as
the smallest testable part of an application. In procedural programming, a unit could be an entire module, but it
entire interface, such as a class, but could be an individual method. Unit tests are short code fragments created
by programmers or occasionally by white box testers during the development process. It forms the basis for
component testing. Ideally, each test case is independent from the others. Substitutes such as method stubs,
mock objects, fakes, and test harnesses can be used to assist testing a module in isolation. Unit tests are
typically written and run by software developers to ensure that code meets its design and behaves as intended.
5.1.2 BENEFITS
The goal of unit testing is to isolate each part of the program and show that the individual parts are correct. A
unit test provides a strict, written contract that the piece of code must satisfy. As a result, it affords several
benefits
1) Find problems early: Unit testing finds problems early in the development cycle. In test-driven
development (TDD), which is frequently used in both extreme programming and scrum, unit tests are created
before the code itself is written. When the tests pass, that code is considered complete. The same unit tests are
run against that function frequently as the larger code base is developed either as the code is changed or via an
automated process with the build. If the unit tests fail, it is considered to be a bug either in the changed code or
the tests themselves. The unit tests then allow the location of the fault or failure to be easily traced. Since the
unit tests alert the development team of the problem before handing the code off to testers or clients, it is still
2) Facilitates Change: Unit testing allows the programmer to refactor code or upgrade system libraries at a
later date, and make sure the module still works correctly (e.g., in regression testing). The procedure is to write
test cases for all functions and methods so that whenever a change causes a fault, it can be quickly identified.
3) Simplifies Integration: Unit testing may reduce uncertainty in the units themselves and can be used in a
bottom-up testing style approach. By testing the parts of a program first and then testing the sum of its parts,
4) Documentation: Unit testing provides a sort of living documentation of the system. Developers looking to
learn what functionality is provided by a unit, and how to use it, can look at the unit tests to gain a basic
understanding of the unit's interface (API).Unit test cases embody characteristics that are critical to the success
of the unit. These characteristics can indicate appropriate/inappropriate use of a unit as well as negative
in which individual software modules are combined and tested as a group. It occurs after unit testing and
before validation testing. Integration testing takes as its input modules that have been unit tested, groups them
in larger aggregates, applies tests defined in an integration test plan to those aggregates, and delivers as its
5.2.1 PURPOSE
The purpose of integration testing is to verify functional, performance, and reliability requirements placed on
major design items. These "design items", i.e., assemblages (or groups of units), are exercised through their
interfaces using black-box testing, success and error cases being simulated via appropriate parameter and data
inputs. Simulated usage of shared data areas and inter-process communication is tested and individual
subsystems are exercised through their input interface. Test cases are constructed to test whether all the
components within assemblages interact correctly, for example across procedure calls or process activations,
and this is done after testing individual modules, i.e., unit testing. The overall idea is a "building block"
approach, in which verified assemblages are added to a verified base which is then used to support the
integration testing of further assemblages. Software integration testing is performed according to the software
development life cycle (SDLC) after module and functional tests. The cross dependencies for software
integration testing are: schedule for integration testing, strategy and selection of the tools used for integration,
define the cyclomatic complexity of the software and software architecture, reusability of modules and life-
cycle and versioning management. Some different types of integration testing are big-bang, top-down, and
bottom-up, mixed (sandwich) and risky-hardest. Other Integration Patterns are: collaboration integration,
backbone integration, layer integration, client-server integration, distributed services integration and high-
frequency integration.
system or major part of the system and then used for integration testing. This method is very effective for
saving time in the integration testing process. However, if the test cases and their results are not recorded
properly, the entire integration process will be more complicated and may prevent the testing team from
achieving the goal of integration testing. A type of big-bang integration testing is called "usage model testing"
which can be used in both software and hardware integration testing. The basis behind this type of integration
testing is to run user-like workloads in integrated user-like environments. In doing the testing in this manner,
the environment is proofed, while the individual components are proofed indirectly through their use. Usage
Model testing takes an optimistic approach to testing, because it expects to have few problems with the
individual components. The strategy relies heavily on the component developers to do the isolated unit testing
for their product. The goal of the strategy is to avoid redoing the testing done by the developers, and instead
Bottom-up testing is an approach to integrated testing where the lowest level components are tested first, then
used to facilitate the testing of higher level components. The process is repeated until the component at the top
of the hierarchy is tested. All the bottom or low-level modules, procedures or functions are integrated and then
tested. After the integration testing of lower level integrated modules, the next level of modules will be formed
and can be used for integration testing. This approach is helpful only when all or most of the modules of the
same development level are ready. This method also helps to determine the levels of software developed and
makes it easier to report testing progress in the form of a percentage. Top-down testing is an approach to
integrated testing where the top integrated modules are tested and the branch of the module is tested step by
step until the end of the related module. Sandwich testing is an approach to combine top down testing with
bottom up testing.
In software project management, software testing, and software engineering, verification and validation
(V&V) is the process of checking that a software system meets specifications and that it fulfils its intended
purpose. It may also be referred to as software quality control. It is normally the responsibility of software
testers as part of the software development lifecycle. Validation checks that the product design satisfies or fits
the intended use (high-level checking), i.e., the software meets the user requirements. This is done through
dynamic testing and other forms of review. Verification and validation are not the same thing, although they
Software Verification: The process of evaluating software to determine whether the products of a given
development phase satisfy the conditions imposed at the start of that phase.
Software Validation: The process of evaluating software during or at the end of the development process to
In other words, software verification is ensuring that the product has been built according to the requirements
and design specifications, while software validation ensures that the product meets the user's needs, and that
the specifications were correct in the first place. Software verification ensures that "you built it right".
Software validation ensures that "you built the right thing". Software validation confirms that the product, as
Malfunction – according to its specification the system does not meet its specified functionality
Both verification and validation are related to the concepts of quality and of software quality assurance.
By themselves, verification and validation do not guarantee software quality; planning, traceability,
configuration management and other aspects of software engineering are required. Within the modelling
and simulation (M&S) community, the definitions of verification, validation and accreditation are similar:
M&S Verification is the process of determining that a ⦁ computer model, simulation, or federation
of models and simulations implementations and their associated data accurately represent the
M&S Validation is the process of determining the degree to which a model, simulation, or
federation of models and simulations, and their associated data are accurate representations of the
be used to ensure the correct operation of a system. However, often for non-mission critical software systems,
formal methods prove to be very costly and an alternative method of software V&V must be sought out. In
A test case is a tool used in the process. Test cases may be prepared for software verification and software
validation to determine if the product was built according to the requirements of the user. Other methods, such
as reviews, may be used early in the life cycle to provide for software validation
peering into its internal structures or workings. This method of test can be applied virtually to every level of
software testing: unit, integration, system and acceptance. It typically comprises most if not all higher level
Specific knowledge of the application's code/internal structure and programming knowledge in general is not
required. The tester is aware of what the software is supposed to do but is not aware of how it does it. For
instance, the tester is aware that a particular input returns a certain, invariable output but is not aware of how
Test cases are built around specifications and requirements, i.e., what the application is supposed to do. Test
cases are generally derived from external descriptions of the software, including specifications, requirements
and design parameters. Although the tests used are primarily functional in nature, non-functional tests may
also be used. The test designer selects both valid and invalid inputs and determines the correct output, often
with the help of an oracle or a previous result that is known to be good, without any knowledge of the test
testing) is a method of testing software that tests internal structures or workings of an application, as opposed
to its functionality (i.e. black-box testing). In white-box testing an internal perspective of the system, as well
as programming skills, are used to design test cases. The tester chooses inputs to exercise paths through the
code and determine the appropriate outputs. This is analogous to testing nodes in a circuit, e.g. in-circuit
testing (ICT). White-box testing can be applied at the unit, integration and system levels of the software
testing process. Although traditional testers tended to think of white-box testing as being done at the unit level,
it is used for integration and system testing more frequently today. It can test paths within a unit, paths
between units during integration, and between subsystems during a system–level test. Though this method of
test design can uncover many errors or problems, it has the potential to miss unimplemented parts of the
5.5.1 Levels
1) Unit testing: White-box testing is done during unit testing to ensure that the code is working as intended,
before any integration happens with previously tested code. White-box testing during unit testing catches any
defects early on and aids in any defects that happen later on after the code is integrated with the rest of the
2) Integration testing: White-box testing at this level are written to test the interactions of each interface with
each other. The Unit level testing made sure that each code was tested and working accordingly in an isolated
environment and integration examines the correctness of the behaviour in an open environment through the
use of white-box testing for any interactions of interfaces that are known to the programmer.
3) Regression testing: White-box testing during regression testing is the use of recycled white-box test cases
5.5.2 Procedures
White-box testing's basic procedures involves the tester having a deep level of understanding of the source
code being tested. The programmer must have a deep understanding of the application to know what kinds of
test cases to create so that every visible path is exercised for testing. Once the source code is understood then
the source code can be analysed for test cases to be created. These are the three basic steps that white-box
documents, proper source code, security specifications. This is the preparation stage of white box
testing to layout all of the basic information. Processing involves performing risk analysis to guide
whole testing process, proper test plan, execute test cases and communicate results. This is the phase
of building test cases to make sure they thoroughly test the application the given results are recorded
accordingly.
Output involves preparing final report that encompasses all of the above preparations and results.
system's compliance with its specified requirements. System testing falls within the scope of black-box testing,
and as such, should require no knowledge of the inner design of the code or logic. As a rule, system testing
takes, as its input, all of the "integrated" software components that have passed integration testing and also the
software system itself integrated with any applicable hardware system(s). The purpose of integration testing is
to detect any inconsistencies between the software units that are integrated together (called assemblages) or
between any of the assemblages and the hardware. System testing is a more limited type of testing; it seeks to
detect defects both within the "inter-assemblages" and also within the system as a whole.
System testing is performed on the entire system in the context of a Functional Requirement Specification(s)
(FRS) and/or a System Requirement Specification (SRS). System testing tests not only the design, but also the
behaviour and even the believed expectations of the customer. It is also intended to test up to and beyond the
CHAPTER 6: RESULTS
6.1 Sign in / Sign Up
6.2 Dashboard
6.3 Reports
6.4 Expense/ Budget List
CHAPTER 7: ADVANTAGES
Easy to use.
Helps you to maintain your financial control and reduce your financial stress.
CHAPTER 8: CONCLUSION
After making this application we assure that this application will help its users to manage the cost of their
daily expenditure. It will guide them and aware them about their daily expenses. It will prove to be helpful for
the people who are frustrated with their daily budget management, irritated because of amount of expenses and
wishes to manage money and to preserve the record of their daily cost which may be useful to change their
way of spending money. In short, this application will help its users to overcome the wastage of money.
BIBLIOGRAPHY
www.stackoverflow.com