0% found this document useful (0 votes)
18 views

(6737) Seminardocumentation 1

Normalization is a fundamental concept in database design that involves structuring data into tables and relationships to reduce redundancy and improve data integrity. It follows a series of normal forms like 1NF, 2NF and 3NF to break down tables and eliminate anomalies. Achieving normal forms helps ensure accurate and reliable data retrieval while avoiding data inconsistencies.

Uploaded by

svijaynani007
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views

(6737) Seminardocumentation 1

Normalization is a fundamental concept in database design that involves structuring data into tables and relationships to reduce redundancy and improve data integrity. It follows a series of normal forms like 1NF, 2NF and 3NF to break down tables and eliminate anomalies. Achieving normal forms helps ensure accurate and reliable data retrieval while avoiding data inconsistencies.

Uploaded by

svijaynani007
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 26

A Technical Seminar report entitled

On

NORMALIZATION IN DBMS
In partial fulfilment of the requirements for the award of
BACHELOR OF TECHNOLOGY
In
Computer Science and Engineering (Data Science)
Submitted by
R.Sakethreddy (20E51A6737)

Under the Esteemed guidance of


Dr. P. MADHURI
Associate Professor

HYDERABAD INSTITUTE OF TECHNOLOGY AND MANAGEMENT


Gowdavelly (Village), Medchal, Hyderabad, Telangana, 501401

(UGC Autonomous, Affiliated to JNTUH, Accredited by NAAC (A+) and NBA)

2023-2024
HYDERABAD INSTITUTE OF TECHNOLOGY AND
MANAGEMENT

(UGC Autonomous, Affiliated to JNTUH, Accredited by NAAC (A+) and NBA)

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

CERTIFICATE

This is to certify that the Technical Seminar entitled “Normalization in DBMS" is being
submitted by R.SakethReddy bearing hall ticket number 20E51A6737 in partial
fulfilment of the requirements for the degree BACHELOR OF TECHNOLOGY in
COMPUTER SCIENCE AND ENGINEERING (DATA SCIENCE ) by the
Jawaharlal Nehru Technological University, Hyderabad, during the academic year 2023-
2024. The matter contained in this document has not been submitted to any other
University or institute for the award of any degree or diploma.

Internal Supervisor Head of the Department


Dr. P. Madhuri Dr. Ila Chandana Kumari
Associate Professor
HYDERABAD INSTITUTE OF TECHNOLOGY AND
MANAGEMENT
(UGC Autonomous, Affiliated to JNTUH, Accredited by NAAC (A+) and NBA)

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING


(DATA SCIENCE)

DECLARATION

I “R.Saketh Reddy” student of ‘Bachelor of Technology in CSD’, session: 2023 - 2024,


Hyderabad Institute of Technology and Management, Gowdavelly, Hyderabad, Telangana
State, hereby declare that the work presented in this Technical Seminar entitled
‘Normalization in Data Science’ is the outcome of our own bonafide work and is correct
to the best of our knowledge and this work has been undertaken taking care of
engineering ethics. It contains no material previously published or written by another
person nor material which has been accepted for the award of any other degree or
diploma of the university or other institute of higher learning, except where due
acknowledgment has been made in the text.

R. Saketh Reddy (20E51A6737)


ACKNOWLEDGEMENT
An endeavor of a long period can be successful only with the advice of many
well-wishers. We would like to thank our chairman, SRI. ARUTLA
PRASHANTH, for providing all the facilities to carry out Technical Seminar
successfully. We would like to thank our Principal Dr. P. RAJESH KUMAR, who
has inspired lot through their speeches and providing this opportunity to carry out
our technical Seminar successfully. We are very thankful to our Head of the
Department, Dr Ila Chandana Kumari and B-Tech Technical Seminar Coordinator
Mrs.Dr. P. Madhuri We would like to specially thank my internal supervisor MRS.
Dr .P. Madhuri, ASSOCIATE PROFESSOR for Technical Guidance. We wish to
convey our gratitude and express sincere thanks to all D.C (DEPARTMENTAL
COMMITTEE) and T.R.C (TECHNICAL REVIEW COMMITTEE) members,
non-teaching staff for their support and Co-operation rendered for successful
submission of our Technical Seminar. We also want to express our sincere
gratitude to all my family members and my friends for their individual care and
everlasting moral support.

R. SAKETH REDDY (20E51A6737)


TABLE OF CONTENTS

ABSTRACT……………………………………………………………………………....i
LIST OF FIGURES……………………………………………………………………...ii

1. CHAPTER - 01…………………………………………………………………….....1

● INTRODUCTION TO DATABASE NORMALIZATION

2. CHAPTER - 02……………………………………………………………………….1

● FIRST NORMAL FORM(1NF)

2.1 Definition and Requirements


2.2 Example illustrating 1NF
3 CHAPTER - 03………………………………………………………………………2

 SECOND NORMAL FORM(2NF)


3.1 Definition and Requirements
3.2 Example illustrating 2NF

4 CHAPTER - 04…………………………………………………………………….8

 THIRD NORMAL FORM(3NF)


4.1 Definition and Requirements
4.2 Example illustrating 3NF

5 CHAPTER - 05…………………………………………………………………….9

 BOYCE-CODD NORMAL FORM


5.1 Definition and Requirements
5.2 Example illustrating BCNF

6 CHAPTER - 06…………………………………………………………………….11

 FOURTH NROMAL FORM(4NF)


6.1 Definition and Requirements
6.2 Example illustrating 4NF

7 CHAPTER - 07…………………………………………………………………….14
 NORMALIZATION VS. DENORMALIZATION
7.1 Comparison of the two approaches
7.2 When to denormalize a database

8 CONCLUSION ……………………………………………………………………15

9 REFERENCES…………………………………………………………………….17
LIST OF FIGURES

Sl.no CAPTION

1. Database Normalization
2. First Normal Form(1NF)
3. Second Normal Form(2NF)
4. Third Normal Form(3NF)
5. Boyce-codd Normal Form (BCNF)
6. Fourth Normal Form(4NF)
7. Normalization vs. Denormalization
.
i

ABSTRACT

Title: Normalization in DBMS

Certainly, here's an abstract summarizing the concept of normalization in a database


management system (DBMS):
Normalization in Database Management Systems (DBMS): A Comprehensive Overview
Normalization is a fundamental principle in database design aimed at optimizing data
organization and integrity within a database management system (DBMS). It involves the
systematic process of breaking down a complex database into smaller, well-structured
tables to reduce redundancy, eliminate anomalies, and enhance overall data quality. By
adhering to a series of normal forms, such as First Normal Form (1NF), Second Normal
Form (2NF), and Third Normal Form (3NF), designers achieve a higher level of data
integrity while minimizing data duplication and inconsistencies. Through the
establishment of relationships between tables using primary and foreign keys,
normalization ensures that data retrieval remains accurate, efficient, and reliable.
However, striking the right balance between normalization and performance optimization
is crucial, as excessive normalization can lead to complex queries and potential
performance issues. Ultimately, normalization serves as a cornerstone of effective
database design, contributing to improved data management, streamlined operations, and
reliable decision-making processes within DBMS environments.
1. INTRODUCTION

Normalization in DBMS:

Normalization is a fundamental concept in Database Management Systems (DBMS) that


plays a pivotal role in designing efficient, organized, and reliable databases. It is a
systematic process of structuring data within a relational database to reduce data
redundancy and maintain data integrity. The primary goal of normalization is to ensure
that data is stored in a way that minimizes inconsistencies and anomalies while allowing
for efficient data retrieval and maintenance.

1.1Data modification:

In the context of a relational database, data is organized into tables, with each table
consisting of rows (records) and columns (attributes). When designing a database, it's
essential to ensure that data is stored in a manner that reflects real-world relationships
accurately and efficiently. Normalization achieves this by breaking down larger tables
into smaller, related tables and establishing rules that govern how data can be added,
updated, and deleted.

1.2Normal Forms:

The normalization process involves dividing a large table into multiple smaller tables,
each serving a specific purpose. To accomplish this, normalization relies on a set of rules,
known as normal forms, which define the requirements that must be met for a table to be
considered normalized. These normal forms, such as First Normal Form (1NF), Second
Normal Form (2NF), Third Normal Form (3NF), and others, progressively eliminate data
redundancy and various types of anomalies.

1
2.First normal form(1NF)
Database Management Systems (DBMS). It defines the basic requirements that a relation
(table) in a relational database must meet to be considered in 1NF. The primary objective
of 1NF is to eliminate duplicate rows and ensure that each attribute (column) contains
atomic (indivisible) values. The First Normal Form (1NF) is the foundational step in the
normalization process within

Here are the key characteristics and requirements of the First Normal Form:

Atomic Values: Each attribute (column) in a 1NF table must contain atomic (indivisible)
values. This means that the values in a column should not be further decomposed into
smaller parts. For example, if you have a "Phone Numbers" column, it should contain
complete phone numbers, not a combination of area codes, prefixes, and line numbers.
No Repeating Groups: There should be no repeating groups or arrays within a single
cell or attribute. This requirement implies that you should avoid storing multiple values
in a single attribute. For instance, if you have a "Skills" column, you shouldn't store
multiple skills as a comma-separated list within the same cell.
Unique Column Names: Each column in the table should have a unique name to ensure
that attributes are easily distinguishable and to prevent ambiguity in data retrieval.
Unique Rows: Each row in the table should be unique. This implies that there should be
a way to uniquely identify each row, often achieved by having a primary key. No two
rows should be identical in all their attributes.

3. Second Normal Form(2NF)


2
The Second Normal Form (2NF) is the second step in the normalization process within
Database Management Systems (DBMS). It builds upon the concepts of the First Normal
Form (1NF) and is designed to further eliminate redundancy and improve data integrity
in relational databases. 2NF is concerned with ensuring that each non-prime attribute (an
attribute that is not part of the primary key) is fully functionally dependent on the entire
primary key.

Here are the key characteristics and requirements of the Second Normal Form (2NF):

Satisfying 1NF: Before a table can be in 2NF, it must already satisfy the requirements of
1NF, which means that it should have atomic values in each attribute and no repeating
groups.
Primary Key: There should be a primary key defined for the table that uniquely
identifies each row.
Functional Dependency: Each non-prime attribute (an attribute not part of the primary
key) should be fully functionally dependent on the entire primary key. This means that
every non-prime attribute should depend on the entire primary key, not just a part of it.

Student ID Course ID Course Name Instructor


101 1 Mathematics Dr. Smith
101 2 Physics Dr. Johnson
102 1 Mathematics Dr. Smith
103 3 Biology Dr. Williams

Student ID Course ID

101 1

101 2

102 1
3
Student ID Course ID

103 3

1.Understanding Second Normal Form (2NF)

Definition: 2NF is a stage in the normalization process for relational databases.


Purpose: It eliminates partial dependencies and reduces data redundancy.
Building on 1NF: To achieve 2NF, a table must already satisfy the requirements of First
Normal Form (1NF), ensuring atomic values and no repeating groups.

2. Primary Key and Non-Prime Attributes

Primary Key: In 2NF, there must be a defined primary key that uniquely identifies each
row in the table.
Non-Prime Attributes: These are attributes that are not part of the primary key.

3.Functional Dependency and Partial Dependencies

Functional Dependency: 2NF addresses functional dependencies between attributes.


Partial Dependencies: The focus is on eliminating partial dependencies, where non-
prime attributes depend on only part of the primary key.
Example: Illustrate an example table where partial dependencies exist.
Criteria: Explain how to identify partial dependencies by examining the relationships
between attributes and the primary key.

5. Achieving 2NF through Decomposition

Decomposition: Describe the process of breaking down the table with partial
dependencies into multiple tables.
Result: Show how decomposition leads to tables where non-prime attributes are fully
functionally dependent on the entire primary key.

4
Example: Provide an example of a table before and after achieving 2NF through
decomposition.

IMPORTANCE OF SECOND NORMAL FORM:

Elimination of Partial Dependencies: The primary purpose of 2NF is to eliminate


partial dependencies within a relational table. Partial dependencies occur when non-
prime attributes (attributes not part of the primary key) depend on only part of the
primary key. By enforcing the elimination of partial dependencies, 2NF helps ensure that
data anomalies, such as insertion, update, and deletion anomalies, are minimized or
avoided. This leads to a more consistent and reliable database.
Data Integrity: 2NF contributes to data integrity by structuring the database in a way
that prevents data anomalies. When non-prime attributes are fully functionally dependent
on the entire primary key, it reduces the risk of inconsistent or contradictory data being
stored in the database. This is particularly crucial in applications where data accuracy is
paramount, such as financial systems and healthcare databases.
Minimization of Data Redundancy: By breaking down tables with partial
dependencies into smaller, related tables, 2NF reduces data redundancy. Data is stored in
a more compact and efficient manner, which not only saves storage space but also makes
the database more manageable and less prone to inconsistencies.
Efficient Querying: A database in 2NF tends to perform better for querying operations.
Since related data is stored in separate tables and is linked through primary and foreign
keys, queries that involve joins and data retrieval become more efficient. This can lead to
faster query response times and improved overall system performance.
Adaptability and Scalability: 2NF-compliant databases Improved Database
Design: Achieving 2NF often results in a more logical and well-structured database
design. It helps organize data in a manner that accurately represents the relationships
between entities and attributes, making it easier to understand and maintain the database
schema.
are typically more adaptable and scalable. As the data model becomes more normalized,
it becomes easier to accommodate changes in requirements, add new features, and scale
the database to handle increasing data volumes.
Compliance with Best Practices: Following normalization principles, including 2NF,
aligns your database design with industry best practices. It ensures that your database is
structured in a way that is widely accepted and understood by database professionals and
developers.
In summary, Second Normal Form (2NF) is crucial in DBMS because it helps maintain
data integrity, reduces redundancy, improves database design, and enhances query
5
performance. It ensures that your database accurately represents the real-world
relationships between entities and attributes while minimizing the risk of data anomalies
and inconsistencies. By adhering to 2NF, you contribute to the overall reliability and
efficiency of your database system.

4. Third Normal Form(3NF)

The Third Normal Form (3NF) is a crucial step in the normalization process of relational
databases. It builds upon the principles of the First Normal Form (1NF) and the Second
Normal Form (2NF) while addressing transitive dependencies within the data. 3NF aims
to ensure that data remains accurate and that there are no non-prime attributes (attributes
not part of the primary key) that depend on other non-prime attributes.
Here are the key characteristics and requirements of the Third Normal Form (3NF):
Satisfying 1NF and 2NF: Before a table can be in 3NF, it must already satisfy the
requirements of 1NF and 2NF.
6
Primary Key and Non-Prime Attributes: There should be a defined primary key that
uniquely identifies each row in the table. Non-prime attributes are those that are not part
of the primary key.
Functional Dependency: 3NF addresses functional dependencies between attributes.
Transitive Dependencies: The primary focus of 3NF is to eliminate transitive
dependencies. A transitive dependency occurs when a non-prime attribute depends on
another non-prime attribute, which in turn depends on the primary key.

To better understand 3NF, let's consider an example:


Suppose we have a table named "Employee Info" that records information about
employees and their respective departments:

Employee ID Employee Name Department Department Head

101 John Smith HR Mary Johnson

102 Jane Doe Sales Sam Williams

103 Robert Brown HR Mary Johnson

In this table, "Employee ID" is the primary key, and "Department Head" is a non-prime
attribute. The issue here is that "Department Head" depends on "Department," which in
turn depends on the primary key "Employee ID."
To bring this table into 3NF, we would split it into two tables:

Employees Table:

7
Employee ID Employee Name

101 John Smith

102 Jane Doe

103 Robert Brown

Departments Table:

Department Department Head

HR Mary Johnson

Sales Sam Williams

Now, "Department Head" is directly dependent on "Department," and the database is in


3NF. This separation eliminates redundancy and ensures that each piece of data is stored
in one place while maintaining data integrity.

8
3NF is essential in database design because it helps prevent data anomalies and
inconsistencies by removing transitive dependencies. It results in a well-structured and
efficient database schema, which is easier to manage and query effectively.

5.Boyce-codd Normal Form (BCNF)

Boyce-Codd Normal Form (BCNF) is a higher level of database normalization that


addresses certain limitations of the Third Normal Form (3NF). It is named after two
computer scientists, Raymond Boyce and Edgar Codd, who contributed significantly to
the development of database normalization theory. BCNF focuses on eliminating
anomalies in a relational database by addressing functional dependencies between
attributes and ensuring that non-prime attributes are functionally dependent only on the
super key.
Here are the key characteristics and requirements of Boyce-Codd Normal Form (BCNF):
Satisfying 1NF and 2NF: A table must first satisfy the requirements of 1NF and 2NF
before it can be considered for BCNF. This means that the table should have atomic
values and no partial dependencies.
9
Primary Key and Super key: In BCNF, there must be a defined primary key that
uniquely identifies each row. Additionally, every non-prime attribute (an attribute not part
of the primary key) should be functionally dependent on the super key, which is a set of
one or more attributes that can be used to uniquely identify a row.
Functional Dependency: BCNF addresses functional dependencies between attributes.
It ensures that non-prime attributes depend only on super keys, not on other non-prime
attributes.
No Partial Key Dependency: BCNF eliminates partial key dependencies, where non-
prime attributes depend on only part of the primary key. This is similar to the
requirements of 2NF and 3NF.
Primary Key Determination: If a table has multiple candidate keys (sets of attributes
that could be the primary key), BCNF ensures that each candidate key is sufficient to
determine all non-prime attributes.

To better understand BCNF, let's consider an example:

Suppose we have a table named "Teachers Courses" that records information about
teachers and the courses they teach:

Teacher ID Course ID Course Name Department

101 1 Mathematics Math

101 2 Physics Physics

102 1 Mathematics Math

103 3 Biology Biology

10
BCNF is important in database design because it helps eliminate redundancy and
anomalies in data, resulting in a well-structured and efficient database schema. It ensures
that data is organized accurately and efficiently while preserving data integrity, which is
essential for database reliability and query performance.

6.Fourth Normal Form(4NF)

The Fourth Normal Form (4NF) is a level of database normalization that builds upon the
principles of the Third Normal Form (3NF). 4NF aims to address multi-valued
dependencies within a relational database, ensuring that data is organized in a way that
eliminates redundancy and preserves data integrity. Multi-valued dependencies occur
when an attribute depends on multiple, independent values of another attribute within the
same table.
Here are the key characteristics and requirements of the Fourth Normal Form (4NF):
Satisfying 1NF, 2NF, and 3NF: Before a table can be in 4NF, it must already satisfy the
requirements of 1NF, 2NF, and 3NF. This means the table should have atomic values, no
partial dependencies, and no transitive dependencies.
Primary Key and Non-Prime Attributes: There should be a defined primary key that
uniquely identifies each row in the table. Non-prime attributes are those that are not part
of the primary key.
Functional Dependency: 4NF addresses functional dependencies between attributes.
Multi-Valued Dependencies (MVDs): The primary focus of 4NF is to eliminate multi-
valued dependencies. A multi-valued dependency occurs when a non-prime attribute
depends on multiple, independent values of another non-prime attribute.

11
To better understand 4NF, let's consider an example:

Suppose we have a table named "Student Courses" that records information about
students and the courses they have taken, along with the textbooks they have used:

Student ID Course ID Textbook

101 1 Book A

101 2 Book B

102 1 Book C

102 3 Book D

103 3 BookE

In this table, "Student ID" and "Course ID" together form the composite primary key, and
"Textbook" is a non-prime attribute. The issue here is that "Textbook" depends on both
"Student ID" and "Course ID," which makes it a multi-valued dependency.
To bring this table into 4NF, we would split it into two tables:

Student Courses Table:

12
Student ID Course ID

101 1

101 2

102 1

102 3

103 3

1. Textbooks Table:

Course ID Textbook

1 Book A

2 Book B

13
Course ID Textbook

1 Book C

3 Book D

3 Book E

Now, "Textbook" is directly related to "Course ID," and the database is in 4NF. This
separation eliminates multi-valued dependencies and ensures that data is organized
efficiently while preserving data integrity.

4NF is important in database design when dealing with multi-valued dependencies, as it


helps maintain data accuracy, consistency, and efficient storage of information. It results
in a more structured and well-organized database schema.

14
7. Normalization vs Denormalization
Normalization and denormalization are two opposing database design techniques used to
organize and structure relational databases. Each approach has its advantages and
disadvantages, and the choice between them depends on specific requirements and trade-
offs in a given application. Here's a comparison of normalization and denormalization:

Normalization:
Definition: Normalization is a process of organizing data in a relational database to
reduce redundancy and improve data integrity. It involves breaking down large tables
into smaller, related tables while ensuring that each table adheres to certain normalization
forms (e.g., 1NF, 2NF, 3NF, BCNF).
Objective: The primary goal of normalization is to prevent data anomalies, such as
insertion, update, and deletion anomalies, by enforcing rules that ensure data is stored
efficiently and without redundancy.
Advantages:
Reduces data redundancy, leading to storage optimization.
Minimizes data anomalies, ensuring data consistency and accuracy.
Makes it easier to maintain and update the database.
Generally leads to more efficient queries through well-structured tables.
Use Cases: Normalization is typically preferred in scenarios where data integrity is
critical, such as financial systems, healthcare databases, and mission-critical applications.
It's also suitable when data is frequently updated.
Complex Queries: Normalized databases may require more complex queries involving
joins to retrieve data from multiple tables, which can impact query performance in some
cases.
Denormalization:
Definition: Denormalization is a process of intentionally introducing redundancy into a
relational database by combining tables or duplicating data. It's done to improve query
performance or simplify database design.

15
Objective: The primary goal of denormalization is to optimize query performance by
reducing the need for complex joins and allowing for faster data retrieval.
Advantages:
Speeds up query performance by minimizing the number of joins.
Simplifies database design, making it easier to understand and implement.
May be suitable for read-heavy or analytical workloads where data consistency can be
compromised to some extent.
Use Cases: Denormalization is often applied in scenarios where query performance is
critical, such as reporting systems, data warehouses, and applications with a heavy read
workload.
Data Integrity: Denormalization can compromise data integrity because it may
introduce redundancy, making it necessary to carefully manage data updates to ensure
consistency.
Storage Overhead: Denormalized databases can consume more storage space due to
redundant data.

Choosing Between Normalization and Denormalization:

The choice between normalization and denormalization depends on factors like the
specific requirements of your application, the nature of your data, and the trade-offs
you're willing to make. In practice, many databases strike a balance by normalizing the
core data for data integrity and then selectively denormalizing certain parts for query
optimization. This approach is called "controlled denormalization" and seeks to harness
the benefits of both techniques while minimizing their drawbacks.

16
8. Conclusion
Normalization is a fundamental concept in Database Management Systems (DBMS) that
plays a pivotal role in designing efficient, organized, and reliable databases. Throughout
this discussion, we've explored the significance, principles, and various normal forms
involved in the normalization process. In conclusion, here are the key takeaways
regarding normalization in DBMS:
Data Integrity and Accuracy: Normalization is primarily about ensuring data integrity
and accuracy in relational databases. By adhering to the principles of normalization, we
can significantly reduce the chances of data anomalies, such as insertion, update, and
deletion anomalies, ensuring that the data remains consistent and trustworthy.
Structured and Efficient Data Storage: Normalization leads to well-structured
databases, where data is organized in a way that reflects real-world relationships. This
organization not only promotes data consistency but also optimizes data storage by
reducing redundancy. It ensures that data is stored in a compact and efficient manner.
Progressive Normal Forms: Normalization encompasses a series of normal forms, from
First Normal Form (1NF) to higher levels like Boyce-Codd Normal Form (BCNF) and
Fourth Normal Form (4NF). Each level builds on the previous one, refining the database
structure and eliminating specific types of data dependencies and anomalies.
Trade-Offs Between Normalization and Performance: While normalization enhances
data integrity, it can sometimes result in more complex queries involving multiple joins.
Therefore, designers must strike a balance between normalization and performance,
especially in scenarios where read-heavy workloads and query efficiency are critical.
Selective Denormalization: In practice, controlled denormalization is often employed to
optimize query performance while maintaining data integrity. This approach selectively
denormalizes specific parts of the database, striking a balance between normalized and
denormalized data structures.
Application-Specific Considerations: The decision to normalize or denormalize should
be guided by the specific requirements of the application. High-integrity applications,
such as financial systems, may prioritize normalization, while read-heavy analytics
systems may lean toward denormalization.
17
In conclusion, normalization is a vital concept in DBMS that helps ensure the reliability,
efficiency, and accuracy of relational databases. It offers a structured framework for
organizing data and mitigating the risk of data anomalies. The choice of the appropriate
normalization level or the introduction of controlled denormalization should be driven by
the specific needs and performance considerations of the application at hand. Ultimately,
effective database design strikes a balance between data integrity and query performance
to meet the goals of the organization.

9.References

1.Hischier, R. and B. Weidema. 2009. Implementation of life cycle impact


assessment methods: Data v2.1. Ecoinvent report no. 3. St. Gallen,
Switzerland: Ecoinvent Centre.
2.Khoo, H. 2009. Life cycle impact assessment of various waste conversion
technologies. Waste Management 29(6): 1892–1900.
3. NREL (National Renewable Energy Laboratory). 2004. U.S. Life Cycle
Inventory Database. http://www.nrel.gov/lci/. Accessed January 2011.
4. USDA (U.S. Department of Agriculture). 2010. Agriculture chemical use
database.
5. White, P. and M. Carty. 2010. Reducing bias through process inventory
dataset normalization. International Journal of Life Cycle Assessment 15(9):
994–1013.
6. NASS (National Agricultural Statistics Service). 2008. Agricultural
statistics 2008.

18

You might also like