Normalization of the Lowe's Inventory Information System Database
As a database grows in size and complexity it is essential that order and organization be maintained to control these complexities and minimize errors and redundancy in the associated data. This goal is managed by a process referred to as normalization.
Normalization permits us to design our relational database tables so that they "(1) contain all the data necessary for the purposes that the database is to serve, (2) have as little redundancy as possible, (3) accommodate multiple values for types of data that require them, (4) permit efficient updates of the data in the database, and (5) avoid the danger of losing data unknowingly (Wyllys, R. E., 2002).".
As a prelude to normalization, the database modeler researches the company and current database uses to determine the requirements for the new database. Table elements and relationships are determined, and candidate keys reviewed and established for the tables. The process of database normalization then begins.
Databases can attain varying degrees of normalization classified as 1NF, 2NF, 3NF, 4NF, 5NF, and BCNF, however for practicality and in staying with the layout of our Lowe's inventory database, only the first through third normal forms or 1NF – 3NF will be addressed.
First, a balance must be struck between data accessibility with regard to performance and maintenance and the concerns of data redundancy. To accomplish this and normalize the Lowe’s database, the supply and retail sides of the database were combined and the tables set in first normal form. In first normal form, the tables were formatted to ensure that the data within them was atomic i.e., ensuring that it was in its simplest form and had no repeating groups. A concatenated PK characterizes tables in 1NF and these tables can have partial and transitive dependencies. Decomposition from this point helps to eliminate redundancy as the modeler works toward a defined goal based on business rules and individual needs.
The tables were next moved to second normal form, again undergoing a review where efforts were taken to reduce the amount of redundant data by extracting and placing it in new table(s). Here, each key component is written on a separate line, with the original key written on the last line. All dependant attributes then follow their perspective keys. This process is used to eliminate partial dependencies which are not allowed in 2NF.
Finally, the tables were set into third normal form by ensuring that no non-identifying attributes were dependent on any other non-identifying attributes.
The next project deliverable is a robust, modernized database and data warehouse design. The company collects large amounts of website data and uses this data to analyze it for the company’s customers. This document will provide an overview of the new data warehouse along with the type of database design that has been selected for the data warehouse. Included in the appendix of this document is a graphical depiction of the logical design of the
Society tends to turn a blind eye towards majorly inhumane activities. One such activity that is overlooked is organ trafficking. The fictional novel Never Let Me Go by Kazuo Ishiguro covers the lives of children who were cloned specifically to give their organs for non-clones to live. The article “Not properly human”: literary and cinematic narratives about human harvesting” by Henriette Roos explores the reality of human organ trafficking and how people who want the organs believe the act is normal and acceptable. As inhumane as each circumstance is, the people of the outside world who utilize the organs try to validate the victimization through word choice, daily lifestyle behavior, and stereotypical acceptances.
According to the task assigned to me, I have visited one of my favorite food court (i.e) TACO BELL. As a matter of fact each and every organization has a unique data requirements in this regard, this particular food court has its own requirement for data. The requirement may be seen at different levels. The first level of TACO Bell is Hiring manager. The hiring manager requires the data of the employee who work at Taco bell. On the next level will be a manager who looks after the inventory. Inventory management requires a database which works efficiently. So, at this level the use of the database is must. At the next level the data requirement of the cashier is different from others. The next level of data requirement is for the drive-away customers.
Delobel, C., C. Lecluse, and P. Richard. Databases: From Relational to Object-Oriented Systems. ITP, 1995.
Dr. Edgar F. Codd was best known for creating the “relational” model for representing data that led to today’s database industry ("Edgar F. Codd") (Edgar F. Codd). He received many awards for his contributions and he is one of the many reasons that we have some of the technologies today. As we dig deeper into his life in this research paper, we will find that Dr. Edgar F. Codd was in fact, a self-motivated genius.
The Revolution in Database Architecture, by Jim Gray, describes the path that Gray thought that the evolution of the Database Architecture would take after 2004. He considers that databases had been stagnated for several years and that, beginning in 2004, the development of several technologies would pave the way into a revolution in the database world.
[7] Elmasri & Navathe. Fundamentals of database systems, 4th edition. Addison-Wesley, Redwood City, CA. 2004.
With the advancement in database systems and software, Eric Brewer in his new article that:
A data warehouse comprised of disparate data sources enables the “single version of truth” through shared data repositories and standards and also provides access to the data that will expand frequency and depth of data analysis. Due to these reasons, data warehouse is the foundation for business intelligence.
When developing a relational database understanding the logical flow of information and proper planning will improve the probability of the database functioning the way it is intended and producing the desired results. In determining the proper structure of a relational database for a video rental store one must consider what information is stored, the process for renting videos and information on the videos maintained in inventory. Customer, Videos and Video Types are the entity classes that will be discussed and Customer Order is the intersection relation needed to explain the complete process as seen on the Entity Relationship Diagram below.
...ey constraints, contain data which shows the rows in the fact table. In the star schema design, the dimension tables are demoralized to reduce the number of JOINs necessary in queries on the fact table, while in the snowflake schema the dimension tables are normalized to reduce data duplication and allow reuse of those tables with other fact tables.
This data model organizes data is a tree-like structure similar to the organizational structure. This structure allows the repetition of information using parent/child relationship. One-to-many relationship is created as one parent can have many children but a child belongs to only one parent. This model is the first data base model created by IBM in 1966. It was used in Information Management System (IMS) developed by IBM. Later Microsoft employed it in Windows Registry. This data model replaced the Flat file database system because it was more fast and simple but inflexible as the relationship was restricted to one-to-many.
One of the most basic measures that most be examined and planned involves the smallest units within the database, the fields. The fields are derived from the simple attributes that were defined in the logical data model. A few decisions need to be made regarding each of these individual fields. First what type of data is going to be storied in them? The data type that is assigned to each field should be able to accurately represent every possible valid value, while limiting invalid values as much as possible. Special consideration should be taken for any manipulations that will be done on the data as some data types allow these manipulations a lot easier than other ones. When considering data manipulations it is important to keep in mind simple things like addition, if finding the sum of the data field’s values the data type that worked for the fields may not be large enough to support the resulting summation.
Prior to the start of the Information Age in the late 20th century, businesses had to collect data from non-automated sources. Businesses then lacked the computing resources necessary to properly analyze the data, and as a result, companies often made business d...
In our world, people rely heavily on the power of technology every day. Kids are learning how to operate an iPad before they can even say their first word. School assignments have become virtual, making it possible to do anywhere in the world. We can receive information from across the world in less than a second with the touch of a button. Technology is a big part of our lives, and without it life just becomes a lot harder. Just like our phones have such an importance to us in our daily lives, database management systems are the same for businesses. Without this important software, it would be almost impossible for companies to complete simple daily tasks with such ease.