Wait a second!
More handpicked essays just for you.
More handpicked essays just for you.
Chapter 3 computer programming
Early childhood education and computers
Early childhood education and computers
Don’t take our word for it - see why 10 million students trust us with their essay needs.
Recommended: Chapter 3 computer programming
1. Herman Hollerith was born on February 29, 1860 in Buffalo, New York. He graduated from the Columbia School of Mines in 1879 after studying engineering among various other topics. Shortly after, Herman worked as an assistant to his former teacher at the U.S. Census Bureau. Working as a statistician, he revealed the problems of dealing with large amounts of data by hand. The 1880s census took seven and a half years to complete, and the 1890s census was expected to take much longer due to the rise of immigration. While at the Census Bureau, he met his future wife Lucia Beverley Talcott. Her father gave Herman an idea for a device to speed up the census count. Her father got the idea from an automated loom, called a Jacquard loom, which automated …show more content…
several different repetitive tasks. Not long after, Herman became an instructor of mechanical engineering at MIT, but did not stay long as he did not like working with the students. He then assisted with experimentation of an electrically activated brake system for railroads, as well as acquiring a patent for a tabulator in 1884. Sin 1885, Herman’s machine was first used by the U.S. Navy. Shortly after his experimental work, he got a job with the U.S. Patent Office where he remained until 1890. The 1890 census was able to move much quicker, being completed in two and a half years as opposed to the likely 10 years that was expected. Herman’s invention saw expanded use in the United States, Canada, Austria, and Norway. 2.
The development of the abacus is still debated to this day. Some claim that it was developed in ancient China around 3000 BC. Some believe it was developed in ancient Mesopotamia around the same time. The time and place varies depending on what expert you ask. Regardless, it is a device made of wood and beads. The beads can go up and down on wooden cylinders covered by wooden rectangles on each side. Each bead has a specific value, and each column has a different division of ten, starting with one. Basic mathematics (addition, subtraction, multiplication, and division) are performed by moving appropriate beads to the middle of the abacus. The abacus was created to as a calculator for ancient man. It was small and easy to carry on one’s person. It was so successful that it spread from China to many other countries across Europe and Asia. There were different variations of the abacus depending on which country you were in. For example, the Russian abacus didn’t have the heaven and earth beads that the Chinese abacus had. Even the ancient Roman empire adopted use of the abacus. It was slightly different, in by you didn’t have cylinders with beads, but instead you moved counters on a smooth table. The Roman abacus was very much similar to the Greek abacus. The abacus was used world wise for nearly anything that dealt with counting, or calculation. It enjoyed much use in trade and commerce. The abacus remained the world’s main calculator for thousands of …show more content…
years. 3.
Grace Hopper, referred to by some as the “Queen of Code,” was one of the first to program the first computers in the 40s and 50s. During World War II, Hopper left a teaching job at Vassar College to join the Navy Reserve. That's when she went to Harvard to work on the first programmable computer in the United States: the Mark I. The Mark I was the first digital computer to be programmed sequentially. Thus, Hopper experienced firsthand the complexities and frustration that have always been the hallmark of the programming field. The exacting code of machine language could be easily misread or incorrectly written. To reduce the number of programming errors, Hopper and her colleagues collected programs that were free of error and generated a catalogue of subroutines that could be used to develop new programs. By this time, the Mark II had been built. Aiken's team used the two computers side by side, effectively achieving an early instance of multiprocessing. Hopper's association with UNIVAC resulted in several important advances in the field of programming. Still aware of the constant problems caused by programming errors, Hopper developed an innovative program that would translate the programmer's language into machine language. This first compiler, called "A-O, " allowed the programmer to write in a higher-level symbolic language, without having to worry about the tedious binary language of endless numbers that were needed to communicate with the machine itself. Hopper died
on January 1st, 1992 after an award filled career. 4. The UNIVAC I, or Universal Automatic Computer I, was designed in 1948 by J. Presper Eckert and John Mauchly, who also designed the ENIAC. The machine was huge, weighing in at 29,000 pounds. The processor alone took up a space of 14.5 by 7.5 by 9 feet. The UNIVAC I was the first commercial computer made by the United States. Designed for business and administration, the UNIVAC I was able to quickly calculate simple arithmetic and data transport operations. The UNIVAC I was the first computer in the United States to separate I/O from actual computation. The input that was used by the UNIVAC I was an operator keyboard and console typewriter. Then the output was printed on tape. The first UNIVAC was installed at the U.S. Government’s Census Bureau. Later, the fifth created UNIVAC was able to predict the winner of the ’52 U.S. presidential election, predicting that Dwight Eisenhower would come out the victor over Adlai Stevenson. Many businesses installed the UNIVAC I, including G.E., Westinghouse, and Pacific Mutual Life Insurance. It was not a cheap machine, initially being priced at $159,000, and then raising in price to around 1,250,000 or more. The UNIVAC I was initially created to replace punch-card accounting machines as a data processing computer. It soon became clear from a financial standpoint that the UNIVAC would need to be able to read punch cards as well, and thus a tape-to-card converter was created. The machine could read 7,200 decimal digits per second, which was a vast improvement to the punch-card account machines of prior years. 5. MS-DOS, or Microsoft Disk Operating System, is a text based operating system created by Microsoft in 1981. Initially, MS-DOS was called 86-DOS, and was written by Tim Paterson in 1980. 86-DOS took only six weeks to develop, and was a near clone of an older operating system that was used for an older processor. In 1981 Microsoft hired Tim Paterson and bought 86-DOS for $75,000. 86-DOS was then renamed MS-DOS, and licensed to IBM who used the operating system for the IBM PC. In one year, it became much more widely used, as Microsoft licensed the operating system to more than 70 other companies. This, and other factors led to MS-DOS being the dominant operating system throughout the 1980s. MS-DOS was implemented differently than the operating systems today, as every separate computer had its own unique version of MS-DOS. The operating system increased the speed of the programs on the PC due to direct control of the computer’s hardware. Over the years, Microsoft would continue to improve performance and add additional features. The operating system itself uses a command line, also known as a text-based interface. This made for very simple use of programs that were installed onto disk storage, but was still quite difficult for a brand new user. The popularity of MS-DOS fueled Microsoft’s success as one of, if not the biggest software companies world wide. The MS-DOS operating system continued as the dominant operating system until the development of the GUI, or graphical user interface operating system Microsoft Windows in 1985. 6. Linux is an open source operating system based on Unix. Most distributions of Linux are free to use, and are highly customizable due to it being an open source operating system. The operating system itself is not much different than other modern operating systems, as it has a graphical interface, and the ability to install software of your choosing (though slightly different than software for windows or mac). Each distribution is usually described as its own “flavor” of the operating system. The distributions range from desktop operating systems to server and mobile operating systems. Linux is comprised of the bootloader, the kernel, daemons, the shell, the graphical server, the desktop environment, and applications. In 1991 Linus Torvalds created the first distribution of Linux as an open source alternative to an academic operating system called MINIX, which was also based on Unix. Since its creation, Linux has been more of an enthusiast operating system in regards to desktop computers. It started seeing more wide success with the supercomputing community. NASA had begun replacing many of their expensive machines with clusters of household grade computers running the Linux operating system. Supercomputers today are still using Linux based operating systems. Lately many netbooks have gone with distributions of Linux as the operating system they are sold with, Google’s Chrome OS for example. By far the widest use of Linux is the Android OS used on many mobile devices. As of September 2015, Android boasted an astonishing 1.4 billion active users, making it the most used mobile operating system worldwide. 7. A mobile operating system is an operating system that is meant to specifically run on a mobile device. Mobile devices include smartphones, tablets, PDAs etc. Modern mobile systems usually use a touchscreen interface, have GPS capability, and include a camera. The mobile operating systems must take into account these specific functions that are not usually included in a desktop environment. Not only that, but they need to be very user friendly and easy to manage, especially if the developers want to draw in new users. The main way to acquire more programs on your mobile device is to use the mobile operating system’s app (application) store. Here you can find not only programs, but movies and books as well. Today, the two big players in the mobile operating system market are Android and Apple iOS. As previously mentioned, Android is the widest used mobile operating system in the world, but Apple iOS is also widely used worldwide and is only used on apple products. Apple’s iOS trades in customization for a very smooth and polished experience. Apple’s app store is much stricter on what it releases so Apple can make sure that the apps will not hinder the smooth and polished experience that Apple users have come to expect from the operating system. Android is much more customization friendly, as it is based on Linux. Many people have been able to customize their own operating systems based on the Android OS. Also, Android’s app store is less strict about what is posted, and there are many more apps available because of this.
On August 14, 1851 in Griffin, Georgia, John Henry Holliday was born to Henry Burroughs and Alice Jane Holliday. Their first child, Martha Eleanora, had died on June 12, 1850 at six months of age. When he married Alice Jane McKay on January 8, 1849, Henry Burroughs was a druggist by trade and, later became a wealthy planter, lawyer, and during the War between the States, a Confederate Major. Church records state: "John Henry, infant son of Henry B. and Alice J.
Grace Murray Hopper, born December 9, 1906, was a Math professor that enlisted in the United States Navy at the start of World War II. Over the time of her enlistment, Hopper developed several new programming languages, including COBOL, which is still one of the most used programming languages today. Hopper was also one of the first people to coin the term “computer bug”. Over the course of her life, Grace Hopper influenced many people through her service in the military and led a movement in modern electronics through her work.
The U.S. Census was initially established as a mandate by the Constitution to determine representation of each state in the House of Representatives in Congress. The first U.S. Census taken was in 1790. A tally system was used to enumerate the population until 1890. This form of manual data processing was far too slow and cumbersome to account for the United States' exploding population. After 1880 the U.S. Census was dealing with a lot more data than earlier censuses because of an influx in immigration and the expansion of the United State's borders. From 1790 to 1880 the census count went from 3.9 to 50.2 million people, respectively. The 1880 census took 9 years to complete. The tally system was inefficacious and took far too long; there were fears the next census would not be completed in less than 10 years(Shelburne). In 1888, the U.S. Census Bureau held a competition to procure a more effective and faster manner of processing census data. Three competitors submitted their designs (Census History Staff). A young engineer by the name of Herman Hollerith swept the competition. His machine, called the Hollerith Electric Tabulating System won the competition by a landslide in terms of the time it took to tabulate all the ...
What is Aztec Mathematics you may be asking yourself well let’s start from the beginning of Aztec Mathematics.Well there were these people called Aztecs who ruled central mexico before the spanish arrived they left the most extensive mathematical writing. They ruled until the spanish came and overthrew the empire. Aztecs used hand, heart and arrow symbols as fractional distances when they would calculate lands. The Aztecs existed around 1200 A.D.
...m simple tasks. Then Massachusetts Institute of Technology students, led by Vannevar Bush, fabricated the first analog computer, which could perform more complicated tasks than the previous computer. The analog computer was improved upon even further by Howard Aiken, who created the first computer with memory (Brinkley 643).
Due to archeological evidence we know that the African people were the first people in the world to use counting to keep track of their things, or time. Around 35,000 BC, in South Africa the earliest known tally stick was made, and was left in Lebombo Cave. 29 notches were cut into the stick. We don't know exactly what they were counting. Some people think they were counting the days from one moon phase to the next, but it could have been something else. Just as well. Now, what we do see is that by 35,000 BC people in South Africa had the idea of keeping records by making marks. “The Lebombo bone is a baboon fibula with a set of 29 notches carved in it. Archeologists believe these marks are evidence of a primitive calendar, measuring either the lunar or the menstrual calendar. This artifact is incredibly important for unders...
Abacus has a rich history. The traces of the great device that revolutionized the calculation method can still be seen today. Although, after the digital calculator was introduced the Abacus was fading away, countries like South Korea, Japan, and China still use it to enhance mathematic skills. These three countries are using Abacus instead of digital calculators because they have known how beneficial it is. Abacus can bring improvement in various qualities from practice of calculations to benefits in memory enhancement, an increase in problem-solving ability, and a boost in brain function.
Computer engineering started about 5,000 years ago in China when they invented the abacus. The abacus is a manual calculator in which you move beads back and forth on rods to add or subtract. Other inventors of simple computers include Blaise Pascal who came up with the arithmetic machine for his father’s work. Also Charles Babbage produced the Analytical Engine, which combined math calculations from one problem and applied it to solve other complex problems. The Analytical Engine is similar to today’s computers.
Pascal programming language was designed in 1968, and published in 1970. It is a small and efficient language intended to encourage good programming practices using structured programming and data structuring. Pascal was developed by Niklaus Wirth. The language was named in honor of the French mathematician and philosopher Blaise Pascal. In 1641, Pascal created the first arithmetical machine. Some say it was the first computer. Wirth improved the instrument eight years later. In 1650, Pascal left geometry and physics, and started his focus towards religious studies. A generation of students used Pascal as an introduction language in undergraduate courses. Types of Pascal have also frequently been used for everything from research projects to PC games. Niklaus Wirth reports that a first attempt to merge it in Fortran in 1969 was unsuccessful because of Fortran's lack of complex data structures. The second attempt was developed in the Pascal language itself and was operational by mid-1970. A generation of students used Pascal as an introductory language in undergraduate courses. Pascal, in its original form, is a Procedural language and includes the traditional like control structures with reserved words such as IF, THEN, ELSE, WHILE, FOR, and so on. However, Pascal has many data structuring and other ideas which were not included in the original, like type definitions, records, pointers, enumerations, and sets. The earliest computers were programmed in machine code. This type of programming is time consuming and error prone, as well as very difficult to change and understand. Programming is a time-consuming a process. More advanced languages were developed to resolve this problem. High level languages include a set of instruction...
Karwatka, Dennis. "Ada Lovelace--The First Computer Programmer." Tech Directions 54.10 (1995): 21. Academic Search Complete. Web. 5 May 2014.
In 500 B.C. the abacus was first used by the Babylonians as an aid to simple arithmetic. In 1623 Wihelm Schickard (1592 - 1635) invented a "Calculating Clock". This mechanical machine could add and subtract up to 6 digit numbers, and warned of an overflow by ringing a bell. J. H. Mueller comes up with the idea of the "difference engine", in 1786. This calculator could tabulate values of a polynomial. Muellers attempt to raise funds fails and the project was forgotten. Scheutz and his son Edward produced a 3rd order difference engine with a printer in 1843 and their government agreed to fund their next project.
In conclusion, it is clear that while their ancient civilization perished long ago, the contributions that the Egyptians made to mathematics have lived on. The Egyptians were practical in their approach to mathematics, and developed arithmetic and geometry in response to transactions they carried out in business and agriculture on a daily basis. Therefore, as a civilization that created hieroglyphs, the decimal system, and hieratic writing and numerals, the contributions of the Egyptians to the study of mathematics cannot and should not be overlooked.
The history of the computer dates back all the way to the prehistoric times. The first step towards the development of the computer, the abacus, was developed in Babylonia in 500 B.C. and functioned as a simple counting tool. It was not until thousands of years later that the first calculator was produced. In 1623, the first mechanical calculator was invented by Wilhelm Schikard, the “Calculating Clock,” as it was often referred to as, “performed it’s operations by wheels, which worked similar to a car’s odometer” (Evolution, 1). Still, there had not yet been anything invented that could even be characterized as a computer. Finally, in 1625 the slide rule was created becoming “the first analog computer of the modern ages” (Evolution, 1). One of the biggest breakthroughs came from by Blaise Pascal in 1642, who invented a mechanical calculator whose main function was adding and subtracting numbers. Years later, Gottfried Leibnez improved Pascal’s model by allowing it to also perform such operations as multiplying, dividing, taking the square root.
The history of math has become an important study, from ancient to modern times it has been fundamental to advances in science, engineering, and philosophy. Mathematics started with counting. In Babylonia mathematics developed from 2000B.C. A place value notation system had evolved over a lengthy time with a number base of 60. Number problems were studied from at least 1700B.C. Systems of linear equations were studied in the context of solving number problems.
The fist computer, known as the abacus, was made of wood and parallel wires on which beads were strung. Arithmetic operations were performed when the beads were moved along the wire according to “programming” rules that had to be memorized by the user (Soma, 14). The second earliest computer, invented by Blaise Pascal in 1694, was a “digital calculating machine.” Pascal designed this first known digital computer to help his father, who was a tax collector. Pascal’s computer could only add numbers, and they had to be entered by turning dials (Soma, 32). It required a manual process like its ancestor, the abacus. Automation was introduced in the early 1800’s by a mathematics professor named Charles Babbage. He created an automatic calculation machine that was steam powered and stored up to 1000 50-digit numbers. Unlike its two earliest ancestors, Babbage’s invention was able to perform various operations. It relied on cards with holes punched in them, which are called “punch cards.” These cards carried out the programming and storing operations for the machine. Unluckily, Babbage’s creation flopped due to the lack of mechanical precision and the lack of demand for the product (Soma, 46). The machine could not operate efficiently because technology was t adequate to make the machine operate efficiently Computer interest dwindled for many years, and it wasn’t until the mid-1800’s that people became interested in them once again.