Wait a second!
More handpicked essays just for you.
More handpicked essays just for you.
Past computer technology
Past computer technology
Past computer technology
Don’t take our word for it - see why 10 million students trust us with their essay needs.
Recommended: Past computer technology
Computers in the 1950’s
People have been in awe of computers since they were first invented. At first scientist said that computers would only be for government usage only. “Then when the scientists saw the potential computers had, scientist then predicted that by 1990 computers may one day invade the home of just about ever citizen in the world” (“History” Internet), the scientists were slightly wrong, because by 1990 computers were just beginning to catch on. Then a few years later when scientists when to major corporations to get help with a special project, the corporations said no, because computers would just be a fad and they wouldn’t make much money off of it. “By definition Abacus is the first computer (the proper definition of a computer is one who or that which computes) ever invented” (Internet).
The subject of this term paper will be about computers in the 1950’s. The divisions that will be covered are; the types of computers there were, the memory capacity of computers, the programming languages of that time, and the uses of the computers for that time. Information will be gathered from the Internet, from books, and from magazines, and from the encyclopedia.
Ali 2
In the fifties, computers were in the experimental stage they were extremely hard to work with, and were a constant technicians worst nightmare, because often enough you had to replace the fuses (s Appendix a).
The memory capacity of that time was rather limited. “There were not many external drives, the only external drives of that time were I/O cards, I cards and O cards”
(“Whirlwind” Internet) “computers of that time were capable of a multitude of small tasks, like data processing (i.e. IRS related material, and information storage.), word processing (i.e. extremely early model of Microsoft word), data analysis (i.e. survey taking), complex calculations (i.e. weather prediction) communications (i.e., the telephone system (switching))” (“ Computers” Internet).
The lack of the internal drive, even 1 k drives proved to be very hard on the computers of the 1950’s (“Hackers” Internet) “with out the hard drives, programmers had to leave everything running all the time or print what they typed for the day and retype it the next day or they could save it to a I/O card” (“Computers” Internet). All three of these choices posed as a problem from programmers of that time, as first off to leave on the computer would cost a lot of money as it required a lot of money to maintain them while they are on.
In 500 B.C. the abacus was first used by the Babylonians as an aid to simple arithmetic. In 1623 Wihelm Schickard (1592 - 1635) invented a "Calculating Clock". This mechanical machine could add and subtract up to 6 digit numbers, and warned of an overflow by ringing a bell. J. H. Mueller comes up with the idea of the "difference engine", in 1786. This calculator could tabulate values of a polynomial. Muellers attempt to raise funds fails and the project was forgotten. Scheutz and his son Edward produced a 3rd order difference engine with a printer in 1843 and their government agreed to fund their next project.
to replace the IBM machine. In the 1960s and the 1970s IBM came out quickly and built a
Watson, J. (2008). A history of computer operating systems (pp. 14-17). Ann Arbor, MI: Nimble Books.
There is no doubt that computers are firmly implanted in our nation’s daily existence. Everyday I use a computer at least once, either writing a paper, surfing the Internet or checking my e-mail. On a recent trip to the library to do research I was quickly escorted by a librarian to the variety of computer databases, which are the fastest and most current source of information I found a survey in Statistical Abstracts of the United States comparing students use of computers in kindergarten through college from 1984 to 1993. A total of 27.3% of students used computers in schools in 1984, while a total of 59.0% used computers in 1993. This shows a steady rise in scholastic computer use and these figures are probably greatly inflated by now in 1997.
Prior to the revolution in technology that was microprocessors, making a computer was a large task for any manufacturer. Computers used to be built solely on discrete, or individual, transistors soldered together. Microprocessors act as the brain of a computer, doing all mathematics. Depending on how powerful the machine was intended to be, this could take weeks or even months to produce with individual components. This laborious task put the cost of a computer beyond the reach of any regular person. Computers before lithographic technology were massive and were mostly used in lab scenarios (Brain 1).
The First Generation of Computers The first generation of computers, beginning around the end of World War 2, and continuing until around the year 1957, included computers that used vacuum tubes, drum memories, and programming in machine code. Computers at that time where mammoth machines that did not have the power our present day desktop microcomputers. In 1950, the first real-time, interactive computer was completed by a design team at MIT. The "Whirlwind Computer," as it was called, was a revamped U.S. Navy project for developing an aircraft simulator.
The fist computer, known as the abacus, was made of wood and parallel wires on which beads were strung. Arithmetic operations were performed when the beads were moved along the wire according to “programming” rules that had to be memorized by the user (Soma, 14). The second earliest computer, invented by Blaise Pascal in 1694, was a “digital calculating machine.” Pascal designed this first known digital computer to help his father, who was a tax collector. Pascal’s computer could only add numbers, and they had to be entered by turning dials (Soma, 32). It required a manual process like its ancestor, the abacus. Automation was introduced in the early 1800’s by a mathematics professor named Charles Babbage. He created an automatic calculation machine that was steam powered and stored up to 1000 50-digit numbers. Unlike its two earliest ancestors, Babbage’s invention was able to perform various operations. It relied on cards with holes punched in them, which are called “punch cards.” These cards carried out the programming and storing operations for the machine. Unluckily, Babbage’s creation flopped due to the lack of mechanical precision and the lack of demand for the product (Soma, 46). The machine could not operate efficiently because technology was t adequate to make the machine operate efficiently Computer interest dwindled for many years, and it wasn’t until the mid-1800’s that people became interested in them once again.
Towards the end of the 1970's, microcomputers began to appear on the market. Machines like the Apple II, Tandy TRS-80 and Commodore PET and BBC Model B began to gain popularity. Education Department had purchased some computers which were loaned to schools for short periods of time. The beginning of the 1980's saw the first awakenings that computers may well have a place within schools themselves. Suddenly there was available a relatively low cost, small, yet powerful computer which did not need the progamming skills of the computers of earlier.
Thousands of years ago calculations were done using people’s fingers and pebbles that were found just lying around. Technology has transformed so much that today the most complicated computations are done within seconds. Human dependency on computers is increasing everyday. Just think how hard it would be to live a week without a computer. We owe the advancements of computers and other such electronic devices to the intelligence of men of the past.
...ce Discrete Variable Automatic Computer), both faster than Eniac, began to share the Computing Laboratory's workload with the ENIAC in 1953. It became noticeable almost immediately that the Eniac would have to be modified if it were to remain competitive, economical, and efficient. Even with these transformations and the fact that trouble-free operating time remained at about a 100 hours a week during the last 6 years of the Eniac’s operation, its operating costs were way more expensive then those of the EDVAC and ORDVAC. The Eniac was no longer competitive economically. The workload gradually shifted to the other machines, and at 11:45 p.m. on October 2, 1955, the power to the Eniac was cut off. Although the Eniac’s purpose was over it still played a major role in the development of the computer industry. “It's death was a natural one, it had served its purpose.”
Houghton. A Brief Timeline in the History of Computers. Western Carolina University Retrieved January 30th 2014 from Western Carolina University:
The history of computing goes as far back as the 1600s. However, computers didn’t start to look like the ones we know today until the late 1900s. At first, computers were too big and too expensive for personal use. They were only used by businesses and the elite part of society. However, computer manufacturers like Apple and IBM began to refine and upgrade computers until they became practical for personal use. Today’s computers are much more advanced than their predecessors. Nowadays, we use computers for everything from grocery shopping to doing homework. “Tom Forester and Perry Morrison point out that Computers are the core technology of our times. They are the new paradigm, the new ‘common sense.’ In the comparatively short space of forty years, computers have become central to the operations of industrial societies. Without computers and computer networks, much of manufacturing industry, commerce, transport and distribution, government, the military, health services, education, and research would simply grind to a halt.” (1)
...othing like what are computers are today, it still started the ball rolling for the invention of many practical and useful computers today.
In 1937 the electronic computer was born. Computers were in 1943 to break “the unbreakable” German Enigma codes. 1951 introduced the computer commercially. However, it wasn’t until around 1976 when the Apple II was introduced and it was immediately adopted by high schools, colleges, and homes. This was the first time that people from all over really had an opportunity to use a computer. Since that time micro processing chips have been made, the World Wide Web has been invented and in 1996 more than one out of every three people have a computer in their home, and two out of every three have one at the office.
The next major improvement is the memory of a computer. This includes the hard drive and the RAM. During the same generation as the 386, there was also the hard drive and RAM. But these two components were not much then. The hard drive was about 100-300 megabytes and the RAM was about 4 megabytes.