Wait a second!
More handpicked essays just for you.
More handpicked essays just for you.
History of computer development
Evolution of microprocessors
History of computer development
Don’t take our word for it - see why 10 million students trust us with their essay needs.
Recommended: History of computer development
The History of Microprocessors
If there is one piece of technology in this world today that has been through thousands of revolutions and evolutions in the past several decades, it is the computer. The basis of all computers is the microprocessor, which is integrated on the motherboard which functions as the computer's nucleus or brain. The microprocessor has evolved heavily since Intel's discovery of the 4004 in 1971 to the present Pentium III class processors. Even today, the speed, complexity, versatility, and efficiency of processors are enhancing at a lightning fast pace.
Microprocessors serve as the brain of the computer, meaning that all cycles of data, which is virtually trillions of numbers that are crunched at extremely high speeds, are calculated inside them. The speed at which these calculations are resolved are measured by hertz or Hz, which is a single cycle of data per second. Processors are mounted on the motherboard which is connected to all other components of the computer including its RAM (Random Access Memory), hard drive, and storage drives.
The first microprocessor was founded by Intel during 1971 and it was called the 4004, which computed calculations at a speed of 750 kilohertz. Intel's goal was to boost it's speed to 1 megahert (Mhz), or 1,000,000 cycles of data per second. This was not accomplished for quite some time since consumers rarely, if at all, owned and knew how to operate a computer. Through the next 10 years, Intel would create several upgraded versions of their 4004 which slowly became faster and faster.
Computers first became commercialized when Intel created the 80286, also known as the i286. It was created in 1982 which was also the year that Microsoft's revolutionary operating system, MS DOS 1.0, was released. This new processor could run at speeds of 6 - 8 Mhz, which was revolutionary in the world of microprocessors.
Not only was the speed revolutionary, but it also had the capability of multitasking, meaning that it could calculate data for several applications at once. Before the 286, multitasking was possible only in the most advanced processors at very slow speeds.
By the late 80's, Intel's technology had increased significantly and they produced the i386 class microprocessor. The first of it's kind was released in 1989 and it was clocked at a blazing fast speed of 25 Mhz. As the processors became more advanced, new motherboards which had the size and power to house them also became more advanced.
Processor (CPU) – The processor, also known as the Central Processing Unit runs the operating system and other applications. It is constantly receiving data from the user or other active software. The data is then processed and then an output is produced which either will be displayed on screen or stored by an application.
“In 1946, John Mauchly and J Presper Eckert developed the fastest computer at that time, the ENIAC I. It was built under the assistance of the US army, and it was used on military researches. The ENIAC I contained 17468 vacuum tubes, along with 70000 resistors, 10000 capacitors, 1500 relays, 6000 manual switches and 5 million soldered joints. It covered 1800 square feet of floor space, weighed 3 tons, consumed 160 kilowatts of electrical power.”(Bellis, Inventors of Modern Computer)
Intel is a multinational semiconductor chip maker corporation with the main headquarters located in Santa Clara, California. Founded on July 18, 1968, they are the world’s largest and highest valued semiconductor chip manufacturer (based on the company’s income) and are also the inventor of the x86 series processor2. It was founded by two men, Gordon E. Moore and Robert Noyce; the duo came from the Fairchild Semiconductor company. Intel’s first product after founding was the 3101 Schottky TTL bipolar 64-bit static random-access memory which was nearly twice as fast as earlier iterations by Fairchild and other competing companies. In the very same year, 1969, Intel also manufactured the 3301 Schottky bipolar 1024-bit read-only memory and the first publicly available metal–oxide–semiconductor field-effect transistor silicon gate SRAM chip, which was the 256-bit 1101.
In 1970, Intel got into the microprocessor business with Busicom, a Jap firm, as collaborators. During late 70s, Apple collaborated with Motorola for microprocessor purchases against Intel who had sim...
to replace the IBM machine. In the 1960s and the 1970s IBM came out quickly and built a
There are many different beginnings to the origins of computers. Their origins could be dated back more than two thousand years ago, depending on what a person means when they ask where the first computer came from. Most primitive computers were created for the purpose of running simple programs at best. (Daves Old Computers) However, the first ‘digital’ computer was created for the purposes of binary arithmetic, otherwise known as simple math. It was also created for regenerative memory, parallel processing, and separation of memory and computing functions. Built by John Vincent Atanasoff and Clifford Berry during 1937-1942, it was dubbed the Atanasoff Berry Computer (ABC).
In 1985 the company produced (in China) the first computer of its own design (the "Turbo PC").
Many encyclopaedias and other reference works state that the first large-scale automatic digital computer was the Harvard Mark 1, which was developed by Howard H. Aiken (and team) in America between 1939 and 1944. However, in the aftermath of World War II it was discovered that a program controlled computer called the Z3 had been completed in Germany in 1941, which means that the Z3 pre-dated the Harvard Mark I. Prof. Hurst Zuse (http://www.epemag.com/zuse/)
Processor speeds are measured in megahertz (MHz) and now come in speeds of up to 1000 MHz (1 GHz), which is very fast. This is almost ten times faster than the speed of most home computers, which average from 133 MHz to 166 MHz. Intel and AMD have been in a race to break the 1 GHz speed barrier, and the number of megahertz in the newest processors is not as significant as it was in earlier processors. For example, the difference between a 133 MHz processor and a 166 MHz processor is
Computers are very complex and have many different uses. This makes for a very complex system of parts that work together to do what the user wants from the computer. The purpose of this paper is to explain a few main components of the computer. The components covered are going to be system units, Motherboards, Central Processing Units, and Memory. Many people are not familiar with these terms and their meaning. These components are commonly mistaken for one and other.
We have the microprocessor to thank for all of our consumer electronic devices, because without them, our devices would be much larger. Microprocessors are the feat of generations of research and development. Microprocessors were invented in 1972 by Intel Corporation and have made it so that computers could shrink to the sizes we know today. Before, computers took a room because the transistors or vacuum tubes were individual components. Microprocessors unified the technology on one chip while reducing the costs. Microprocessor technology has been the most important revolution in the computer industry in the past forty years, as microprocessors have allowed our consumer electronics to exist.
It’s prime role is to process data with speed once it has received instruction. A microprocessor is generally advertised by the speed of the microprocessor in gigahertz. Some of the most popular chips are known as the Pentium or Intel-Core. When purchasing a computer, the microprocessor is one of the main essentials to review before selecting your computer. The faster the microprocessor, the faster your data will process, when navigating through the software.
In 1953 it was estimated that there were 100 computers in the world. Computers built between 1959 and 1964 are often regarded as the "second generation" computers, based on transistors and printed circuits - resulting in much smaller computers. 1964 the programming language PL/1 released by IBM. 1964 the launch of IBM 360. These first series of compatible computers. In 1970 Intel introduced the first RAM chip. In 1975 IBM 5100 was released. In 1976 the Apple Computer Inc. was founded, to market Apple I Computer. Designed to Stephen Wozinak and Stephan Jobs. In 1979 the first compact disk was released around 1981 IBM announced PC, the standard model was sold for $2,880.00.
In designing a computer system, architects consider five major elements that make up the system's hardware: the arithmetic/logic unit, control unit, memory, input, and output. The arithmetic/logic unit performs arithmetic and compares numerical values. The control unit directs the operation of the computer by taking the user instructions and transforming them into electrical signals that the computer's circuitry can understand. The combination of the arithmetic/logic unit and the control unit is called the central processing unit (CPU). The memory stores instructions and data.
The First Generation of Computers The first generation of computers, beginning around the end of World War 2, and continuing until around the year 1957, included computers that used vacuum tubes, drum memories, and programming in machine code. Computers at that time where mammoth machines that did not have the power our present day desktop microcomputers. In 1950, the first real-time, interactive computer was completed by a design team at MIT. The "Whirlwind Computer," as it was called, was a revamped U.S. Navy project for developing an aircraft simulator.