Wait a second!
More handpicked essays just for you.
More handpicked essays just for you.
Research on the history of microprocessors
Research on the history of microprocessors
Research on the history of microprocessors
Don’t take our word for it - see why 10 million students trust us with their essay needs.
Recommended: Research on the history of microprocessors
Introduction
As a Computer Science student in Computer Engineering/Computer Science 285, I’ve grown to appreciate using a prototype board to get through the class. As a person more inclined to software than to hardware, the sixteen-week journey has been a process. Part of my appreciation comes from my curiosity in the equipment of this class, in learning about other components and the potential the board had. Learning not only about the 8051 microcontroller and board but also how to deal with it.
Microcontroller vs. Microprocessor
The first thing that needs to be done is to know the basics of what the 8051. The 8051 refers to a microcontroller. Microcontrollers are pieces of hardware that have set amounts on its parts, like RAM, ROM, and its input and output ports. Microcontrollers are more cost efficient than microprocessors, which need separate RAM, ROM, according to the needs of the user. Microcontrollers come in handy when making embedded systems in applications so that it can control specific actions of a computer in an efficient way, in both processing and cost.
8051 History
Like most technology, there is always the struggle to make the hardware more efficient, much like the story of the 8051 microcontroller. In 1980, Intel introduced the 8051 microcontroller as a successor to the 8048. It was efficient. It was not the best piece of equipment, but it worked at the time and was successful. 8-bits, multiple I/O ports, 128kb of RAM, it was a standard piece to use.
From here comes the real success of the 8051 microcontroller. Intel decided that it was not smart to keep the hardware strictly theirs. They allowed other companies to come in and make different versions of the 8051 microcontroller: different boards and different ...
... middle of paper ...
...s fastest (and "bestest") 8051-compatible core”.
Conclusion
8051 microcontrollers were an important milestone in technology and are still useful to this day. Computers and other applications still run using 8051 and its being used in classrooms today. Whether or not the microcontroller is dated or obsolete is always up to debate. In my opinion, the 8051 microcontroller has proved itself to be a useful piece of hardware in teaching engineers the relationship between code and hardware.
Much like the review above, being able to see hardware with a direct correlation to software has given me more appreciation for it. Especially when making our final project, seeing the effect that a line of code has on the entire output has shown me that hardware and software go hand in hand, and that this classes being titled CECS, Computer Engineering/Computer Science, is fitting.
van Oostveen, R. (2014). Syllabus for EDUC 5101G Learning With Technology. Retrieved from Blackboard EDUC5101G Course Web Site: https://uoit.blackboard.com/bbcswebdav/pid-464061-dt-content-rid-2012785_1/courses/20140172557.201401/EDUC5101_course_outline2014_v2.pdf
And this was not the first time Intel presented similar announcement. Also, another CPU giant AMD has been suffered from design and manufacture defect in their microprocessors for a long time. The traditional paradigm for computer hardware is to design the specific circuit including the gates, wires and the way they connected, and built it perfectly. Perfection of the components is the baseline of modern computer hardware--a single design or manufacture defect can cause the entire system crashed. It is thought that with the chip's increasing of complexity and shrinking in size, the defect is inevitable, either in design or manufacture process.
Throughout its history, Intel has centered its strategy on the tenets of technological leadership and innovation (Burgelman, 1994). Intel established its reputation for taking calculated risks early on in 1969 by pioneering the metal-oxide semiconductor (MOS) processing technology. This new process technology enabled Intel to increase the number of circuits while simultaneously being able to reduce the cost-per-bit by tenfold. In 1970, Intel once again led the way with the introduction of the world’s first DRAM. While other companies had designed functioning DRAMs, they had failed to develop a process technology that would allow manufacturing of the devices to be commercially viable. By 1972, unit sales for the 1103, Intel’s original DRAM, had accounted for over 90% of the company’s $23.4 million revenue (Cogan & Burgelman, 2004).
It then proceeds to examine the need for these skills in the real world and the need for these skills to be taught at university level. It starts by examining the general case of all students arriving in college for the first time and by the end gets to the particular needs of computer science students and others in the more practical disciplines.
Intel's business grew a bit in the years to come as it got bigger and made improvements on the way that products were made, and produced a wider range/variety of those products. Even though Intel created the first publically available processor (Intel...
In 1970, Intel got into the microprocessor business with Busicom, a Jap firm, as collaborators. During late 70s, Apple collaborated with Motorola for microprocessor purchases against Intel who had sim...
In today’s world where several different domains of technology must imbricate to pacify the rapid pace of development, it is necessary to take a multifaceted approach to learning. While Electrical Engineering major courses like Microprocessors, Embedded Controllers and Digital Electronics have exposed me to computing and data storage at the lowest level, chosen Computer Science courses like Computer Systems and Programming, Data Structures and MOOCs like Algorithms, Machine Learning and Software-as-a-Service have equipped me with the latest knowledge and ideas to capitalize on the digital systems as efficiently as possible. Two initial advanced courses in Mathematics have stre...
Maybe America’s educational leaders don’t understand what computer science is, which is why they don’t place enough emphasis on this invaluable skill for now and the future. Less than 7 percent of the state’s high schools offer courses in the important science.
Ever since I built my first computer early in 2012, I have had an avid interest in computing and technology. From hardware to software, I enjoy it all. I love everything about building computers, and I mean everything. From making the list of parts, to the Lego-like building process, to the final POST of the motherboard, I enjoy...
I took up Computer Science and Engineering as my discipline in Bhoj Reddy Engineering College for Women affiliated to Jawaharlal Nehru Technological University, one of the premier institutes in India. My undergraduate education has been a great learning and enriching process for me. It exposed me to all the core areas of Computer Science like operating systems, database management systems, networks and network security, data structures, algorithms and software engineering. These courses have given me a good foundation in the core concepts. My interest lies in Database Management, Programming Languages, Theory of Computation, and Software Engineering. I am fully acquainted with the fast growing subjects like Object Oriented Programming, Analysis and Design. On the other hand, the intensive laboratory classes exposed me to a fascinating world of experimentation. It was here that I discovered the qualities of perseverance and diligence in myself. I feel that these courses have done a lot to prepare me for my future studies and research work.
To build more powerful microprocessors requires an expensive and intense production process. Some computations take years to solve even with the more powerful microprocessor. May be because of these factors, programmers sometimes use a different approach called parallel processing.
We have the microprocessor to thank for all of our consumer electronic devices, because without them, our devices would be much larger. Microprocessors are the feat of generations of research and development. Microprocessors were invented in 1972 by Intel Corporation and have made it so that computers could shrink to the sizes we know today. Before, computers took a room because the transistors or vacuum tubes were individual components. Microprocessors unified the technology on one chip while reducing the costs. Microprocessor technology has been the most important revolution in the computer industry in the past forty years, as microprocessors have allowed our consumer electronics to exist.
Software engineering encompasses various principles from both the computer science field and the computer engineering field to develop practical uses of software within hardware we use on a daily basis. With the passage of time, technology dependent on software has become increasingly prevalent. As a result, there will be a high demand for software engineers to sustain the eventual abundance of new and more complex computers.
My interest in Computers dates back to early days of my high school. The field of CS has always fascinated me. The reason for choosing CS stream was not a hasty decision. My interest started developing in the early stage of my life, when I studied about the invention of computers. The transformation from the large size to small palmtops enticed me to know about the factors responsible for making computers, also the electronic gadgets so small. I was quite impressed after seeing a small chip for the first time in my school days, especially after I learnt that it contained more than 1000 transistors, “integrated circuits”.
Ever since computers first came into production, they have been evolving. The Commodore 64 and Apple computers have dominated the very first computer market. Today, there are many companies in the computer industry fighting for technology supremacy. And since the beginning, every new generation of computers has dominatedover the old ones. When they first came out, each jump in technology took awhile, but nowadays, the technology changes daily. Fifth generation computers are overall much better than the previous generation.