Advancement in Si-microprocessor due to changing device structure and design
1. Introduction
This paper deals with technological and structural design changes that are bringing the microprocessor to an extremely higher level. We will see how SOI technology has revolutionized the way chips were being made. These Si-microprocessors has made our life extremely sophisticated and it has seen a thousand fold increase since their invention. Focus nowadays is primarily on how reduce heat generation, power consumption and its size and increase its output power. The main concern of the designers is parallelism, reliability, structural optimization preferably better synergy etc.
2. Evolution of Microprocessor
With the coming of Moore’s Law in picture after 1975, the case was made for continued wafer and die size growth, defect density reduction, and increased transistor density as technology scaled and manufacturing matured. Transistor count in leading microprocessors has doubled in each technology node, appropriately every 2 years. Factors that drove up transistor count are increasingly complex processing cores, integration of multiple levels of caches, and inclusion of system functions. The frequency of a Microprocessor has almost doubled every generation resulting in advanced circuit design and more efficient transistors.. Die size has increased at 7% per year while feature size reduced by 30% every 2 to 3 years. Together, this fuel the transistor density growth as predicted by Moore’s Law. Die size is limited by the reticle size, power dissipation, and yield. Leading microprocessors typically have large die sizes that are reduced with more advanced process technology to improve frequency and yield. As feature size gets smaller, fi...
... middle of paper ...
... extent represented within the Intel Itanium Processor, security, scalability, delivering reliability, massive resources, parallelism and a new memory model based on a sound microarchitectural foundation. Because of its efficiency and so tiny in size and independence of out-of-order logic, the most advanced generation Itanium processor delivers best performance without any thermal generation problems. This quality of Itanium makes it a simple but efficient and refined engine that helps in consistent long-term improvement in code execution by small advancement in software, ultimately reducing the importance for significant new improvements in hardware. Microprocessor hardware advancements are becoming more and more difficult as time is progressing and even Moore believes that the exponential upward curve in microprocessor hardware advancements will not go forever.
For over thirty years, since the beginning of the computing age, the Gordon Moore's equation for the number of chip transistors doubling every eighteen months has been true (Leyden). However, this equation by its very nature cannot continue on infinitely. Although the size of the transistor has drastically decreased in the past fifty years, it cannot get too much smaller, therefore a computer cannot get much faster. The limits of transistor are becoming more and more apparent within the processor speed of Intel and AMD silicon chips (Moore's Law). One reason that chip speeds now are slower than possible is because of the internal-clock of the computer. The clock organizes all of the operation processing and the memory speeds so the information ends at the same time or the processor completes its task uniformly. The faster a chip can go (Mhz) requires that this clock tick ever and ever faster. With a 1.0 Ghz chip, the clock ticks a billion times a second (Ball). This becomes wasted energy and the internal clock limits the processor. These two problems in modern computing will lead to the eventual disproving of Moore's Law. But are there any new areas of chip design engineering beside the normal silicon chip. In fact, two such designs that could revolutionize the computer industry are multi-threading (Copeland) and asynchronous chip design (Old Tricks). The modern silicon processor cannot keep up with the demands that are placed on it today. With the limit of transistor size approaching as well the clock speed bottleneck increasing, these two new chip designs could completely scrap the old computer industry and recreate it completely new.
The ubiquity of silicon chips has been primarily driven by the breadth, and rate of innovation in the semiconductor field over the past fifty years. Every realm of human life has benefited from these advancements, and it is gratifying to know that each of us in the field has played a part, however infinitesimal. However, disruptive innovation in the next ten years will surpass all that we have accomplished in the last fifty years.
Not long ago computers were non-existent in many homes. When computers were first introduced to the world, they were for the sole purpose of performing business functions. The only people who owned computers were large organizations. Eventually, computers were introduced into the homes of those who could afford to buy them. Today, just about everyone owns some form of system that they use daily to help manage their day-to-day operations. What many once survived without now seems impossible to do without. As technology continues to grow, it has a greater effect on families and the education system. Some companies such as Microsoft and Apple made it possible to reinvent a new form of technology that would change the world. Each company had some form of struggle and overtime had to keep up with the changes of time and the way people communicated. From the first day of its invention, organizations have had to steadily implement new operating systems to keep up with the demands of the people while staying afloat with competitors. The ways of life for many have changed as well as the way people communicate. It is evident that the history and uses of computers have changed the world but these computers could not perform without the operating systems. Various operating systems will be discussed, how they began and how they each changed since they were first introduced. Although, they all had a purpose each varied in how they performed and changed the lives of many and will continue in the near future.
Throughout its history, Intel has centered its strategy on the tenets of technological leadership and innovation (Burgelman, 1994). Intel established its reputation for taking calculated risks early on in 1969 by pioneering the metal-oxide semiconductor (MOS) processing technology. This new process technology enabled Intel to increase the number of circuits while simultaneously being able to reduce the cost-per-bit by tenfold. In 1970, Intel once again led the way with the introduction of the world’s first DRAM. While other companies had designed functioning DRAMs, they had failed to develop a process technology that would allow manufacturing of the devices to be commercially viable. By 1972, unit sales for the 1103, Intel’s original DRAM, had accounted for over 90% of the company’s $23.4 million revenue (Cogan & Burgelman, 2004).
“Which is better, AMD or Intel?” is a question that is constantly debated among people involved with computers. There are many reasons to choose one side over another, as both do have their advantages and disadvantages. Intel and AMD are the most prevalent processor production companies, which in turn creates competition between the two. This question is a by-product of that competition. Only by knowing each company and what their product has to offer, can a person make a decision as to what to buy to suit their needs.
“After the integrated circuits the only place to go was down—in size that it. Large scale integration (LS) could fit hundreds of components onto one chip. By the 1980’s, very large scale integration (VLSI) squeezed hundreds of thousands of components onto a chip. Ultra-Large scale integration (ULSI) increased that number into millions. The ability to fit so much onto an area about half the size of ...
James E. Smith, Gurindar S. Sohi “The Microarchitecture of Superscalar Processors”, Proceedings of the IEEE, Volume: 83, Issue: 12, pp. 1609-1624, December 1995.
Its Fourth-generation quad-core Intel Core i7 processor is remarkable for good performance and visuals. It enables the user to feel and see in high definition and 3D; and gives room for multitasking and media. The thrilling speeds are modeled for smooth and seamless games, photos and movies. The chip has a transistor count of 1.4 billion and a die size that is 177 square millimeters. Additionally, it has an incorporated processor graphics, and a dual-channel DDR3 support of up to 1600MHz (Williams & Sawyer, 2010).
The debate over whether or not the design|architecture} design or the CISC architecture is best has been occurring for several years. whether or not design|architecture} design with its tiny however economical instruction set or the CISC architecture with its massive and straightforward to use instruction set is best has been arduous to work out. during a time once new chips ar free nearly monthly, corporations wish to create certain they need the sting over the competition. they require their chips to be designed with speed in mind. several chips have used either the Reduced Instruction Set pc or the advanced Instruction Set pc since the start of the pc era however whether or not one is best has ne'er been a clear-cut issue. They each have strengths and weaknesses. we tend to ar progressing to discuss the advantages and downsides of every design and verify that is that the higher design.
does not catch up with hardware at this time. Designing programs for multi-processor computers is still
In the past few decades, one field of engineering in particular has stood out in terms of development and commercialisation; and that is electronics and computation. In 1965, when Moore’s Law was first established (Gordon E. Moore, 1965: "Cramming more components onto integrated circuits"), it was stated that the number of transistors (an electronic component according to which the processing and memory capabilities of a microchip is measured) would double every 2 years. This prediction held true even when man ushered in the new millennium. We have gone from computers that could perform one calculation in one second to a super-computer (the one at Oak Ridge National Lab) that can perform 1 quadrillion (1015) mathematical calculations per second. Thus, it is only obvious that this field would also have s...
In 1947 with the invention of the transistor, the role of the interconnect has proven to be a critical component in the design and manufacture of integrated circuits1. Various metals and manufacturing techniques have been employed, from pure Aluminum to Tungsten plugs to the metal that is used in the High Volume Manufacture of integrated circuit today, Copper.
A processor is the chip inside a computer which carries out of the functions of the computer at various speeds. There are many processors on the market today. The two most well known companies that make processors are Intel and AMD. Intel produces the Pentium chip, with the most recent version of the Pentium chip being the Pentium 3. Intel also produces the Celeron processor (Intel processors). AMD produces the Athlon processor and the Duron processor (AMD presents).
CPU Stands for "Central Processing Unit." The CPU is the primary component of a computer that processes instructions. It runs the operating system and applications, constantly receiving input from the user or active software
computer architecture Computer architecture covers the design of system software, such as the operating system (the program that controls the computer), as well as referring to the combination of hardware and basic software that links the machines on a computer network. Computer architecture refers to an entire structure and to the details needed to make it functional. Thus, computer architecture covers computer systems, microprocessors, circuits, and system programs. Typically the term does not refer to application programs, such as spreadsheets or word processing, which are required to perform a task but not to make the system run.