Moore’s law states that “every two years the number of transistors per square inch on integrated circuits has doubled every year since the integrated circuit was invented” (Intel, 2014). This was a term that was coined by Intel co-founder Gordon Moore in 1995 (Webopedia, 2014). This trend has allowed computers to go from large devices, where one computer would take up entire rooms to the computer devices we have now. This same trend can be seen in all parts of computer technology including the networking side of computer hardware. Where once we had simple peer-to-peer networks, then client-server networks and now we have cloud computing and virtualization. With virtualization, we can now use computers that are no longer in the same room or even the same building and work on the computers as if they were sitting right next to the user just like a normal computer.
What is Desktop Virtualization
Desktop Virtualization takes the idea of running a computer and turns it into a service that provides users a desktop even over a great distance. This is done by “isolating a logical operating system instance from the client that is uses to access the operating system” (Rouse & Madden, Desktop Virtualization, 2011). The basic idea is that the end user will get a version of the operating system that is streamed to their office location. The user will then be able to move around and work on the instance of the operating system that they have as if the computer that they are connecting to is like any other normal computer they connect to and work on in their office. Whereas instead of the computer they are connecting to is in front or next to them the computer that they are working on can be in another room in the building or perhaps...
... middle of paper ...
.... (2013). Cloud Computing as Evolution of Distributed Computing - A Case Study for SlapOS Distributed Cloud Computing Platform. Informatica Economica, 17(4), 109-122.
The Virtualization Practice. (2014). Presentation Virtualization. Retrieved from The Virtualization Practice, Virtualization & Cloud Computing News, Resources, and Analysis: http://www.virtualizationpractice.com/topics/presentation-virtualization/
Tittel, E., & Lindros, K. (2013, December 16). Virtual Desktop Infrastructure Offers Risks and Rewards. Retrieved from CIO: http://www.cio.com/article/744687/Virtual_Desktop_Infrastructure_Offers_Risks_and_Rewards
Webopedia. (2014). Moore's Law. Retrieved from Webopedia: http://www.webopedia.com/TERM/M/Moores_Law.html
Webopedia. (2014, January 23). The 7 Layers of the OSI Model. Retrieved from Webopedia: http://www.webopedia.com/quick_ref/OSI_Layers.asp
As its core essences cloud computing is nothing but a specialized form of grid computing and distributing computing’s which various in terms of infrastructure , deployment, service and Geographic’s dispersion (Veeramachanenin, Sepetember 2015) the cloud enhance scalability, collaboration, availability , ability to adapt to fluctuation according to demand accelerate development work and provide optional for cost reduction and through efficient and optimized computing. (BH kawljeet, June 2015) cloud computing (CC) recently become as a new paradigm for the delivery and hosting of services our the internet. There are mainly three service delivery model Software as Service (SaaS) required software, operating system and network is provided or we can say in SaaS the customer can access the hosted software instead of installing it in local computer and the user can access these software through local computer internet browser (e.g web enabled E-mail ) the user only pay and the cloud service provider is responsible for management or control of mobile cloud infrastructure some of the company which provide such service are Google, Microsoft , Salesforce ,Facebook, etc…..Infrastructure as Service(IaaS)the cloud provider only provide some hardware resources such as network and virtualization is
Virtual machines operate based on the computer architecture and functions of a real or hypothetical computer, and their implementations may involve specialized hardware, software, or a combination of both.
Virtualization is a technology that creates an abstract version of a complete operating environment including a processor, memory, storage, network links, and a display entirely in software. Because the resulting runtime environment is completely software based, the software produces what’s called a virtual computer or a virtual machine (M.O., 2012). To simplify, virtualization is the process of running multiple virtual machines on a single physical machine. The virtual machines share the resources of one physical computer, and each virtual machine is its own environment.
It is about the ability to deliver any information to any device over any network. In short, it is about computers everywhere where computers are embedded into equipments , machines , furniture or people. Portable devices, wireless communication and nomadic or ubiquitous computing systems.
Goldworm, B & Skamarock, A. 2007. Blade Servers and Virtualization . Indiana : Wiley Publishing
As we all know virtualization is the requirement of future. We have evolved from the age of traditional environment to virtual environment.We have grown accustomed to almost all things virtual from virtual memory to virtual networks to virtual storage.The most widely leveraged benefit of virtualization technology is server consolidation, enabling one server to take on the workloads of multiple servers. For example, by consolidating a branch office’s print server, fax server, exchange server, and web server on a single windows server, businesses reduce the costs of hardware, maintenance, and staffing.
Virtualization technologies provide isolation of operating systems from hardware. This separation enables hardware resource sharing. With virtualization, a system pretends to be two or more of the same system [23]. Most modern operating systems contain a simplified system of virtualization. Each running process is able to act as if it is the only thing running. The CPUs and memory are virtualized. If a process tries to consume all of the CPU, a modern operating system will pre-empt it and allow others their fair share. Similarly, a running process typically has its own virtual address space that the operating system maps to physical memory to give the process the illusion that it is the only user of RAM.
The fundamental idea behind a virtual machine is to remove the hardware of a single computer and make it a self-contained operating environment that behaves as it is a separate computer. Essentially, the virtual machine is software that executes an application and isolates it from the actual operating system and hardware. CPU scheduling and virtual-memory techniques are used so that an operating system can create the illusion that a process has its own processor with its own (virtual) memory. The virtual machine provides the ability to share the same hardware yet run several different operating systems concurrently, as shown in Figure 2-11.
During the boom of the microcomputer industry, or around 1980s, computers began to be deployed all around the world, in many cases with little or no care about operating requirements. As information technology operations started to grow in diversity, companies grew cautions of the need to control information technology resources. Companies needed fast Internet connection and nonstop operation to deliver systems and establish a presence on the Internet. A lot of companies build large facilities, which were named Internet data center and provided businesses with a range of solutions for systems to adopt and operate. Data centers for cloud computers are called cloud data center. The distribution of these terms has approximately abandoned and they are being established as “data center”. Business and government institutions are reviewing data centers to a higher degree in areas like security, availability, environmental impact and attachment to requirements. Requirements Documents from authorized organizations groups, like for example the Telecommunications Industry Association. Well-known operational metri...
In the past few decades, one field of engineering in particular has stood out in terms of development and commercialisation; and that is electronics and computation. In 1965, when Moore’s Law was first established (Gordon E. Moore, 1965: "Cramming more components onto integrated circuits"), it was stated that the number of transistors (an electronic component according to which the processing and memory capabilities of a microchip is measured) would double every 2 years. This prediction held true even when man ushered in the new millennium. We have gone from computers that could perform one calculation in one second to a super-computer (the one at Oak Ridge National Lab) that can perform 1 quadrillion (1015) mathematical calculations per second. Thus, it is only obvious that this field would also have s...
Today’s servers are designed to last for longer periods; however, the idea of keeping up with new data center demands, requires constant hardware upgrade. The main reasons for undertaking server upgrade include; need to extend the server’s life, need to get the most out of the existing servers, the need to make use of the old data center hardware and implications of virtualization. Server virtualization is a critically networking function because it makes one server act as many servers; each with unique applications and Operatin...
ErrealMedia (2010) Network standards OSI Reference Model; History of OSI Model; OSI Layers in Action http://www.erealmedia.com/cms125/
My interest in Computers dates back to early days of my high school. The field of CS has always fascinated me. The reason for choosing CS stream was not a hasty decision. My interest started developing in the early stage of my life, when I studied about the invention of computers. The transformation from the large size to small palmtops enticed me to know about the factors responsible for making computers, also the electronic gadgets so small. I was quite impressed after seeing a small chip for the first time in my school days, especially after I learnt that it contained more than 1000 transistors, “integrated circuits”.
Almost every device has some type of computer in it. Whether it is a cell phone, a calculator, or a vending machine. Even things that we take for granted most cars since the 1980’s have a computer in it or a pacemaker. All of the advancements in computers and technology have led up to the 21st century in which “the greatest advances in computer technology will occur…” Mainly in areas such as “hardware, software, communications and networks, mobile and wireless connectivity, and robotics.”