Chapter-1
Introduction
Grid computing is an extension to parallel and distributed computing. It is an emerging environment to solve large scale complex problems. It enables the sharing, coordinating and aggregation of computational machines to full the user demands. Computational grid is an innovative technology for succeeding generations. It is a collection of machines which is geographically distributed under different organizations. It makes a heterogeneous high performance computing environment.
Task scheduling and machine management are the essential component in computational grid. Due to the widespread use of resources, systems are highly prone to errors and failures. Hence fault tolerance plays a key role in grid to avoid the problem
…show more content…
• Data Grid: Data grids primarily deal with providing services and infrastructure for distributed data-intensive applications that need to access, transfer and modify massive datasets stored in distributed storage resources [4].
1.1.3 Basic grid model:
The basic grid model generally composed of a number of hosts, each composed of several computational resources, which may be homogeneous or heterogeneous. The four basic building blocks of grid model are user, res ource broker, grid information service (GIS) and lastly resources.
Figure 1.2: Basic Grid Model [5]
When user requires high speed execution, the job is submitted to the broker in grid.
1. Users: The user enters the jobs to be executed on processor in computational grid.
2. Resource Broker: Users typically do not interact with Grid services directly. Resource Broker is used to discover computing resources with the help of Information System and provide the jobs suitable resource for their computation. Resource broker is used to find the appropriate resource for the jobs, to do so it contacts the grid information server that keeps the status of all the currently available resources in the grid
…show more content…
Grid was originally conceived and designed in this community to allow access to computing resources that were geographically dispersed. The notion was that underutilized resources in places other than where the researchers were physically located could be used. Also fundamental in the formative thinking was the prospect of sharing access to data, typically in the form of files that were being jointly produced and used by collaborators in disparate locations.
Before discussing more about Grids lets go back to birth of distributed computing:
In the early 1970's when computers were first linked by networks, the idea of harnessing unused CPU cycles was born. A few early experiments with distributed computing — including a pair of programs called Creeper and Reaper — ran on the Internet's predecessor, the
The project will bring several changes to the company; it will first expand the current physical IT environment. It will provide the ability to increase the storage capacity of the current storage requirement and expected growth of data, while establishing a new data warehouse and business analytics applications and user interfaces. The project will also improve security by establishing security policies and it will leverage newer cloud based technology to provide a highly redundant, flexible and scalable IT environment while also allowing the ability to establish a low cost disaster recovery site.
... and so quite one grid, the value of every grid are going to be touch a smaller variety of subscribers, and therefore the monetary value per subscriber, and thence worth, are going to be higher (445 textbook).
As its core essences cloud computing is nothing but a specialized form of grid computing and distributing computing’s which various in terms of infrastructure , deployment, service and Geographic’s dispersion (Veeramachanenin, Sepetember 2015) the cloud enhance scalability, collaboration, availability , ability to adapt to fluctuation according to demand accelerate development work and provide optional for cost reduction and through efficient and optimized computing. (BH kawljeet, June 2015) cloud computing (CC) recently become as a new paradigm for the delivery and hosting of services our the internet. There are mainly three service delivery model Software as Service (SaaS) required software, operating system and network is provided or we can say in SaaS the customer can access the hosted software instead of installing it in local computer and the user can access these software through local computer internet browser (e.g web enabled E-mail ) the user only pay and the cloud service provider is responsible for management or control of mobile cloud infrastructure some of the company which provide such service are Google, Microsoft , Salesforce ,Facebook, etc…..Infrastructure as Service(IaaS)the cloud provider only provide some hardware resources such as network and virtualization is
A distributed system is a collection of independent computers (nodes) that appears to its users as a single coherent
Valerdi, Ricardo. "The Cloud Systems." Industrial Engineer n.d.: 28. ABI/INFORM Complete. Web. 1 May 2014. .
Also, it requires a fast and secure communication to monitor real-time connection status to act as Energy Manager. [SANCHEZ]
What we know today as the Internet began as a Defense Advanced Research Projects Agency (DARPA) project in 1969, which was designed to connect several research databases across the country. However, until the end of 1991, the advances were almost completely technical, as the goals set by those responsible in its growth were beyond what the hardware was capable of providing. In 1988, the Internet began to receive attention in the popular press, when the first documented computer virus was released at Cornell University. 1991 marked the beginning of the transition of the Internet as we know it today, with the National Science Foundation’s reinterpretation of its Acceptable Use Policy to allow for commercial traffic across its network, the development of the first graphic interfaces, the formation of the Internet Society, and the formation of ECHO (East Coast Hang Out), one of the first publicly available online communities.
The United States federal government funded new developments in computer science, which resulted in the creation of ARPANET, a project that connected computer systems at five universities with the intent that if one server was destroyed, the connection would remain due to the four other locations . This fundamental structure of the internet was developed as a peer-to-peer system, which means that there is no central control point in the network, therefore the internet is arranged like a web, in which all pieces of information travel as equals. The interconnectivity of the internet resulted in the creation of the World Wide Web in the early 1990s, which is an internet program that developed the internet into a massive, interactive mass medium
When designing networked applications one key protocol stands out as the foundation for making it possible. That protocol is TCP/IP. There are many protocols out there that allow two applications to communicate. What makes TCP/IP a nice protocol is that it allows applications on two physically separate computers to talk. What makes TCP/IP great is that it can do with two computers across a room or across the world. In this paper I will show you how TCP/IP allows a wide array of computer hardware to work together without ever having to knowing what the other machine is or how it even works. At the same time you will learn how it allows information to find its way around the world in a faction of a second without knowing in advance how to get there.
Pivotal HD is an enterprise-ready, Hadoop distribution application that can harness the power of massive data being driven by new apps, systems, machines and the torrent of customer sources. It provides an economic way to leverage the huge amounts of data produced by new sources – such as social media, mobile sensors and Internet of Things (IoT) devices. 5) Real Time-Low Latency Pivotal Corp is actively working on a project aimed at integrating the in-memory data grid capabilities in Gem Fire and Pivotal Hadoop.
4. increase speed with low cost bring out new solutions and services using speed method on cloud based shared development operations.
The SDI should use open data with collaboration between government and private sector. The data supports interoperability by adhering to the standards. Some of the databases will be deployed on the cloud infrastructure using Infrastructure as a Service (IaaS), Data Storage as a Service (dSaaS), etc.
A cloud service is any resource that is provided over the internet. Service delivery in cloud computing comprises three different service models.
The first instance of networking was in August of 1962 when J.C.R. Licklider of MIT described what was called a “Universal Network” concept (Cerf). The Universal Network was to be a globally interconnected set of computers through which everyone could quickly access data and programs from any location. In essence, the concept was very much like the internet of today. The Defense Advanced Research Projects Agency (DARPA) expanded the Universal Network con...
Computers have undoubtedly improved life in many respects, but they have not always been so easy to work with, nor so widespread. Changes in the computer came about from technological advancement, as well as necessity of performance. With a world changing so quickly and getting bigger each day, it’s no wonder systems needed to be implemented to keep up with the fast pace of an evolving society.