1
BASICS OF CLOUD COMPUTING
SOUVIK PAL1, DAC-NHUONG LE2, PRASANT KUMAR PATTNAIK3
1 Sister Nivedita University, Kolkata, India
2 Haiphong University, Haiphong, Vietnam
3 KIIT, Deemed to be University, India
Email: [email protected], [email protected], [email protected]
Abstract
Consider an analogy where personal computer users are not required to run, install or store their applications on their personal computers. Consider a situation where every piece of your information and data may be stored on the cloud (Internet). As an allegory for the Internet, cloud is a very popular word but when integrated with the word “computing,” it becomes more important and uncertain. Cloud computing comes into focus only when users think about what they always need, which leads to the concept of an updated version of utility computing. The advancement of cloud computing came about because of the quickly developing utilization of the web among the general population. Cloud computing is anything but an absolutely new innovation; it’s actually a voyage through distributed, cluster, grid and presently cloud computing. Before the increasing utilization of the web everywhere throughout the globe, cloud computing had just been used in the IT business. Cloud computing is transforming the computing landscape. The cloud concept and its computing process is an emerging topic in internet-centric and IT-market-oriented businesses. The goal required for the IT industry should be a direct conversation about how this new computing worldview will put an effect the associations, how it can be utilized with the current advancements. Cloud computing needs a third-party vendor through which a client or an end user or a customer may use the cloud provided by a cloud service provider (CSP) on demand.
Keywords: Cloud service provider, cloud computing
1.1 Evolution of Cloud Computing
Cloud computing isn’t new. Truth be told, a lot of what we do on our personal computers today requires it. What is changing is the way that we look at what cloud computing is able to do for us today. The idea of cloud computing came after the period of mainframe computers of the 1960s when the likelihood of utility computing had already been proposed by MIT personal computer researcher John McCarthy, who opined that “Calculation may some time or another be sorted out as an open utility.”
In 1961, John McCarthy proposed [1]: “If personal computers of the kind I have supported turned into the personal computers without bounds, at that point computing may some time or another be sorted out as an open utility similarly as the phone framework is an open utility... The personal computer utility could turn into the premise of another critical industry. Utility Computing is called a service providing model. Here, the service provider ensures the accessibility of computing resources and framework administration to the client as per requirement. This method is just like the pay per use service and that of the metered services, which implies that clients can pay according to their use of network access, sharing of records and several other applications. In 1966, Douglas F Parkhill published the book The Challenge of the Computer Utility. In his book, he investigated the idea of versatile provisioning and resource sharing.”
The sequential development of the computing environment may be arranged in a year wise manner as shown in Figure 1.1. IBM System/360 entered the global market in 1964. This model and other items of the same family attracted attention from the business community because the fringe parts were movable and the item unit was implemented in all systems of the family [1]. The scaling down of the mainframe systems and more improvements over time prompted the free machines, the reported minicomputers; for example, DEC’s PDP-8 minicomputer introduced in 1964 and Xerox’s Alto in 1974 [2].
The computer era began in the early 1970s with the release of the first Intel 4004 microprocessor (MP) in 1971, followed by the release of the Intel 8008 MP in 1972. The first personal home computer, the Micral, was created by André Truong Trong Thi [2] based on the Intel 8008 MP. Development of the Mark-8 or TV-Typewriter was the first project for microcomputer hobbyists. In 1975, the MITS Altair 8800 microcomputer kit advertised in several scientific and hobby magazines is credited with having popularized microcomputers. This personal computer was supposed to be the underlying idea behind home computers. The first programming language for the machine was Microsoft’s founding product, Altair BASIC. Successively, Apple, Commodore, Atari and others entered the personal home computer market. IBM introduced its first personal computer to the market, which was commonly known as the IBM PC. Microsoft engineered the operating system (OS) for IBM PCs, which was built up and standardized and wound up being used by numerous PC makers. There have been numerous consecutive periods of improvement with headway being made in the advancements whipping up the market. With the creation of graphical user interface (GUI), the next stage of improvements is being prompted.
While thinking about how to significantly improve interactions among numerous personal computers, another point of reference began in the business sector, which was the Internet. The Advanced Research Projects Agency (ARPA)1 presented the idea of the Internet as an exploratory venture. Each interfacing point is known as a node in a web. With the help of the U.S. Department of Homeland Security, a correspondence framework has been made with the end goal that in case any of the nodes get broken, the correspondence framework remains connected. In the long run, from this endeavor, the ARPANET was created and nearly 200 foundations were connected to this system. The thought of TCP/IP in 1983 has been exhibited, and the internet was changed to TCP/IP, which relates the entire subnet to the ARPANET. By and by the internet has become known as a system of systems. With the advancement of the World Wide Web (WWW) by British expert and personal computer researcher Sir Timothy John Berners-Lee in 1989, the web accomplished its definitive leap forward. Berners-Lee proposed an information administration framework for CERN (European Organization for Nuclear Research)2 where hyperlinks were utilized. In the end, with respect to the end clients there was a necessity for web programs. Thus, the WWW became ubiquitous when the web browser Mosaic was introduced in the market.
Figure 1.1: Evolution of cloud computing.
Today, the entire information technology sector is putting effort into outlining the quality of web programming by increasing the bandwidth and also by using