In fact, the technology behind cloud computing is by and large the easy part. Frankly, the hardest part of cloud computing is the people. The politics of migrating from legacy platforms to the cloud is inherently complicated because the adoption of cloud computing affects the way many people—not just IT professionals—do their jobs. Over time, cloud computing might drastically change some roles so that they are no longer recognizable from their current form, or even potentially eliminate some jobs entirely. Thus, the human-economic implications of adopting and migrating to cloud computing platforms and processes should not be taken lightly.
There are also, of course, countless benefits stemming from the adoption of cloud computing, both in the short term and the longer term. Many benefits of cloud computing in the corporate arena are purely financial, while other network externalities relating to cloud computing will have much broader positive effects. The ubiquity of free or inexpensive computing accessed through the cloud is already impacting both communications in First World and established economies, and research and development, agriculture, and banking in Third World and emerging economies.
Therefore, it is important for decision makers to understand the impact of cloud computing both from a financial and from a sociological standpoint. This understanding begins with a clear definition of cloud computing.
Cloud Computing Defined
Cloud computing is not one single technology, nor is it one single architecture. Cloud computing is essentially the next phase of innovation and adoption of a platform for computing, networking, and storage technologies designed to provide rapid time to market and drastic cost reductions. (We talk more about adoption and innovation cycles in the scope of economic development in Chapter 4, “The Cloud Economy—The Human-Economic Impact of Cloud Computing.”)
There have been both incremental and exponential advances made in computing, networking, and storage over the last several years, but only recently have these advancements—coupled with the financial drivers related to economic retraction and recession—reached a tipping point, creating a major market shift toward cloud adoption.
The business workflows (the rules and processes behind business functions like accounts payable and accounts receivable) in use in corporations today are fairly commonplace. With the exception of relatively recent changes required to support regulatory compliance—Sarbanes-Oxley (SOX), Payment Card Industry Data Security Standard (PCI DSS), or the Health Insurance Portability and Accountability Act (HIPAA), for example—most software functions required to pay bills, make payroll, process purchase orders, and so on have remained largely unchanged for many years.
Similarly, the underlying technologies of cloud computing have been in use in some form or another for decades. Virtualization, for example—arguably the biggest technology driver behind cloud computing—is almost 40 years old. Virtualization—the logical abstraction of hardware through a layer of software—has been in use since the mainframe era.1 Just as server and storage vendors have been using different types of virtualization for nearly four decades, virtualization has become equally commonplace in the corporate network: It would be almost impossible to find a LAN today that does not use VLAN functionality.
In the same way that memory and network virtualization have standardized over time, server virtualization solutions—such as those offered by Microsoft, VMware, Parallels, and Xen—and the virtual machine, or VM, have become the fundamental building blocks of the cloud.
Over the last few decades, the concept of a computer and its role in corporate and academic environments have changed very little, while the physical, tangible reality of the computer has changed greatly: Processing power has more than doubled every two years while the physical footprint of a computer has dramatically decreased (think mainframe versus handheld).2
Moore’s Law aside, at its most basic level, the CPU takes I/O and writes it to RAM and/or to a hard drive. This simple function allows applications to create, process, and save mission-critical data. Radically increased speed and performance, however, means that this function can be performed faster than ever before and at massive scale. Additionally, new innovations and enhancements to these existing technology paradigms (hypervisor-bypass and Cisco Extended Memory Technology, for example) are changing our concepts of what a computer is and does. (Where should massive amounts of data reside during processing? What functions should the network interface card perform?) This material and functional evolution, coupled with economic and business drivers, are spurring a dramatic market shift toward the cloud and the anticipated creation and growth of many new markets.
While it is fair to say that what is truly new about the cloud is the use of innovative and interrelated technologies to solve complex business problems in novel ways, that is not the whole story. Perhaps what is most promising about cloud computing, aside from the breadth of solutions currently available and the functionality and scalability of new and emerging platforms, is the massive potential for future products and solutions developed in and for the cloud. The untapped potential of the cloud and the externalities stemming from consumer and corporate adoption of cloud computing can create significant benefits for both developed and underdeveloped economies.
With a basic understanding of the technology and market drivers behind cloud computing, it is appropriate to move forward with a deeper discussion of what cloud computing means in real life. To do this, we turn to the National Institute of Standards and Technology (NIST).