Cloud Computing and Software as a Service (SaaS)
Imagine thousands of huge hotels, each one a thousand rooms in size, situated on every street. Imagine also those hotels offering full service and all amenities and each with an occupancy rate of only ten percent. Every night, nine hundred rooms would be dark, while the overhead and personnel cost continue at a level to support one thousand rooms in each location. If you were a hotelier, what would you do?
Such was the plight of data centers at the end of the dot-com bust. Huge data centers, each with thousands of servers, disks spinning and air conditioning blasting away to cool the long racks of servers, many of which were utilized at less than ten percent capacity, while others were off-line, were waiting for the work load that was not to come. If you were in the data center business, what would you do?
During the run-up to the peak days before the dot-com decline, servers and storage were commissioned at a feverish rate, anticipating the data loads that would consume all the capacity and more. The promise of on-line shopping, from books to boots was the dream of every purveyor. Applications to support the shopper systems were also being dreamed up at an amazing rate, with tracking, couponing, award systems, databases of product hungry consumers and many not-so-obvious back room support elements, all of which would consume server cycles, storage on disks of the day and wild optimism propelling the continuous development of this chain of "data hotels". Then came the crash. What would you do?
Some turned to a 1966 book by Doug Parkhill titled "The Challenge Of The Computer Utility". This book, written forty years before, and more than thirty years before the advent of a consumer Internet, described the business opportunity and the technical challenge of sharing computing cycles to be sold in similar fashion to kilowatts of electricity or thousands of gallons of municipal water, served by public or private interests. It now is ten years after the dot-com demise and almost twenty years since the public learned that there is an Internet, and many aspects of Parkhill's model have become reality.
Remember that in 1966 and until the mid-1990's computer to computer communication was often made possible through the use of high cost, leased private lines provided by a major telecom company. DARPA, the Defense Advanced Research Project Agency, founded to counter the 1957 launch of Sputnik, supported the development of DARPANET, connecting the research labs and universities engaged in advanced projects. Tim Berners-Lee while at CERN, the European Center for Nuclear Research, described the concept of IP addressing, allowing a unique address for all the computers expected to ever exist. Thus the Internet was born. The Mosaic browser came on the scene in 1993, and was licensed in 1994 to Spyglass, Inc., who later licensed to several companies including Microsoft, seeding the development of Internet Explorer. The browser wars ensued, with Netscape eventually crumbling under the installed weight of Internet Explorer.
These elements all contributed to the fevered development of the public Internet, and led to the dot-com euphoria. Technology stocks could lose no money, buying and selling the same day provided huge profits for the many who became known as day-traders. Then suddenly, the euphoria ended. Stocks plummeted to erase lifetimes of savings and perceived wealth, while servers began to be turned off and storage disks spun to a halt. The bubble had burst.
We will return to this point later, but now, fast forward to 2035. placing your water glass under the kitchen faucet, a steam of pure water pours forth, controlled by the sensor in the faucet, and logging you, the thirsty drinker through the use of the RF chip that is your personal identifier. Water is precious in 2035, so it is monitored and dispensed with care. Gasoline, that once was used to fuel vehicles, costs less than water, but is not good for drinking. Solid state lamps have reduced the amount of electricity used, and it is a good thing, since power plants now are so costly to operate.
The best utility bargain of all is the 24/7 connectedness that was known as "cloud computing" in 2010, but is now just an immersive utility that is wirelessly delivered, lightening fast, and which bridges all forms of communication, computing and knowledge gathering. IPv6, which was still in its initial stage in 2010, has served us well these past twenty-five years. With every person and every device having an address, seamless connectivity has become invisible and ubiquitous. We had some glimpses at this when unified communication was introduced, but at that time, forwarding between devices was required. Now, from the initial setup, devices are known to each other and to their owner, and all forms of communication from video phone calls to the use of applications on personal devices just happens, and does not require the user to be a "techie".
Just as a matter of interest, the initial Internet Protocol, IPv4 had address capability for about four billion addresses. These are expected to be depleted by 2010 or 2011, with work-arounds to bridge until full adoption of IPv6 under discussion. This IPv4 exhaustion was also predicted to occur in 2003 and 2004, so the accuracy of such predictions is called into question. IPv6 with 128 bit addressing provides at least 5 x 1028 addresses for the 6.8 billion people who were alive on the earth in 2010. IPv6 has what is known as Stateless Address Configuration, meaning that any device is able to configure itself automatically when first connected to a network. It is this functionality that provides for the ubiquitous connectivity of any device from any location where Internet service is available.
Each of our daily activities, from shopping, dining, payment systems and personal communication and tracking are ubiquitous, seamless, and are just extensions of our being. The personal device, which was once known as a cellular phone, now serves as wallet, personal communicator, calendar, reminder, medical record access, and personal valet/shopper. Many of the pre-computer generation have passed on, and a new generation of younger people, never having known a life without technological assistance, do not require training or adaptation to new technologies. This natural evolution has occurred quietly.
As we flash back to 2010, we will examine how this point was reached.
With all of the excess server capacity from the dot-com days, knowledge deployment in the form of server based information proliferated. Much of it is of high value, while a part is pure drivel. Determining which is which is one of the challenges faced by users of Internet technology. It is a simple task to post information and deploy it to hundreds of millions of people, while ensuring it to be accurate and truthful is the responsibility of the poster, but really falls to the one accessing the knowledge base. Reputable companies are on the line daily, where just a single rogue associate can damage reputation and credibility with lasting negative results. Early visionaries such as the developer of Salesforce.com helped create an environment where applications that once were closely guarded, internal and proprietary could be safely housed in remote servers with adequate security to ensure that corporate data is safe, but is accessible from anywhere there is Internet access. Intuit quietly reached out to taxpayers who once purchased federal and state tax preparation programs for installation on home computers, with on-line versions that accomplished the same tax filings, but did so without installing applications on the home or office PC. The consumer was thus drawn into using software supplied as a service. Banks and financial institutions also contributed to the growing confidence in the use of on-line services. Banking and bill paying, along with brokerage accounts were served by data centers hosting the most personal of data, with safeguards in place to minimize theft or misappropriation. Of course, along with the public's use of the internet came a darker side, the professional thieves who also wanted access to the personal data and bank accounts of the on-line user. Social engineering is harder to defend against than brute force attacks. At least, security systems can foil most attacks, while human gullibility is often the cause for compromised data.
Google must be credited with increasing the comfort of consumers with having personal data stored remotely, while at the same time, keeping a wary eye on the storage purveyors to make sure the personal data is not misused. The love-hate relationship with Google teaches that on the one hand, access to information from a myriad of sources, although free, has a cost. That cost is the forfeiture of privacy in individual transactions. For most, the trade-off is an acceptable one. For a few, who may wish to access information or transact business on-line, the ability for Google to capture data related to one's on-line habits is disconcerting. The legality has not yet been challenged, as it is the voluntary use of the on-line search agent that through its use, also compromises privacy.
Amazon.com, originally a seller of books has expanded into the sale of many lines of consumer and tradesman goods, many of which it neither stocks, ships, or warrants, but is paid a commission on sales made through its huge on-line processing capability. The "back-end" that handles these transactions is a collection of server farms, distributed around the globe. This data capacity is many times what is required for the sale of all items offered by Amazon. This brings Amazon back to the hotelier's dilemma. What do we do with the unused computing cycles and vast unused storage? In Amazon's case, the answer is to sell CPU cycles to those who have need for massive data manipulation, and storage for those who need database administration but do not have or wish to have the headaches associated with data center operation. Thus, Amazon as a sales outlet is connected to the cloud, while at the same time, Amazon the seller of computer cycles and storage is the cloud.
Software as a Service or SaaS, is an extension of the use of the public network to deliver applications, services or computing cycles on a metered basis. The actual way in which charges are administered is quite broad. Subscriptions are the most lucrative, as the charges continue even during times of light or non-use. The per-use model, in which time of use is billed may have a higher rate than a subscription model, but what is used is the only thing for which the client is billed. Of course, there are many add-ons to enhance revenue. Redundancy and reliability are highly desirable in a data service, so Service Level Agreements or SLA's define what is expected of the provider, and what penalties will be incurred in the event that the provider fails to meet the terms of the SLA. There may be a cost associated with the use of an application provided by the cloud host. It may be an application of their own design, or may be a licensed application written by others and provided to many by the host provider. Royalties flow back to the application developer, but the license costs are not seen by the end user. They are instead rolled into the cost of the service.
Today, one has available accounting, customer relationship management, human resource management, legal libraries, and case law, medical record keeping, relational databases for transaction processing and many technical, scientific, business and governmental applications. Where an application does not exist, one may be assembled from components of other applications in what has become known as a "mashup". These hybrid applications must be thoroughly tested before deployment, and may require many months to fine tune before being able to meet requirements of the user. Caveat emptor.
Private clouds are emerging as interest in cloud computing reaches the executive suite. Before we open the private cloud Pandora's box, let's examine some of the functional aspects that are in play today, the year 2010. Server virtualization has reached the level that server capacity can be rented, much as folding chairs for the wedding on the lawn. An initial server space is contracted, but as the demand for more capacity becomes known, (in similar fashion to more RSVP's for the wedding), more chairs are ordered, or more server space is contracted. Only a few years ago this would have taken some time, as servers would have to be provisioned, ginned up, security applied and applications or storage installed. Today, the scalability is automated. When more capacity is needed, a new virtual machine is commissioned, operating within the virtual server farm, and capacity is seamlessly added as it is needed. When the peak is reached and a capacity decline is realized, the virtual machines are gradually retired and the client is billed for only the capacity used. The flexibility thus gained allows on-demand use and because it is not on-site at the user location, is an automated, self-commissioning cloud.
Business continuity and disaster recovery are areas in which growth is being seen, and which offer the benefit of unattended backup into a private cloud. In a normal environment, a primary server would contain the application, database, or storage needed to maintain daily business operations. As transactions occur, and are processed, with enterprise-level backup software providing a safe copy of the operations server on a backup server. If disaster should strike, the backup server can be utilized to commission a new operations server at a different location, and the business continues uninterrupted. With a third level of safety, a virtual third server is located at yet another location, mirroring the data that has been saved to the backup server. This self-configuring virtual machine is thus less costly, and requires no administration. Costs for such a cloud based service are approximately 50% of those incurred with the traditional backup server. This "belt-and-suspenders" approach to business continuity is priceless when disaster calls for it to be available.
In the early days of the public Internet, a cloud symbol was used in diagramming the network architecture. Client computers or servers were shown connected to a "cloud", a metaphor for all the servers, routers and storage that make up the high speed backbone of the Internet. How the interconnection occurred was not within the interest or view of the end user. He was attached to the "cloud" and the connections thus made allowed the end user to connect with another end user or service provider. The "cloud" is not some mysterious place, nor is it to be feared. It is simply a metaphor for how computer to computer interconnection occurs over a public network. It is no different today, 2035 than it was in 2010. The hardware has changed and the ability to obtain and supply data from anyplace has occurred in a natural way. Along that way, companies have died, only to be replaced with other companies with a better idea, a better service level, or a more accommodating business model.
Be sure to follow this blog as more of the vision unfolds.