Cloud computing is definitely hitting the news today. In many respects, it is a pretty tempting scenario. As I heard Jeff Bezos of Amazon describe it, its rather analogous to generating your own electricity. You might be a baker or butcher or brewery - and yes, you surely use electricity, but why would you want to generate it yourself? Why not let some experts do that. In fact, let the experts do it en masse, and by doing so, can do it cheaply and efficiently.
You get the wondrous luxury of simply using computing resources by the drink. That is, you only pay for precisely what you use. Sounds like a no brainer and its also a no brainer why every large computing company has (or soon wil be) jumping into the mix with their own cloud.
Sun Microsystems is always a fun example. They own Java (for the moment, Oracle will soon of course) but sadly failed to own the development tool, application server, or primary java-housed architecture markets. Even though a "java cloud" makes fantastic sense, it would seem that historically they should and apparently did come to the game very late. In their defense, Sun might have actually been "too early" to the game. They had the Sun Grid
all through the birth of the cloud. The failure was partly because it was called "grid" and partly on how you used it. But at its core, it was a cloud.
Amazon, which surely isn't given enough credit as a technology company, was really the one to popularize the cloud. Even now they're the most mature, and usable system. The offer an entire computing solution from CPU to database - all, as Jeff Bezos is wont to say, by the drink. Microsoft is wisely creating their cloud to run with their technology.
Applications in the cloud need to be scalable. That's not to be confused directly with performant. Performance is the web-sense is how fast a user gets a response. Scalability is more akin to how performance changes as you add more users and more resources. You could in theory have an unlikely server that could only serve one user at a time. That's clearly not very performant. However, if you can keep adding one server and supporting yet another user, you are actually scalable.
In fact, that's basically perfect scalability which isn't common. Usually some bottleneck (which is often a synonym for "database") becomes a bottleneck. After that point, adding more servers just doesn't help.
Scalability is a deep and interesting topic but needless to say, at some level, you need it in order to have any sort of web application. On the other hand, its surprising about how poor performance you can actually get away with in many applications. In some sense its sad, but in the true business sense, "throwing more hardware" at a problem is often a good business decision (as opposed to expensive developer time to rip out a bad architecture). And the cloud vendors have no problem with you giving them poor performing applications. Sure - use up all that CPU you want, they'll send you the bill.
Given all that however, the cloud is quite a compelling story. Let's say you develop your nice, scalable, semi-performant website and deploy it to the cloud. Then you get some press - and overnight, your 100 users goes to 100,000. Great work if you can get it. Unfortunately however, if your architecture isn't ready for it, only the first few thousand of those users will actually get to see your site. Because after that it will come crashing down.
In the non-cloud world, you can then scramble to store, buy some servers, head over to the Colo, install them, configure them and bring them online. And of course by then, the users are gone anyway. Alternatively, you can buy plenty of hardware up front. That hardware will be sitting idle most of the time wasting power and rent space. Not to mention the time you take to maintain it, install it, and configure it.
The cloud does all this for you. If you get 99,000 new users, you simply hit the control panel of your cloud and add a few servers. Or add a few hundred servers - whatever you need. And when the onslaught is over, you drop those servers. Not bad, eh?
Looking at it this way - who wouldn't use the cloud? Assuming its reasonably priced (and so far, they seem to be) - why not?
There are several reasons actually. One is that the tools for cloud computing is still catching up. As you can imagine, traditional systems for monitoring applications fall apart in many cases. You simply don't know where your application will end up "truly" running. And its not uncommon that it performs different in the cloud than in your testing. The cloud is a wonderful tool to throw your code over a wall and let it just run, until you might need to actually know more detail (this gap was one impetus in Preemptive's Runtime Intelligence software - let your applications phone home and tell you how they are - from anywhere).
Another point is if your data has privacy concerns. If your business pulls credit reports, its not likely you'll be able to store it in the cloud. If for procedural reasons only - the cloud just won't meet the standards set forth by many security standard solutions. Quite often they simply dictate - you must own the key to the machine with the data.
Notably, Sun is introducing their cloud with the ability for you to run it on your own machines. We'll see how this pans out but of course that puts all the maintenance and configuration back on you. In addition, there are open source solutions appearing that come close to letting you setup your own cloud if you want to.
Without a doubt, a cloud is a great way to start a small business or new project. Effectively outsourcing all the computer details with (nearly infinite) room to grow. You're somewhat agreeing to lock-in to a vendor but that's not necessarily bad or even true depending on the cloud system. Also, tools to support your cloud are trailing the cloud itself - so make sure your system choice has all the pieces you'll need to maintain your applications even though, they're somewhere in the cloud.