Using cloud servers for compiling applications

ozgurCity CloudLeave a Comment

source codeIf you are a programmer, you are well aware of the process called “Compilation”, which basically takes the source code and converts it to an executable file or in some cases into byte code which is then interpreted by another program.

Thanks to the increase in computing power, this process can be swift in a not so expensive home PC. But in some cases, it could take a few minutes or even more. Now, that doesn’t seem like much but imagine a programmer testing new code several times a day, the amount of idle time begins to add up. And sometimes that process could take hours, grinding your computer to a screeching halt.

In this article, we’ll talk about a potential way of speeding up the process and some goodies that come along with it.

Why compile remotely?

Even if you have a fast computer, chances are you might still need more computing power. Cloud computing offers you that, computing power at your fingertips and, better yet, you decide how much of it to use and when.

Compiling remotely ensures that your workstation is free to do other computing intensive tasks, like encoding/decoding, testing applications, rendering or number crunching of any kind. Or even if you just need it smooth enough to use your everyday applications.

Additionally, sometimes you need to ensure that your program runs in different versions of the same platform, or different platforms and operating systems. Sure, virtualization can be done locally with a slew of software applications but this way, you don’t have to worry about setting up anything but the servers themselves. And lest not forget, they can be used for other tasks as well.

There are arguably many cases where a remote compilation pays off, so let’s see how one would usually go about this.

Okay, but how?

distccmon-gnome-2003-09-23-01Basically, all you need to do is create new servers and set up the compilers accordingly. With some organization skills you can have several servers sleeping, with different compiler versions and platforms. You can then fire them up, compile and put them back to sleep.

When we talk about setting up compilers accordingly, we are talking about installing your compiler of choice and then some tool for distributed compiling. These tools start several instances of the compilation process on different machines, and then link everything back. Some examples of this are the well known DistCC (usually under Linux) or Xoreax’s IncrediBuild (for Windows, with a license fee).

One disadvantage of this approach is that you have to upload all the changes to the code and then download the executable to test it locally. But thanks to version control you can do this easily, since it generally uploads small changes and lets you keep track of modifications. Not to mention, a great way to backup your code, which is a welcomed addition. To download you can setup a quick web server that gives you a website to fetch all the files quickly, or you can use ftp.

buildmonitor_progress_smallAnother way of benefiting from remote servers is that you can use a great deal of tools to automate the process. You can even instruct the version control software to compile a new version the minute changes were uploaded. Or, as many teams do already, have a script that fetches the latest version late at night (usually referred to as Nightly builds) and have a new version ready for your next morning coffee, along with all kind of error reports. In this regard, the possibilities are endless.

Finally, thanks to our City Cloud API, you can automate the starting/shutting down of servers according to your computing needs. This ability to manually or automatically manage your servers ultimately ends up saving you money.

Cost Effective

Of course, having dedicated servers is not completely free but it only scales up when you use too many of them, for a long time. In this case, servers are short lived or if you need them at all times, chances are you saved money on local hardware and monthly electricity bills.

The point here is that ultimately you have control. Projects usually scale from a quick compilation at the beginning, where there is probably not many features, to several minutes and even hours at the end. Similarly, you could start with just one server and then scale up as the code complexity increases.

This all makes your project, future proof. Which means that you don’t have to account for a great deal of hardware purchases at the beginning and rest assured that you will have increased power when you need it.

Conclusions and beyond

Cloud computing not only got into the server space but also into the private enterprise. Locally or remotely, it doesn’t matter, virtualization is still the same. Several teams are already enjoying the benefits of this approach, be that for stress testing, nightly builds, compatibility testing or platform development.

Next time we come back to this subject, we’ll show you a basic example of how to use DistCC to compile your project with with several servers.