|
We use incredibuild for our own personal builds; don't know if it's used for the daily builds (I'm not in charge of that), since it's not really
important that it takes more or less time.
(edit)
It works very well, and will save lot of time on a full rebuild. in our cases, we can go from 45 minutes (on my slow machine) to 6 minutes with all
CPU used. When the build is dispatched, it looks at the load of the machines and will use or not a machine according to it.
Having multicores also really helps.
|
|
|
|
|
is the reduction in compilation times directly proportional to the number of cores running the build?
|
|
|
|
|
no, because you never know (using non dedicated computers) if the people are working or not on their PC.
Like everything parallel, it's never directly proportional.
|
|
|
|
|
Yep - some 5 years ago I was involved in making a build system (with Perl ) that was using multiple machines to build our machine translation system.
As for my current job, I am not even sure how it works - there are full-time build engineers that take care of it, but I bet they do use clustered builds.
|
|
|
|
|
On linux all the time since this is very easy to do and the required software for this is opensource and free. On windows (using microsoft compilers) the answer is no because the required software is too expensive.
John
|
|
|
|
|
I was actually amazed that the XCode IDE, used to build Mac and iPhone applications, has an option for clustered builds built-in. I mean, why don't we just have this 'by default' in Visual Studio?
|
|
|
|
|
Visual Studio 2008 does, in a way. If you have a multi-proc/multi-core system, it will compile multiple projects at once when building a solution.
Software Zen: delete this;
|
|
|
|
|
Yep. But unfortunately I can't just stick extra CPUs into my machine whenever I want.
|
|
|
|
|
Indeed.
It's nice in that the multi-compile feature seems to speed up individual builds, but it doesn't help with an overall build process unless you structure your product into a small number of solutions that include a large number of independent projects. Most of the components in our product have a single project within a single solution. The solutions with multiple projects usually have dependencies that require a specific build order (project B requires the build results from project A, project C requires B, and so on), and therefore running the builds in parallel doesn't get you anything (in fact, it wouldn't work).
I'm considering a home-grown solution to ours. We recently switched our build machine to a four processor server box, and it mostly loafs along during a build.
Software Zen: delete this;
|
|
|
|
|
But surely there is some mechanism for determining build order? By that I mean that you could probably (haven't tried, of course) write a VS add-in that does it.
My concern is somewhat different: I'm generally in favour of having one build machine, so my thinking is how can we make a machine that compiles things quickly enough? Thinking along the lines of system-on-a-chip and all that but, admittedly, it doesn't look technically feasible right now.
|
|
|
|
|
Some of our dependencies can be determined automatically, others can not. We have some external tools we use during the build process that are cumbersome to get Visual Studio to consider the outputs for in its dependency evaluation. We've found it's a lot more reliable to set the project dependencies manually.
We are like you, in that we have a single build machine. I'm interested in finding a solution that uses that machine more efficiently, and hopefully completes the build process in a shorter amount of time. The only part of the process that may potentially be reduced by making better use of multiple processors is the compile step (the longest step of the process). Retrieving sources from source control, constructing the installation, and publishing files to the public areas, while time-consuming, are all unfortunately 'single-threaded' due to the nature of the operations.
Software Zen: delete this;
|
|
|
|
|
I have been doing this with 2005 as well. But it does not use other computers in the build farm like gcc (with distcc and ccache) has done for many years.
John
|
|
|
|
|
Interesting. We're skipping directly from VS2003 to VS2008, and I didn't know VS2005 provided that.
John M. Drescher wrote: it does not use other computers in the build farm like gcc (with distcc and ccache) has done for many years
We're not up to a 'build farm' yet, fortunately. In addition to being a developer, I'm also the D.S.J.B. (Departmental Sh!t-Job Boy), which means I'm the admin for our source control system and our build machine. Moving our build stuff to our new server box was enough of pain. I don't like to think about keeping track of a bunch of boxes, licenses, backups, etc.
Software Zen: delete this;
|
|
|
|
|
Gary Wheeler wrote: I didn't know VS2005 provided that.
http://vagus.wordpress.com/2008/02/15/source-level-parallel-build-in-visual-studio-2005/[^]
Gary Wheeler wrote: We're not up to a 'build farm' yet, fortunately.
Since I use gentoo linux (installs everything from source code) I have invested time in getting this working because it saves a lot of time in upgrading. I have over 20 cores available for building. Recently I have moved from distcc to icecram and this greatly simplifies the setup especilly with x86 and x86_64 clients on the network. On top not needing any specific setup for different architectures it also allows me to use different distributions in the build. I mean I can have a dozen gentoo boxes participate in a build farm with an 8 core suse box and the setup is simple. icecream sends a small compiler environment from the client on first build to all of the hosts and after that this build environment will do the building independently on what compilers are on that host. And finally icecream will schedule the builds to the most powerful (and have the lowest cpu load) systems first.
http://en.opensuse.org/Icecream[^]
John
modified on Tuesday, November 18, 2008 11:36 AM
|
|
|
|
|
It depends on what you mean with clustered builds.
I run my builds on a farm that has two build agents and two database servers that are required to run tests.
However I build two versions of the same program parallel (Each agent runs a single version). So it's actually not that clustered.
|
|
|
|
|
Our build process is automated, but we don't initiate that process automatically. We perform builds on an ad hoc basis, whenever we need one for testing or release purposes.
Given that our build process takes from 40 minutes to over an hour, doing a build each time someone checked something in would have us doing overlapping, continual builds.
Software Zen: delete this;
|
|
|
|
|
Yes.
We have a dedicated machine set up for build too and it is performed only when we need to build a new release.
|
|
|
|
|
Gary Wheeler wrote: doing a build each time someone checked something in would have us doing overlapping, continual builds.
How about nightly builds (plus automated tests)?
|
|
|
|
|
The only benefit I can see that nightly builds would give us over our ad hoc builds would be validating that the checked-in code is buildable. Our group 'culture' is such that builds, when broken, don't stay broken long .
Our rules are simple:
1. Don't check in breaking changes unless you know what will be broken.
2. If you check something in that other team members must deal with, it's your job to let them know and make sure they make the necessary changes.
Automated testing on a build basis is a nice idea, but not terribly applicable to our products, which are process control applications that run machinery. It would be very difficult to automate the test environment sufficiently to perfectly simulate the hardware. I can easily see us having more code in the test bed than in the actual application (somewhere around 500,000 lines of C++). We currently do manual 'unit testing' with a variety of stand-ins, test bench and simulator applications.
Software Zen: delete this;
|
|
|
|
|
I use automated configuration, build and testing for all my project even small ones and I very much appreciate it's benefits so I was perplexed when I saw the number of "We do not use automated builds". Why not use automated build? What are your reasons (if any)? Intrigued mind asks.
Regards.
|
|
|
|
|
Our software isn't big enough to need an automated build. The build in VS takes less then a minute. The biggest project is only 20,000 lines
|
|
|
|
|
hopingToCode wrote: The build in VS takes less then a minute.
Automation's purpose is not to improve build speed but to automate the build process . If you have several people working on it, a testing framework or a code repository then automation is very useful.
Regards.
|
|
|
|
|
PedroMC wrote: If you have several people working on it, a testing framework or a code repository then automation is very useful.
How I would wish... out of interest do you automate build for websites as well or just other type of apps?
|
|
|
|
|
hopingToCode wrote: How I would wish... out of interest do you automate build for websites as well or just other type of apps?
Yes, most of the time. It can automate (some) testing, documentation creation, compilation and deployment to a test server. There is not much difference in automation between web apps projects and other apps projects. One thing that I don't know how to automate is testing the UI part of web apps (not much easier for GUI apps). Someone must always go through every page on several browsers, boring repetitive work. Automation can help in collecting a list of pages that need testing but that is about it.
Regards.
|
|
|
|
|
For testing web UIs, have a look at Selenium.
http://selenium.seleniumhq.org/
|
|
|
|