|
williamnw wrote: So as I write about our lovely automatic process, I'm running manual builds of all the code lines.
Welcome, brother.
I'm the keeper/inventor/curser-atter for our build process as well. Ours is a combination of a Windows application, VBscript, and a couple batch files that run the various processes to build our products. Debugging it's a PITA, since our builds take an hour. Given that we may do several builds in a day, if I need to make a change to the build process, I often have to get it right the very first time. If I don't, I end up with whiny, pissed-off people in my cube, and that's just annoying .
Software Zen: delete this;
|
|
|
|
|
Gary Wheeler wrote: I end up with whiny, pissed-off people in my cube, and that's just annoying
My solution to this is that I have I mug and on it is written 'Grumpy Old Man'. I have nurtured my reputation as a bad-tempered SOB. The code monkeys fear my wrath should they become 'wingeing-whiny-snot-noses'. I'm also the one who goes over to them and informs them, diplomatically of course, that they've broken the build with their kack-handed attempts at writing code.
'Part from that, I'm a sweetie.
Panic, Chaos, Destruction.
My work here is done.
|
|
|
|
|
Hmm. My mug, presented to me by my daughter three years ago, reads "Professional Smartass".
Software Zen: delete this;
|
|
|
|
|
Mine came from darling Mrs Wife.
Panic, Chaos, Destruction.
My work here is done.
|
|
|
|
|
|
I should have added that builds that leave the developers for alpha/beta test are automated to assure consistency and all components are versioned accordingly.
|
|
|
|
|
Yes it does. We have HTML/CSS/JavaScript projects that get "built." The point of the build is to run tests and produce a deployable package. And builds lead to build numbers which make issue tracking much easier than "oh, that release with the button that looks a bit squiff in IE6."
(Possibly not a technically correct usage of the term "build" but in practice it is what we need.)
|
|
|
|
|
Just curious - does anyone have experience with multi-computer (clustered) builds?
|
|
|
|
|
We use incredibuild for our own personal builds; don't know if it's used for the daily builds (I'm not in charge of that), since it's not really
important that it takes more or less time.
(edit)
It works very well, and will save lot of time on a full rebuild. in our cases, we can go from 45 minutes (on my slow machine) to 6 minutes with all
CPU used. When the build is dispatched, it looks at the load of the machines and will use or not a machine according to it.
Having multicores also really helps.
|
|
|
|
|
is the reduction in compilation times directly proportional to the number of cores running the build?
|
|
|
|
|
no, because you never know (using non dedicated computers) if the people are working or not on their PC.
Like everything parallel, it's never directly proportional.
|
|
|
|
|
Yep - some 5 years ago I was involved in making a build system (with Perl ) that was using multiple machines to build our machine translation system.
As for my current job, I am not even sure how it works - there are full-time build engineers that take care of it, but I bet they do use clustered builds.
|
|
|
|
|
On linux all the time since this is very easy to do and the required software for this is opensource and free. On windows (using microsoft compilers) the answer is no because the required software is too expensive.
John
|
|
|
|
|
I was actually amazed that the XCode IDE, used to build Mac and iPhone applications, has an option for clustered builds built-in. I mean, why don't we just have this 'by default' in Visual Studio?
|
|
|
|
|
Visual Studio 2008 does, in a way. If you have a multi-proc/multi-core system, it will compile multiple projects at once when building a solution.
Software Zen: delete this;
|
|
|
|
|
Yep. But unfortunately I can't just stick extra CPUs into my machine whenever I want.
|
|
|
|
|
Indeed.
It's nice in that the multi-compile feature seems to speed up individual builds, but it doesn't help with an overall build process unless you structure your product into a small number of solutions that include a large number of independent projects. Most of the components in our product have a single project within a single solution. The solutions with multiple projects usually have dependencies that require a specific build order (project B requires the build results from project A, project C requires B, and so on), and therefore running the builds in parallel doesn't get you anything (in fact, it wouldn't work).
I'm considering a home-grown solution to ours. We recently switched our build machine to a four processor server box, and it mostly loafs along during a build.
Software Zen: delete this;
|
|
|
|
|
But surely there is some mechanism for determining build order? By that I mean that you could probably (haven't tried, of course) write a VS add-in that does it.
My concern is somewhat different: I'm generally in favour of having one build machine, so my thinking is how can we make a machine that compiles things quickly enough? Thinking along the lines of system-on-a-chip and all that but, admittedly, it doesn't look technically feasible right now.
|
|
|
|
|
Some of our dependencies can be determined automatically, others can not. We have some external tools we use during the build process that are cumbersome to get Visual Studio to consider the outputs for in its dependency evaluation. We've found it's a lot more reliable to set the project dependencies manually.
We are like you, in that we have a single build machine. I'm interested in finding a solution that uses that machine more efficiently, and hopefully completes the build process in a shorter amount of time. The only part of the process that may potentially be reduced by making better use of multiple processors is the compile step (the longest step of the process). Retrieving sources from source control, constructing the installation, and publishing files to the public areas, while time-consuming, are all unfortunately 'single-threaded' due to the nature of the operations.
Software Zen: delete this;
|
|
|
|
|
I have been doing this with 2005 as well. But it does not use other computers in the build farm like gcc (with distcc and ccache) has done for many years.
John
|
|
|
|
|
Interesting. We're skipping directly from VS2003 to VS2008, and I didn't know VS2005 provided that.
John M. Drescher wrote: it does not use other computers in the build farm like gcc (with distcc and ccache) has done for many years
We're not up to a 'build farm' yet, fortunately. In addition to being a developer, I'm also the D.S.J.B. (Departmental Sh!t-Job Boy), which means I'm the admin for our source control system and our build machine. Moving our build stuff to our new server box was enough of pain. I don't like to think about keeping track of a bunch of boxes, licenses, backups, etc.
Software Zen: delete this;
|
|
|
|
|
Gary Wheeler wrote: I didn't know VS2005 provided that.
http://vagus.wordpress.com/2008/02/15/source-level-parallel-build-in-visual-studio-2005/[^]
Gary Wheeler wrote: We're not up to a 'build farm' yet, fortunately.
Since I use gentoo linux (installs everything from source code) I have invested time in getting this working because it saves a lot of time in upgrading. I have over 20 cores available for building. Recently I have moved from distcc to icecram and this greatly simplifies the setup especilly with x86 and x86_64 clients on the network. On top not needing any specific setup for different architectures it also allows me to use different distributions in the build. I mean I can have a dozen gentoo boxes participate in a build farm with an 8 core suse box and the setup is simple. icecream sends a small compiler environment from the client on first build to all of the hosts and after that this build environment will do the building independently on what compilers are on that host. And finally icecream will schedule the builds to the most powerful (and have the lowest cpu load) systems first.
http://en.opensuse.org/Icecream[^]
John
modified on Tuesday, November 18, 2008 11:36 AM
|
|
|
|
|
It depends on what you mean with clustered builds.
I run my builds on a farm that has two build agents and two database servers that are required to run tests.
However I build two versions of the same program parallel (Each agent runs a single version). So it's actually not that clustered.
|
|
|
|
|
Our build process is automated, but we don't initiate that process automatically. We perform builds on an ad hoc basis, whenever we need one for testing or release purposes.
Given that our build process takes from 40 minutes to over an hour, doing a build each time someone checked something in would have us doing overlapping, continual builds.
Software Zen: delete this;
|
|
|
|
|
Yes.
We have a dedicated machine set up for build too and it is performed only when we need to build a new release.
|
|
|
|