|
Some of our dependencies can be determined automatically, others can not. We have some external tools we use during the build process that are cumbersome to get Visual Studio to consider the outputs for in its dependency evaluation. We've found it's a lot more reliable to set the project dependencies manually.
We are like you, in that we have a single build machine. I'm interested in finding a solution that uses that machine more efficiently, and hopefully completes the build process in a shorter amount of time. The only part of the process that may potentially be reduced by making better use of multiple processors is the compile step (the longest step of the process). Retrieving sources from source control, constructing the installation, and publishing files to the public areas, while time-consuming, are all unfortunately 'single-threaded' due to the nature of the operations.
Software Zen: delete this;
|
|
|
|
|
I have been doing this with 2005 as well. But it does not use other computers in the build farm like gcc (with distcc and ccache) has done for many years.
John
|
|
|
|
|
Interesting. We're skipping directly from VS2003 to VS2008, and I didn't know VS2005 provided that.
John M. Drescher wrote: it does not use other computers in the build farm like gcc (with distcc and ccache) has done for many years
We're not up to a 'build farm' yet, fortunately. In addition to being a developer, I'm also the D.S.J.B. (Departmental Sh!t-Job Boy), which means I'm the admin for our source control system and our build machine. Moving our build stuff to our new server box was enough of pain. I don't like to think about keeping track of a bunch of boxes, licenses, backups, etc.
Software Zen: delete this;
|
|
|
|
|
Gary Wheeler wrote: I didn't know VS2005 provided that.
http://vagus.wordpress.com/2008/02/15/source-level-parallel-build-in-visual-studio-2005/[^]
Gary Wheeler wrote: We're not up to a 'build farm' yet, fortunately.
Since I use gentoo linux (installs everything from source code) I have invested time in getting this working because it saves a lot of time in upgrading. I have over 20 cores available for building. Recently I have moved from distcc to icecram and this greatly simplifies the setup especilly with x86 and x86_64 clients on the network. On top not needing any specific setup for different architectures it also allows me to use different distributions in the build. I mean I can have a dozen gentoo boxes participate in a build farm with an 8 core suse box and the setup is simple. icecream sends a small compiler environment from the client on first build to all of the hosts and after that this build environment will do the building independently on what compilers are on that host. And finally icecream will schedule the builds to the most powerful (and have the lowest cpu load) systems first.
http://en.opensuse.org/Icecream[^]
John
modified on Tuesday, November 18, 2008 11:36 AM
|
|
|
|
|
It depends on what you mean with clustered builds.
I run my builds on a farm that has two build agents and two database servers that are required to run tests.
However I build two versions of the same program parallel (Each agent runs a single version). So it's actually not that clustered.
|
|
|
|
|
Our build process is automated, but we don't initiate that process automatically. We perform builds on an ad hoc basis, whenever we need one for testing or release purposes.
Given that our build process takes from 40 minutes to over an hour, doing a build each time someone checked something in would have us doing overlapping, continual builds.
Software Zen: delete this;
|
|
|
|
|
Yes.
We have a dedicated machine set up for build too and it is performed only when we need to build a new release.
|
|
|
|
|
Gary Wheeler wrote: doing a build each time someone checked something in would have us doing overlapping, continual builds.
How about nightly builds (plus automated tests)?
|
|
|
|
|
The only benefit I can see that nightly builds would give us over our ad hoc builds would be validating that the checked-in code is buildable. Our group 'culture' is such that builds, when broken, don't stay broken long .
Our rules are simple:
1. Don't check in breaking changes unless you know what will be broken.
2. If you check something in that other team members must deal with, it's your job to let them know and make sure they make the necessary changes.
Automated testing on a build basis is a nice idea, but not terribly applicable to our products, which are process control applications that run machinery. It would be very difficult to automate the test environment sufficiently to perfectly simulate the hardware. I can easily see us having more code in the test bed than in the actual application (somewhere around 500,000 lines of C++). We currently do manual 'unit testing' with a variety of stand-ins, test bench and simulator applications.
Software Zen: delete this;
|
|
|
|
|
I use automated configuration, build and testing for all my project even small ones and I very much appreciate it's benefits so I was perplexed when I saw the number of "We do not use automated builds". Why not use automated build? What are your reasons (if any)? Intrigued mind asks.
Regards.
|
|
|
|
|
Our software isn't big enough to need an automated build. The build in VS takes less then a minute. The biggest project is only 20,000 lines
|
|
|
|
|
hopingToCode wrote: The build in VS takes less then a minute.
Automation's purpose is not to improve build speed but to automate the build process . If you have several people working on it, a testing framework or a code repository then automation is very useful.
Regards.
|
|
|
|
|
PedroMC wrote: If you have several people working on it, a testing framework or a code repository then automation is very useful.
How I would wish... out of interest do you automate build for websites as well or just other type of apps?
|
|
|
|
|
hopingToCode wrote: How I would wish... out of interest do you automate build for websites as well or just other type of apps?
Yes, most of the time. It can automate (some) testing, documentation creation, compilation and deployment to a test server. There is not much difference in automation between web apps projects and other apps projects. One thing that I don't know how to automate is testing the UI part of web apps (not much easier for GUI apps). Someone must always go through every page on several browsers, boring repetitive work. Automation can help in collecting a list of pages that need testing but that is about it.
Regards.
|
|
|
|
|
For testing web UIs, have a look at Selenium.
http://selenium.seleniumhq.org/
|
|
|
|
|
We're not doing automated builds, but I'm trying to lead us that way. The first step is to introduce patterns that make the UI as "dumb" as possible. This way we can do automated testing on 99% of the code. We're doing our first WPF project and I'm using it to try and change our development habits and mentality. The next step is either moving to ASP.NET MVC or using the same patterns in web forms we're using for WPF to make the UI dumb. It will be an interesting challenge.
|
|
|
|
|
If you are testing a website it comes down to bringing in discipline around architecture. Your logic should be in different components (assemblies in .Net) that can be invoked from unit tests. A lot has been said for and against MVC, but you don't even need to go that far - just a three-tiered approach.
If you get that right, if a developer checks in code that introduces a bug you will get instant notification (if using CI) that something broke.
If you UI is FUBAR, you can go as far as sending customers an aspx page/whatever instead of a dll/whatever that needs to be installed in the GAC, registered with COM, or registry settings: preceded and followed by an IISReset. You get the idea.
He who asks a question is a fool for five minutes. He who does not ask a question remains a fool forever. [Chineese Proverb]
Jonathan C Dickinson (C# Software Engineer)
|
|
|
|
|
Im ignorant of how to do it in VS with TFS
|
|
|
|
|
Normally, you do it using a Build Server like CruiseControl.NET in combination with a build tool like NAnt or MSBuild on a dedicated machine (the build server). There's no need for VS - you only need the compiler (e.g. csc.exe for C#) which can be used free of charge.
All the tools I mentioned are open source and under free licenses, yet very mature and reliable. So there's no additional cost in using them despite the learning effort, which, for a simple solution, is not so high. Since widely in use, many examples exist on the web.
Regards
Thomas
Programming today is a race between software engineers striving to build bigger and better idiot-proof programs, and the Universe trying to produce bigger and better idiots. So far, the Universe is winning.
Programmer - an organism that turns coffee into software.
|
|
|
|
|
Most of our projects (C++ we do not use .net) are written by only 1 developer with external code being called from libraries so there has not been a great need for automated builds. The build process is for the most part done on their machines and only when changes are made. With that said we are now moving to more cross platform development with Qt and using CMake to generate projects. With this I can see automated testing being part of the build process (since this is built into CMake).
John
|
|
|
|
|
I have been a supporter of routinely, automatically building all of our apps, but it never rises to the top of the pile of flaming, gotta-be-done-now emergencies.
My main concern is that we don't detect software rot until there is a critical need to change the application. At which point it is too late. (Software rot shows up when the unchanged code stops working due to environmental changes like a new compiler, OS, changed DLLs, etc.)
|
|
|
|
|
Sorry, I don't fully get it.
Are you using an automated build server or not?
All things you describe can be perfectly solved with an automated build server, an appropriate suite of unit tests, and perhaps some software metrics measurement. This together acts like an automated quality assurance station.
Regards
Thomas
www.thomas-weller.de
Programming today is a race between software engineers striving to build bigger and better idiot-proof programs, and the Universe trying to produce bigger and better idiots. So far, the Universe is winning. Programmer - an organism that turns coffee into software.
|
|
|
|
|
No, we are not. I would like to but there is never enough time to set it up.
|
|
|
|
|
Harold Bamford wrote: there is never enough time to set it up.
As always...
There must be a commitment within the team and also with the management to do it. Otherwise it will never happen...
Regards
Thomas
www.thomas-weller.de
Programming today is a race between software engineers striving to build bigger and better idiot-proof programs, and the Universe trying to produce bigger and better idiots. So far, the Universe is winning. Programmer - an organism that turns coffee into software.
|
|
|
|
|
Harold Bamford wrote: My main concern is that we don't detect software rot until there is a critical need to change the application.
What?! Looking incredulous at the MANY errors and warning. It built without a single error or warning yesterday.
Yep, I know the filling.
Regards.
|
|
|
|