|
I suggest this article be deleted. The information it provides is inaccurate and will only cause confusion to readers.
Sun Jian
|
|
|
|
|
Gentlemen - Mr Aberglas is not the only one who has seen a similar problem. We have experienced semi-reproducable bugs on a system where the 1.1 framework was installed (due to a full windows update download) *before* the 1.0 framework and dev environment. Uninstall did not work, clean reinstall of the OS and 1.0 framework cleared the problem up.
.Net provides an excellent and very powerful framework and pretty good documentation. However there are areas where the documentation is a little light or hard to come by, and it is natural that people using .Net in these early years dont know all that it can do, or all of its idiosynchrasies. I have been using it in anger for more than 2 years and still feel I have barely scratched the surface.
thePipe
|
|
|
|
|
Please report such issues to Microsoft. I've asked around, and we haven't heard of any problem like that.
Also, please consider asking the newsgroup for help if you run into a problem you can't resolve, or contacting MS's official support channels.
|
|
|
|
|
My point of view on dll hell is that the end-user does not have to know the versions of the dependent assemblies used by an application.
Using a version that is incremented at each build moves the dll hell from deployement to compilation: this is now the developper that has to be aware of which assembly it uses... it seems to be a improvement since the developer should be aware of what assembly he wants to use.
Once a application is compiled and your integration tests are working, you are sure that the deployed application will also works when shipped to the end user.
Why? Because your application will use only your deployed assemblies (including dependent assemblies of your application) and not previous assemblies with the same name and a previous version that already exist in the end-user workstation (shipped with a previous version of your application for instance).
Gilles
|
|
|
|
|
OK, and suppose that your application A refers to third party module X written by vendor V. And then after you ship A, V ships X', an updated version of X. Do you want your A to use X or X'.
The good news with .Net is you have a choice. The bad news is that you generally have to make it explicitly.
In fact you probably do want to start using X'. Maybe it is 50% faster. Maybe it adds support of the woopee database. Maybe it has a security fix. If V is reputable and is careful to maintain upward compatibility, as he should be, things will be fine. But the default is to use X, not X'.
So either V our yourself needs to provide config files to override the default behaviour. And then you can start to think about what happens if you both provide config files...
Anthony
|
|
|
|
|
aberglas wrote:
Do you want your A to use X or X'.
I would want my application to continue using X until I tested it with X'. After I have tested A with X' I will either 1) recompile A or 2) provide the necessary configuration information to start using X'. If the vendor, V, dares to claim that X' is 100% backward compatible with X then he can increment the AssemblyFileVersion attribute and leave the AssemblyVersion attribute unchanged and overwrite the old assembly.
aberglas wrote:
The bad news is that you generally have to make it explicitly.
This is good news. It is what eliminates DLL hell. DLL hell is caused by developers not being able to make an explicit choice. It is not when a developer has too many choices when versioning and cannot make the right one, which is often the case, but is a different problem of its own.
aberglas wrote:
If V is reputable and is careful to maintain upward compatibility, as he should be, things will be fine.
Sometimes it's just not possible to be reputable and careful while still maintaining backward compatibility. Even if it were you'd be naive to think the vendor is careful 100% of the time.
|
|
|
|
|
I think there are some major misconceptions here. First, Frameworks assemblies do not have the problem described here. See Binding to .NET Frameworks Assemblies for details. If you load references to the v1.0 and v1.1 System.Windows.Forms.dll, and the v1.1 CLR is running, only the v1.1 System.Windows.Forms.dll will be loaded - not both.
Second, if you have references hard-coded (static references, config files, etc.), then they will not produce unstable/unreproducible results. You'll get the same results every time, unless you cause a different CLR to be loaded, change config files, or change your references - all of which are under the machine's administrator's control.
However, it is true that you can load two versions of non-Frameworks assemblies into the same appdomain. This is good for the case where you need to both reference foo.dll v1.0 and some third party assembly which references foo.dll v2.0 (and foo.dll's types don't need to be cast between the two). If you don't want that to happen, you could choose the one version to use by using a config file redirect.
But, you don't like config files. Then, you don't have to use them! You can compile your app in such a way that only one, select version is referenced. Then, just it will be loaded, without a config file redirect.
However, it sounded like you assumed that your assembly version would be changing with every shipping build. That's possible, but not necessary. See When to Change File/Assembly Versions for details.
Plus, you said that all version dependencies are hard-coded. That's not quite true. If you were really worried about it, you could use late-binding (Assembly.Load(), Activator.CreateInstance(), etc.) to do the same thing, and you could choose the version you wanted at runtime. Of course, that's a hassle, and, IMHO, unnecessary.
About your suggestion that the loader use the latest version of the assembly by default, and you'd have to opt out if you didn't want that: that is the definition of dll hell. See Avoid Partial Binds (it, in effect, would be the same thing as doing a partial bind). In theory, COM would have avoided dll hell, too, as long as people opted-in to the versioning rules. But, the truth is that people don't always opt in, and when they don't, there's pain all around. Safer to make people opt in to dll hell, rather than opt out of it.
In closing, I wonder what the problem was that you were seeing. If you post the description, I may be able to help. Also, consider viewing the Fusion logs next time before assuming it's a versioning problem. See this blog entry for instructions.
|
|
|
|
|
Hello Suzzane,
I have indeed just stumbled upon your excellent blog based on a link in these replies.
(You evidently did not use the keywords I used for search (eg. .Net | dotnet | c#). (It aint easy to search for Microsoft product documentation (cf. ".net" vs "java"). I'm waiting for a product named "The".))
I assert that the .Net versioning system is too subtle for technical writers to understand, so that there is no useful documentation of this complex area other than blogs and news groups. Richter's book is the best that I found, and it is superficial. How did you find out -- because you wrote the code?
Please prove me wrong!
Maybe there comes a time when writing up existing features is more important than adding new ones? This is obviously your area...
Or maybe just a good table of contents that points to the scattered articles?
Anthony
As to DLL hell, I still assert that maintaining upward compatibility is a fact of life. It is not OK to push full responsibility for version control onto the end user, who probably does not even know what a "version" is. Remember that the "machine administrator" will often be an end user who has very little control of anything. Most PCs are used at home or by small businesses. And IT departments generally do not support non-standard applications on individual machines.
Great to provide version control options, great to provide tools that show version dependencies to users when things go wrong, great to provide more dynamic linking and type checking to avoid memory corruptions. But you need to have a standard versioning policy that clear and simple and only very occasionally needs to be overriden.
|
|
|
|
|
Mr. Aberglass: do your homework.
yes .net is complex... but so is Java and C++ or programming for UNIX / Linux.
the information *IS* out there .... Microsoft has a whole slew of books and such that have this kind of stuff explained.... perhaps not all in one place but it is there.....
perhaps you could write an article or two on what is there and now to use it after you have found it .... that would be very usefull.....
Strong Names
Binding
Versioning
App Domains
Prefered Version
use of .config files
etc....
|
|
|
|
|
When you say "Microsoft has a whole slew of books" please suggest at least one title that you have read and really goes into at least one of these issues clearly. I mentioned Richter which is a good start, but only a start.
It is generally easier to prove that something exists than that it doesn't. Prove me wrong, but through experience, not supposition.
Anthony
|
|
|
|
|
I have to agree with you on these assertions.
First of all, Richter assumes we are all living in a 100% managed world. The problem is, not today. The assumption falls short, and so the remainder of his explanations.
Another typical wrong assumption is about the uniqueness of the CLR. Books and articles always assume that side-by-side works so well that it's not even worth mentioning and so their articles rely on recently formatted machines where there is only one CLR, the same CLR than the developer CLR. This kind of assumption says a lot about deployment of managed apps in the real world. The problem is, all those articles and books went out in the early phases of the .NET development and evangelization (one in which we still are these days by the way), and so don't anticipate such issues. That's why for instance nowhere in the MSDN .NET 1.0 documentation it's said developers are supposed to anticipate a .config file in order to anticipate and manage side-by-side issues that will occur LATER.
Regarding deployment, how funny (or sad depending on the view) is this sentence from the MS .NET framework download site[^] : " In order to install Dotnetfx.exe, the user must have administrator privileges on the computer.". So while ISVs are supposed to distribute and sell applications that don't require local admin priviledge to install and run, MS bangs this by explicitely forcing end users to be local admin. No comment.
I agree with you that, in order for a managed app to work on an end-user machine the same than on the developer's machine, a lot of assumptions is put on the end-user. This is simply wrong. May be MS expects developers and ISVs to go through $$$$.NET app certification steps$$$$ before a managed app gets labelled as "compatible with Windows".
When you look up MSDN, in a general-to-specific manner, for articles supposed to explain how unmanaged code and managed code can live together, you get into very few articles actually, and one of them is this[^]. The problem is, the topic is viewed under a candide and naive eye. Everything is made so trivial that one wonders if this article wasn't written in the COM days when Don Box envisioned a successor to it. But then real issues are not even mentioned.
The same holds for type marshalling. Looks like you have the tools to succeed (marshal attribute) but man, it's such a mess! Wonder whether the blindness occuring in trying to marshal types in .NET is in any way better than the marshalling holes we had between idl types, com types and automation types.
-- modified at 10:01 Saturday 8th October, 2005
|
|
|
|
|
There are quite a lot of articles in MSDN that go into it. Search for "version" in a recent MSDN, for example. Microsoft does put a lot of effort into getting books, articles, etc., published. You may also want to consider taking your questions to Microsoft newsgroups, or their official customer support channels.
About Dll Hell: I didn't mean that versioning should be in the hands of the end user. I meant that it should mostly be in the hands of the assembly publisher. My comment about machine admin was regarding the development-time problem you were seeing. For that case, the machine admin would also be the assembly owner, so that's not unreasonable.
It sounded like the memory corruptions you were referring to were due to unmanaged code. .NET doesn't have control over that - it's legacy. It's up to the publisher to do the right thing. Those who want a more managed environment should use managed code.
Being able to load the files you need is a very core feature. No one simple plan is going to work for all scenarios, so it becomes necessarily more complex. But, we did try hard to make it as simple as possible while remaining powerful. For this case, the simple answer to compile against what you will run against. If you will run against something different, you can recompile. If you're not willing to live with that, the complexity of the story you need increases.
|
|
|
|
|
Yes, there are a lot of words in MSDN. Part of the problem. Much of it is vague and it is not well classified or searchable. Hence my suggestion for a list of links to good articles.
My app was 100% .net. The P/Invokes I refer to are presumed to be within the .net framework, which you do have control over. So if my problem was not .net hell, it was something very weird involving bugs in the core CLR.
What I do know is that reworking version numbers made the bug go away.
Anthony
|
|
|
|
|
The CLR requires that its dlls not be mixed and matched. (So, don't copy files from one version of the CLR to another.) Other than that, its P/Invokes should work. What exactly was the problem you were seeing, and what exactly fixed it (details)? If it was a problem with the core CLR, I don't see how changing your own versions would fix it, so far.
|
|
|
|
|
I was seeing a large variety of exceptions deep within the framework that made no sense. Remove a line of code and something unrelated would fail. It felt like some internal memory was very corrupted.
My application consists of five 100% managed .net solutions, each with a few projects.
What I did was simply to consistently upgrade all the solutions to VS.net 2003 and rebuild everything. The version numbers in the references magically were updated. (I rebuild often, so that should not be an issue.)
From your description of the fact that Framework modules always default to the CLR version what I did should not make any difference. But it did.
Incidentally, why is it that the Framework modules ignore version information. Is it in the Vendor .config file, or is it burnt into the CLR?
Regards,
Anthony
|
|
|
|
|
Anthony,
I believe that Suzanne is expecting a step-by-step description of the problem in full details.
Dealing with versioning issues, I am not sure if you use strong names in all referenced assemblies. I guess that's suited.
Yeah, dealing with P/Invoke, it's true that Microsoft will always tell you from now on they expect versioning issues to be solved especially when you are using pure managed code. Unfortunately, that's a show stopper. This vision does not hold since there are tens of billions lines of unmanaged code out there, and it won't be rewritten to managed code. At least not today.
Dealing with CLR versions, I also believe there are issues. I have faced some of them and had no tool to inspect more. There is a clear need for a trace tool that would show all the inputs and outputs of fusion, for instance when a combination of config files are being used. A clean tabular report of all candidate load paths etc. would be extremely good at this point.
(I haven't seen yet if there is any instrumentation API in 1.1 for this, just as the one used by Nathan[^], may be there is).
My 0.5 cent.
-- modified at 10:01 Saturday 8th October, 2005
|
|
|
|
|
Stephane's idea for tools is important. They do not have to have fancy GUIs. But when you have a complex configuration environment we need an easy way to know what is actually going on.
Suzanne, I realize that you are probably just a programmer that generously contributes to these debates. But if you are a PM then an annotated bibliography would be very good. Good tools badly documented are bad tools.
As to details of my problem, its very simple. No strong names. Just compiling under VS during development. And 100% managed. Should be very easy. But it wasn't.
|
|
|
|
|
Yes, I would need more details in order to diagnose the problem. So far, it sounds like it's unrelated to P/Invoke: there was a file loading exception due to a version mismatch.
For unmanaged dlls, like I said in another post, versioning issues with those can be fixed the same as in the managed world by adding the unmanaged dlls to an assembly. Set it as a non-metadata containing file. The easy way to make your compiler do this is to include the file as a ManifestResource (csc /linkresource, etc.).
Of course, the publisher can choose to keep their old versioning scheme for unmanaged components. In that case, the versioning scheme is no worse than adding new unmanaged components to their existing ones.
There is such a tool: fuslogvw.exe. See this blog entry for details. If you need the callstacks for loads, that can be done using a debugger and/or profiler which supports the CLR.
Anthony: Fusion ignores the versions for Frameworks assemblies because it's burned into the CLR.
|
|
|
|
|
CAN YOU PROVIDE A LIST OF ARTICLES?
ps: your blog is great
|
|
|
|
|
Thanks!
This list was taken by searching for "version" filtered by ".NET Framework" in MSDN:- Redirecting Assembly Versions
ms-help://MS.MSDNQTR.2003JUL.1033/cpguide/html/cpconassemblyversionredirection.htm
- Assembly Versioning
ms-help://MS.MSDNQTR.2003JUL.1033/cpguide/html/cpconassemblyversioning.htm
- .NET Framework Assembly Unification
ms-help://MS.MSDNQTR.2003JUL.1033/cpguide/html/cpconnetframeworkassemblyunification.htm
- Guidelines for Creating Apps/Components for Side-by-Side Execution
ms-help://MS.MSDNQTR.2003JUL.1033/cpguide/html/cpconguidelinesforcreatingapplicationscomponentsforside-by-sideexecution.htm
- Side-by-Side Execution Overview
ms-help://MS.MSDNQTR.2003JUL.1033/cpguide/html/cpconside-by-sideexecutionoverview.htm
- Etc.
If you don't have access to MSDN, consider doing a search on Google's Microsoft search. (Try searching for ".NET Framework assembly version", for example.)
|
|
|
|
|
Thanks for the tips, especially about the 1.1 upgrade.
I hope your comment about .net hell is overstated. It seems a little early days to me to be concerned. I think that building in .Net requires a lot of new ideas and ways of thinking, not to mention learning new stuff that we took for granted before. Most of the major problems I have had, like yours, come down to not knowing enough when we first started.
ThePipe
|
|
|
|
|
Fair point, I've tweaked the article to make it clear that I am using ".Net Hell" as a pun on Microsoft's use of "DLL Hell". But I had tried a lot of things before I found the bug and was close to giving up, so Hell is not such a bad term.
The C++ world was crude but it was also simple. No Garbage Collection etc. .Net is much more sophisticated, and so can produce much more "sophisticated" bugs. Sloppy practices that were OK in C++ are no longer acceptable.
|
|
|
|
|
Its true about 'sophisticated' bugs. An example I used to make was VB6 forms developers who move to asp.net pages. At first glance they think it looks easy, but they experience lots of problems once they get into it. I believe that Java people would find it an easier transition to .Net than than most VB developers.
|
|
|
|
|
HI!
I'm using .net for 9 months now and I did come across such a problem. But isn't there a more nice way of dealing with it instead of editing config files here and there. I think that if the needed assembly dlls are included in the directory of the application assembly then this inconvenience will be partially solved - still not a good idea, but i can't think of anything else.
|
|
|
|
|
Fore more info, consider lurking around this weblog[^] (Suzanne Cook is probably one of the persons behind .NET fusion).
-- modified at 10:01 Saturday 8th October, 2005
|
|
|
|
|