|
I did it because I think people who know a lot of the .NET framework might know the solution ...
|
|
|
|
|
problem solved ...
I reprogrammed it in visual studio 2003 instead of visual studio express 2008
|
|
|
|
|
I created C# application that connects to MS Access Database
I compiled, built, and ran the application, and everything went well
Then I copied the *.exe file to another folder and other computers
I ran it many times with no problems
But, sometimes it does not work and displays the message (Unable to find a version of the runtime to run this application)
And this problem is not solved till I rebuild the source code again
I rebuild and run for 10s of times and the problem appears again
What is the source of such aproblem, and what is the true solution
foreach(Minute m in MyLife)
myExperience++;
modified on Monday, July 28, 2008 3:32 AM
|
|
|
|
|
When you are not sure if client computer has necessary all .dll's, runtimes etc. installed, you should make setup project for your application.
In setup project you can deal with prerequisites.
This is highly unlikely, but stranger things happened to me,
do you have some .dll's in your bin directory? Perhaps you forgot to copy them.
Edit: I've read your personal info after I've posted my answer, so please excuse me for patronizing tone; as I've said, stranger things had happened to me and to my local Gandalfs.
|
|
|
|
|
First of all, thanks for your reply
I post this qustion along time ago with no response
Oshtri Deka wrote: do you have some .dll's in your bin directory? Perhaps you forgot to copy them.
you will be surprized when you know that this problem occurs with the executable file even while running from the bin directory itself
another thing I recently noticed, is that th file size changes
when it runs proberly its size is 452 kb
when it fails, its size becomes 484 kb
Oshtri Deka wrote: so please excuse me for patronizing tone; as I've said, stranger things had happened to me and to my local Gandalfs.
Do not mention it, thanks for your thoughtfullness and help
foreach(Minute m in MyLife)
myExperience++;
|
|
|
|
|
Mohammed Gouda wrote: when it runs proberly its size is 452 kb
when it fails, its size becomes 484 kb
Scan a target computer with an antivirus software (e.g. free Avast[^]).
Greetings - Gajatko
Portable.NET is part of DotGNU, a project to build a complete Free Software replacement for .NET - a system that truly belongs to the developers.
|
|
|
|
|
It seems to be really a virus activity, in spite of I scanned my machine and got none infected files
But, I discovered the problem source
The executable file is opened somehow and some bytes are appended making it unusable.
So, the simple solution I did is ....
... I put the file in the READ ONLY access mode
Now, everything works well
Thanks to all participants
foreach(Minute m in MyLife)
myExperience++;
|
|
|
|
|
This doesn't concern you?
Wow.
Mark Salsbery
Microsoft MVP - Visual C++
|
|
|
|
|
??????!!!!!!
foreach(Minute m in MyLife)
myExperience++;
|
|
|
|
|
Hi everyone!
I want to reduce the numbers of colors in images dynamiclly in asp.net. For that I use this library (Source code). I use the PalettQuantizer class to reduce to specific colors specifyed in an ArrayList. But if I make the palett bigger than 256 entries, I get IndexOutOfBoudsException. I get the exception because the Bitmap.Palette.Entries array just can hold 256 entries. But if I want a palett that is bigger, how should I do?
Regards Tobias
|
|
|
|
|
Microsoft are retiring the MCAD/MCSD March next year. I remember a poll on here about which version of the .NET framework developers are targeting. The majority said 2.0.
I have an MCAD and wish to upgrade it to a higher qualification, and I feel that going for an MCSD would be a bit of a wasted effort as it targets 1.1 and will soon be obsolete. Many jobs I have seen require experience with 2.0 or higher.
If I go with the MCSD, I fear that version 4.0 of the .NET framework will come out by the time I've done it. So I'm thinking I will upgrade to an MCPD to catch up a bit. Any thoughts on this?
|
|
|
|
|
I would say go for the upgrade to MCPD, especially since the MSCD is also being retired.
Scott Dorman Microsoft® MVP - Visual C# | MCPD
President - Tampa Bay IASA
[ Blog][ Articles][ Forum Guidelines] Hey, hey, hey. Don't be mean. We don't have to be mean because, remember, no matter where you go, there you are. - Buckaroo Banzai
|
|
|
|
|
What's the upper limit on the number of projects for solution in VS2005 ?
|
|
|
|
|
Not sure if there is any. Hard disk, available ram, and processor power, human sanity dealing with a slow load, are probably the deciding factors on the limits. Have you seriously tried googling around? Maybe MS has the actual number somewheres in their docs?
--- modified
After a quick google around, nothing much. Probably depends on your hardware, number of developers, resources available.
"The clue train passed his station without stopping." - John Simmons / outlaw programmer
"Real programmers just throw a bunch of 1s and 0s at the computer to see what sticks" - Pete O'Hanlon
|
|
|
|
|
I have one with fifty-six projects so far.
|
|
|
|
|
Hi all,
I am developing an application that plots the timeseries data.
Currently, I have used SQL Server to store the data and ChartDirector to plot the chart.
But the application is slow and takes long time (40-50 seconds) just to plot 1 million data points. What I do is fire select query to database and use the result to draw that chart using CharDirector.
Please suggest me on how can I improve the speed of this application?
I am thinking on sampling (if data is too huge like fromo 2000 - 2005).
Nevertheless, I should not miss out the outliers meanwhile.
I would be grateful for your help.
Thanking you!
|
|
|
|
|
Have you found any solution to your problem?
"The clue train passed his station without stopping." - John Simmons / outlaw programmer
"Real programmers just throw a bunch of 1s and 0s at the computer to see what sticks" - Pete O'Hanlon
|
|
|
|
|
I have code along the lines of:
try
{
MarshalByRefObject obj = (MarshalByRefObject)RemotingServices.Connect(
typeof(MyClasss),
"http://MyServer:8080/MyClass");
MyClass class = (MyClass)obj;
class.CallSomeMethod();
return;
}
catch
{
return;
}
In the above code, if for whatever reason I can't connect to my server, an exception is thrown, I eat it, and move on. That's all expected.
What I'd like to know is: Is it possible to set how long the timeout is before the remote request gives up? If the server is down, it can take a while before the exception is thrown - sometimes a minute, and I'd like to limit it to something low, like 10 seconds.
Thanks,
Jeff
|
|
|
|
|
|
Thanks! I wasn't evening thinking about HttpClientChannel (and I wasn't aware of the configuration dictionary that's passed in), just MarshalByRefObject.
Thanks,
Jeff
|
|
|
|
|
If an object reference were stored adjacent to an integer in the same oct-word, or if two object references were stored likewise, is there any reasonable way to to a single CompareExchange that will update both of neither has changed or leave both alone if either has changed? Many algorithms for non-blocking updates seem to want a 'CompareExchange-plus'. From a hardware perspective there should be no difficulty(*), but since .net needs to "know" about objects, storing two object references in a Long would seem problematical.
The closest approaches I can figure would be to either (1) when it's necessary to have one 'extra bit' with the object reference, create two objects that the object reference can point to, both of which have pointers to each other. One of them is the real object, and one is a spare. If the object reference points directly at the "real" object, the "extra" bit is "zero"; if it points to the spare, it's "one". This avoids any need to create new objects while doing the update; (2) have the object reference point to an intermediate object which holds the real object reference and the other data; to do an update, copy the data from the old intermediate object to a new one and then CompareExchange the reference to the old to point to the new. That approach allows an arbitrary amount of supplemental data, but is more apt to tax the garbage collector. Further, even if the rest of the algorithm is interference-free, the memory allocation isn't.
Anyone have any brilliant insights?
(*) I wouldn't be surprised if object references in IA64 fill up the largest available CompareExchange type, but I can't see any real reason they should. If an object reference stores an index into a system-maintained array of objects, then 32 bits should be plenty for that purpose. I don't want to sound like Bill Gates' "640K is enough for everyone", but I can't see much use for having four billion live objects. If an application would need anything near that number, I would think it would be more efficient to use fewer objects and store more information in each.
|
|
|
|
|
32-bit processors support 32-bit interlocked operations; 64-bit processors in 64-bit mode support 64-bit interlocked operations and that's it.
Lock-free programming is very, very hard and requires you to fully understand the memory model. It's much easier to use actual locks until you can prove that the lock is the bottleneck. I suggest you read Herb Sutter's DDJ article Use Critical Sections (Preferably Locks) to Eliminate Races[^].
The .NET Monitor is a very low cost lock: it spins on a condition variable for a while, before eventually waiting on a Windows event object (allocating one if there wasn't one). This is much the same as the Windows CRITICAL_SECTION structure, if initialized with the InitializeCriticalSectionAndSpinCount function.
DoEvents: Generating unexpected recursion since 1991
|
|
|
|
|
Mike Dimmick wrote: 32-bit processors support 32-bit interlocked operations; 64-bit processors in 64-bit mode support 64-bit interlocked operations and that's it.
The 32-bit processors support a CMPXHG8 instruction for 64-bit compare-exchange. That's what's used when performing CompareExchange on a long integer. So there is no hardware impediment to providing an "Object-plus" compare-exchange in .net.
As for 64-bit architectures, the question would be whether they need to use a full 64 bits for an object reference. I can't really see any reason that should be necessary.
Lock-free programming in general is difficult, though some cases are pretty easy (e.g. a singly-linked list in which either insertions or deletions are restricted to the start of the list). Allowing a CompareExchange on an "Object-plus" would allow convenient handling of more data structures.
The problem with locks isn't just one of performance, but also one of stability. While it's certainly possible to use locks safely, the more widely they're used, the greater the likelihood of deadlock, priority inversion, or other such problems. Certainly there are places where locks are more practical than non-locking methods, but using locks within a class can create tricky behavioral dependencies which need to be documented and dealt with. If a non-blocking approach can yield the same result, such dependencies are eliminated.
|
|
|
|
|
Hi,
although you only just entered this, google has already picked it up.
But their link is wrong, so I entered this message[^].
about your subject:
- I am in favor of lock-free stuff too, if the environment lets me.
- a Win64 system would need pointers larger than 32-bit, I expect them to use the full 64-bit even
when they already said virtual address space would be limited to 48-bit IIRC.
- I don't think you can rely on the CMPXHG8 instruction to be present on every machine running .NET,
hence you should use an API function that hides those hardware details. Don't know if that is available.
And if it is, it should be offered in a .NET class !
|
|
|
|
|
- a Win64 system would need pointers larger than 32-bit, I expect them to use the full 64-bit even when they already said virtual address space would be limited to 48-bit IIRC.
I don't see why object references would need to be as long as pointers. I would expect the system to have a table of object references; if each object reference is 32 bits, that would allow for four billion object references. If each entry in the object table is 16 bytes (more likely they'd be 32 bytes each) that would mean that the limit of four billion object references wouldn't be a factor until there were 64 gigs in use for the object table alone.
How could one practically use more than four billion objects in a process space? I ask that not as a "why would anyone ever need more than 640K" question, but rather "How could one have billions of objects in a process space and not have garbage collection become totally unmanageable."
- I don't think you can rely on the CMPXHG8 instruction to be present on every machine running .NET, hence you should use an API function that hides those hardware details. Don't know if that is available. And if it is, it should be offered in a .NET class !
The System.Threading.Interlocked class offers methods to perform a compare-exchange on an integer, object reference, or long. According to Wikipedia, the 64-bit processors offer a 16-byte compare-exchange which could operate on two side-by-side 64-bit objects. The only difficulty with doing an object-pair compare-and-swap would be making sure that the .net memory manager knew what it was doing. Unfortunately, I don't know how to accomplish that, but I wouldn't think it should be overly difficult.
|
|
|
|
|