|
Agreed, it's the only subscription email that I actually wait for
|
|
|
|
|
Doesn't seem like the best forum.
You'll never get very far if all you do is follow instructions.
|
|
|
|
|
I agree with Piebald: try posting it here: http://www.codeproject.com/suggestions.aspx[^] - the hamsters don't get as many "attaboy!"s as they sometimes deserve.
We will remember it is your fault if they do start charging for it though.
Those who fail to learn history are doomed to repeat it. --- George Santayana (December 16, 1863 – September 26, 1952)
Those who fail to clear history are doomed to explain it. --- OriginalGriff (February 24, 1959 – ∞)
|
|
|
|
|
The other day, a customer complained that we - sometimes, but not always - wrote the password of their Hospital Information System (HIS) in our log files in clear text.
Heh? Just another customer telling us bullshit!?
Alahs, right he is.
When our application starts, it logs some general information, e.g. hardware, OS, and Environment Variables.
And in the section of the Environment Variables, sometimes there was an entry like
HIS_PWD=CUSTOMERS_HIS_PASSWORD
The customer found then out that it did not happen when he started our application from the start menu or from its desktop item. It only happened when he started it from the HIS (as the doctors normally do: the HIS can provide us with context information like the patient the doctor is working on).
Well, a process inherits the environment from the process it was started from, including all its Environment Variables.
Do you see what happened here?
It's really a great idea to store the clear-text password as an environment variable, it is absolutely safe there.
|
|
|
|
|
What a great idea! I'm on my way to a meeting - I will drop it in as my idea of safe development and then go to sleep. It will take hours to them to figure out how to eat it...
I'm not questioning your powers of observation; I'm merely remarking upon the paradox of asking a masked man who he is. (V)
|
|
|
|
|
It's already wrong that you know the password; should have been a hash. And yes, this is the reason why I oppose linking medical systems.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
Ugh. I don't know what software this is, but as far as I can tell, the way they design hospital software is to take the biggest, worst, most horrifying Microsoft Access application you've ever encountered, the sort that happens when someone who wasn't a programmer discovered Access and built a giant, awful system on it and kept at it for a decade, and then model your new medical records application on that.
|
|
|
|
|
Trajan McGill wrote: Microsoft Access
Worse, some use Cache.
You'll never get very far if all you do is follow instructions.
|
|
|
|
|
Hee hee!! I just stored a plain text password in an environment variable this morning! Only temporarily and I have since rebooted.
The password in question is generally protected inside an SSIS parameter file, but I wanted it closer to hand.
P.S. I had to keep it close to hand again today, but instead of doing SET PWD=pa$$w0rd I did pa$$w0rd=PWD for greater security!
You'll never get very far if all you do is follow instructions.
modified 12-Jun-14 18:48pm.
|
|
|
|
|
Working with date and time is known to be a source of beautiful code...
System.DateTime.Now gets the current system time, but you cannot set it. When you want to be able to set the system time, you can try SetSystemTime(ref SYSTEMTIME lpSystemTime) of kernel32.dll ; and there is also a GetSystemTime(ref SYSTEMTIME lpSystemTime) function in that dll.
When I played with these functions, I wrote some tests to make sure I got the conversion from struct SYSTEMTIME to System.DateTime and vice versa right.
[Test]
public void GetSystemTime()
{
DateTime expected = System.DateTime.Now;
DateTime actual = SystemTime.GetSystemTime();
Assert.AreEqual(expected, actual);
}
Well, that always failed: it takes a tick or two to do all the calculations.
I had to update it to allow for some delay:
[Test]
public void GetSystemTime()
{
DateTime expected = System.DateTime.Now;
DateTime actual = SystemTime.GetSystemTime();
int delta = Math.Abs(Convert.ToInt32((actual - expected).Ticks));
int maxDuration = Convert.ToInt32(2 * TimeSpan.TicksPerMillisecond);
Assert.Less(delta, maxDuration);
}
That's 2 Milliseconds for that simple code. But still it fails rather often:
Errors and Failures:
1) Test Failure : XY.SystemTimeTests.GetSystemTime
Expected: less than 20000
But was: 147763
What does the computer do during those almost 15 ms? Is this a test for catching the computer asleep?
|
|
|
|
|
Well, if yours was absolutely the only thing thing that ran on the computer, you might expect this to work consistently. Sadly, in a multi-tasking environment, you cannot guarantee that it's going to be accurate.
|
|
|
|
|
Pete O'Hanlon wrote: cannot guarantee that it's going to be accurate But 15 milliseconds?!
|
|
|
|
|
Yup. You just need a couple of expensive processes running and that would easily be the case.
|
|
|
|
|
There are a few thing you can test, first prevent switching of the CPU core by setting which core to use:
Process.GetCurrentProcess().ProcessorAffinity = new IntPtr(1); Then make sure your process is having more exclusive access to that core:
Process.GetCurrentProcess().PriorityClass = ProcessPriorityClass.High;
Thread.CurrentThread.Priority = ThreadPriority.Highest;
Just don't blame me if you're messing up your system.
|
|
|
|
|
Well, thanks for the hints...
When I posted it here in the "Weird and Wonderful" - or the "Coding Horror" section as it was called previously - I chose this forum because the outcome of these simple lines of code is so "weird and wonderful".
The little hints won't have much effect: the tests are running in Virtual Machines, there are several more VMs on that host, though almost all of them are "idle" - but you what Windows can do while the machine is idle...
|
|
|
|
|
A well, I don't have a clue how ProcessorAffinity is handled on a VM, but probably wrong.
I just figured that your 15 ms was because of context switching and was simply curious.
|
|
|
|
|
Setting the system time is something you should almost never do. I hope you had a good reason for that.
The system clock on a lot of machines only updates every ~15ms, so any time based test that expects a better resolution than that needs to use a dedicated high definition timer, e.g. QueryPerformanceCounter.
|
|
|
|
|
Exactly. That was my thinking too, resolution.
Regards,
Rob Philpott.
|
|
|
|
|
If you need a higher resolution, use timeGetTime. GetTickCount has a resolution of roughly 10ms, where the timeGetTime function is more accurate.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
Had to debug a crash today where the 32 bit build of an application was crashing, but the 64 bit build wasn't. There was a method that takes a string (char * ) buffer as an in / out argument as follows.
void SomeClass::someMethod(char * arg1, )
{
arg1[i] = '\0';
return;
}
The caller was supposed to call it with a variable, but one caller was passing a string literal, so naturally it caused an access violation.
obj.someMethod("string_literal");
However the 64 bit build was working fine. After some fiddling around, I discovered that it was due to the project's optimization settings. The assignement apparently has no effect, so the compiler was removing it in the 64 bit build. Disabling optimization caused the 64 bit build to crash too.
Of course I had to come up with all kinds of crazy reasons first, thinking it was something to do with how 64 bit applications pass arguments, etc.
No more in/outs. const char * is much safer, and will keep you sane (until you cast away the const , of course).
|
|
|
|
|
The assignment arg1[i] = '\0'; wouldn't work with a const char * (at least it shouldn't).
Also, shouldn't a string literal be implicitely cont char * and create a compiler error when being passed to a function that expects a char * as parameter?
The good thing about pessimism is, that you are always either right or pleasently surprised.
|
|
|
|
|
It only works the other way, you can't put a non-const pointer in a const (without casting).
Ignore that, it is clearly incorrect, I'm the one who got it backwards. See below instead.
const char * const_ptr;
char * non_const_ptr;
const_ptr = non_const_ptr;
const_ptr = "literal";
non_const_ptr = "literal";
non_const_ptr = const_ptr;
modified 19-May-14 22:35pm.
|
|
|
|
|
That is really interesting and a great write-up. Very clear too.
It seems crazy to me, however, that the char * is "safer / saner" because it is somewhat more dangerous.
The in/out should force more control, like requiring the user to create a variable on the stack first, right?
Also, you said,
Indivara wrote: so the compiler was removing it in the 64 bit build
What do you mean "removing it"? Do you mean it actually just ignored the value sent in?
That could drive you crazy, when you're trying to figure out why it isn't working.
|
|
|
|
|
I meant using const char * arg1 would be a better choice in this case, not a generalization.
The compiler was removing the last assignment arg1[i] = '\0' because it had no effect. The thing was thrown together in a hurry and the assignment was 'insurance' against the caller forgetting to add it. Had the reverse effect ironically.
|
|
|
|
|
Ah, okay. I see what you mean now. Thanks for clarifying.
|
|
|
|