|
leonej_dt wrote: cpkilekofp wrote:
is a C# example, not a C example.
Excuse me, sir. You are wrong. The IS_NOT_ZERO macro is a C example, not a C# one. You can't apply the ! operator to an integer in C#. You can do that in C.
Excuse me, sir, but the original coding horror in this thread is a C# coding horror, not a C coding horror, thus my comments about using booleans where you MEAN it is either true or false are appropriate, while your comments that it doesn't make a difference apply to C where booleans (at least for most of the language's lifespan) were not supported; is that, finally, clear?.
Additonally, I based my comments regarding your statements about coding standards on what exactly you wrote; getting upset because I responded to, again, exactly what you wrote rather than what you actually do would seem to me to be a problem with your description. Further, I wasn't referring to your code when I commented about "indentation" - how could I, when the only piece of code from you that I saw was a single line? However, I've had to read, debug, and modify more than one piece of obscure C that made it into production in the time before code reviews became commonplace.
In general, the comments you've made on this topic are in line with what I call "C bigotry", an attitude that naturally comes to most good C programmers (I was most definitely guilty of it when C was my primary development language). The elegance and concision of C creates in the C developer an automatic contempt for languages which lack those features, and for those who find it difficult to impossible to read C when used by one well versed in all its subtleties. Please note: back when I was using C regularly, we had macros almost identical to yours to convert integers to true/false when we were building files for processing by, say, COBOL mainframe programs that looked for 0 or 1 as false or true (about 17-18 years ago), so I don't see your macro as a joke; however, I do view your comment as an indicator that, perhaps, your experience in different arenas of development seems to not be as widespread as mine (this is NOT a comment on your skill, just on the limited number and type of programming environments to which you have been exposed - once you reach the point where you are working with junior programmers with much less skill than you have, some of these points will not only be clearer to you, they'll seem like "common sense").
|
|
|
|
|
cpkilekofp wrote: Excuse me, sir, but the original coding horror in this thread is a C# coding horror, not a C coding horror,
The original post was a C coding horror:
geoffs wrote: So, in reviewing a coworker's code I come across the following line:
m_boolVar = (intVar == 0 ? false : true) ;
Yes, parenthesization and spacing exactly as shown above. Were it my code, it would have been written as:
m_boolVar = intVar != 0; // (corrected from == 0 by GDavy -- I typed too fast!
...or if I was feeling in a bit more perverse mood:
m_boolVar = !!intVar;
There were much bigger fish to fry in this code, but there are times when I just can't let things like this go by. These things are like misspelled words that shout out at me from among the surrounding text.
Although the first two lines of code could also be valid C#, the third one can't be valid C#.
cpkilekofp wrote: Additonally, I based my comments regarding your statements about coding standards on what exactly you wrote
My coding standards are those I have written one or two posts ago.
cpkilekofp wrote: In general, the comments you've made on this topic are in line with what I call "C bigotry", an attitude that naturally comes to most good C programmers (I was most definitely guilty of it when C was my primary development language).
I'm actually more guilty of C++ bigotry. I seldom use classes (90% of my classes are Class-Wizard-generated GUI classes), but I'm very fond of function pointers and templates.
cpkilekofp wrote: back when I was using C regularly, we had macros almost identical to yours to convert integers to true/false when we were building files for processing by, say, COBOL mainframe programs that looked for 0 or 1 as false or true (about 17-18 years ago)
Wow. I was 2 years old back then.
cpkilekofp wrote: perhaps, your experience in different arenas of development seems to not be as widespread as mine
Agreed. I'm just a college student.
|
|
|
|
|
LMAO...it had never occurred to me to compile "m_boolVar = !!intVar;", but you're right, C# rejected it because the bang operator is not valid for integers. So it is a C coding horror.
Your coding standards as originally stated,
1. the program has to be as fast as possible
2. the compiled executable has to be as small as possible
3. the source code has to be as small as possible
are, as I said earlier, quite valid and sensible in standalone programming, where no one will ever have to maintain your code...including you, ten years later. Unfortunately, even you won't necessarily remember what some code was supposed to do after that length of time, and the more "optimized" it is, the more difficult it is to read, to the point where depending on the size of the code block you are examining, it may take many hours to figure out what you meant back then.
I know you think this is an extreme example, but it is an example I pull from personal experience: code I wrote in my first programming job was code I was still maintaining as a part-time consultant two full-time jobs later, then again when my original boss founded an Internet start-up - and some of that highly efficient code I wrote in 1990 (necessary for multi-megabyte programs being loaded a chunk at a time as overlays in old MSDOS, well before OS/2 and Windows brought virtual memory to the PC-compatible world) was just plain obscure when I had to modify it to sit in a COM object in 2000 (if you think Y2K was a problem, understand how much trouble you can get into when code is written with explicit assumptions of 16-bit integers when it gets moved to a 32-bit integer environment).
You may not realize it, but function pointers have been part of C since its inception - it was one of the most fascinating aspects of the language for many of us, and I used them extensively. In fact, this language feature allowed overlays to be built in a language other than assembly language for the first time, one (carefully!!) loaded the code from the binary executable, cast a function pointer to it, then executed it. Templates were described in the original version of the C++ Annotated Reference, but didn't appear in DOS/Windows compilers until, I think, 1992, but I remember writing some pretty frightening code to achieve the same effect (historical note for you: the first C++ compilers weren't compilers, they were preprocessors similar to the one that provides #include and #ifdef, and its output was C code).
In order to move from being a good programmer in C++ to a great developer in any environment, you'll need to develop appreciations for things that just seem like a waste of time right now. I still remember when my best friend, an electrical engineer who spent 80% of his time developing embedded software, was having trouble using his hardware stack for several things at once in a project he was working on in 1983; until I suggested it, it had never occurred to him that he could build a software stack and keep his hardware stack (the one the CPU used for things like storing return pointers from a function) strictly for its original purpose. Up to that point, he'd never learned to look at code as a series of logical abstractions, but afterward he never forgot to do so.
I lost track of how many languages that I'd used professionally by 1997, but it was at least ten (including script languages like DOS Batch, Rexx, Awk, and Korn Shell - yes, an amazing number of production programs for business purposes have been written in, and are still running on, AWK) with another four or five that I'd used in graduate school projects, including LISP and Prolog, and at least four assembly languages, x86 being the only one I've ever used professionally. At that point, one begins to look at programming languages in a highly abstract sense, and one notes which languages should be restricted to use for the brightest intellects and which are safe to use in environments where one can only afford to use dim bulbs (this is the biggest reason that COBOL, the second oldest commonly used language, is still the most prolific computer language used in business - and beware of having a manager who never programmed in anything BUT COBOL).
Thus, as one small example, booleans: when you need to depend on coders who don't and never will understand the hardware substrate on which their computer sits, you need abstractions they can understand, and many coders will never understand the difference between a byte used as boolean and a bite in the Rse.
Regards
|
|
|
|
|
I should add, for fairness, that many of those "dim bulbs" I referred to are, in fact, sophisticted experts in their own fields...but they are not computer scientists. FORTRAN, the first high-level programming language, was created so that scientists could write complex calculations without having to learn anything about the inner workings of the computer they were using for the calculation. COBOL was created with a similar intent for business and administrative purposes. C was created so that a portable language existed for computer experts that allowed one to program directly on the computer without having to use assembly language. In fact, some of the dialects of C I used in my early career contained primitives for directly address CPU registers, and most of them contained facilities for allowing in-line assembly language blocks; I may be wrong, but I doubt that you've had a use for either of these facilities in your academic career to date
|
|
|
|
|
Hey, you two are really keeping this thread going! Wow.
Meanwhile, as an aside, modern C and C++ compilers still provide the capability for inline assembler code (the MS compiler defines __asm as the keyword for doing this, other compilers may use asm, _asm, etc). The ability can come in handy for performance critical areas if you don't care about portability across processors. I'll also note that the quality of code put out by modern compilers is making the usage of inline assembler not the necessity it used to be (depending on the application, of course).
|
|
|
|
|
I see a lot of:
if (b1)
b2 = true;
else
b2 = false;
and even some:
switch (b1)
{
case true:
b2 = true;
break;
case false:
b2 = false;
break;
}
|
|
|
|
|
David St. Hilaire wrote: I see a lot of:
if (b1) b2 = true;else b2 = false;
and even some:
switch (b1){case true: b2 = true; break;case false: b2 = false; break;}
You have my sympathies.
|
|
|
|
|
... have to work with people who wrote this GEM
if(i==0)
*outcallRecvCount = 0;
else
*outcallRecvCount = i ;
C++ where friends have access to your private members !
|
|
|
|
|
eh... ahem...
you're using plural... so there's more than one...?
(yes|no|maybe)*
|
|
|
|
|
I don't actually see something wrong with this.
I could be that that i changes between
if(i==0)
and
*outcallRecvCount = i;
But then they should have protected it with CRITICAL_SECTIONS
Learn from the mistakes of others, you may not live long enough to make them all yourself.
|
|
|
|
|
Well, IF the variable did change the code would make a bit more sense.... I believe what he meant was the pattern
if (someCondition)
doSomething();
else
doExactlyTheSameThing();
|
|
|
|
|
BadKarma wrote: But then they should have protected it with CRITICAL_SECTIONS
There is only one situation where I would expect to see code like this: when one wants to set a breakpoint at i == 0 in the debugger, and one doesn't want to slow execution within the debugger by creating a conditional breakpoint.
|
|
|
|
|
Gin for breakfast. That should sort it out.
Panic, Chaos, Destruction.
My work here is done.
|
|
|
|
|
It would have been clearer if the if ( ... ) had been expanded upon. You know something like this,
if ( i == 0 && *outcallRecvCount != 0 )
*outcallRecvCount = 0;
else
if ( i != *outcallRecvCount )
*outcallRecvCount = i;
There that's much clearer and couldn't possibly be considered a horror.
Chris Meech
I am Canadian. [heard in a local bar]
In theory there is no difference between theory and practice. In practice there is. [Yogi Berra]
|
|
|
|
|
That's legitimate, because the author needed
*outcallRecvCount = 0;
instead of
*outcallRecvCount = 0 ;
on
i==0
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler.
-- Alfonso the Wise, 13th Century King of Castile.
This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong.
-- Iain Clarke
[My articles]
|
|
|
|
|
Oh I see it's the space before the semi-colon...
|
|
|
|
|
Exactly.
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler.
-- Alfonso the Wise, 13th Century King of Castile.
This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong.
-- Iain Clarke
[My articles]
|
|
|
|
|
Lovely!
What is even more disgusting is that I've seen similar code in the product I am working on. You know, things like:
if (thisBoolVar == true)
{
thatBoolVar = true;
}
else
{
thatBoolVar = false;
}
It is quite painful to have to be in areas of this code...
|
|
|
|
|
And people ( especially non-programmers/managers ) don't understand why I say that just because it seems to work doesn't mean that it is good code.
Bill W
|
|
|
|
|
Very true. However, I don't want to make it seem like all managers are lumped into the non-programmer's category. Every so often, in some companies more than others, you'll have managers that are technically savvy and do understand the realities of the code base. It's a pleasure to work with these managers, even if they can't fix the world (because of the dictates of managers above them, etc).
|
|
|
|
|
I have to admit that is true. Some managers do understand, usually the ones that were programmers themselves, but sometimes the non-programmers do "get it".
Bill W
|
|
|
|
|
My first manager was technically savvy. He was a software engineer and has written many of the earlier versions of the software. It was lovely technical discussion. But he sucked as a manager / leader, just following orders from the top
He would come and tell us what is going to happen and expresses his frustration and disagreements. poor fella, can't he fix the world?
Yusuf
|
|
|
|
|
CIDev wrote: And people ( especially non-programmers/managers ) don't understand why I say that just because it seems to work doesn't mean that it is good code.
Sometimes, when I tell a manager about the Obscure C contest, tne light goes on
|
|
|
|
|
|
We can build upon this!
try
{
if (thisBoolVar == true? true : false)
{
thatBoolVar = (thisBoolVar != false? true : false);
}
else
{
thatBoolVar = (thisBoolVar == false? false : true);
}
}
catch {}
if (thatBoolVar != thisBoolVar) ...
|
|
|
|
|