Click here to Skip to main content
16,004,529 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
I know in C and C++ !! is supposed to reduce the value on the right to zero and one.

What it looks like to me, is double negation which leads me to ask the following:

A) Does a single "!" similarly reduce the value on the right to zero and one (while inverting it?)

B) Related. Is there a way to tell the difference between true/false and one/zero in C or C++? Maybe checking the type/size?

What I have tried:

I haven't really tried anything, because of question (B).
Posted
Comments
PIEBALDconsult 20-Aug-24 20:59pm    
Never heard of !! , will have to watch this play out.
honey the codewitch 20-Aug-24 21:00pm    
Like so much of my C information, I can't even remember where I first heard of it.
PIEBALDconsult 20-Aug-24 21:07pm    
Well, my C was on OpenVMS and I basically haven't touched it in twenty years.
Just some small experiments with ODBC.
k5054 20-Aug-24 22:05pm    
I've never seen !! either. And its not listed here : https://en.cppreference.com/w/c/language/operator_precedence
Can you cite an example?
honey the codewitch 20-Aug-24 23:00pm    
I could if I remembered where I picked it up. I do remember reading it somewhere because I remember being surprised at it, as you are.

1 solution

After a little experimentation

A) !x reduces to either 0 or 1. That is for any x != 0 then !x -> 0. Similarly if x == 0 then !x -> 1. !!x could be rewritten as !(!x), so always 1 if x != 0. Maybe for some architectures !!x is faster or fewer operands than x != 0?

B) Interestingly, in both C and C++ you can't increment a bool value past true. I was under the impression that in C, bool was a typedef alias for char, but clearly there's more to it than just that. e.g.
Shell
[k5054@localhost]$ cat bool.c
#include <stdio.h>
#include <stdbool.h>

int main()
{
    bool c = false;
    for(int i = 0; i < 10; ++i) {
        printf("c = %d\n", c);
        c += 1;
    }
}
[k5054@localhost]$ gcc bool.c -o bool
[k5054@localhost]$ ./bool
c = 0
c = 1
c = 1
c = 1
c = 1
c = 1
c = 1
c = 1
c = 1
c = 1
But otherwise, sizeof(bool) == sizeof(char) == 1. I can't think of anything else that might distinguish between a bool or any other type. Recall that C didn't originally have any bool (or _Bool) type, and even today in both C and C++ anything that can be compared to zero has a "truth value" i.e. if x == 0 then false else true, for any type where x == 0 has a valid interpretation.
 
Share this answer
 
Comments
PIEBALDconsult 21-Aug-24 12:00pm    
"try to realize the truth. There is no bool.”

Having said that, I like bools in C#, but I won't accept that bools which may have been hacked into some implementations of C can work as well and I doubt I would use them if I ever dabble in C again.

Rant: Besides it always seemed backward to me that C considers zero false rather than true. I would have thought that it would be based on jump-if-zero, rather than jump-if-not-zero. But I'm sure that's just me.
At any rate, we may represent a bool graphically as a zero or a one, but it is neither, it's a bool and the developer shouldn't care about how they are represented in the computer.
honey the codewitch 21-Aug-24 12:07pm    
Well I think for business development your point is valid, particularly because team development requires being careful about correctly expressing intent through your code.

But particularly with a mid level language like C or C++ that may be running on constrained hardware, it helps to know what your language is actually doing even at the low level.

I routinely run my code through godbolt.org. That doesn't tell me what the standard is though, and the C and C++ standards docs often leave me more confused than if I just ask.

So I asked.
PIEBALDconsult 21-Aug-24 12:19pm    
Absolutely. You (as opposed to me) need to know the size. And different architectures.

But ... back in the 90s, when I was being paid to write C to run on Alpha chips, the same code had to run the same on VAX chips, and other systems (running DOS or QNX). The gurus had long before decided to use defines for TRUE and FALSE. Something like #define TRUE (0==0) and #define FALSE (!TRUE)
arguing that it was device independent, etc.

They also had an odd view that using define saved bytes, so we used them for pretty much anything constant.
And yoda-coding, : shudder :
honey the codewitch 21-Aug-24 12:20pm    
well back then you had to do funny things to get the compiler to produce the code you wanted. Compilers have come quite a long way.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900