最佳答案
Consider this program:
#include <stdio.h>
int main(void)
{
unsigned int a;
printf("%u %u\n", a^a, a-a);
return 0;
}
Is it undefined behaviour?
On the face of it, a
is an uninitialized variable. So that points to undefined behaviour. But a^a
and a-a
are equal to 0
for all values of a
, at least I think that is the case. Is it possible that there is some way to argue that the behaviour is well defined?