为什么每个人都选择 typedef 而不是标准的 C 类型?

如果你想使用 QT,你必须拥抱 quint8quint16等等。

如果你想使用 图片来源: https://en.wikipedia.org/wiki/GLib,你必须欢迎 guint8guint16等等。

Linux上有 u32s16等等。

UC/OS 定义了 SINT32UINT16等等。

如果你不得不使用这些东西的一些组合,你最好做好麻烦的准备。因为在你的机器上,u32typedefd/longquint32typedefd/int,编译器是 会抱怨的

如果有 <stdint.h>,为什么每个人都这样做? 这是图书馆的某种传统吗?

6331 次浏览

stdint.h didn't exist back when these libraries were being developed. So each library made its own typedefs.

For the older libraries, this is needed because the header in question (stdint.h) didn't exist.

There's still, however, a problem around: those types (uint64_t and others) are an optional feature in the standard. So a complying implementation might not ship with them -- and thus force libraries to still include them nowadays.

stdint.h has been standardised since 1999. It is more likely that many applications define (effectively alias) types to maintain partial independence from the underlying machine architecture.

They provide developers confidence that types used in their application matches their project specific assumptions on behavior that may not match either the language standard or compiler implementation.

The practice is mirrored in the object oriented Façade design pattern and is much abused by developers invariably writing wrapper classes for all imported libraries.

When compliers were much less standard and machine architectures could vary from 16-bit, 18-bit through 36-bit word length mainframes this was much more of a consideration. The practice is much less relevant now in a world converging on 32-bit ARM embedded systems. It remains a concern for low-end microcontrollers with odd memory maps.

So you have the power to typedef char to int.

One "coding horror" mentioned that one companies header had a point where a programmer wanted a boolean value, and a char was the logical native type for the job, and so wrote typedef bool char. Then later on someone found an integer to be the most logical choice, and wrote typedef bool int. The result, ages before Unicode, was virtually typedef char int.

Quite a lot of forward-thinking, forward compatibility, I think.