Important Notice: Our web hosting provider recently started charging us for additional visits, which was unexpected. In response, we're seeking donations. Depending on the situation, we may explore different monetization options for our Community and Expert Contributors. It's crucial to provide more returns for their expertise and offer more Expert Validated Answers or AI Validated Answers. Learn more about our hosting issue here.

How should NULL be #defined on a machine which uses a nonzero bit pattern as the internal representation of a null pointer?

0
Posted

How should NULL be #defined on a machine which uses a nonzero bit pattern as the internal representation of a null pointer?

0

Programmers should never need to know the internal representation(s) of null pointers, because they are normally taken care of by the compiler. If a machine uses a nonzero bit pattern for null pointers, it is the compiler’s responsibility to generate it when the programmer requests, by writing “0” or “NULL,” a null pointer. Therefore, #defining NULL as 0 on a machine for which internal null pointers are nonzero is as valid as on any other, because the compiler must (and can) still generate the machine’s correct null pointers in response to unadorned 0’s seen in pointer contexts. 1.

Related Questions

What is your question?

*Sadly, we had to bring back ads too. Hopefully more targeted.

Experts123