How should NULL be defined on a machine which uses a nonzero bit pattern as the internal representation of a null pointer?
Related Questions
- How should NULL be #defined on a machine which uses a nonzero bit pattern as the internal representation of a null pointer?
- How should NULL be defined on a machine which uses a nonzero bit pattern as the internal representation of a null pointer?
- pattern as the internal representation of a null pointer?