Following a conversation with a colleague, I have decided to put to paper what I think is wrong with nulls.
Essentially, this: to the compiler, null means "uninitialised". It is the default value given to references.
As programmers, however, we invariably overload null with other meanings, such as "the empty list", "the empty string", "the unspecified option", and so forth.
This is bad, because I want the tools (compiler, debugger, etc.) to tell me the difference between "I have forgotten to initialise this variable" and "I have put the wrong value in this variable".
As it is, I end up with a bastardised mixture of meanings in my code requiring a lot of checking that would otherwise be redundant.
Maybe code contracts will be able to clean up this mess...