I just define it for myself as "If it can transmit information, it could also be used given enough technology to send any kind of data, such as a Microsoft word document."
Metadata. In this case, the broadcaster's general location and the timing and (I suppose) size of the messages. And perhaps known/guessed plaintext attacks: if you too have a weather ship in the same area you can guess at the content.
You're just describing information theory. Every message is contextualized within the codebooks used to encode/decode it. Sometimes we call that codebook "language", sometimes it uses another representation.
It seems that the term "information" is being used in two different ways in this thread. The usual meaning of a bit of information is with regard to a probability distribution over messages which the user wants to send to a server. I don't think most people are used to thinking about bits in other contexts, so that's where the miscommunication is happening.
Your interpretation, which I think is correct in this context, seems to be with regard to the entropy of a probability distribution over internet users, and the mutual information between that and the distribution over messages. The actual length of the message is irrelevant to the math once you fix the joint probability distribution.
The argument others seem to be making is that the joint probability distribution is in fact not fixed, and that you can smear out the conditional probability over users given a message by shrinking the space of possible messages. In theory that seems possible, but I don't know enough to have any idea how well that would work in practice. If you shrank the message space to be small enough to be useful for this purpose, wouldn't that get in the way of usability?
reply