Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

16 bit is not enough?


sort by: page size:

16 bit should be enough for everybody.

16-bit

that's how much you can address with 16 bit

Do you (or anyone) have some idea why anyone could possibly have thought 16 bits would be enough? Many decisions are bad in hindsight but surely no hindsight was needed for that.

> 65535

Why limit yourself to 16bit?


If 16 bits is enough, then you should be using int16_t. If 16 bits is not enough, then you should not be using int.

16-bit bytes?

16-bit? Did they mean 4-bit?

It's 16 bytes, not 16 bits.

but 16 bits >> 8 bits? Can the AI community get by with 8 bits?

You might be right, however 16-bit sounds really harsh to my ears, and 24-bits is the only widely used standard, better than 16-bit.

IIRC the real rationale is back in the early days people bet that 16 bits would be enough for a fixed length encoding, but the bet didn't pay off and now they're stuck with the worst of both worlds.

But most of these are 16-bit...

You mean they should have stayed at 16 bits?

What would happen to all the characters that don't fit in that case, better luck in other standards?


Is there a reason it wouldn't be 16 bits like in C?

Even 16bit.

8, you mean. 16 bit allows a big pallete, near true color, but bad gradients.

> If 16 bits is enough, use an explicitly 16-bit integer.

This will often generate more code. So don't do this except when it saves a modest amount of memory.


Just as many as all of the 8-bit systems in use today. There is no need, in the vast majority of cases, for wide data busses in embedded applications. 16-bit is going to die out, though, like the 4-bit and bitslice processors.
next

Legal | privacy