After reading this blog post I thought a bit about endianness (big-endian is just bad), and while having a shower a theory came into my mind: Maybe Arabs had little-endian integers (meaning least-significant bit first) but wrote (and still do) from right to left (meaning least-significant bit/digit at the right). And when Leonardo da Pisa (Fibonacci) brought Arabic numerals to Europe, he wrote in the same style, not flipping the digits, hence establishing big-endian. In fact I could verify that with Wikipedia. But I also noticed that this “bug” has been there before, Indians write from left to right (Wikipedia told me about a coin in Brahmi written from right to left, but that was before there were any numerals), and they have always used big-endian. Thus Arabs fixed that issue (maybe not knowingly), but stupid Europeans did not get why big-endian is stupid. Furthermore, big-endian numerals look more like those stupid Roman numerals, and our usual way of vocalising them is like in Roman times. And because of Leonardo da Pisa there are those stupid architectures using big-endian representation (fortunately not x86, amd64), causing non-portability, byte-order-marks and all that stupid stuff. And left-shifts could actually be left-shifts and right-shifts could be right-shifts.
Short list of arguments for little-endian:
- Value of a digit d at position i is simply d·b**i (b is the base). That would obviously be the most natural representation if you would implement integers by using bit-arrays. It does not depend on the length, no look-ahead required.
- You can simply add numbers from left to right (no right-alignment for summation).
- For radix sort you can begin from left.
- Simple cast between longer and shorte integers without moving any bits.
- You do not need words like “hundred”, “ten-million”, “billiard” etc., because you can interprete a sequence online without look-ahead.
- Repeating modulo and integer division by the base gives little-endian-representation.
- The least-significant bits carry more interesting number theoretical information.
Well, big-endian is more like lexicographic order, although I am not sure if it is clearly better for natural languages. For division you have to start with the most-significant bit, but—hey—division is obviously not as important as all the other operations where you start with the least-significant bit. Of course sometimes little-endian is not a good notation, for measurements one should use floating point numbers (in a decimal world called “scientific notation”) and the mantissa should start with the most-significant bit/digit, after the exponent to avoid look-ahead (unlike the scientific notation).
If Leonardo da Pisa would have thought a bit about what he is doing, there would not be all those drawbacks! Just my thoughts about that regression.