Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Notation makes a huge difference. I mean, have you TRIED to do arithmetic with Roman numerals?

>If the idea is that the right notation will make getting insights easier, that's a futile path to go down on. What really helps is looking at objects and their relationships from multiple viewpoints. This is really what one does both in mathematics and physics.

Seeing the relationships between objects is partly why math has settled on a terse notation (the other reason being that you need to write stuff over and over). This helps up to a point, but mainly IF you are writing the same things again and again. If you are not exercising your memory in such a way, it is often easier to try to make sense of more verbose names. But at all times there is tension between convenience, visual space consumed, and memory consumption.



I haven't thought about or learned a systematic way to add roman numerals. But, I would argue that the difference is not notation but a fundamental conceptual advance of representing quantities by b (base) objects where each position advances by a power of b and the base objects let one increment by 1. The notation itself doesn't really make a difference. We could call X=1, M=2, C=3, V=4 and so on.

I also don't know what historically motivated the development of this system (the Indian system). Why did the Romans not think of it? What problems were the Indians solving? What was the evolution of ideas that led to the final system that still endures today?

I don't mean to underplay the importance of notation. But good notation is backed by a meaningfully different way of looking at things.


Adding and subtracting Roman numerals is pretty easy because it's all addition and subtraction. A lot of it is just repeating the symbols just like with tally marks. X+X is just XX for example. You do have to keep track of when another symbol is appropriate, but VIIII is technically equivalent to IX. It's all the other operations that get harder. If the Romans had negative numbers, then the digits of a numeral could be viewed as some kind of polynomial with some positive and negative coefficients. But they also didn't have that.

>The notation itself doesn't really make a difference. We could call X=1, M=2, C=3, V=4 and so on.

Technically, the positional representation is part of the notation as well as the symbols used. Symbols had to evolve to be more legible. For example, you don't want to mix up 1 and 7, or some other pairs that were once easily confused.

>Why did the Romans not think of it?

I don't know. I expect that not having a symbol for zero was part of it. Place value systems would be very cumbersome without that. I think that numbers have some religious significance to the Hindus, with their so-called Vedic math, but the West had Pythagoras. I'm sure that the West would have eventually figured it out, as they figured out many impressive things even without modern numerals.

>But good notation is backed by a meaningfully different way of looking at things.

That's just one aspect of good notation. Not every different way of looking at things is equally useful. Notation should facilitate or at least not get in the way of all the things we need to do the most. The actual symbols we use are important visually. A single letter might not be self-describing, but it is exactly the right kind of symbol to express long formulas and equations with a fairly small number of quantities. You can see more "objects" in front of you at once and can mechanically operate on them without silently reading their meaning. On the other hand, a single letter symbol in a large computer program can be confusing and also makes editing the code more complicated.


Considering that post-arithmetic math rarely use numbers at all, and even ancient Greeks use lots of lines and angles instead of numbers, I don't think Roman numerals would really hold math that much.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: