### math notation abuse

In the Wikipedia article Golden ratio, quote:

Two quantities a and b are said to be in the golden ratio φ if:

```(a+b)/a = a/b = φ
```

This equation unambiguously defines φ.

This is a example of abuse of math notation.

In 「(a+b)/a = a/b = φ」, we have basically a system of equations of 3 variables. The reason someone wrote this equation to define the golden ratio, is due mostly to the abuse of the equal sign as used in traditional math notation. In fact, the equation does not make sense.

This abuse does not happen just in Wikipedia, but in just about every math textbook from professional mathematicians. (e.g. 《Visual Complex Analysis》, by Tristan Needham. amazon or my friend's 《Differential Equations, Mechanics, and Computation》 amazon)

Typically, mathematicians will just say it doesn't matter, because by context it makes it clear to humans. I disagree. I think it introduces lots of mis-understanding and garbage into our minds, especially students. The wishy-washy, ill-defined, subconscious, notions and formulas make logical analysis of math subjects difficult. Of course, mathematicians simply grew up with this, got used to it, so don't perceive any problem. They'd rather attribute the problem to the inherent difficulty of math concepts. I think that if we actually force all math notations into something similar of a formal language (in a technical sense), then many un-necessary confusion will go away, and math understanding and research will be easier and grew faster. (one historical example to illustrate this is the history of complex numbers and calculus, which took decades to overcome.)

You will realize much of these problems when you actually try to program math to computers, may it be setting up a computer based proof or writing a simple computer algebra software. For example, let's say you want to write program to draw a rectangle of golden ratio. You read about the golden ratio. You need to define it in your program. You start with the Wikipedia equation given, then immediately, you'd realize, it does not make sense at all. Because when you try to program something to the computer, the computer won't accept your wishy-washy subconscious ill-defined implicit human notions. When you actually try to program it, you'll realize lots of these issues, and in fact come out with a clear understanding of the subject, even if the subject is a basic one that you thought you knew all about it.