Can u find the missed mistake if there is any?
Let a & b each be equal to 1.
Since a ^ b are equal,
b^2 = ab (eq.1)
Since a equals itself, it is obvious that
a^2 = a^2 (eq.2)
Subtract equation 1 from equation 2.
(a^2) - (b^2) = (a^2)-ab (eq. 3)
We can factor both sides of the equation:
(a^2)-ab equals a(a-b).
Likewise, (a^2)-(b^2) equals (a + b)(a - b)
(Nothing fishy is going on here. This statement is perfectly true. Plug in numbers and see for yourself!) Substituting into the equation 3, we get
(a+b)(a-b) = a (a-b)
So far, so good. Now divide both sides of the equation by (a-b) and we get
a + b = a
b = 0
But we set b to 1 at the very beginning of this proof, so this means that
1 = 0
wow! im lost, lol.