Proof that for two number a and b, any linear combination of a and b is a multiple of gcd(a,b).
I can do a proof by contradiction but the proof is unintuitive.
Let \(d=gcd(a,b)=ma+nb\). Suppose that there exist a linear combination of a and b which is not a multiple of gcd(a,b). Let that multiple be x. \[ x=ua+vb\\ x=kd+r\quad\text{by division theorem.}\\ x-kd=r\\ ua+vb-k(ma+nb)=r\\ (u-km)a+(v-kn)b=r,\quad0\leq r <d \] This contradicts the minimality of ma+nb. Thus if x is a linear combination of a and b, it must be a multiple of gcd(a,b).
Only after I typed the proof I understand why I don't find it intuitive. I think it is because that I don't understand Bezout's identity. Can anyone explain it to me?
That looks neat :)
let me see if i can give an alternative proof
Let \(d = \gcd(a,b)\) \(d|a\) and \(d|b\) \(\implies a = a'd\) and \(b = b'd\) for some integers \(a'\) and \(b'\) Next consider a linear combination of \(a\) and \(b\) \[ax+by = a'dx + b'dy = d(a'x+b'y) = \gcd(a,b)(\text{some integer)}\]
Join our real-time social learning platform and learn together with your friends!