if gcd(a,b)=1, prove that gcd(a+b,b^2)=1
without saying i know what to do, i will make a guess you know if \(p|ab\) then \(p|a\) or \(p|b\) since \(gcd(a,b)=1\) then imagine \(p|a+b\) and also \(p|b^2\) and you want to show \(p=1\) consider \((a+b)^2-b^2\) which \(p\) must divide as well
can you please give me a clearer hint?
well actually now that i look,that doesn't quite do it, does it?
i bet @joemath314159 has a better idea
earlier we proved that gcd(a,b^2) was one by showing there existed a linear combination of a and b^2 that equaled 1.
i also proved, using the same technique, gcd(a^2,b^2)=1
i was thinking along the lines of the proof that \(gcd(a+b,a^2+b^2)=1\)
actually 1 or 2
the hint my teacher gave me was to prove gcd(a+b,b)=1 first
ah, ok, i know the trick then, Since gcd(a^2,b^2)1, there exist integers x,y such that:\[a^2x+b^2y=1\Longrightarrow (a^2-b^2+b^2)x+b^2y=1\]\[\Longrightarrow (a^2-b^2)x+b^2(x+y)=1\Longrightarrow (a+b)\left[(a-b)x\right]+b^2(x+y)=1\]
There is your linear combination of a+b and b^2 that comes out to 1.
This is probably the dumbest way to do this problem though lol.
oh? is there another way to show this proof?
thanks a lot :D
Join our real-time social learning platform and learn together with your friends!