Show that the series \[\sum_{0}^{\infty}1/(1+n ^{2}x)\] converges uniformly on [a,∞) for any a>0 but does not converge uniformly on (0,∞).
first of all it doesn't converge at all if \(x=0\) so it cannot converge uniformly on \([0,\infty)\) so we have to consider two cases, \([a,\infty)\) for \(a>0\) and \((0,\infty)\) the first one is easiest i believe you use "weirstrauss M test" \[|\sum_{1}^{\infty}\frac{1}{1+n^2x}\leq \sum_1^{\infty}\frac{1}{n^2x}=\frac{1}{x}\sum_{1}^{\infty}\frac{1}{n^2}\] converges by comparison test
and so if \[x\geq a\] we have \[\frac{1}{1+n^2x}\leq \frac{1}{an^2}\] and so convergence is uniform by weierstrauss question is what happens if we are in the interval \((0,\infty)\) and then convergence will no be uniform for no matter what N you pick the interval will contain some \(\frac{1}{n^2}\) where \(n>N\) to make \[\sum_{N}^{\infty}\frac{1}{1+n^2x}\geq\frac{1}{2}\]
Join our real-time social learning platform and learn together with your friends!