Ask your own question, for FREE!
Mathematics 13 Online
OpenStudy (anonymous):

Can the roots of the function f(x) = x - cos(x) be determined analytically?

OpenStudy (anonymous):

I think the answer is no, but don't know what reason to give

OpenStudy (anonymous):

As of now there is no known solution according to wolframalpha, see "dottie number'

OpenStudy (anonymous):

Perhaps there is some theory of transcendental equations, which shows it can't be given in terms of elementry functions. Similar to Galois theory or somthing

OpenStudy (anonymous):

When you say analytic solutions, I assume you mean in terms of elementry functions

OpenStudy (anonymous):

I actually have no idea what you're talking about hahaha! With analytical, I mean factorization or other methods, except numerical methods

OpenStudy (anonymous):

The term 'analytic solution' or 'closed form' usually refers to the representation of some expression in terms of elementry functions, ie other trigonometric functions surds etc

OpenStudy (anonymous):

For example general quintic equations can't be solved in terms of radicals like other lower order polynomials

OpenStudy (kainui):

I found a method, you can try this: x=cos(x) Now just plug it into itself many times on your calculator and you'll approach the right answer like this: cos(cos(cos(cos(cos(cos(cos(cos(cos(cos(cos(cos(cos(cos(cos(cos(x then you can use like 1 or 0 for x, doesn't really matter since it cancels itself out as you approach infinity. Compare it to wolfram alpha.

OpenStudy (anonymous):

If your interested in that sort of thing, you could look up keywords like 'fixed point' or 'fractals'

OpenStudy (anonymous):

Numerically, you could try newtons method, the lagrange inversion formula, or a number of other things.

OpenStudy (kainui):

What are those @Jack183

OpenStudy (anonymous):

I don't really know much about either, but if your interested in iterating the composition of functions like that, I think they are somewhat related.

OpenStudy (anonymous):

Though I think we are geting off topic from the original question lol

OpenStudy (anonymous):

$$\cos x=x$$i.e. you want the fixed-point of \(\cos x\) -- unfortunately the best you're going to get is probably polynomial approximations using Taylor series e.g.$$\cos x\approx 1-\frac{x^2}2+\frac{x^4}{24}=x$$which yields \(x\approx0.73922\), close to the exact value \(x=0.739085\dots\) -- and that's only with a fourth-order approximation. Unfortunately the gain in using higher-order approximations gradually diminishes (law of diminishing returns)

OpenStudy (anonymous):

@Kainui Newton's method is an iterative method to determine the zeros of a sufficiently 'nicely' behaved function. The Lagrange inversion formula is a nicer approach (like mine above) that gives the Taylor series expansion of the inverse of an analytic function (e.g. spits out a series for some \(\cos^{-1}\))

OpenStudy (anonymous):

Anyways, @Kainui what you discovered is an demonstration of the Banach fixed-point theorem (or contraction mapping theorem). As it turns out, \(\cos x\) is a contraction mapping in that it 'bunches' \(x\)s together; mathematically speaking, on the typical Euclidean space equipped with metric (distance) \(d(x,y)=|x-y|\), we have some nonnegative real constant \(M<1\) \(d(\cos x,\cos y)\le Md(x,y)\) for all \(x,y\) i.e. \(\cos \cdot\) is *guaranteed* to push any two points together to some degree.

OpenStudy (kainui):

Wow weird, sounds fancy. I'm taking real analysis and differential equations this fall. How many classes am I away from Banach fixed-point theorem coming up in a math class of mine?

OpenStudy (anonymous):

It then follows that with some fixed point \(x^*\) of \(\cos\cdot\) i.e. a value that satisfies \(\cos x^*=x^*\) and any other point \(y\) that by the fact that \(\cos\cdot\) is a 'contracting' map we get \(d(\cos x^*,\cos y)=d(x^*,\cos y)<d(x^*,\cos y)\) i.e. applying \(\cos\) to \(y\) brings it *closer* to our fixed point \(x^*\). It follows that you can keep applying \(\cos\) (as you did in your example) to get arbitrarily close to our fixed point \(x^*\) -- which is why we see (using functional power notation \(\cos^{(n)}=\underbrace{\cos\circ\cos\circ\cos\circ\cdots\cos}_{n\ \text{times}}\)):$$\cos^{(n)}y\to x^*\text{ as } n\to\infty$$

OpenStudy (anonymous):

oops that inequality should read \(<d(x^*,y)\) i.e. the distance shrinks by taking the cosine again

OpenStudy (anonymous):

not too far @Kainui -- you will definitely learn about metric spaces in real analysis and in differential equations (depending on the level of rigor) you will learn of the Picard-Lindelof existence-uniqueness theorem for solutions of differential equations, most proofs of which use the Banach fixed-point theorem. In fact, some DE books will even introduce you to the notion of Picard iterations (which are themselves contraction mappings and whose fixed-point is our solution!)

OpenStudy (kainui):

Hmm, interesting. I'm actually a little afraid to take Real Analysis simply because I feel like it'll make me more narrow minded towards math even though it will open a lot of doors for me.

OpenStudy (dumbcow):

i found newtons method works well here...it gives answer in only about 5 iterations and you can easily plug the formula into a spreadsheet program

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!