let k be a random number between 1 and n; what is the probability that k < sqrt(n)?
I know that as n approaches infinity it is more likely to have a number bigger than sqrt(n). n - sqrt(n) elements are bigger than sqrt(n)...
So long as k is positive, and between 1 and n, as specified, every square root of k will be smaller than n. Why not write out a couple of integers {k} less than n=10 and then compare then to Sqrt(10)? Are these integers larger than, equal to or smaller than Sqrt(10)? Not a particularly imaginative suggestion, granted, but doing this might help you to see more clearly what's happening. It might be interesting for you to investigate what would happen if k were between 0 and 1, although you were NOT asked to do that.
What type of random variable is k? Is it uniformly distributed?
Good questions, wio. Without giving the matter much thought, I simplified the problem for myself by first considering k to be a random INTEGER. I did not consider what kind of distribution k might have.
I'm pretty sure you can assume it's uniformly distributed if it's truly random. As for whether it's a real or an integer, I'd guess the latter, but who knows.
Join our real-time social learning platform and learn together with your friends!