pls i need urgent help with this question For a certain first order reaction, it is found that it takes 156 seconds for the concentration of reactant to fall from 0.100 M to 0.0500 M. How much time would it take for the concentration of reactant to fall from 0.0500 M to 0.0250 M?
Here is an attempt: Here is our equation for a first order reaction. r = our rate, k the rate constant and [ ] means concentration of A. \[r = k[A]^{1}\] I'm guessing we would do something like this: \[[A_{0}]-[A] = \Delta A ~ \frac{ \Delta A }{ \Delta T } = change~\in~concentration\] We know that our initial concentration is 0.10 and our final concentration is 0.05 at that moment. also, we know the time. so we plug this in and we get a nice rate. \[\frac{ [0.10]-[0.05] }{ 156s } = 3.0x10^{-4} (\frac{ mol }{ L*s })\] that will tell you how fast the concentration is changing. now, let's see what are we looking for. we're looking for how long it would take for our reactant concentration to go from 0.50 to 0.025 M let's go back to our original rate we made. our unknown is time so we solve for time: \[\frac{ [A]_{0}-[A] }{ \Delta T } = rate \] \[\Delta T = \frac{ [A_{0}]-[A] }{ Rate } ; \frac{ [0.05]-[0.025] M}{ 3.0x10^{-4}M *s^{-1} } = 83~seconds\]
Join our real-time social learning platform and learn together with your friends!