A double-slit diffraction pattern is formed on a distant screen. If the separation between the slits decreases, what happens to the distance between interference fringes? Assume the angles involved remain small. The distance between interference fringes increases. The distance between interference fringes remains the same. The effect cannot be determined unless the distance between the slits and the screen is known. The distance between interference fringes also decreases.
I did d sin (theta) = m (lambda) so d would decrease but I got it wrong, am I not understanding what things mean in this
d = slit width theta = angle at which you are finding the interference m = number lamda = wavelength
there is not distance for separation between slits in this so I have the wrong formula ?
oh no.. d = distance of seperation.. sorry :P
what would be the distance between interference fringes
don't you have the expression ?? derived in the book?
nope the book doesnt show it derived since its general physics
they have it for some problems but not for all, would that help if I have the derived part
yes.. see the derivation.. i don't wanna draw and derive here :-/ takes too much time
well I think the book states that the sin(theta) would increase to keep something balanced when the distance between slits decreease
that's correct.. so whts your conclusion from that?
so distance between fringes will also increase?
yes.. since in our day to day lives, slits are so large, that the fringe distances are soooo very small.. you cannot see them.. so only when you make the slits closer and closer.. and narrower and narrower, can u make the interference visible!
ohhh that kinda makes sense
thank you =)
yes.. !
Join our real-time social learning platform and learn together with your friends!