Ask your own question, for FREE!
Physics 9 Online
OpenStudy (caozeyuan):

is there a simple way to prove that eigenfunctions of hermitian operator form basis for a vector space

OpenStudy (osprey):

@IrishBoy I recognise the thrust of the q but wonder if you can .... I don't know the details ... boooo hoooooo.

OpenStudy (osprey):

Does this have something to do with ORTHOGONALITY as a requirement for the basis for a vector space (vector cpts being orthogonal, any function which is going to be a component is orthogonal - cos theta and sine theta are orthogonal in cartesian/plane polar vector spaces) ????????????????????????? I this is right I'd probably faint ...

OpenStudy (irishboy123):

@osprey this needs a Ninja, mate @Kainui

OpenStudy (osprey):

@IrishBoy123 You've got me on the tech word "ninja" ... some sort of heavy duty theoretical general relativist who knows about qm ????

OpenStudy (caozeyuan):

my thinking is that Linear indepence is easy since non-degenrate eigenvalues yields orthogonal eigenfunctions, then I want to prove the dimension of Hilbert space is the cardinality of my basis set, which I dont know how to do

OpenStudy (osprey):

This looks like a theoretical quantum mechanics problem ... is it undergraduate or post graduate - by which I'm asking is it a project or a question one of many, so to speak. I've hauled a book out on hermitians and hilbert and orthogonality and s's equation ...

OpenStudy (caozeyuan):

undergrade question

OpenStudy (kainui):

Depends on how rigorous you want to be and what sort of notation you're comfortable with. Orthogonality of vectors is not a requirement for being basis, what matters is that the vectors in your set are linearly independent, that's what makes them a basis, although if all the vectors are mutually orthogonal, this means they are linearly independent and are a basis if the number of vectors in your set is the dimension of your space. You can show that eigenfunctions of a Hermitian operator are orthogonal pretty easily, it's a fun proof to work though. I'll skip some details to let you fill in the gaps, if you're not comfortable with Dirac notation you can easily do this with either integrals or linear algebra too, I'll show you if you can't figure it out. Start out here with two different eigenfunctions with different eigenvalues:\[\hat H |\psi_i\rangle = E_i|\psi_i\rangle\]\[\hat H |\psi_j\rangle = E_j|\psi_j\rangle\] So now when you calculate this, it's equal to these two things: \[\langle \psi_i |\hat H |\psi_j\rangle = E_i \langle \psi_i |\psi_j\rangle =E_j \langle \psi_i |\psi_j\rangle\] You might be tempted to divide out both sides by \( \langle \psi_i |\psi_j\rangle\) and get: \[E_i=E_j\] but this is a contradiction! So what's the problem? Well it's cause you're dividing by zero, that's the only alternative. \[ \langle \psi_i |\psi_j\rangle=0\] which means they're orthogonal!

OpenStudy (kainui):

I think I mixed up the question asker and the other person, since this doesn't entirely prove it, only that if you do have eigenfunctions with different eigenvalues, they are orthogonal. There's more to be shown, which is, if you have eigenfunctions with the same eigenvalues, that they can always be orthogonalized by the GS procedure and furthermore I guess this doesn't really answer the question... You want to know that the basis vectors you get spans the entire space? OR You want to know more simply, how to show eigenfunctions span a vector space in a generic sense? This I can answer right now, which is, you need to look at the definition of a vector space and verify that they obeys all the vector space axioms.

OpenStudy (caozeyuan):

The fiirst one, Linear independence is easy to proof because our teacher does not require us to know the degenerate case, but how about span

OpenStudy (kainui):

I think the span comes down to definition. All the eigenvalues of the Hermitian operator are all the observables of your system, and each eigenvalue corresponds to a eigenfunction in your space. So suppose there's some vector in your space which can't be reached by your eigenfunctions, then it does not correspond to a linear combination of your eigenfunctions, and since each of those eigenfunction corresponds to all the eigenvalues which are the observables of your system, then it's not an actual vector in your vector space at all, it's a contradiction. Does that seem ok, or is that cheating? Sometimes proofs seem like cheating like this some times so I can't tell lol

OpenStudy (caozeyuan):

What if we prove the third condition that the dimension of this vector space is the same as the number of vectors in our basis set, from what I learned in linear algebra this plus linear independence would also be enough

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!