Can we represent a matrix in a matrix?
For instance, if A is a 2x2 matrix,\[\LARGE AI = \left[\begin{matrix}A & 0 \\ 0 & A\end{matrix}\right]\]
hmmmmm, not sure. havent made it that far XD
Suppose \[A=\left[\begin{matrix}a & b\\ c & d\end{matrix}\right]\] Then we have: \[\left[\begin{matrix}a & b\\ c & d\end{matrix}\right] I = \left[\begin{matrix} \left[\begin{matrix}a & b\\ c & d\end{matrix}\right]& \left[\begin{matrix}0 & 0\\ 0 & 0\end{matrix}\right] \\ \left[\begin{matrix}0 & 0\\ 0 & 0\end{matrix}\right] &\left[\begin{matrix}a & b\\ c & d\end{matrix}\right]\end{matrix}\right]\] Which is really equivalent to saying we can represent all nxn matrices as (kn)x(kn) matrices as bands down the diagonal... which seems wrong since then it seems to imply that the determinant is squared meaning it's different... But that's not really enough for me to say that this is completely wrong yet.
Let \(A=\left[\begin{matrix}a & b\\ c & d\end{matrix}\right]\), then \[\left[\begin{matrix} \left[\begin{matrix}a & b\\ c & d\end{matrix}\right]& \left[\begin{matrix}0 & 0\\ 0 & 0\end{matrix}\right] \\ \left[\begin{matrix}0 & 0\\ 0 & 0\end{matrix}\right] &\left[\begin{matrix}a & b\\ c & d\end{matrix}\right]\end{matrix}\right]=\left[\begin{matrix}A & 0\\ 0 & A\end{matrix}\right] = AI\]
Sure, but this seems to violate the rule since AI=A they should have the same determinant. \[\Large \det(AI)=\det(A) \\ \Large \det(A)^2 \ne \det(A)\] You see what I mean, when we take the determinant of the block matrix? However I am not opposed to the idea of det(A) being the "principal" determinant or something.
Here is another interesting "weird" use I found using the Cayley-Hamilton Theorem where it doesn't quite line up. A is a 2x2 matrix and has determinant d and trace t. \[\left[\begin{matrix}t & -d\\ 1 & 0\end{matrix}\right] \left(\begin{matrix}A \\ I\end{matrix}\right) = \left(\begin{matrix}A^2 \\ A\end{matrix}\right)\] You see if we expand the "vectors" as 2x2 matrices we are multiplying a 2x2 matrix with a 4x2 matrix which is technically not allowed. But it works out since these matrices are indeed linear combinations of each other.
This makes me wonder that perhaps the regular matrix multiplication is really in fact just a special case of a more general matrix multiplication where you simply resize what you are multiplying to match.
\[\Large A_{ij}B_{jk}=C_{ik}\] what I'm saying is change this to do something like we had in the last example I showed with matrices in the vectors I could write it with just their indicies perhaps:\[\Large A_{22}B_{42}=A_{22}[B_{21}(b_{22})]=[C_{21}(c_{22})]=C_{42}\] By exchanging common factors and factoring maybe.
I don't understand your claim: "Sure, but this seems to violate the rule since AI=A they should have the same determinant. \(\det(AI)=\det(A)\), \(\det(A)^2≠\det(A)\) "
To make it clear, \[\left[\begin{matrix} \left[\begin{matrix}a & b\\ c & d\end{matrix}\right]& \left[\begin{matrix}0 & 0\\ 0 & 0\end{matrix}\right] \\ \left[\begin{matrix}0 & 0\\ 0 & 0\end{matrix}\right] &\left[\begin{matrix}a & b\\ c & d\end{matrix}\right]\end{matrix}\right]=\left[\begin{matrix}A & 0\\ 0 & A\end{matrix}\right] = \left[\begin{matrix}A & 0\\ 0 & A\end{matrix}\right]\times \left[\begin{matrix}I_4 & 0\\ 0 & I_4\end{matrix}\right] = AI_8\]
Ahh ok, so what I'm saying is I am pulling the matrix A inside the other matrix like you would commonly pull in a scalar: \[ AI =\left[\begin{matrix}a & b\\ c & d\end{matrix}\right] \left[\begin{matrix}1& 0\\ 0 & 1\end{matrix}\right] = \left[\begin{matrix} \left[\begin{matrix}a & b\\ c & d\end{matrix}\right]& \left[\begin{matrix}0 & 0\\ 0 & 0\end{matrix}\right] \\ \left[\begin{matrix}0 & 0\\ 0 & 0\end{matrix}\right] &\left[\begin{matrix}a & b\\ c & d\end{matrix}\right]\end{matrix}\right]\]
This may be better or worse... Let \(A=\left[\begin{matrix}a & b\\ c & d\end{matrix}\right]\),\(A' = \left[\begin{matrix} \left[\begin{matrix}a & b\\ c & d\end{matrix}\right]& \left[\begin{matrix}0 & 0\\ 0 & 0\end{matrix}\right] \\ \left[\begin{matrix}0 & 0\\ 0 & 0\end{matrix}\right] &\left[\begin{matrix}a & b\\ c & d\end{matrix}\right]\end{matrix}\right] =\left[\begin{matrix}A & 0\\ 0 & A\end{matrix}\right] \) \[\det(AI) = \det( \left[\begin{matrix}a & b\\ c & d\end{matrix}\right]\times \left[\begin{matrix}1 & 0\\ 0 & 1\end{matrix}\right]) =\det ( \left[\begin{matrix}a & b\\ c & d\end{matrix}\right]) = \det(A)\] \[\det(A'I) = \det( \left[\begin{matrix}A & 0\\ 0 & A\end{matrix}\right] \times \left[\begin{matrix}I_4 & 0\\ 0 & I_4\end{matrix}\right]) =\det ( \left[\begin{matrix}A & 0\\ 0 & A\end{matrix}\right]) = \det(A^2) = (\det(A))^2\]
I think no, because if the dimension of the matrix at left side is n*n, then dimension of the Matrix at right side is (2n)*(2n)
for example, from physics, if your thesis is true, then we ca say that Pauli matrices and Dirac matrices are the same matrices, which is the case n=2
@Michele_Laino What are you referring to when you say "I think no, ..."?
to the this equality: \[A=\left[\begin{matrix}A & 0 \\ 0 & A\end{matrix}\right]\]
oops.. to this equality:....
Essentially what I'm saying is perhaps we should really consider matrices to be able to be represented periodically and that \[A=\left[\begin{matrix}A & 0 \\ 0 & A\end{matrix}\right]\] Is in fact a correct representation.
No, it is not. That why I need to define A'
You say it is not, but there is no reason why it can't be. I can choose to define it to be this way if I choose.
\[A=\left[\begin{matrix}a & b\\ c & d\end{matrix}\right]\] \[A' = \left[\begin{matrix} \left[\begin{matrix}a & b\\ c & d\end{matrix}\right]& \left[\begin{matrix}0 & 0\\ 0 & 0\end{matrix}\right] \\ \left[\begin{matrix}0 & 0\\ 0 & 0\end{matrix}\right] &\left[\begin{matrix}a & b\\ c & d\end{matrix}\right]\end{matrix}\right] =\left[\begin{matrix}A & 0\\ 0 & A\end{matrix}\right]\] Two matrices can only be identical when 1) they are of the same size 2) the corresponding entries are identical. This is the definition.
Sorry, @Kainui if your equality, is true, then Puli matrices are able to act into a space of four-spinors, and of course that is false.
When you define in that way, you are just abusing the notation and make it unclear.
oops.. Pauli matrices...
Now to account for it, all I have to say is det(A) is the "principal determinant" and the version of this matrix with the direct sum of a matrix n times has the same determinant to the nth power. Arguing with this definition as you currently are is like saying there is only a single square root of 1. You can say the square root of 1 is defined to be 1 but I am saying it can be both 1 and -1 depending on what you need.
So, what is your question?
The question in the question box "Can we represent a matrix in a matrix?" --> Yes. Example: Block matrix Question about determinant: "\(\det(AI)=\det(A)\), \(\det(A)^2\ne \det(A)\)" --> I don't know since I don't know if the A are 2x2 matrix are a 4x4 matrix. I have shown you both cases in 6 posts above. An amendment in that post, the last line should be \(\det(A'I)\) \(= \det( \left[\begin{matrix}A & 0\\ 0 & A\end{matrix}\right] \times \left(\begin{matrix}I_2 & 0\\ 0 & I_2\end{matrix}\right]) \) \(=\det ( \left[\begin{matrix}A & 0\\ 0 & A\end{matrix}\right]) \) \(= \det(A^2) = (\det(A))^2\) Just to make my presentation clearer, I chose to define A' instead of using A since A' and A are not the same.
I am not disagreeing with you, I just think it might be time to depart from the standard definition, and in doing so, does this work? If so, how, or are there holes in it that cause it to be irreconcilable?
One more thing, when two matrices are the same, no matter how you name them, they are still the same. Example: If \(A=\left[\begin{matrix}1 & 2\\ 3 & 4\end{matrix}\right]\), \(B=\left[\begin{matrix}1 & 2\\ 3 & 4\end{matrix}\right]\), then A=B BUT if \(A=\left[\begin{matrix}1 & 2\\ 3 & 4\end{matrix}\right]\), \(B=\left[\begin{matrix}1 & 2\\ 4 & 3\end{matrix}\right]\), then A≠B Also, if \(A=\left[\begin{matrix}1 & 2\\ 3 & 4\end{matrix}\right]\), \(A=\left[\begin{matrix}1 & 2 & 5\\ 3 & 4 & 6\end{matrix}\right]\), then the first A is not the same as the second A
Yeah, I am not really worried about basic things like this. From the traditional standpoint these are obvious statements of fact.
So far, I know we can compute ordinary multiplication using blocks as entries, given that the blocks are compatible.
If the blocks are not compatible, you'd better re-partition it.
search up Kronecker Product
WOW! This seems to fill some of the missing part in my book about block matrix! Thanks a lot!!!
@watchmath Ahh this is interesting, thanks.
Join our real-time social learning platform and learn together with your friends!