Whether the Columns of Matrices have a Natural Order

This article is meant for readers who, like me, have studied Linear Algebra and who, like me, are curious about Quantum Mechanics.

Are the columns of matrices in a given, natural order, as we write them? Well, if we are using the matrix as a rotation matrix in CGI – i.e. its elements are derived from the trig functions of Euler Angles – then the column order depends, on the order in which we have labeled coordinates to be X, Y and Z. We are not free to change this order in the middle of our calculations, but if we decide that X, Y and Z are supposed to form a different set, then we need to use different matrices as well.

OTOH, We also know that a matrix can be an expression of a system of simultaneous equations, which can be solved manually through Gauss-Jordan Elimination on the matrix. If we have found that our system has infinitely many solutions, then we are inclined to say that certain variables are the “Leading Variables” while the others are the “Free Variables”. It is being taught today, that the Free Variables can also be made our parameters, so that the set of values for the Leading Variables follows from those parameters. But wait. Should it not be arbitrary for certain combinations of variables, which follows from which?

The answer is, that if we simply use Gauss-Jordan Elimination, and if two variables are connected as having possibly infinite combinations of values, then it will always be the variables stated earlier in the equations which become the Leading, and the ones stated later in the equations will become the Free Variables. We could restate the entire equations with the variables in some other order, and then surely enough, the variable that used to be a Free one will have become a new Leading one, and vice-versa. (And if we do so, the parametric equations for the other Leading variables will generally also change.)

The order of the columns, has become the order of discovery.

This could also have ramifications for Quantum Mechanics, where matrices are sometimes used. QM used matrices at first, in an effort to be empirical, and to acknowledge that we as Humans, can only observe a subset of the properties which particles may have. And then what happens in QM, is that some of the matrices used are computed to have Eigenvalues, and if those turn out to be real numbers, they are also thought to correspond to observable properties of particles, while complex Eigenvalues are stated – modestly enough – not to correspond to observable properties of the particle.

Even though this system seems straightforward, it is not foolproof. A Magnetic North Pole corresponds according to Classical Principles, to an angle from which an assumed current is always flowing arbitrarily, clockwise or counter-clockwise. It should follow then, that from a different perspective, a current which was flowing clockwise before, should always be flowing counter-clockwise. And yet according to QM, monopoles should exist.

Continue reading Whether the Columns of Matrices have a Natural Order

Self-Educating about Perpendicular Matrices with Complex Elements

One of the key reasons for which my class was taught Linear Algebra, including how to compute Eigenvalues and Eigenvectors of Matrices, was so that we could Diagonalize Symmetrical Matrices, in Real Numbers. What this did was to compute the ‘Perpendicular Matrix’ of a given matrix, in which each column was one of its Eigenvectors, and which was an example of an Orthogonal Matrix.  (It might be the case that what was once referred to as a Perpendicular Matrix, may now be referred to as the Orthogonal Basis of the given matrix,?)

Having computed the perpendicular matrix P of M, it was known that the matrix product

PT M P = D,

which gives a Diagonal Matrix ‘D’. But, a key problem my Elementary Linear class was not taught to solve, was what to do if ‘M’ had complex Eigenvalues. In order to be taught that, we would need to have been taught in general, how to combine Linear Algebra with Complex Numbers. After that, the Eigenvectors could have been computed as easily as before, using Gauss-Jordan Elimination.

I have brushed up on this in my old Linear Algebra textbook, where the last chapter writes about Complex Numbers. Key facts which need to be understood about Complex Vector Spaces, is

  • The Inner Product needs to be computed differently from before, in a way that borrows from the fact that complex numbers naturally have conjugates. It is now the sum, of each element of one vector, multiplied by the conjugate, of the corresponding element of the other vector.
  • Orthogonal and Symmetrical Matrices are relatively unimportant with Complex Elements.
  • A special operation is defined for matrices, called the Conjugate Transpose, A* .
  • A Unitary Matrix now replaces the Orthogonal Matrix, such that A-1 = A* .
  • A Hermitian Matrix now replaces the Symmetrical Matrix, such that A = A* , and the elements along the main diagonal are Real. Hermitian Matrices are also easy to recognize by inspection.
  • Not only Hermitian Matrices can be diagonalized. They have a superset, known as Normal Matrices, such that A A* = A* A . Normal Matrices can be diagonalized.

This could all become important in Quantum Mechanics, considering the general issue known to exist, by which the bases that define how particles can interact, somehow need to be multiplied by complex numbers, to describe accurately, how particles do interact.

Continue reading Self-Educating about Perpendicular Matrices with Complex Elements