According to an earlier posting, I had suggested a recipe for ‘Perpendicularizing’ a matrix, that was to represent a Quadric Equation, according to methods which I learned in “Linear 1″. That approach used the application ‘wxMaxima’, which is actually a fancy front-end for the application ‘Maxima’. But the main drawback with the direct approach I had suggested, was that it depended on the package ‘lapack’, which I had written, takes a long time to compile.
Since writing that posting, I discovered that some users cannot even get ‘lapack’ to compile, making that a broken, unusable package for them. Yet, the desire could still exist, to carry out the same project. Therefore, I have now expounded on this, by using the package ‘eigen’, which is built in to Maxima, and which should work for more users, assuming there is no bug in the way Maxima was built.
The following work-sheet explains what initially goes wrong when using the package ‘eigen’, and, how to remedy the initial problem…
(Updated 6/17/2020, 14h35… )
(As of 1/14/2019, 12h00: )
I suppose that there’s an observation I should add. Using just a matrix of unit eigenvectors has as caveat, a possible outcome in which the eigenvectors are still not orthogonal. If that’s the case, then to use the transpose in place of the inverse is not acceptable.
If the reader is familiar with the exercise which I linked to at the top of this posting, he or she will notice that the matrix which I’m diagonalizing is diagonally symmetrical. This is because coefficients belonging to the quadric it represents, have either been given to one diagonal element, or distributed between two elements of the matrix equally.
In that case, the matrix of eigenvectors will be orthogonal.
(Update 6/17/2020, 14h35: )
Upon reading Other articles available on the Internet, I have learned that, to use the built-in capabilities of Maxima, runs into shortcomings that go beyond, simply having to reformat the output / lists. Apparently, when we use the native packages, Maxima will try to compute the eigenvalues of a matrix, that are factually the roots of a polynomial, as ‘exact, analytical solutions’, which are also referred to as ‘symbolic solutions’. The problem here is, that with polynomials of degree 5 or higher, no general solution is known, and even with polynomials of degree 4, the general solution is too complex, for consumer-grade Computer Algebra Systems (‘CAS’) to have it in their database.
For that reason, what this posting suggests will work well for 3×3 matrices, but will often fail to produce results, for larger matrices, unless those result in polynomials that happen to be easy for Maxima to solve, just like many exercises in Math courses have specifically been designed to have rational roots, where none are really guaranteed.
And then, once the eigenvalues could not be found, the eigenvectors could not be found either, and the question linked to above results.
Apparently, when using Maxima, the only way to circumvent this eventual problem is, to rely on ‘lapack’.