The necessary and sufficient conditions for matrix equivalence are: the same type and the same rank ... all matrices that have undergone elementary transformation are equivalent. ...
Vector group equivalence cannot be deduced from matrix equivalence ... because vector groups are equivalent ... the number of column vectors can be different.
That is, it does not meet the same type.
Equivalence of vector groups;
Equivalence of two vector groups: these two vector groups can be expressed linearly with each other ... so r(A)=r(B)
But the two vector groups can have different linear correlations. ...
Obviously, the N-dimensional column vector group whose rank is not n is equivalent to its largest independent group. ...
But the matrix formed by these two vector groups is not equivalent ... the reason is: the types are different.
The linear correlation between these two vector groups is also different ... The largest independent group ... is linearly independent.
N-dimensional column vector group ... linear correlation ....
Final conclusion:! ! ! ! Two peers cannot push each other! ! ! ! !
Two. A vs
adjoint matrix
A*
(1) When
r(A)=n
When r(A*)=n
(2) When
r(A)=n
R(A*)= 1 at - 1。
(3) When
r(A)& lt; =n-2
When r(A*)=0
Proved as follows:
( 1)AA*=|A|E
Because r(A)=n
It is deduced that A is reversible, so n=r(|A|E)=r(AA*)=r(A*).
(2)r(A)=n- 1, and it is deduced that |A|=0, and there is a sub-formula of order n- 1 that is not 0, so A*≠0, r (a *) > = 1
|A|E=0=AA*
So: r (a)+r (a *) < =n
So: r(A*)= 1.
(3) When
r(A)& lt; =n-2
When the n- 1 subformulas of a are all 0, so A*=0.
So: r(A*)=0
PS: The above conclusions can be deduced from each other.
In other words, the counter-proposition holds.
3. Eigenvalue Eigenvector
(1) For the same n-order matrix A, the eigenvectors of different eigenvalues are linearly independent. ..
(2) When the eigenvalues are multiple roots, the eigenvectors corresponding to the eigenvalues of multiple roots are assumed to be X 1, X2.
Linear combination: k1x1+K2x2 (k1,K2 is not all 0) is still the eigenvector of A.
(3) The sum of eigenvectors of different eigenvalues must not be the eigenvector of A (reduction to absurdity can be used)
(4) A certain eigenvalue has countless eigenvectors, but when we construct the matrix p, we only use one.
A (usually basic solution system)
Geometric space attribute
Geometric significance of the relationship between complementary vectors
1。 If the vector A 1 is linearly related to A2, there must be a1/A2.
2。 If vectors A 1 and A2 are linearly independent, they intersect or are out of plane.
3。 If the vectors A 1, A2 and A3 are linearly related, then a1/a2//a3 or its * * * plane.
4。 If the vectors A 1, A2 and A3 are linearly independent, then A 1, A2 and A3 are not * * * planes.
Ps: I don't need it for my three-number exam. That's why I added Baoyu's words. Brother Song. ...
Algebraic cofactor
The (1) algebraic cofactor is signed. Determining the+-sign of algebraic cofactor by using reverse order number.
(2) When finding the adjoint matrix of a matrix with the algebraic cofactor, remember to change the row of the cofactor into a column and the column into a row.
(3) The product of the algebraic cofactor of a row or column of a matrix and the corresponding element of another row or column is 0.
(4) An algebraic cofactor is not affected by the corresponding elements of this algebraic cofactor ... that is, it has nothing to do with his elements. ..
For example: a 1 1 and A 1 1 ... Even if the value of a 1 1 changes, its algebraic remainder remains the same. ...
Contract matrix and similarity matrix
First of all, these matrices are all based on real symmetric matrices and the following conclusions are drawn.
(1) when A~B
Matrices A and B have the same eigenvalue. According to orthogonal transformation, matrices A and B can have the same quadratic form.
So there are the same positive and negative inertia coefficients ... so. Double matrix contract.
Conclusion: Two real symmetric matrices are similar, and a dual matrix contract can be derived.
(2) The real symmetric matrix must be diagonalized: there is an orthogonal matrix p, so that P(T)AP=∧.
According to the definition of contraction matrix, any real symmetric matrix must be contracted to a diagonal matrix.