Algebra
TOC
Introduction
Algebra is a branch of mathematics that deals with the study of mathematical symbols and the rules of manipulation of these symbols. It is a broad area of mathematics that covers topics such as equations, functions, polynomials, matrices, and vectors. Algebra is used extensively in various fields such as science, engineering, economics, and computer science. The fundamental concepts of algebra include operations such as addition, subtraction, multiplication, and division, as well as properties of numbers and variables.
The basics
Linear algebra let us write a system of linear equations compactly in matrix notation:
into Ax = b, with
and
We can denote
After having the matrix notation, we have the product of two matrices
Similarly, the inner product or dot product of two vectors
We have
For the outer product of two vectors
There are some properties we need to remember:
-
Matrix multiplication is associative (AB)C = A(BC)
-
Matrix multiplication is distributive A(B+C) = AB+AC
-
Matrix multiplication is generally not commutative
Definitions
The identity matrix is denoted
There is the property regarding identity matrix:
A diagonal matrix is a matrix where all non diagonal elements are 0s. This is denoted
In matrix algebra, transposing is a technique to flip rows and columns. Given a matrix
A square matrix
Let’s denote the set of all symmetric matrics of size n as
The trace of a square matrix
-
For
-
For
-
For
-
For A, B such that AB is square, tr(AB) = tr(BA)
-
For A, B, C such that ABC is square, tr(ABC) = tr(BCA) = tr(CAB)
A norm of a vector
Mathematically, a norm is function
-
(non negativity) -
f(x) = 0 if and only if x = 0
-
(homogeneity) -
Apart from Euclidean norm, we have the
The generalization of the family of
For matrices, we have the Frobenius norm:
Properties
Next we would study the linear independence of vectors. A set of vectors
-
For
. At equal, A is full rank. -
For
-
For
-
For
The inverse of a square matrix
If A is invertible then the roots x of the system Ax=b would be
Two vectors
The span of a set of vectors
The range of a matrix
The nullspace of a matrix
The determinant of a square matrix
For a 2x2 matrix
the rows are [1, 3] and [3, 2]. The determinant of this matrix is the area of the parallelogram of the two row vectors.
Here are some properties of the determinant:
-
The determinant of the identity is 1,
-
If we multiple a row of matrix
then the determinant increase by t factor. -
If we exchange any two row of A, then the determinant inverts into
-
For
-
For
-
For
if and only if A is singular (it doesn’t have a full rank and the columns are linearly dependent). -
For
and A non singular
The adjoint of a matrix
Given a square matrix
-
A symmetric matrix
is positive difinite (PD) if for all non zero vectors -
A symmetric matrix
is positive semidefinite (PSD) if for all vectors -
A symmetric matrix
is negative definite (ND) if for all non zero -
A symmetric matrix
is negative semidefinite (NSD) if for all -
A symmetric matrix
is indefinite (neither positive semidefinite nor negative semidefinite) if there exists such that and
If A is positive definite then -A is negative definite and vice versa. If A is positive semidefinite then -A is negative semidefinite and vice versa. If A is indefinite then so is -A. For positive definite and negative definite matrices, they are always full rank, hence invertible.
Eigen decomposition
For the square matrix
-
The trace of A is the sum of its eigenvalues:
-
The determinant of A is the product of its eigenvalues:
-
The rank of A is the number of its non zero eigenvalues
-
If A is nonsingular then
is an eigenvalue of with the associated eigenvector -
The eigenvalues of a diagonal matrix D are the diagonal entries.
For the matrix A and eigenvector matrix V and eigenvalue vector
For matrices that are not invertible, there is the Moore-Penrose pseudoinverse:
Matrix calculus
Let
The Hessian is the matrix of partial derivatives: