Introduction
Let's ensure we understand the foundational concepts before delving further into the subjects. Here is a brief introduction if you are unfamiliar with the Eigen.

The C++ template library Eigen is used for matrix and linear algebra operations. It supports all matrix sizes, including sparse matrices, small fixed-size matrices, and arbitrary huge dense matrices. We run our test suite against several compilers to ensure reliability and find any compiler bugs, and Eigen provides good support for these compilers.
In this blog, we will discuss a catalogue of the dense matrix decompositions offered by Eigen and their terminology.
Without further ado, let's get started.
Catalogue of decompositions offered by Eigen
In this part, We talk about a catalogue of the dense matrix decompositions offered by Eigen. Firstly we Know about the use of Eigen Decomposition. The eigenvalues and eigenvectors can be found using the eigen decomposition method to break down a matrix. Because it makes some matrix operations simpler and provides crucial information about the matrix itself, this operation may be helpful.
💁 This table refers to the Generic information, which is not Eigen Specific.

💁 This table refers to Eigen Specific.

Note:
🎯 1: The LDLT algorithm comes in two different forms. In contrast to Lapack's one, which creates a block diagonal D matrix, Eigen's one produces a pure diagonal D matrix, making it incapable of handling indefinite matrices.
🎯 2: Iterative techniques are used in Schur, SVD, and Eigenvalues decompositions. The degree of eigenvalue separation affects how quickly they converge.
🎯 3: Because our JacobiSVD is two-sided, it provides square matrices with the most accurate and reliable precision. We must first employ a QR preconditioner for non-square matrices. ColPivHouseholderQR, the default option, is already highly trustworthy, but use FullPivHouseholderQR if you want it to be validated.
Terminology
✅ Self-adjoint.
Self-adjoint and symmetric are terms that describe real matrices. Self-adjoint and hermitian are equivalent in a complex matrix. To put it more simply, a matrix A is only self-adjoint if its adjoint matrix A is equal to it. Another name for the adjoint is the conjugate transpose.
✅ Positive/negative definite.
If v*Av>0 for any non-zero vector v, then a self-adjoint matrix A is positive definite. In a similar vein, if v*Av0 for any non-zero vector v, then it is negative definite.
✅ Positive/negative semidefinite.
If v∗Av≥0 for any vector v that is not zero, a self-adjoint matrix A is positive and semi-definite. Similarly, if v∗Av≤0 for any non-zero vector v, it is negative semi-definite.
✅ Blocking.
This implies that the algorithm can run on each block, resulting in a decent scaling of the performance for big matrices.
✅ Implicit Multi Threading (MT).
This translates to the algorithm using OpenMP to benefit from multicore processors. Implicit means that while the algorithm is not parallelized, it relies on routines for the matrix-matrix product.
✅ Explicit Multi Threading (MT).
This means the algorithm is explicitly parallelized to take advantage of multicore processors via OpenMP.
✅ Meta-unroller.
Translates to an explicit and automatic unrolling of the algorithm for very small fixed-size matrices.
Frequently Asked Questions
Why do we use eigen decomposition?
The eigenvalues and eigenvectors can be found by using the eigendecomposition method to break down a matrix. Because it makes some matrix operations simpler to carry out and provides crucial information about the matrix itself, this operation may be helpful.
What do eigenvalues mean?
The unique set of scalar values known as eigenvalues is connected to the linear equations most likely found in matrix equations. The term "characteristic roots' ' also refers to the eigenvectors. After applying linear transformations, it is a non-zero vector that its scalar factor can only alter.
Does every matrix have an eigen decomposition?
There is an eigenvalue for each real matrix. However, it could be complicated. If every matrix containing entries in field K has an eigenvalue, then field K is algebraically closed. The companion matrix can be used to demonstrate one direction.