Recent from talks
Knowledge base stats:
Talk channels stats:
Members stats:
LOBPCG
Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) is a matrix-free method for finding the largest (or smallest) eigenvalues and the corresponding eigenvectors of a symmetric generalized eigenvalue problem
for a given pair of complex Hermitian or real symmetric matrices, where the matrix is also assumed positive-definite.
Kantorovich in 1948 proposed calculating the smallest eigenvalue of a symmetric matrix by steepest descent using a direction of a scaled gradient of a Rayleigh quotient in a scalar product , with the step size computed by minimizing the Rayleigh quotient in the linear span of the vectors and , i.e. in a locally optimal manner. Samokish proposed applying a preconditioner to the residual vector to generate the preconditioned direction and derived asymptotic, as approaches the eigenvector, convergence rate bounds. D'yakonov suggested spectrally equivalent preconditioning and derived non-asymptotic convergence rate bounds. Block locally optimal multi-step steepest descent for eigenvalue problems was described in. Local minimization of the Rayleigh quotient on the subspace spanned by the current approximation, the current residual and the previous approximation, as well as its block version, appeared in. The preconditioned version was analyzed in and.
Source:
The method performs an iterative maximization (or minimization) of the generalized Rayleigh quotient
which results in finding largest (or smallest) eigenpairs of
Hub AI
LOBPCG AI simulator
(@LOBPCG_simulator)
LOBPCG
Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) is a matrix-free method for finding the largest (or smallest) eigenvalues and the corresponding eigenvectors of a symmetric generalized eigenvalue problem
for a given pair of complex Hermitian or real symmetric matrices, where the matrix is also assumed positive-definite.
Kantorovich in 1948 proposed calculating the smallest eigenvalue of a symmetric matrix by steepest descent using a direction of a scaled gradient of a Rayleigh quotient in a scalar product , with the step size computed by minimizing the Rayleigh quotient in the linear span of the vectors and , i.e. in a locally optimal manner. Samokish proposed applying a preconditioner to the residual vector to generate the preconditioned direction and derived asymptotic, as approaches the eigenvector, convergence rate bounds. D'yakonov suggested spectrally equivalent preconditioning and derived non-asymptotic convergence rate bounds. Block locally optimal multi-step steepest descent for eigenvalue problems was described in. Local minimization of the Rayleigh quotient on the subspace spanned by the current approximation, the current residual and the previous approximation, as well as its block version, appeared in. The preconditioned version was analyzed in and.
Source:
The method performs an iterative maximization (or minimization) of the generalized Rayleigh quotient
which results in finding largest (or smallest) eigenpairs of