One of my favorite areas of mathematics. I am especially fond of numerical (multi)linear algebra.
Projection onto a vector subspace is a common task in linear algebra courses. Affine subspaces, sets of the form $ \{ \mathbf{v} \in \mathbb{R}^m : \exists \mathbf{c} : \mathbf{v} = \mathbf{Ac} + \mathbf{b} \} $ for fixed matrix $\mathbf{A}$ and ...
An often stressed property of the conjugate gradient method (CG) for solving linear systems is the monotonic decrease in the A-norm of the error. When CG is applied in practice the exact solution is unknown and the error cannot be computed or tracked, so ...
Recently I needed to solve the following problem: given a square matrix $\mathbf{A}$ find another square matrix $\mathbf{B}$ satisfying $\text{vec}(\mathbf{A})^T \text{vec}(\mathbf{B}) = 0$. This is the sense in which I use the word orthogonal ...
Here we are concerned with “non-negative” linear systems, that is, linear systems where $\mathbf{A}, \mathbf{b} \geq \mathbf{0}$ (elementwise). In particular, we give a sufficient condition for ...
The pivoted LU factorization of square matrices is the key routine for the direct solution of linear systems. All square, invertible matrices have a pivoted LU factorization of the form $\mathbf{PA} = \mathbf{LU}$, ...
Like all fields of mathematics, linear algebra has many prominent figures whose names are non-trivial to pronounce in English (the lingua franca of science and mathematics).
Eigenpairs of symmetric matrices are intimately related to optimization and critical points, with the eigenvectors being critical points of the Rayleigh quotient. In optimization settings, the type of critical point (minimum, maximum, ...
Although the majority of successful algorithms for the symmetric tensor eigenvalue problem use optimization techniques directly, there are a few notable algorithms that do not appear to be based on optimization. Rather, they more closely ...