Integration
primate
supports a variety of matrix-types of the box, including numpy ndarray
’s, compressed sparse matrices (a lá SciPy), and LinearOperators.
The basic requirements for any operator A
to be used with any of primate
’s spectral method are:
- A method
A.__matmul__(v: ndarray) -> ndarray
ORA.matvec(v: ndarray) -> ndarray
implementing v \mapsto Av - An attribute
A.shape -> tuple[int, int]
giving the output/input dimensions of A - An attribute
A.dtype -> np.dtype
giving the floating point precision of A (32- or 64-bit)
Here’s an example of a simple operator representing a Diagonal matrix:
import numpy as np
from numpy.typing import ArrayLike
from scipy.sparse.linalg import LinearOperator
class DiagonalOp(LinearOperator):
= None
diag: np.ndarray
def __init__(self, d: ArrayLike, dtype = None):
self.diag = np.array(d)
self.shape = (len(d), len(d))
self.dtype = np.dtype('float64') if dtype is None else dtype
def _matvec(self, x: ArrayLike) -> np.ndarray:
= self.diag * np.ravel(x)
out return out.reshape(x.shape)
Note that by following the subclassing rules of SciPy’s LinearOperator, this class inherits a .matvec()
method and thus satisfies the constraints above.
Alternatively, using SciPy’s aslinearoperator
, this can also be done more succinctly:
from scipy.sparse.linalg import aslinearoperator
type('DiagonalOp', (object,), {
aslinearoperator('shape': (len(d),)*2,
'matvec': lambda x: np.atleast_2d(d).T * v
}))# <_CustomLinearOperator with dtype=float64>
C++ usage
On the C++ side, it is similar: simply pass any type with .shape()
and .matvec()
member functions:
class LinOp {
int nrow, ncol;
(int nr, int nc) : nrow(nr), ncol(nc) {}
LinOp
void matvec(const float* input, float* output) const {
... // implementation details
}
void shape() const { return std::make_pair(nrow, ncol); }
}
It’s up to you to ensure shape()
yields the correct size; primate
will supply vectors to input
of size .shape().second
(number of columns) and guarantees the pointer to the output
will be at least shape().first
(number of rows), no more.