skerch
: Sketched Linear Operations for PyTorchGitHub | PyPI | Docs | CI | Tests |
---|---|---|---|---|
|
|
|
|
|
In many computational fields we often have to run computations on large or slow linear objects (such as matrices). But very often, those objects admit a much smaller representation (such as low-rank).
Through the magic of randomized linear algebra, sketched methods allow us to directly obtain the smaller representation, without having to look at the whole thing! This can lead to very substantial gains in speed and scale at minimal or no cost in accuracy, e.g. here is an exact eigendecomposition of a 40000x40000 Deep Learning Hessian, which runs in ~30 seconds on a CPU:
And here is a head-to-head comparison between skerch.algorithms.ssvd
and the PyTorch counterparts, torch.linalg.svd
and torch.svd_lowrank
:
skerch
delivers sketched methods to your doorstep with full power and flexibility, such as SVD/EIGH, diagonal/triangular approximations and norm estimations.
All you need to do is to provide an object that satisfies this simple interface, and skerch
will do the rest:
class MyLinOp:
def __init__(self, shape):
self.shape = shape # (height, width)
def __matmul__(self, x):
return "... implement A @ x ..."
def __rmatmul__(self, x):
return "... implement x @ A ..."
Version 1.0 is the first major release, with lots of good stuff:
Check the example gallery and tutorials for quick and direct ways to get started:
pip install skerch
Here’s to a skerched earth! 🍾 🌍