RandLAPACK.
For more details, see the
The RandLAPACK book.
Recent years have even seen the incorporation of RandNLA methods into MATLAB, the NAG Library, NVIDIA's cuSOLVER, and SciKit-Learn.
We are setting up the "RandBLAS" and "RandLAPACK" libraries, to serve as standards (for RandNLA and other methods), conceptually analogous to BLAS and LAPACK.
This is part of the larger BALLISTIC project (with J. Demmel, J. Dongarra, J. Langou, J. Langou, and P. Luszczek).
Python Algorithms for Randomized Linear Algebra (PARLA), a Python package for prototyping algorithms in a future C++ library for RandNLA.
Matlab Algorithms for Randomized Linear Algebra (MARLA), a Matlab library for prototyping algorithms in a future C++ library for RandNLA.
SuperBench, is a benchmark dataset and evaluation framework for super-resolution tasks in scientific machine learning (so more of a benchmark data/framework than software, but I'll include it here).
For more details, see the
arXiv paper.
imate,
developed by Siavash Ameli,
a modular high-performance C++/CUDA library distributed as a Python package that provides scalable
randomized algorithms for computationally expensive matrix functions in machine learning.
Glearn,
developed by Siavash Ameli,
a modular and high-performance Python package
for machine learning using Gaussian process regression
with novel algorithms capable of petascale computation on multi-GPU devices.
ADAHESSIAN, a second order based optimizer for neural network training based on PyTorch.
For more details, see the
arXiv paper
as well as
the repo.
ZeroQ, a zero-shot quantization framework.
For more details, see the
arXiv paper.
PyHessian, a pytorch library for Hessian based analysis of neural network models..
For more details, see the
arXiv paper.
WeightWatcher, open-source, diagnostic tool for analyzing neural networks, without needing access to training or even test data.
For more details, see the
arXiv paper
as well as
the repo,
or
the repo,
or
the repo.
LocalGraphClustering.
For more details, see the
arXiv paper.
Hessian Flow.
For more details, see the
arXiv paper.
Alchemist project.
For more details, see the
RISE project page on Alchemist
or the KDD 2018 paper or the CUG 2018 paper.
See also
the repo.
ANODEV2.
For more details, see the
arXiv paper.
Distributed Second-order Convex Optimization.
For more details, see the
arXiv paper.
GPU-accelerated Sub-sampled Newton's Method.
For more details, see the
arXiv paper.
Second-Order Optimization for Non-Convex Machine Learning.
For more details, see the
arXiv paper.
Local Graph Clustering.
For more details, see the
PIEEE paper.
Performance of linear algebra in Spark.
For more details, see the arXiv paper,
or the talk at the 2016 Dato Data Science Summit,
or the blog post by Alex Gittens.
LSRN: the randomized least-squares solver for parallel environments.
For more details, see the
LSRN paper.