Skip to content

Commit 2b66fac

Browse files
committed
cleaning up the readme
1 parent 8dcaa44 commit 2b66fac

File tree

1 file changed

+76
-7
lines changed

1 file changed

+76
-7
lines changed

README.md

Lines changed: 76 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,84 @@
1-
# variationalsparsebayes
1+
<div align="center">
22

3-
This library provides a PyTorch implementation for learning sparse models with with half-cauchy priors using stochastic variational inference.
3+
# VariationalSparseBayes
44

5-
# Features
5+
#### A PyTorch library for stochastic variational inference with sparsity inducing priors.
66

7-
The main features of the library are methods for performing:
7+
</div>
8+
9+
<p align="center">
10+
<img align="middle" src="examples/well-tuned-example.png" alt="comparison-plot"/>
11+
</p>
12+
13+
_A comparison plot for sparse Bayesian regression with a half-cauchy prior (L),
14+
support vector regression (C), and the relevance vector machine (R). We see the
15+
half-cauchy prior provides a more sparse solution and better error bars._
16+
17+
## What is VariationalSparseBayes?
18+
19+
This package provides an implementation of the algorithm described in
20+
[Louizos et. al. (2017)](https://arxiv.org/pdf/1705.08665.pdf) for use on a
21+
broad class of machine learning problems.
22+
23+
## Installation
24+
25+
```bash
26+
pip install variationalsparsebayes
27+
```
28+
29+
## Usage
30+
31+
The library provides a high-level interface with some prebuilt sparse Bayesian
32+
models and a low-level interface for building custom sparse Bayesian models.
33+
34+
### High-level interface
35+
36+
The library provides a few sparse Bayesian models:
837

938
- [sparse polynomial regression](https://github.com/coursekevin/variationalsparsebayes/blob/main/examples/sparse_poly_regression.py)
39+
- [sparse Bayesian neural networks](https://github.com/coursekevin/variationalsparsebayes/blob/main/examples/sparse_bnn_regression.py).
1040
- sparse learning with [precomputed features](https://github.com/coursekevin/variationalsparsebayes/blob/main/examples/support_vectors.py)
11-
- sparse learning of [Bayesian neural networks](https://github.com/coursekevin/variationalsparsebayes/blob/main/examples/sparse_bnn_regression.py).
1241

13-
To implement your own custom features, you can inherit from the [SparseFeaturesLibrary](https://github.com/coursekevin/variationalsparsebayes/blob/main/variationalsparsebayes/sparse_glm.py) class.
42+
To implement your own linear model, you can inherit from the [SparseFeaturesLibrary](https://github.com/coursekevin/variationalsparsebayes/blob/main/variationalsparsebayes/sparse_glm.py) class.
43+
Note that I haven't implemented the "group" sparsity idea presented in [Louizos et. al. (2017)](https://arxiv.org/pdf/1705.08665.pdf).
44+
Sparsification is performed at the parameter level (meaning far less computational savings).
45+
46+
### Low-level interface
47+
48+
The most important class provided by the library is the [SVIHalfCauchyPrior](https://github.com/coursekevin/variationalsparsebayes/blob/main/variationalsparsebayes/svi_half_cauchy.py#L96).
49+
The class inherits from [nn.Module](https://pytorch.org/docs/stable/generated/torch.nn.Module.html).
50+
The user is responsible for (i) transforming a batch of weights from the variational
51+
posterior into a batch of predictions and (ii) adding the KL-divergence provided
52+
by the prior onto the negative ELBO.
53+
54+
```python
55+
from torch import nn
56+
from variationalsparsebayes import SVIHalfCauchyPrior
57+
58+
class MyModel(nn.Module):
59+
def __init__(self, num_params: int):
60+
super().__init__()
61+
# we initialize the prior with tau=1e-5 (see https://arxiv.org/pdf/1705.08665.pdf)
62+
self.prior = SVIHalfCauchyPrior(num_params, 1e-5)
63+
...
64+
65+
def forward(self, x, num_reparam_samples):
66+
w_samples = self.prior.get_reparam_weights(num_reparam_samples)
67+
sparse_index = self.prior.sparse_index
68+
# user transforms weights and inputs into predictions
69+
...
70+
71+
def elbo(self, x, y):
72+
return log_like(x, y) - self.prior.kl_divergence()
73+
74+
model = MyModel(num_params)
75+
...
76+
```
77+
78+
When it comes time to sparsify the approximate posterior run:
1479

15-
More generally you can use the [SVIHalfCauchyPrior](https://github.com/coursekevin/variationalsparsebayes/blob/main/variationalsparsebayes/svi_half_cauchy.py) class to perform sparse regression with _any_ parameterized model. To do so you need to define a method which takes reparameterized sample weights and computes the expected log-likelihood of your data using these weights.
80+
```python
81+
model.prior.update_sparse_index()
82+
# get the index of all weights which remain after sparsification
83+
model.prior.sparse_index
84+
```

0 commit comments

Comments
 (0)