Official code repository for Distributional Autoencoders Know the Score, NeurIPS 2025.
Besides software from pypi installed via requirements.txt, this repository depends on the following packages:
For code that was modified, the respective licenses are reproduced in the third_party_licenses folder.
-
Clone the repository:
git clone https://github.com/andleb/DistributionalAutoencodersScore cd DistributionalAutoencodersScore -
Install general dependencies:
pip install -r requirements.txt -
Install the Distributional Principal Autoencoder, mlcolvar, and PyTorch-VAE dependencies.
Option A: Install from PyPI:
pip install DistributionalPrincipalAutoencoder pip install mlcolvar # NOTE: PyTorch-VAE is not on PyPI, clone it into src/ as shown in Option B!Option B: Clone them locally:
mkdir -p src cd src git clone https://github.com/xwshen51/DistributionalPrincipalAutoencoder.git git clone https://github.com/luigibonati/mlcolvar.git git clone https://github.com/AntixK/PyTorch-VAE.git cd ..
One could also use submodules for the above if they are familiar with them.
-
If needed, add the root of this repository to your
PYTHONPATH:export PYTHONPATH="${PYTHONPATH}:$(pwd)"
All paths below are relative to the repository root. Unless otherwise noted, we assume the repository root is on your PYTHONPATH.
The contents assume Option B from the installation instructions above was used and append the src/ folder to the PYTHONPATH.
The results can be reproduced by running the files in the exp/ folder - see the structure listing below for particular result.
For a quickstart, the exp/Gaussian_score.ipynb notebook is probably the best self-contained example to start with.
The structure of the repository is as follows:
data- datasets used in the experimentsexp- the experiments scripts and notebooksGaussian_score.ipynb– reproduces Figure 1score_alignment.py- reproduces Table 1MB.ipynb- reproduces Figure 2MFEP_comparisons.py- reproduces Table 2 and Figures 6, 7train_indep.py- trains the basic models for Table 3run_train_simple.sh- bash script to train multiple models in paralleltrain_swiss.py- trains the Swiss-roll models for Table 3train_scurve.py- trains the S-curve models for Table 3train_scurve.sh,run_train_simple.sh,train_swiss.sh- bash scripts to train the models for Table 3Indep-deterministic.ipynb- reproduces Table 3run_CRT_linear.py- performs the CRT experiment in Section 4.2Indep-extra.ipynb- reproduces Table 6
utils- utility functions (load the module onto your path)mfep_utils.py- utility functions for MFEP experimentsplot_utils.py- plotting utilities (some adapted frommlcolvar)
If you find this work useful in your research, please consider citing the paper:
@inproceedings{
leban2025distributionalautoencodersknowscore,
title={Distributional Autoencoders Know the Score},
author={Andrej Leban},
booktitle={The Thirty-ninth Annual Conference on Neural Information Processing Systems},
year={2025},
url={https://neurips.cc/virtual/2025/poster/119870}
}