Skip to content

Latest commit

 

History

History
61 lines (42 loc) · 2.59 KB

README.md

File metadata and controls

61 lines (42 loc) · 2.59 KB

Differentiable Random Partition Models

This is the official codebase for the Neurips 2023 paper Differentiable Random Partition Models.

The code and repository is still work in progress.

The code for the experiment "partitioning of generative factors" is not yet ready to be released, but we are working on it. Apologies for that!

Two-stage Process

In this paper, we introduce the Differentiable Random Partition Model (DRPM), a differentiable relaxation to incorporate RPMs into gradient-based learning frameworks.

The DRPM is based on the following two-stage process:

Two-stage process

Installation

To install the differentiable random partition module and run experiments from the paper Differentiable Random Partition Models, make sure to add the path to the repository for the MVHG distribution to line 4 in setup.py.

The code for the MVHG model can be found here.

After that, you can install the drpm using

pip install .[pt]

In order to run the multi-task experiment or the clustering experiment, please consult the README.md files in their respective experiment folder under experiments/.

Disclaimer

The release of the code for the experiment Partitioning of generative factors is delayed. We work on transforming everything from tensorflow v1 to pytorch. Initially, we based our experiments on the disentanglement_Lib, which is written in tensorflow v1. To improve the usability of the released code and make everything more future-proof, we rewrite the experiment in pytorch. Unfortunately, this is still work in progress.

Citation

If you use our model in your work, please cite us using the following citation

@inproceedings{sutterryser2023drpm,
  title={Differentiable Random Partition Models},
  author={Sutter, Thomas M and Ryser, Alain and Liebeskind, Joram and Vogt, Julia E},
  year = {2023},
  booktitle = {Advances in Neural Information Processing Systems},
}

and also

@inproceedings{sutter2023mvhg,
  title={Learning Group Importance using the Differentiable Hypergeometric Distribution},
  author={Sutter, Thomas M and Manduchi, Laura and Ryser, Alain and Vogt, Julia E},
  year = {2023},
  booktitle = {International Conference on Learning Representations},
}

Questions

For any questions or requests, please reach out to:

Thomas Sutter ([email protected])

Alain Ryser ([email protected])