This repository aims to reproduce the results in "Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization" as part of the NeurIPS 2019 reproducibility challenge. We have implemented the Binary optimization algorithm in PyTorch and with it are able to train a binary neural network on CIFAR-10. See the reproducibility report for details.
First, install dependencies
# clone project
git clone https://github.com/nikvaessen/Rethinking-Binarized-Neural-Network-Optimization
# install project
cd https://github.com/nikvaessen/Rethinking-Binarized-Neural-Network-Optimization
pip install -e .
pip install requirements.txt
If you are interested in training a BNN on cifar-10, you can navigate to research_seed/cifar
and run cifar_trainer.py
.
# module folder
cd research_seed/cifar/
# run module
python cifar_trainer.py
In order to reproduce the original paper we have implemented the following:
- Bytorch implements binary optimisation and binary layers in PyTorch
- cifar implement BinaryNet (from this paper) for CIFAR-10
- theoretical implements experiments to disprove the approximation viewpoint as well as behaviour of learning rates under latent-weight optimisation
- experiments contains convenience scripts to reproduce the experiments of section 5.1 and 5.2 of the original paper