Nothing Special   »   [go: up one dir, main page]

Skip to content

Implementation of Densely Connected Convolutional Network with Keras and TensorFlow.

Notifications You must be signed in to change notification settings

cmasch/densenet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 

Repository files navigation

DenseNet

Densely Connected Convolutional Network (DenseNet) is a network architecture where each layer is directly connected to every other layer in a feed-forward fashion. It's quite similar to ResNet but in contrast DenseNet concatenates outputs instead of using summation. If you need a quick introduction about how DenseNet works, please read the original paper[1]. It's well written and easy to understand.

I implemented a DenseNet in Python using Keras and TensorFlow as backend. Because of this I can't guarantee that this implementation is working well with Theano or CNTK. In the next months I will update the code to TensorFlow 2.x. Besides I will try to optimize this architecture in my own way with some modifications. You can find several implementations on GitHub.

Results

Fashion-MNIST

I used this notebook to evaluate the model on fashion-MNIST with following parameters:

Dense Blocks Depth Growth Rate Dropout Bottlen. Compress. BatchSize /
Epochs
Training
(loss / acc)
Validation
(loss / acc)
Test
(loss / acc)
5 35 20 0.4 False 0.9 100 / 80 0.1366 / 0.9681 0.1675 / 0.9643 0.2739 / 0.9459

Feel free to try it on your own with another parameters.

Requirements

  • TensorFlow 2.x
  • Python 3.x

Usage

Feel free to use this implementation:

import densenet
model = densenet.DenseNet(input_shape=(28,28,1), nb_classes=10, depth=10, growth_rate=25,
                          dropout_rate=0.1, bottleneck=False, compression=0.5).build_model()
model.summary()

This will build the following model:

References

[1] Densely Connected Convolutional Networks
[2] DenseNet - Lua implementation

Author

Christopher Masch

Releases

No releases published

Packages

No packages published

Languages