Examples
Examples may be used as templates for new projects... All examples are at GitHub/examples:
Simple MLP
: A simple multi-layer perceptron for MNIST classification, build with Knet and Helferlein-types in just one line of code (or so).
Simple LeNet
: A simple LeNet for MNIST classification, build with help of the Helferlein layers in just two (ok: long) lines of code.Training unbalanced data with help of a focal loss function
: A simple MLP with focal loss demonstrate classification of highly unbalanced data.
Vanilla Autoencoder
: A simple autoencoder design with help of Knet in Helferlein-style.Convolutional Autoencoder
: A convolutional autoencoder design with help of Knet in Helferlein-style.Variational Autoencoder
: Example for a simple VAE utilising the NNHelferlein-typeVAE
and demonstrating the fascinating regularisation of a VAE.Simple sequence-to-sequence network
: Simple s2s network to demonstrate how to setup macghine translation with a rnn.Sequence-to-sequence RNN for machine translation
: RNN to demonstrate how to setup machine translation with a bidirectional encoder RNN and attention.RNN Sequence tagger for annotation of ECGs
: RNN to demonstrate how to set-up a sequence tagger to detect heart beats. Only one layer with 8 units is necessary to achieve almost 100% correct predictions. The example includes the definition on peephole LSTMs to display how to integrate non-standard rnn-units with the NNHelfrelein framework.Import a Keras model
: The notebook shows the import of a pretrained VGG16 model from Tensorflow/Keras into a Knet-style CNN and its application to example images utilising the Helferlein imagenet-utilities.
Transformer for machine translation
: A simple transformer architecture is set up according to the 2017 Vaswani paper Attention is All You Need with help of NNHelferlein-utils.
Simple Transformer API for Bert-like architectures
: A simple transformer architecture is set up with the NNHelferlein transformer API.
Pretrained Nets
Based on the Keras import constructors, it is easy to import pretrained models from the TF/Keras ecosystem.