Skip to content

Latest commit

 

History

History

snns

Self-Normalizing Networks

Note: Codes are modified to fit Python 3.6 and Tensorflow 1.4.

Original repository: bioinf-jku/SNNs

Tutorials and implementations for "Self-normalizing networks"(SNNs) as suggested by Klambauer et al.

Versions

  • Python 3.6 and Tensorflow 1.4

Note for Tensorflow 1.4 users

Tensorflow 1.4 already has the function "tf.nn.selu" and "tf.contrib.nn.alpha_dropout" that implement the SELU activation function and the suggested dropout version.

Tutorials

  • Multilayer Perceptron (notebook)
  • Convolutional Neural Network on MNIST (notebook)
  • Convolutional Neural Network on CIFAR10 (notebook)

KERAS CNN scripts:

Design novel SELU functions

  • How to obtain the SELU parameters alpha and lambda for arbitrary fixed points (python codes)

Basic python functions to implement SNNs

are provided as code chunks here: selu.py

Notebooks and code to produce Figure 1 in Paper

are provided here: Figure1

Calculations and numeric checks of the theorems

UCI, Tox21 and HTRU2 data sets

Models and architectures built on Self-Normalizing Networks

GANs

Convolutional neural networks

FNNs are finally deep

Reinforcement Learning

Autoencoders

Recurrent Neural Networks