Note: Codes are modified to fit Python 3.6 and Tensorflow 1.4.
Original repository: bioinf-jku/SNNs
Tutorials and implementations for "Self-normalizing networks"(SNNs) as suggested by Klambauer et al.
- Python 3.6 and Tensorflow 1.4
Tensorflow 1.4 already has the function "tf.nn.selu" and "tf.contrib.nn.alpha_dropout" that implement the SELU activation function and the suggested dropout version.
- Multilayer Perceptron (notebook)
- Convolutional Neural Network on MNIST (notebook)
- Convolutional Neural Network on CIFAR10 (notebook)
- KERAS: Convolutional Neural Network on MNIST (python script)
- KERAS: Convolutional Neural Network on CIFAR10 (python script)
- How to obtain the SELU parameters alpha and lambda for arbitrary fixed points (python codes)
are provided as code chunks here: selu.py
are provided here: Figure1
- THINKING LIKE A MACHINE - GENERATING VISUAL RATIONALES WITH WASSERSTEIN GANS: Both discriminator and generator trained without batch normalization.
- Deformable Deep Convolutional Generative Adversarial Network in Microwave Based Hand Gesture Recognition System: The rate between SELU and SELU+BN proves that SELU itself has the convergence quality of BN.
- Solving internal covariate shift in deep learning with linked neurons: Show that ultra-deep CNNs without batch normalization can only be trained SELUs (except with the suggested method described by the authors).
- DCASE 2017 ACOUSTIC SCENE CLASSIFICATION USING CONVOLUTIONAL NEURAL NETWORK IN TIME SERIES: Deep CNN trained without batch normalization.
- Point-wise Convolutional Neural Network: Training with SELU converges faster than training with ReLU; improved accuracy with SELU.
- Over the Air Deep Learning Based Radio Signal Classification: Slight performance improvement over ReLU.
- Convolutional neural networks for structured omics: OmicsCNN and the OmicsConv layer: Deep CNN trained without batch normalization.
- Searching for Activation Functions: ResNet architectures trained with SELUs probably together with batch normalization.
- EddyNet: A Deep Neural Network For Pixel-Wise Classification of Oceanic Eddies: Fast CNN training with SELUs. ReLU with BN better at final performance but skip connections not handled appropriately.
- SMILES2Vec: An Interpretable General-Purpose Deep Neural Network for Predicting Chemical Properties: 20-layer ResNet trained with SELUs.
- Sentiment Analysis of Tweets in Malayalam Using Long Short-Term Memory Units and Convolutional Neural Nets
- RETUYT in TASS 2017: Sentiment Analysis for Spanish Tweets using SVM and CNN
- Predicting Adolescent Suicide Attempts with Neural Networks: The use of the SELU activation renders batch normalization unnecessary.
- Improving Palliative Care with Deep Learning: An 18-layer neural network with SELUs performed best.
- An Iterative Closest Points Approach to Neural Generative Models
- Retrieval of Surface Ozone from UV-MFRSR Irradiances using Deep Learning: 6-10 layer networks perform best.
- Automated Cloud Provisioning on AWS using Deep Reinforcement Learning: Deep CNN architecture trained with SELUs.
- Learning to Run with Actor-Critic Ensemble: Second best method (actor-critic ensemble) at the NIPS2017 "Learning to Run" competition. They have tried several activation functions and found that the activation function of Scaled Exponential Linear Units (SELU) are superior to ReLU, Leaky ReLU, Tanh and Sigmoid.
- Replacement AutoEncoder: A Privacy-Preserving Algorithm for Sensory Data Analysis: Deep autoencoder trained with SELUs.
- Application of generative autoencoder in de novo molecular design: Faster convergence with SELUs.
- Sentiment extraction from Consumer-generated noisy short texts: SNNs used in FC layers.