Idea Transcript
Activation Functions Zhipeng Ding Nov. 2, 2016
Outline What
is the activation function? Why use activation functions? Some common activation functions. Comparison. Revise Saturated Activation Functions.
What is activation function?
https://www.quora.com/What-is-the-role-of-the-activation-function-in-a-neural-network
Biological motivation and connections
http://cs231n.github.io/neural-networks-1/
Why use activation functions
https://www.quora.com/What-is-the-role-of-the-activation-function-in-a-neural-network
Some common activation functions Sigmoid Tanh Rectified
Linear Unit (ReLU) Leaky ReLU Parametric ReLU (PReLU) Randomized Leaky ReLU (RReLU) Exponential Linear Unit (ELU) Maxout
Sigmoid/Tanh
http://cs231n.github.io/neural-networks-1/
http://kaiminghe.com/icml16tutorial/icml2016_tutorial_deep_residual_networks_kaiminghe.pdf
http://kaiminghe.com/icml16tutorial/icml2016_tutorial_deep_residual_networks_kaiminghe.pdf
Sigmoid/Tanh
http://cs231n.github.io/neural-networks-1/
ReLU/LReLU/PReLU/RReLU
[Xu et al., 2015]
ReLU/LReLU/PReLU/RReLU
[Xu et al., 2015]
ReLU/LReLU/PReLU/RReLU
[Xu et al., 2015]
ReLU
http://blog.csdn.net/cyh_24/article/details/50593400
LReLU/PReLU/RReLU
ELU
[Clevert et al., 2016]
Maxout
[Goodfellow et al., 2016]
Comparison
[Clevert et al., 2016]
Comparison “Empirical
Evaluation of Rectified Activations in Convolution Network” ReLU/LReLU/PReLU/RReLU CXXNET CIFAR-10/CIFAR-100 National
(NDSB)
Data Science Bowl Competition
Revise Saturated Activation Functions
Scaled Sigmoid
[Xu et al., 2016]
Penalized Tanh
[Xu et al., 2016]
Work cited
Xu, B., Wang, N., Chen, T., & Li, M. (2015). Empirical evaluation of rectified activations in convolutional network. arXiv preprint arXiv:1505.00853.
Clevert, D. A., Unterthiner, T., & Hochreiter, S. (2015). Fast and accurate deep network learning by exponential linear units (elus). arXiv preprint arXiv:1511.07289.
Goodfellow, I. J., Warde-Farley, D., Mirza, M., Courville, A. C., & Bengio, Y. (2013). Maxout networks. ICML (3), 28, 1319-1327.
He, K., Zhang, X., Ren, S., & Sun, J. (2015). Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. In Proceedings of the IEEE International Conference on Computer Vision (pp. 1026-1034).
Xu, B., Huang, R., & Li, M. (2016). Revise Saturated Activation Functions. arXiv preprint arXiv: 1602.05980.
Thank you!