Activation Functions - Deep Learning [PDF]

Nov 2, 2016 - Some common activation functions. › Sigmoid. › Tanh. › Rectified Linear Unit (ReLU). › Leaky ReLU.

0 downloads 10 Views 4MB Size

Recommend Stories


[PDF] Deep Learning
If you feel beautiful, then you are. Even if you don't, you still are. Terri Guillemets

R Deep Learning Cookbook Pdf
Learning never exhausts the mind. Leonardo da Vinci

Deep learning
Ask yourself: What role does gratitude play in your life? Next

Deep learning
The only limits you see are the ones you impose on yourself. Dr. Wayne Dyer

deep learning
Pretending to not be afraid is as good as actually not being afraid. David Letterman

Deep Learning
Don't watch the clock, do what it does. Keep Going. Sam Levenson

Deep Learning
Make yourself a priority once in a while. It's not selfish. It's necessary. Anonymous

Deep Learning
Learn to light a candle in the darkest moments of someone’s life. Be the light that helps others see; i

Deep Learning
In every community, there is work to be done. In every nation, there are wounds to heal. In every heart,

Deep Learning
Almost everything will work again if you unplug it for a few minutes, including you. Anne Lamott

Idea Transcript


Activation Functions Zhipeng Ding Nov. 2, 2016

Outline ›  What

is the activation function? ›  Why use activation functions? ›  Some common activation functions. ›  Comparison. ›  Revise Saturated Activation Functions.

What is activation function?

https://www.quora.com/What-is-the-role-of-the-activation-function-in-a-neural-network

Biological motivation and connections

http://cs231n.github.io/neural-networks-1/

Why use activation functions

https://www.quora.com/What-is-the-role-of-the-activation-function-in-a-neural-network

Some common activation functions ›  Sigmoid ›  Tanh ›  Rectified

Linear Unit (ReLU) ›  Leaky ReLU ›  Parametric ReLU (PReLU) ›  Randomized Leaky ReLU (RReLU) ›  Exponential Linear Unit (ELU) ›  Maxout

Sigmoid/Tanh

http://cs231n.github.io/neural-networks-1/

http://kaiminghe.com/icml16tutorial/icml2016_tutorial_deep_residual_networks_kaiminghe.pdf

http://kaiminghe.com/icml16tutorial/icml2016_tutorial_deep_residual_networks_kaiminghe.pdf

Sigmoid/Tanh

http://cs231n.github.io/neural-networks-1/

ReLU/LReLU/PReLU/RReLU

[Xu et al., 2015]

ReLU/LReLU/PReLU/RReLU

[Xu et al., 2015]

ReLU/LReLU/PReLU/RReLU

[Xu et al., 2015]

ReLU

http://blog.csdn.net/cyh_24/article/details/50593400

LReLU/PReLU/RReLU

ELU

[Clevert et al., 2016]

Maxout

[Goodfellow et al., 2016]

Comparison

[Clevert et al., 2016]

Comparison ›  “Empirical

Evaluation of Rectified Activations in Convolution Network” ›  ReLU/LReLU/PReLU/RReLU ›  CXXNET ›  CIFAR-10/CIFAR-100 ›  National

(NDSB)

Data Science Bowl Competition

Revise Saturated Activation Functions

Scaled Sigmoid

[Xu et al., 2016]

Penalized Tanh

[Xu et al., 2016]

Work cited › 

Xu, B., Wang, N., Chen, T., & Li, M. (2015). Empirical evaluation of rectified activations in convolutional network. arXiv preprint arXiv:1505.00853.

› 

Clevert, D. A., Unterthiner, T., & Hochreiter, S. (2015). Fast and accurate deep network learning by exponential linear units (elus). arXiv preprint arXiv:1511.07289.

› 

Goodfellow, I. J., Warde-Farley, D., Mirza, M., Courville, A. C., & Bengio, Y. (2013). Maxout networks. ICML (3), 28, 1319-1327.

› 

He, K., Zhang, X., Ren, S., & Sun, J. (2015). Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. In Proceedings of the IEEE International Conference on Computer Vision (pp. 1026-1034).

› 

Xu, B., Huang, R., & Li, M. (2016). Revise Saturated Activation Functions. arXiv preprint arXiv: 1602.05980.

Thank you!

Smile Life

When life gives you a hundred reasons to cry, show life that you have a thousand reasons to smile

Get in touch

© Copyright 2015 - 2024 PDFFOX.COM - All rights reserved.