relu: Rectified Linear Unit, y=max(x,0)y=max(x,0)
sigmoid: y=11+exp(?x)y=11+exp(?x)
tanh: Hyperbolic tangent, y=exp(x)?exp(?x)exp(x)+exp(?x)y=exp(x)?exp(?x)exp(x)+exp(?x)
softrelu: Soft ReLU, or SoftPlus, y=log(1+exp(x))y=log(1+exp(x))
softsign: y=x1+abs(x)
mxnet.ndarray.Activation(data=None, act_type=_Null, out=None, name=None, **kwargs)?
data (NDArray) – The input array.
act_type ({‘relu‘, ‘sigmoid‘, ‘softrelu‘, ‘softsign‘, ‘tanh‘}, required) – Activation function to be applied.
# -*- coding: utf-8 -*-"""Spyder EditorThis is a temporary script file."""import mxnet as mximport numpy as npx = mx.nd.array([[1.085,-2.75,-5.6,9.9],[3.087,5.32,3.75,11.865]])y = mx.nd.Activation(x,act_type=‘relu‘)print yy = mx.nd.Activation(x,act_type=‘softrelu‘)print y
Click and drag to move
[[ 1.085 ?0. ????0. ????9.9 ?]
[ 3.087 ?5.32 ??3.75 ?11.865]]
<NDArray 2x4 @cpu(0)>
[[1.3761026e+00 6.1967585e-02 3.6910437e-03 9.9000502e+00]
[3.1316278e+00 5.3248811e+00 3.7732456e+00 1.1865006e+01]]
<NDArray 2x4 @cpu(0)>
mxnet-Activation激活函数
原文地址:http://blog.51cto.com/13959448/2316472