create_dl_layer_activation T_create_dl_layer_activation CreateDlLayerActivation CreateDlLayerActivation create_dl_layer_activation (Operator)
Name
create_dl_layer_activation T_create_dl_layer_activation CreateDlLayerActivation CreateDlLayerActivation create_dl_layer_activation — Create an activation layer.
Signature
void CreateDlLayerActivation (const HTuple& DLLayerInput , const HTuple& LayerName , const HTuple& ActivationType , const HTuple& GenParamName , const HTuple& GenParamValue , HTuple* DLLayerActivation )
HDlLayer HDlLayer ::CreateDlLayerActivation (const HString& LayerName , const HString& ActivationType , const HTuple& GenParamName , const HTuple& GenParamValue ) const
HDlLayer HDlLayer ::CreateDlLayerActivation (const HString& LayerName , const HString& ActivationType , const HString& GenParamName , const HString& GenParamValue ) const
HDlLayer HDlLayer ::CreateDlLayerActivation (const char* LayerName , const char* ActivationType , const char* GenParamName , const char* GenParamValue ) const
HDlLayer HDlLayer ::CreateDlLayerActivation (const wchar_t* LayerName , const wchar_t* ActivationType , const wchar_t* GenParamName , const wchar_t* GenParamValue ) const
(
Windows only)
static void HOperatorSet .CreateDlLayerActivation (HTuple DLLayerInput , HTuple layerName , HTuple activationType , HTuple genParamName , HTuple genParamValue , out HTuple DLLayerActivation )
HDlLayer HDlLayer .CreateDlLayerActivation (string layerName , string activationType , HTuple genParamName , HTuple genParamValue )
HDlLayer HDlLayer .CreateDlLayerActivation (string layerName , string activationType , string genParamName , string genParamValue )
Description
The operator create_dl_layer_activation create_dl_layer_activation CreateDlLayerActivation CreateDlLayerActivation create_dl_layer_activation creates an activation layer
whose handle is returned in DLLayerActivation DLLayerActivation DLLayerActivation DLLayerActivation dllayer_activation .
The parameter DLLayerInput DLLayerInput DLLayerInput DLLayerInput dllayer_input determines the feeding input layer and
expects the layer handle as value.
The parameter LayerName LayerName LayerName layerName layer_name sets an individual layer name.
Note that if creating a model using create_dl_model create_dl_model CreateDlModel CreateDlModel create_dl_model each layer of
the created network must have a unique name.
The parameter ActivationType ActivationType ActivationType activationType activation_type sets the type of the activation.
Every activation mode defines a pointwise function.
Supported activation types are:
'abs' "abs" "abs" "abs" "abs" :
Absolute value.
'acos' "acos" "acos" "acos" "acos" :
Arccosine activation.
'asin' "asin" "asin" "asin" "asin" :
Arcsine activation.
'atan' "atan" "atan" "atan" "atan" :
Arctangent activation.
'ceil' "ceil" "ceil" "ceil" "ceil" :
Rounds the input up to the nearest integer.
'celu' "celu" "celu" "celu" "celu" :
Continuously differentiable exponential linear unit (Celu) activation,
which is defined as follows:
Setting the generic parameter 'alpha' "alpha" "alpha" "alpha" "alpha" determines the value
(default: 1.0). For
a Celu
activation is identical to an ELU.
'clip' "clip" "clip" "clip" "clip" :
Clip the input to a given interval:
Setting the generic parameters 'min' "min" "min" "min" "min" and 'max' "max" "max" "max" "max" determines
the values
and
, respectively.
'cos' "cos" "cos" "cos" "cos" :
Cosine activation.
'cosh' "cosh" "cosh" "cosh" "cosh" :
Hyperbolic cosine activation.
'elu' "elu" "elu" "elu" "elu" :
Exponential linear unit (ELU) activation, which is defined as follows:
Setting the generic parameter 'alpha' "alpha" "alpha" "alpha" "alpha" determines the value
(default: 1.0).
'erf' "erf" "erf" "erf" "erf" :
Gauss error function, which is defined
as follows:
'exp' "exp" "exp" "exp" "exp" :
Exponential function:
'floor' "floor" "floor" "floor" "floor" :
Rounds the input down to the nearest integer.
'gelu' "gelu" "gelu" "gelu" "gelu" :
Gaussian error linear unit (Gelu) activation,
which is defined as follows:
Setting the generic parameter 'approximate' "approximate" "approximate" "approximate" "approximate" to 'tanh' "tanh" "tanh" "tanh" "tanh"
determines whether an approximate function estimation is used:
'hard_sigmoid' "hard_sigmoid" "hard_sigmoid" "hard_sigmoid" "hard_sigmoid" :
HardSigmoid activation:
Setting the generic parameters 'alpha' "alpha" "alpha" "alpha" "alpha" and 'beta' "beta" "beta" "beta" "beta"
determines the values
(default: 0.2) and
(default: 0.5).
'hard_swish' "hard_swish" "hard_swish" "hard_swish" "hard_swish" :
HardSwish activation:
with fixed
and
.
'log' "log" "log" "log" "log" :
Natural logarithm:
'mish' "mish" "mish" "mish" "mish" :
Mish activation:
'neg' "neg" "neg" "neg" "neg" :
Negative of input:
'pow' "pow" "pow" "pow" "pow" :
Power function:
Setting the generic parameter 'exponent' "exponent" "exponent" "exponent" "exponent" determines the value
.
'reciprocal' "reciprocal" "reciprocal" "reciprocal" "reciprocal" :
Reciprocal function:
'relu' "relu" "relu" "relu" "relu" :
Rectified linear unit (ReLU) activation.
By setting a specific ReLU parameter, another type can be specified
instead of the standard ReLU:
Standard ReLU, defined as follows:
Bounded ReLU, defined as follows:
Setting the generic parameter 'upper_bound' "upper_bound" "upper_bound" "upper_bound" "upper_bound" will result in a
bounded ReLU and determines the value of
.
Leaky ReLU, defined as follows:
Setting the generic parameter 'alpha' "alpha" "alpha" "alpha" "alpha" results in a leaky ReLU
and determines the value
. When the generic parameter
'upper_bound' "upper_bound" "upper_bound" "upper_bound" "upper_bound" is set, 'alpha' "alpha" "alpha" "alpha" "alpha" will be ignored and the
result will be a bounded ReLU.
'round' "round" "round" "round" "round" :
Rounds the input to the nearest integer.
Values with '.5' are rounded to the nearest even integer, e.g.:
,
.
'sigmoid' "sigmoid" "sigmoid" "sigmoid" "sigmoid" :
Sigmoid activation, which is defined as follows:
'sin' "sin" "sin" "sin" "sin" :
Sine activation.
'sinh' "sinh" "sinh" "sinh" "sinh" :
Hyperbolic sine activation.
'softplus' "softplus" "softplus" "softplus" "softplus" :
Softplus activation function, which is defined
as follows:
'softsign' "softsign" "softsign" "softsign" "softsign" :
Softsign activation function, which is defined
as follows:
'sqrt' "sqrt" "sqrt" "sqrt" "sqrt" :
Square root of the input:
'swish' "swish" "swish" "swish" "swish" :
Swish activation function, which is defined as
follows:
Setting the generic parameter 'alpha' "alpha" "alpha" "alpha" "alpha" determines the value
(default: 1.0).
'tan' "tan" "tan" "tan" "tan" :
Tangent activation.
'tanh' "tanh" "tanh" "tanh" "tanh" :
Tanh activation, which is defined as follows:
'thresholded_relu' "thresholded_relu" "thresholded_relu" "thresholded_relu" "thresholded_relu" :
Thresholded ReLU, defined as follows:
Setting the generic parameter 'alpha' "alpha" "alpha" "alpha" "alpha" determines the value
(default: 1.0).
The following generic parameters GenParamName GenParamName GenParamName genParamName gen_param_name and the corresponding
values GenParamValue GenParamValue GenParamValue genParamValue gen_param_value are supported:
'is_inference_output' "is_inference_output" "is_inference_output" "is_inference_output" "is_inference_output" :
Determines whether apply_dl_model apply_dl_model ApplyDlModel ApplyDlModel apply_dl_model will include the output of this
layer in the dictionary DLResultBatch DLResultBatch DLResultBatch DLResultBatch dlresult_batch even without specifying this
layer in Outputs Outputs Outputs outputs outputs ('true' "true" "true" "true" "true" ) or not ('false' "false" "false" "false" "false" ).
Default:
'false' "false" "false" "false" "false"
'upper_bound' "upper_bound" "upper_bound" "upper_bound" "upper_bound" ('relu' "relu" "relu" "relu" "relu" ):
Float value defining an upper bound for a rectified linear unit.
If the activation layer is part of a model which has been created using
create_dl_model create_dl_model CreateDlModel CreateDlModel create_dl_model , the upper bound can be unset. To do so, use
set_dl_model_layer_param set_dl_model_layer_param SetDlModelLayerParam SetDlModelLayerParam set_dl_model_layer_param and set an empty tuple for
'upper_bound' "upper_bound" "upper_bound" "upper_bound" "upper_bound" .
Default:
[]
'alpha' "alpha" "alpha" "alpha" "alpha" ('relu' "relu" "relu" "relu" "relu" , 'elu' "elu" "elu" "elu" "elu" , 'celu' "celu" "celu" "celu" "celu" ,
'thresholded_relu' "thresholded_relu" "thresholded_relu" "thresholded_relu" "thresholded_relu" , 'swish' "swish" "swish" "swish" "swish" , 'hard_sigmoid' "hard_sigmoid" "hard_sigmoid" "hard_sigmoid" "hard_sigmoid" ):
Float value defining the alpha parameter of a leaky ReLU, thresholded
ReLU, ELU, CELU, Swish or HardSigmoid-activation.
Restriction:
The value of 'alpha' "alpha" "alpha" "alpha" "alpha" must be
positive or zero for all activations except for ActivationType ActivationType ActivationType activationType activation_type
'thresholded_relu' "thresholded_relu" "thresholded_relu" "thresholded_relu" "thresholded_relu" .
This parameter is incompatible with and overridden by
'upper_bound' "upper_bound" "upper_bound" "upper_bound" "upper_bound" for ActivationType ActivationType ActivationType activationType activation_type 'relu' "relu" "relu" "relu" "relu" .
Default:
0.2 for 'hard_sigmoid' "hard_sigmoid" "hard_sigmoid" "hard_sigmoid" "hard_sigmoid" , else 1.0
'beta' "beta" "beta" "beta" "beta" ('hard_sigmoid' "hard_sigmoid" "hard_sigmoid" "hard_sigmoid" "hard_sigmoid" ):
Float value defining the beta parameter of a HardSigmoid activation.
Default:
0.5
'min' "min" "min" "min" "min" ('clip' "clip" "clip" "clip" "clip" ):
Float value defining the min parameter of a clip activation.
'max' "max" "max" "max" "max" ('clip' "clip" "clip" "clip" "clip" ):
Float value defining the max parameter of a clip activation.
'approximate' "approximate" "approximate" "approximate" "approximate" ('gelu' "gelu" "gelu" "gelu" "gelu" ):
This value determines whether an approximate function estimation is used.
List of values:
'tanh' "tanh" "tanh" "tanh" "tanh" , 'false' "false" "false" "false" "false"
Default:
'false' "false" "false" "false" "false"
'exponent' "exponent" "exponent" "exponent" "exponent" ('pow' "pow" "pow" "pow" "pow" ):
Float value defining the exponent parameter of a pow activation.
Certain parameters of layers created using this operator
create_dl_layer_activation create_dl_layer_activation CreateDlLayerActivation CreateDlLayerActivation create_dl_layer_activation can be set and retrieved using
further operators.
The following tables give an overview, which parameters can be set
using set_dl_model_layer_param set_dl_model_layer_param SetDlModelLayerParam SetDlModelLayerParam set_dl_model_layer_param and which ones can be retrieved
using get_dl_model_layer_param get_dl_model_layer_param GetDlModelLayerParam GetDlModelLayerParam get_dl_model_layer_param or get_dl_layer_param get_dl_layer_param GetDlLayerParam GetDlLayerParam get_dl_layer_param . Note, the
operators set_dl_model_layer_param set_dl_model_layer_param SetDlModelLayerParam SetDlModelLayerParam set_dl_model_layer_param and get_dl_model_layer_param get_dl_model_layer_param GetDlModelLayerParam GetDlModelLayerParam get_dl_model_layer_param
require a model created by create_dl_model create_dl_model CreateDlModel CreateDlModel create_dl_model .
Generic Layer Parameters
set
get
'is_inference_output' "is_inference_output" "is_inference_output" "is_inference_output" "is_inference_output"
x
x
'num_trainable_params' "num_trainable_params" "num_trainable_params" "num_trainable_params" "num_trainable_params"
x
'alpha' "alpha" "alpha" "alpha" "alpha"
x
x
'beta' "beta" "beta" "beta" "beta"
x
x
'upper_bound' "upper_bound" "upper_bound" "upper_bound" "upper_bound"
x
x
'min' "min" "min" "min" "min"
x
x
'max' "max" "max" "max" "max"
x
x
'approximate' "approximate" "approximate" "approximate" "approximate"
x
x
'exponent' "exponent" "exponent" "exponent" "exponent"
x
x
Execution Information
Multithreading type: reentrant (runs in parallel with non-exclusive operators).
Multithreading scope: global (may be called from any thread).
Processed without parallelization.
Parameters
DLLayerInput DLLayerInput DLLayerInput DLLayerInput dllayer_input (input_control) dl_layer → HDlLayer , HTuple HHandle HTuple Htuple (handle) (IntPtr ) (HHandle ) (handle )
Feeding layer.
LayerName LayerName LayerName layerName layer_name (input_control) string → HTuple str HTuple Htuple (string) (string ) (HString ) (char* )
Name of the output layer.
ActivationType ActivationType ActivationType activationType activation_type (input_control) string → HTuple str HTuple Htuple (string) (string ) (HString ) (char* )
Activation type.
Default:
'relu'
"relu"
"relu"
"relu"
"relu"
List of values:
'abs' "abs" "abs" "abs" "abs" , 'acos' "acos" "acos" "acos" "acos" , 'asin' "asin" "asin" "asin" "asin" , 'atan' "atan" "atan" "atan" "atan" , 'ceil' "ceil" "ceil" "ceil" "ceil" , 'celu' "celu" "celu" "celu" "celu" , 'clip' "clip" "clip" "clip" "clip" , 'cos' "cos" "cos" "cos" "cos" , 'cosh' "cosh" "cosh" "cosh" "cosh" , 'elu' "elu" "elu" "elu" "elu" , 'erf' "erf" "erf" "erf" "erf" , 'exp' "exp" "exp" "exp" "exp" , 'floor' "floor" "floor" "floor" "floor" , 'gelu' "gelu" "gelu" "gelu" "gelu" , 'hard_sigmoid' "hard_sigmoid" "hard_sigmoid" "hard_sigmoid" "hard_sigmoid" , 'hard_swish' "hard_swish" "hard_swish" "hard_swish" "hard_swish" , 'log' "log" "log" "log" "log" , 'mish' "mish" "mish" "mish" "mish" , 'neg' "neg" "neg" "neg" "neg" , 'pow' "pow" "pow" "pow" "pow" , 'reciprocal' "reciprocal" "reciprocal" "reciprocal" "reciprocal" , 'relu' "relu" "relu" "relu" "relu" , 'round' "round" "round" "round" "round" , 'sigmoid' "sigmoid" "sigmoid" "sigmoid" "sigmoid" , 'sin' "sin" "sin" "sin" "sin" , 'sinh' "sinh" "sinh" "sinh" "sinh" , 'softplus' "softplus" "softplus" "softplus" "softplus" , 'softsign' "softsign" "softsign" "softsign" "softsign" , 'sqrt' "sqrt" "sqrt" "sqrt" "sqrt" , 'swish' "swish" "swish" "swish" "swish" , 'tan' "tan" "tan" "tan" "tan" , 'tanh' "tanh" "tanh" "tanh" "tanh" , 'thresholded_relu' "thresholded_relu" "thresholded_relu" "thresholded_relu" "thresholded_relu"
GenParamName GenParamName GenParamName genParamName gen_param_name (input_control) attribute.name(-array) → HTuple MaybeSequence[str] HTuple Htuple (string) (string ) (HString ) (char* )
Generic input parameter names.
Default:
[]
List of values:
'alpha' "alpha" "alpha" "alpha" "alpha" , 'approximate' "approximate" "approximate" "approximate" "approximate" , 'beta' "beta" "beta" "beta" "beta" , 'exponent' "exponent" "exponent" "exponent" "exponent" , 'is_inference_output' "is_inference_output" "is_inference_output" "is_inference_output" "is_inference_output" , 'max' "max" "max" "max" "max" , 'min' "min" "min" "min" "min" , 'upper_bound' "upper_bound" "upper_bound" "upper_bound" "upper_bound"
GenParamValue GenParamValue GenParamValue genParamValue gen_param_value (input_control) attribute.value(-array) → HTuple MaybeSequence[Union[int, float, str]] HTuple Htuple (string / integer / real) (string / int / long / double) (HString / Hlong / double) (char* / Hlong / double)
Generic input parameter values.
Default:
[]
Suggested values:
'true' "true" "true" "true" "true" , 'false' "false" "false" "false" "false" , 'tanh' "tanh" "tanh" "tanh" "tanh"
DLLayerActivation DLLayerActivation DLLayerActivation DLLayerActivation dllayer_activation (output_control) dl_layer → HDlLayer , HTuple HHandle HTuple Htuple (handle) (IntPtr ) (HHandle ) (handle )
Activation layer.
Module
Deep Learning Professional