ReLU() 返回错误,但 Activation('relu') 在 TensorFlow Functional API 上工作正常



根据文档Activation('relu')ReLU()应该产生类似的结果,除了ReLU()中的附加参数。

然而,

X = Activation('relu')(X)

工作好。但

X=ReLU()(X)

给出以下错误:

NameError: name 'ReLU' is not defined

为什么会这样?ReLU()不应该与功能API一起使用吗?

您需要了解激活函数和ReLU。

它们并不总是返回相同的值,但RelU是整流线性单元激活函数,但激活RelU是目标层激活Fn。

[示例]:

import tensorflow as tf
layer = tf.keras.layers.ReLU()
output = layer([-3.0, -1.0, 0.0, 2.0])
print(output.numpy())
print( "================" )
print( tf.keras.layers.Dense(1, activation='relu')(tf.constant([-3.0, -1.0, 0.0, 2.0], shape=( 4, 1 )).numpy()) )

[Output]:

F:tempPython>python test_tf_ReLU.py
2022-05-10 12:38:02.190099: I tensorflow/core/platform/cpu_feature_guard.cc:151] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-05-10 12:38:02.770833: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1525] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 4634 MB memory:  -> device: 0, name: NVIDIA GeForce GTX 1060 6GB, pci bus id: 0000:01:00.0, compute capability: 6.1
[0. 0. 0. 2.]
================
(None, 4, 1)
[[[0.       ]
[0.       ]
[0.       ]
[2.0980666]]]
F:tempPython>

相关内容

  • 没有找到相关文章

最新更新