tf.hessians() - ValueError: 不支持 None 值



这是回溯:

Traceback (most recent call last):
File "test.py", line 39, in <module>
hess = tf.hessians(loss, wrt_variables)
File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/ops/gradients_impl.py", line 970, in hessians
_gradients = array_ops.unstack(_gradients)
File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/ops/array_ops.py", line 952, in unstack
value = ops.convert_to_tensor(value)
File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/framework/ops.py", line 639, in convert_to_tensor
as_ref=False)
File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/framework/ops.py", line 704, in internal_convert_to_tensor
ret = conversion_func(value, dtype=dtype, name=name, as_ref=as_ref)
File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/framework/constant_op.py", line 113, in _constant_tensor_conversion_function
return constant(v, dtype=dtype, name=name)
File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/framework/constant_op.py", line 102, in constant
tensor_util.make_tensor_proto(value, dtype=dtype, shape=shape, verify_shape=verify_shape))
File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/framework/tensor_util.py", line 360, in make_tensor_proto
raise ValueError("None values not supported.")
ValueError: None values not supported.

变量:

import tensorflow as tf
data_x = [0., 1., 2.]
data_y = [-1., 1., 3.]
batch_size = len(data_x)
x = tf.placeholder(shape=[batch_size], dtype=tf.float32, name="x")
y = tf.placeholder(shape=[batch_size], dtype=tf.float32, name="y")
W = tf.Variable(tf.ones(shape=[1]), dtype=tf.float32, name="W")
b = tf.Variable(tf.zeros(shape=[1]), dtype=tf.float32, name="b")
pred = x * W + b
loss = tf.reduce_mean(0.5 * (y - pred)**2)

然后,跟进此代码将起作用:

wrt_variables = [W, b]
hess = tf.hessians(loss, wrt_variables)

但这失败了:

wrt_variables = tf.concat([W, b], axis=0)
hess = tf.hessians(loss, wrt_variables)

这也将失败:

wrt_variables = [tf.concat([W, b], axis=0)]
hess = tf.hessians(loss, wrt_variables)

对于整形操作,它也会失败。

可以在此处查看此代码的完整版本和注释: https://gist.github.com/guillaume-chevalier/6b01c4e43a123abf8db69fa97532993f

谢谢!

这是因为在您的图中,节点loss不依赖于节点tf.concat([W,b], axis=0)。一个没有反向传播到另一个,因此没有导数。

Tensorflow 不是一个正式的微积分引擎,它只能通过另一个节点估计一个节点的导数,前提是前者在后者的下游。所以例如甚至

tf.hessian(loss, 2*W)

将出于同样的原因失败(2*W是一个新节点,loss不依赖于它(,即使与tf.hessian(loss, W)的关系是直截了当的。

请注意,坐位与tf.gradients相同,即使它失败的方式不同:它返回Nones 而不是抛出异常。

最新更新