下面的代码在张量流中给出"未读变量"错误


# importing libraries
import numpy as np
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.losses import MSE
# creating keras model
model = tf.keras.Sequential()
model.add(Dense(8, input_shape=(1,), activation='tanh'))
model.add(Dense(4, activation='relu'))
model.add(Dense(1))
# defining optimization parameters
opt=Adam(learning_rate = 1e-3)
# creating input and output variables for training
X = np.arange(1.,10.,1).reshape(-1,1)
Y = np.arange(2.,20.,2).reshape(-1,1)
# defining loss - only one iteration is performed
with tf.GradientTape() as tape:
pred = model(X)
loss = MSE(pred, Y)
# calculating gradients for loss w.r.t model parameters 
grads = tape.gradient(loss, model.trainable_variables)
# updating model parameters with above calculated gradients
opt.apply_gradients(zip(grads, model.trainable_weights))

我得到以下错误:

tf.Variable 'UnreadVariable' shape=() dtype=int64, numpy=1

我已经尝试过:tf.compat.v1.disable_eager_execution()禁用急切的张量执行,但我无法提取任何张量的值。此外,我不知道禁用eagerTensor是否真的解决了这个问题,因为我无法打印渐变或损失。

如果您在应用梯度之前和之后打印可训练权重,您将清楚地看到它们已经更新,即使只是向右或向左推一小段:

grads = tape.gradient(loss, model.trainable_variables)
tf.print(model.trainable_variables)
opt.apply_gradients(zip(grads, model.trainable_weights))
tf.print(model.trainable_variables)

最新更新