Keras/Tensorflow 中的错误"ValueError: Graph disconnected"



我在Tensorflow 2中遇到一个错误。我该如何解决?

以下是我的代码(假设Keras的所有相关模块/对象都已导入(:

dense1 = 2**7
dense2 = 2**8
dense3 = 2**9
dropout = 0.8
price_loss = 1
cut_loss = 1
activation= LeakyReLU()
#====================================================================
# INPUTS
#====================================================================

#----------------------------------------------------------------
carat = Input(
shape= (1,),
batch_size= batch_size,
name= 'carat'
)
#----------------------------------------------------------------
color = Input(
shape= (1,),
batch_size= batch_size,
name= 'color'
)
#----------------------------------------------------------------
clarity = Input(
shape= (1,),
batch_size= batch_size,
name= 'clarity'
)
#----------------------------------------------------------------
depth = Input(
shape= (1,),
batch_size= batch_size,
name= 'depth'
)
#----------------------------------------------------------------
table = Input(
shape= (1,),
batch_size= batch_size,
name= 'table'
)
#----------------------------------------------------------------
x = Input(
shape= (1,),
batch_size= batch_size,
name= 'x'
)
#----------------------------------------------------------------
y = Input(
shape= (1,),
batch_size= batch_size,
name= 'y'
)
#----------------------------------------------------------------
z = Input(
shape= (1,),
batch_size= batch_size,
name= 'z'
)
#----------------------------------------------------------------
#====================================================================
# CREATE EMBEDDINGS FOR CATEGORICAL FEATURES "COLOR" AND "CLARITY"
#====================================================================

color = Embedding(input_dim = 7, output_dim = 1, name = 'color_emb')(color)
clarity = Embedding(input_dim = 8, output_dim = 1, name = 'clarity_emb')(clarity)
color = Flatten()(color)
clarity = Flatten()(clarity)

#====================================================================
# CONCATENATE FEATURES
#====================================================================

x = Concatenate()([color, clarity, carat, depth, table, x, y, z])

#====================================================================
# DENSE NETWORK
#====================================================================

x = Dense(dense1, activation = activation)(x)
x = BatchNormalization()(x)
x = Dense(dense2, activation = activation)(x)
x = BatchNormalization()(x)
x = Dense(dense3, activation = activation)(x)
x = BatchNormalization()(x)
x = Dropout(dropout)(x)
#====================================================================
# PREDICTIONS
# ====================================================================
cut = Dense(1, activation = 'sigmoid')(x)
price = Dense(1)(x)
#====================================================================
# DEFINE THE MODEL
# ====================================================================
model = Model(inputs = [carat, color, clarity, depth, table, x, y, z] , outputs = [cut , price])
#====================================================================
# COMPILE THE MODEL
# ====================================================================
model.compile(
optimizer = 'Adam',
loss = {
"price": "huber_loss",

"cut": "binary_crossentropy",

},
loss_weights = [price_loss, cut_loss],
metrics = {
"price": ["mean_absolute_percentage_error"],

"cut": [tf.keras.metrics.AUC(), tf.keras.metrics.Precision(thresholds = thresholds)],
}
)

堆栈跟踪:

WARNING:tensorflow:Functional inputs must come from `tf.keras.Input` (thus holding past layer metadata), they cannot be the output of a previous non-Input layer. Here, a tensor specified as input to "functional_1" was not an Input tensor, it was generated by layer flatten_8.
Note that input tensors are instantiated via `tensor = tf.keras.Input(shape)`.
The tensor that caused the issue was: flatten_8/Reshape:0
WARNING:tensorflow:Functional inputs must come from `tf.keras.Input` (thus holding past layer metadata), they cannot be the output of a previous non-Input layer. Here, a tensor specified as input to "functional_1" was not an Input tensor, it was generated by layer flatten_9.
Note that input tensors are instantiated via `tensor = tf.keras.Input(shape)`.
The tensor that caused the issue was: flatten_9/Reshape:0
WARNING:tensorflow:Functional inputs must come from `tf.keras.Input` (thus holding past layer metadata), they cannot be the output of a previous non-Input layer. Here, a tensor specified as input to "functional_1" was not an Input tensor, it was generated by layer dropout_2.
Note that input tensors are instantiated via `tensor = tf.keras.Input(shape)`.
The tensor that caused the issue was: dropout_2/cond/Identity:0
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-64-132a2d8458b9> in <module>
135 # ====================================================================
136 
--> 137 model = Model(inputs = [carat, color, clarity, depth, table, x, y, z] , outputs = [cut , price])
138 
139 #====================================================================
~AppDataRoamingPythonPython37site-packagestensorflowpythonkerasenginetraining.py in __new__(cls, *args, **kwargs)
240       # Functional model
241       from tensorflow.python.keras.engine import functional  # pylint: disable=g-import-not-at-top
--> 242       return functional.Functional(*args, **kwargs)
243     else:
244       return super(Model, cls).__new__(cls, *args, **kwargs)
~AppDataRoamingPythonPython37site-packagestensorflowpythontrainingtrackingbase.py in _method_wrapper(self, *args, **kwargs)
455     self._self_setattr_tracking = False  # pylint: disable=protected-access
456     try:
--> 457       result = method(self, *args, **kwargs)
458     finally:
459       self._self_setattr_tracking = previous_value  # pylint: disable=protected-access
~AppDataRoamingPythonPython37site-packagestensorflowpythonkerasenginefunctional.py in __init__(self, inputs, outputs, name, trainable)
113     #     'arguments during initialization. Got an unexpected argument:')
114     super(Functional, self).__init__(name=name, trainable=trainable)
--> 115     self._init_graph_network(inputs, outputs)
116 
117   @trackable.no_automatic_dependency_tracking
~AppDataRoamingPythonPython37site-packagestensorflowpythontrainingtrackingbase.py in _method_wrapper(self, *args, **kwargs)
455     self._self_setattr_tracking = False  # pylint: disable=protected-access
456     try:
--> 457       result = method(self, *args, **kwargs)
458     finally:
459       self._self_setattr_tracking = previous_value  # pylint: disable=protected-access
~AppDataRoamingPythonPython37site-packagestensorflowpythonkerasenginefunctional.py in _init_graph_network(self, inputs, outputs)
189     # Keep track of the network's nodes and layers.
190     nodes, nodes_by_depth, layers, _ = _map_graph_network(
--> 191         self.inputs, self.outputs)
192     self._network_nodes = nodes
193     self._nodes_by_depth = nodes_by_depth
~AppDataRoamingPythonPython37site-packagestensorflowpythonkerasenginefunctional.py in _map_graph_network(inputs, outputs)
929                              'The following previous layers '
930                              'were accessed without issue: ' +
--> 931                              str(layers_with_complete_input))
932         for x in nest.flatten(node.outputs):
933           computable_tensors.add(id(x))
ValueError: Graph disconnected: cannot obtain value for tensor Tensor("clarity_8:0", shape=(20, 1), dtype=float32) at layer "clarity_emb". The following previous layers were accessed without issue: []

注意不要覆盖输入变量。覆盖网络内的colorclarityx输入

这里有一个可能的解决方案:

dense1 = 2**7
dense2 = 2**8
dense3 = 2**9
dropout = 0.8
price_loss = 1
cut_loss = 1
activation= LeakyReLU()
batch_size = 32
#====================================================================
# INPUTS
#====================================================================
carat = Input(shape= (1,), batch_size= batch_size, name= 'carat')
Color = Input(shape= (1,), batch_size= batch_size, name= 'color')
Clarity = Input(shape= (1,), batch_size= batch_size, name= 'clarity')
depth = Input(shape= (1,), batch_size= batch_size, name= 'depth')
table = Input(shape= (1,), batch_size= batch_size, name= 'table')
X = Input(shape= (1,), batch_size= batch_size, name= 'x')
y = Input(shape= (1,), batch_size= batch_size, name= 'y')
z = Input(shape= (1,), batch_size= batch_size, name= 'z')
#====================================================================
# CREATE EMBEDDINGS FOR CATEGORICAL FEATURES "COLOR" AND "CLARITY"
#====================================================================
color = Embedding(input_dim = 7, output_dim = 1, name = 'color_emb')(Color)
clarity = Embedding(input_dim = 8, output_dim = 1, name = 'clarity_emb')(Clarity)
color = Flatten()(color)
clarity = Flatten()(clarity)
#====================================================================
# CONCATENATE FEATURES
#====================================================================
x = Concatenate()([color, clarity, carat, depth, table, X, y, z])
#====================================================================
# DENSE NETWORK
#====================================================================
x = Dense(dense1, activation = activation)(x)
x = BatchNormalization()(x)
x = Dense(dense2, activation = activation)(x)
x = BatchNormalization()(x)
x = Dense(dense3, activation = activation)(x)
x = BatchNormalization()(x)
x = Dropout(dropout)(x)
#====================================================================
# PREDICTIONS
# ====================================================================
cut = Dense(1, activation = 'sigmoid')(x)
price = Dense(1)(x)
#====================================================================
# DEFINE THE MODEL
# ====================================================================
model = Model(inputs = [carat, Color, Clarity, depth, table, X, y, z] , 
outputs = [cut , price])
model.compile('adam', 'mse')
model.summary()

这里是跑步笔记本:https://colab.research.google.com/drive/1akpvuGKHXt6Frrec692zvCMAfRzZbIaM?usp=sharing

好吧,就像我们遇到一些错误时一样,答案就在错误消息中。请注意,虽然您已经能够构建覆盖变量colorclarity等的模型结构……但变量更改了类型。一开始,您将变量初始化为tf.keras.Input,然后用EmbeddingFlatten层覆盖它。因此,当涉及到构建模型时,您要求使用不再是tf.keras.Input而是其他内容的输入来构建模型。

解决方案是不覆盖这些变量。

最新更新