可以用主成分分析网络代替CNN的池化层吗?



是否可以用主成分分析网络代替CNN中的池化层?请详细说明。

input_shape = keras.Input(shape=(224, 224,1))
tower_1 = Conv2D(16, (3, 3), padding='same', activation='relu')(input_shape)
reshape_tower1=(tf.reshape(tower_1, [224*224, 16]))
reshape_tower1
Trans_tower1=tf.transpose(reshape_tower1)
Trans_tower1
pca_tower1 = PCA(n_components=10)
pca_tower1.fit(Trans_tower1)
result = pca_tower1.transform(Trans_tower1)

错误:

You are passing KerasTensor(type_spec=TensorSpec(shape=(16, 50176), 
dtype=tf.float32, name=None), name='tf.compat.v1.transpose_1/transpose:0', 
description="created by layer 'tf.compat.v1.transpose_1'"), an intermediate 
Keras symbolic input/output, to a TF API that does not allow registering 
custom dispatchers, such as `tf.cond`, `tf.function`, gradient tapes, or 
`tf.map_fn`. Keras Functional model construction only supports TF API calls 
that *do* support dispatching, such as `tf.math.add` or `tf.reshape`. Other 
APIs cannot be called directly on symbolic Kerasinputs/outputs. You can work 
around this limitation by putting the operation in a custom Keras layer `call` 
and calling that layer on this symbolic input/output.

你似乎得到了只使用numpy数组的sklearn PCA。你可以尝试在PCA之前将你的张量转换为numpy数组,但是你会失去梯度,所以你将无法训练第一个Conv层的参数。顺便说一下,即使你找到一个接受tf张量的PCA,你也会面临一个主要问题:PCA是不可微的,所以你不能在所有情况下训练参数。

相关内容

  • 没有找到相关文章

最新更新