numpy.点-形状误差-神经网络



我试图将这些A1W2矩阵(Z2 = W2.dot(A1))相乘:

A1 : [[0.42940542]
[0.55013895]]
W2 : [[-0.4734037  -0.39642393 -0.05440914 -0.24011293 -0.03670913 -0.37523234]
[-0.45501004  0.23881832  0.21831658  0.32237388  0.25674681  0.27956714]]

但是我得到这个错误shapes (2,6) and (2,1) not aligned: 6 (dim 1) != 2 (dim 0),为什么?(2,1)和(2,6)矩阵相乘不是很正常吗?

因为我有一个hidden layer with 2 nodes和输出层6 nodes

这在数学上是不可能的,因为你是用(2,6)矩阵乘以(2,1)。你所需要做的就是把W2转置。

p。S:注意,在线性代数中np。dot(W2。T, A1)和np。dot(A1)不一样。T W2)

import numpy as np
A1 = np.asarray([[0.42940542], [0.55013895]])
W2 = np.asarray([[
-0.4734037, -0.39642393, -0.05440914, -0.24011293, -0.03670913, -0.37523234
], [-0.45501004, 0.23881832, 0.21831658, 0.32237388, 0.25674681, 0.27956714]])
print(W2.shape, A1.shape)  # (2, 6), (2, 1)
Z2 = W2.T @ A1
print(Z2)

结果为:[[-0.45360086][-0.03884332][0.09674087][0.07424463][0.12548332][-0.00732603]]

最新更新