我正试图计算从高斯#1到高斯#2的Kullback-Leibler散度我有两个高斯的平均值和标准差我从中尝试了此代码http://www.cs.cmu.edu/~chanwook/MySoftware/rm1_Spk-by-PK_MLLR/rm1_PNCC_MLLR_1/rm1/python/sphinx/diversience.py
def gau_kl(pm, pv, qm, qv):
"""
Kullback-Leibler divergence from Gaussian pm,pv to Gaussian qm,qv.
Also computes KL divergence from a single Gaussian pm,pv to a set
of Gaussians qm,qv.
Diagonal covariances are assumed. Divergence is expressed in nats.
"""
if (len(qm.shape) == 2):
axis = 1
else:
axis = 0
# Determinants of diagonal covariances pv, qv
dpv = pv.prod()
dqv = qv.prod(axis)
# Inverse of diagonal covariance qv
iqv = 1./qv
# Difference between means pm, qm
diff = qm - pm
return (0.5 *
(numpy.log(dqv / dpv) # log |Sigma_q| / |Sigma_p|
+ (iqv * pv).sum(axis) # + tr(Sigma_q^{-1} * Sigma_p)
+ (diff * iqv * diff).sum(axis) # + (mu_q-mu_p)^TSigma_q^{-1}(mu_q-mu_p)
- len(pm))) # - N
我使用平均值和标准偏差作为输入,但代码(len(pm))
的最后一行导致了一个错误,因为平均值是一个数字,我不理解这里的len函数。
注意。这两个集合(即高斯)不相等,这就是为什么我不能使用scipy.stats.entropy
以下函数计算任意两个多元正态分布之间的KL散度(不需要协方差矩阵是对角的)(其中numpy作为np导入)
def kl_mvn(m0, S0, m1, S1):
"""
Kullback-Liebler divergence from Gaussian pm,pv to Gaussian qm,qv.
Also computes KL divergence from a single Gaussian pm,pv to a set
of Gaussians qm,qv.
From wikipedia
KL( (m0, S0) || (m1, S1))
= .5 * ( tr(S1^{-1} S0) + log |S1|/|S0| +
(m1 - m0)^T S1^{-1} (m1 - m0) - N )
"""
# store inv diag covariance of S1 and diff between means
N = m0.shape[0]
iS1 = np.linalg.inv(S1)
diff = m1 - m0
# kl is made of three terms
tr_term = np.trace(iS1 @ S0)
det_term = np.log(np.linalg.det(S1)/np.linalg.det(S0)) #np.sum(np.log(S1)) - np.sum(np.log(S0))
quad_term = diff.T @ np.linalg.inv(S1) @ diff #np.sum( (diff*diff) * iS1, axis=1)
#print(tr_term,det_term,quad_term)
return .5 * (tr_term + det_term + quad_term - N)
如果您仍然感兴趣。。。
该函数期望多元高斯协方差矩阵的对角项,而不是您提到的标准差。如果你的输入是单变量高斯,那么pv
和qv
都是对应高斯方差的长度为1的向量。
此外,len(pm)
对应于平均向量的维数。它确实是k在多元正态分布部分。对于单变量高斯,k为1,对于双变量高斯,k为2,依此类推