在带有编码器和解码器的seq2seq模型中,在每个生成步骤中,softmax层输出整个词汇表的分布。在CNTK中,可以使用C.hardmax函数轻松实现贪婪解码器。它看起来像这样。
def create_model_greedy(s2smodel):
# model used in (greedy) decoding (history is decoder's own output)
@C.Function
@C.layers.Signature(InputSequence[C.layers.Tensor[input_vocab_dim]])
def model_greedy(input): # (input*) --> (word_sequence*)
# Decoding is an unfold() operation starting from sentence_start.
# We must transform s2smodel (history*, input* -> word_logp*) into a generator (history* -> output*)
# which holds 'input' in its closure.
unfold = C.layers.UnfoldFrom(lambda history: s2smodel(history, input) >> **C.hardmax**,
# stop once sentence_end_index was max-scoring output
until_predicate=lambda w: w[...,sentence_end_index],
length_increase=length_increase)
return unfold(initial_state=sentence_start, dynamic_axes_like=input)
return model_greedy
但是,在每一步中,我都不想以最大概率输出令牌。相反,我想要一个随机解码器,它根据词汇的概率分布生成一个标记。
我该怎么做?任何帮助,不胜感激。谢谢。
您可以在采用 hardmax 之前向输出添加噪声。特别是,您可以使用C.random.gumbel
或C.random.gumbel_like
按比例采样exp(output)
。这被称为 gumbel-max 技巧。cntk.random 模块也包含其他发行版,但如果你有对数概率,你很可能希望在 hardmax 之前添加 gumbel 噪声。一些代码:
@C.Function
def randomized_hardmax(x):
noisy_x = x + C.random.gumbel_like(x)
return C.hardmax(noisy_x)
然后将hardmax
替换为randomized_hardmax
。
非常感谢Nikos Karampatziakis。
如果您希望使用随机采样解码器来生成与目标序列长度相同的序列,则以下代码有效。
@C.Function
def sampling(x):
noisy_x = x + C.random.gumbel_like(x)
return C.hardmax(noisy_x)
def create_model_sampling(s2smodel):
@C.Function
@C.layers.Signature(input=InputSequence[C.layers.Tensor[input_vocab_dim]],
labels=LabelSequence[C.layers.Tensor[label_vocab_dim]])
def model_sampling(input, labels): # (input*) --> (word_sequence*)
unfold = C.layers.UnfoldFrom(lambda history: s2smodel(history, input) >> sampling,
length_increase=1)
return unfold(initial_state=sentence_start, dynamic_axes_like=labels)
return model_sampling