dropout, recurrent_dropout in LSTM layer



我正在训练一个GRU神经网络,并在我的GRU层中添加了dropout和递归dropout,但从那时起,每次我再次运行程序时我都无法获得可重复的结果,即使使用:

我也无法解决这个问题:
recurrent_initializer=tf.keras.initializers.Orthogonal(seed=42), 
kernel_initializer=tf.keras.initializers.GlorotUniform(seed=42))

在同一层

这是我的模型:

model = tf.keras.models.Sequential()
model.add(tf.keras.layers.GRU(20, activation='tanh',dropout=0.1, 
recurrent_dropout=0.2,recurrent_activation="sigmoid", return_sequences = 
False, input_shape=(train_XX.shape[1], train_XX.shape[2]), 
recurrent_initializer=tf.keras.initializers.Orthogonal(seed=42), 
kernel_initializer=tf.keras.initializers.GlorotUniform(seed=42)))
model.add(tf.keras.layers.Dense(1, activation='sigmoid', 
kernel_initializer=tf.keras.initializers.GlorotUniform(seed=42),))
model.compile(loss=tf.keras.losses.BinaryCrossentropy(from_logits=False,
name="binary_crossentropy",),optimizer='adam',
metrics=[tf.keras.metrics.PrecisionAtRecall(0.75)] )

我已经在程序开始时设置了种子:

import numpy as np
import tensorflow as tf
import random as rn
np.random.seed(1)
tf.random.set_seed(2)
rn.seed(3)

而是通过在3行植种前添加:

import os
os.environ['PYTHONHASHSEED'] = '0'
os.environ['CUDA_VISIBLE_DEVICES'] = ''

它解决了我的问题。

最新更新