填充混洗缓冲区(这可能需要一段时间)



我有一个数据集,其中包括部分1000个真实视频和1000个深度假视频的视频帧。每个视频在预处理阶段转换为其他世界的300帧后,我有一个数据集,其中有300000张带有Real(0(标签的图像和300000张带有Fake(1(标签的图片。我想用这些数据训练MesoNet。我使用costum DataGenerator类以0.8,0.1,0.1的比率处理训练、验证和测试数据,但当我运行该项目时,显示以下消息:

Filling up shuffle buffer (this may take a while):

我能做些什么来解决这个问题?

您可以在下面看到DataGenerator类。

class DataGenerator(keras.utils.Sequence):
'Generates data for Keras'
def __init__(self, df, labels, batch_size =32, img_size = (224,224),
n_classes = 2, shuffle=True):
'Initialization'
self.batch_size = batch_size
self.labels = labels
self.df = df
self.img_size = img_size
self.n_classes = n_classes
self.shuffle = shuffle
self.batch_labels = []
self.batch_names = []
self.on_epoch_end()
def __len__(self):
'Denotes the number of batches per epoch'
return int(np.floor(len(self.df) / self.batch_size))
def __getitem__(self, index):

batch_index = self.indexes[index * self.batch_size : (index + 1) * self.batch_size]
frame_paths = self.df.iloc[batch_index]["framePath"].values
frame_label = self.df.iloc[batch_index]["label"].values
imgs = [cv2.imread(frame) for frame in frame_paths]
imgs = [cv2.cvtColor(img, cv2.COLOR_BGR2RGB) for img in imgs]
imgs = [
cv2.resize(img, self.img_size) for img in imgs if img.shape != self.img_size
]
batch_imgs = np.asarray(imgs)
labels = list(map(int, frame_label))
y = np.array(labels)
self.batch_labels.extend(labels)
self.batch_names.extend([str(frame).split("\")[-1] for frame in frame_paths])
return (
batch_imgs,y  
)
def on_epoch_end(self):
'Updates indexes after each epoch'
self.indexes = np.arange(len(self.df))
if self.shuffle == True:
np.random.shuffle(self.indexes)

请注意,这不是一个错误,而是一条日志消息:https://github.com/tensorflow/tensorflow/blob/42b5da6659a75bfac77fa81e7242ddb5be1a576a/tensorflow/core/kernels/data/shuffle_dataset_op.cc#L138

如果花费的时间太长,您可能会选择太大的数据集:https://github.com/tensorflow/tensorflow/issues/30646

您可以通过降低缓冲区大小来解决此问题:https://support.huawei.com/enterprise/en/doc/EDOC1100164821/2610406b/what-do-i-do-if-training-times-out-due-to-too-many-dataset-shuffle-operations

相关内容

  • 没有找到相关文章

最新更新