Tensorflow:组织大型网络的公认"clean"方式是什么?



我有一个相当大的完全连接的网络,我开始对将自己的权重和偏见存储在字典中,然后计算每个层

layer_i+1 = relu(add(matmul(layer_i, weights['i']), biases['i']))

肯定必须有一些"更清洁"的方法来做到这一点?还是我要思考事物?

我以以下方式管理网络:

layers.py

vision = [
    ('conv', [5,5, 3,32], [32]),
    ('conv', [3,3,32,32], [32]),
    ('conv', [3,3,32,32], [32]),
    ('pool', 2),
    ('conv', [3,3,32,64], [64]),
    ('conv', [3,3,64,64], [64]),
    ('pool', 2),
    ('conv', [3,3,64,128], [128]),
    ('pool', 2),
    ('reshape', [-1,6*128]),
    ('dense', [6*128, 512], [512])
]

counter = [
    ('dense', [512, 256], [256]),
    ('dense', [256, max_digits], [max_digits])
]

tfmodel.py

def conv2d(x, W, b, strides=1, act='relu', name='convolution'):
    x = tf.nn.conv2d(x, W, strides=[1,strides,strides,1], padding="VALID", name=name)
    x = tf.nn.bias_add(x, b)
    if act=='relu':
        return tf.nn.relu(x)
    elif act=='tanh':
        return tf.nn.tanh(x)
    elif act=='softmax':
        return tf.nn.softmax(x)
def maxpool2d(x, k=2):
    return tf.nn.max_pool(x, ksize=[1,k,k,1], strides=[1,k,k,1], padding="VALID")
def process_network(X, layers, dropout, scope):
    with tf.variable_scope(scope):
        h = X
        i=0
        for layer in layers:
            if layer[0]=='conv':
                nameW = 'conv{}W'.format(i)
                nameb = 'conv{}b'.format(i)
                h = conv2d(h, tf.get_variable(nameW, layer[1], initializer=tf.random_normal_initializer()), tf.get_variable(nameb,layer[2], initializer=tf.random_normal_initializer()))
            elif layer[0]=='pool':
                h = maxpool2d(h, layer[1])
            elif layer[0]=='dense':
                nameW = 'dense{}W'.format(i)
                nameb = 'dense{}b'.format(i)
                h = tf.add(tf.matmul(h, tf.get_variable(nameW, layer[1], initializer=tf.random_normal_initializer())), tf.get_variable(nameb,layer[2], initializer=tf.random_normal_initializer()))
            elif layer[0]=='reshape':
                h = tf.reshape(h, layer[1])
            i = i+1
        h = tf.identity(h, 'out')
        return h

并且在创建图形时,只需这样称呼:

h = tfmodel.process_network(image, layers.vision, 0.1, 'vision')
c_ = tfmodel.process_network(h, layers.counter, 0.1, 'counter')

这也可以在张板中创建一个干净的图。它不完整,但我确定您有这个想法。

另一种干净的方法是使用凯拉斯定义图层或型号。查看keras作为张力流的简化接口:教程

最新更新