TensorFlow 微調指定的op層

    利用TensorFlow進行模型finetune時,想訓練指定的op,則須要根據指定的op名進行提取,而後將其配置至tf.train.AdamOptimizer函數中的var_list參數。python

    部分代碼以下函數

def _get_variables_to_train(trainable_scopes = None):
  """Returns a list of variables to train.

  Returns:
    A list of variables to train by the optimizer.
  """
  if trainable_scopes is None:
    return tf.trainable_variables()
  else:
    scopes = [scope.strip() for scope in trainable_scopes.split(',')]

  variables_to_train = []
  for scope in scopes:
    variables = tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES, scope)
    variables_to_train.extend(variables)
  return variables_to_train

output_vars = _get_variables_to_train(Config.trainable_scopes)
update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)
with tf.control_dependencies(update_ops):
    train_op = tf.train.AdamOptimizer(learning_rate=lr).minimize(loss, global_step=global_step, var_list=output_vars)
trainable_scopes = 'HCnet/Bottle_neck5,HCnet/Bottle_neck5_1,HCnet/Bottle_neck6_2,HCnet/Conv7'

trainable_scopes是我須要訓練的op,經過函數_get_variables_to_train獲取須要的op參數,而後將獲取到的參數output_vars傳入tf.train.AdamOptimizer中的var_list中,經過該方法便可對指定的op訓練。上述過程省略了模型預加載的過程。spa