logo
Loading...

Leaky ReLU與Identity的使用方式 - Cupoy

想請問下方程式碼中,全連接層的部分最後為什麼需要劃分為Leaky ReLU和Identity兩種,以...

cvdl-1,cvdl-1-d37

Leaky ReLU與Identity的使用方式

2020/01/15 09:31 AM
電腦視覺深度學習論壇
張凱維
觀看數:0
回答數:1
收藏數:0
cvdl-1
cvdl-1-d37

想請問下方程式碼中,全連接層的部分最後為什麼需要劃分為Leaky ReLU和Identity兩種,以及各自的使用時機?


def local(self, scope, input, in_dimension, out_dimension, leaky=True, pretrain=True, train=True):

   """Fully connection layer


   Args:

     scope: variable_scope name

     input: [batch_size, ???]

     out_dimension: int32

   Return:

     output: 2-D tensor [batch_size, out_dimension]

   """

   with tf.variable_scope(scope) as scope:

     reshape = tf.reshape(input, [tf.shape(input)[0], -1])


     weights = self._variable_with_weight_decay('weights', shape=[in_dimension, out_dimension],

                         stddev=0.04, wd=self.weight_decay, pretrain=pretrain, train=train)

     biases = self._variable_on_cpu('biases', [out_dimension], tf.constant_initializer(0.0), pretrain, train)

     local = tf.matmul(reshape, weights) + biases


     if leaky:

       local = self.leaky_relu(local)

     else:

       local = tf.identity(local, name=scope.name)