Sce Loss

Posted by Packy on July 30, 2020

Symmetric Cross Entropy Learning (SL) For Segmentation

Code for ICCV2019 “Symmetric Cross Entropy for Robust Learning with Noisy Labels” https://arxiv.org/abs/1908.06112

Official

https://github.com/YisenWang/symmetric_cross_entropy_for_noisy_labels

The other replication

The reprocuded results by Hanxun Huang are slightly better for all methods. The code can be found here: https://github.com/HanxunHuangLemonBear/SCELoss-Reproduce

#Input shape as tf.nn.sparse_softmax_cross_entropy_with_logits version y_pred is model output logits B*N*Cand y_true is label array B*N: e.g.: y_pred = [[[0.3,0.5,0.3],[0.3,0.4,0.4]],[[0.3,0.5,0.3],[0.3,0.4,0.4]]] and y_true = [[1,3],[4,5]]

def symmetric_cross_entropy_loss(y_true, y_pred,alpha, beta, seg_class):
    y_true = tf.reshape(y_true, [-1])
    y_true = tf.one_hot(y_true,seg_class) 
    y_pred = tf.nn.softmax(y_pred)
    y_true = tf.cast(y_true,dtype=tf.float32)
    y_pred = tf.cast(y_pred,dtype=tf.float32)
    y_true_1 = y_true
    y_pred_1 = y_pred

    y_true_2 = y_true
    y_pred_2 = y_pred

    y_pred_1 = tf.clip_by_value(y_pred_1, 1e-7, 1.0)
    y_true_2 = tf.clip_by_value(y_true_2, 1e-4, 1.0)
    ce_loss = tf.reduce_mean(-tf.reduce_sum(y_true_1 * tf.log(y_pred_1), axis = -1))
    rce_loss = tf.reduce_mean(-tf.reduce_sum(y_pred_2 * tf.log(y_true_2), axis = -1))
    return alpha*ce_loss + beta*rce_loss