Thursday, December 26, 2024
Google search engine
HomeLanguagesSigmoid Cross Entropy function of TensorFlow

Sigmoid Cross Entropy function of TensorFlow

TensorFlow is an open-source Machine Learning framework used to develop models. While developing models we use many functions to check the model’s accuracy and loss. For a model to be in good condition, loss calculation is a must since loss acts like a penalty. The lower the loss better will be the working of the model.  

There are many kinds of loss functions. One such function is the Sigmoid cross entropy function of TensorFlow. The sigmoid function or logistic function is the function that generates an S-shaped curve. This function is used to predict probabilities therefore, the range of this function lies between 0 and 1.

Cross Entropy loss is the difference between the actual and the expected outputs. This is also known as the log loss function and is one of the most valuable techniques in the field of Machine Learning. 

sigmoid_cross_entropy_with_logits

This is a function of Tensorflow version 2 which is used for soft binary labels. A soft label is one that has a measure of likelihood. This function can also be used for hard labels. It measures the probability of error in tasks that have two outcomes.

Python3




import tensorflow as tf
  
# type list
input = [1., 2., 3., 4., 5.89]
output = [2, 1, 3, 4, 5.9]
  
# conversion to tensor
# input means actual
input = tf.convert_to_tensor(input,
                             dtype=tf.float32)
# Output means predicted
output = tf.convert_to_tensor(output,
                              dtype=tf.float32)
  
# calculating the deviation between
# actual and predicted values
x = tf.nn.sigmoid_cross_entropy_with_logits(
    labels=output, logits=input).numpy()
  
print(x)


Output:

[ -0.68673825   0.12692802  -5.9514127  -11.98185    -28.858236  ]

We can also compute sigmoid cross entropy loss between two 4D tensors using the sigmoid_cross_entropy_with_logits() function.

Python3




import tensorflow as tf
  
# type list
input = [[[[9], [8]], [[7], [5]]]]
output = [[[[1], [2]], [[3], [4]]]]
  
# conversion to tensor
input = tf.convert_to_tensor(input,
                             dtype=tf.float32)
output = tf.convert_to_tensor(output,
                              dtype=tf.float32)
  
# calculating deviation in actual and predicted values
x = tf.nn.sigmoid_cross_entropy_with_logits(
    labels=output, logits=input).numpy()
  
print(x)


Output:

[[[[ 1.2340219e-04]
   [-7.9996648e+00]]
   
  [[-1.3999088e+01]
   [-1.4993284e+01]]]]

Sigmoid Cross Entropy

This is the function that was supported by Tensorflow version 2 which is strongly integrated with Keras. It basically calculates the loss between the actual output and the predicted output. This function can be used for Binary outcomes. The syntax is

Suppose there are two tensors y_pred and y_true. The formula for loss is calculated as

loss = -(y_true log(sigmoid(y_pred)) + (1 - y_true) log(1 - sigmoid(y_pred)))

However, this function is supported in neither of the versions of  Tensorflow in Python.

Python3




import tensorflow as tf
from tensorflow.compat.v1.losses \
import sigmoid_cross_entropy
  
logits = tf.constant([[0, 1],
                      [1, 5],
                      [2, -4]], dtype=tf.float32)
y_true = tf.constant([[1, 1],
                      [1, 0],
                      [2, 0]], dtype=tf.float32)
  
loss = sigmoid_cross_entropy(multi_class_labels=y_true,
                             logits=logits).numpy()
  
print(loss)


Output:

0.74524397

In the above code logits are basically the input tensor and  y_true is the multiclass label. Then the loss function is called that calculates the loss. Since the output is also a tensor, it is converted to a tensor.

Comparison between the three Sigmoid Cross Entropy Functions

Now as we have seen multiple versions of the cross-entropy function it becomes sometimes mind-boggling to remember which function is suited for which kind of task. So, let’s try to summarise this in a tabular format.

sigmoid_cross_entropy_with_logits SigmoidCrossEntropyLoss sigmoid_cross_entropy
This function is used for soft binary labels. This function basically calculates the difference between the actual and the predicted output. This function is used for multiclass labels.
It is supported in Tensorflow Version 2. Currently, this function is not supported in any of the Tensorflow versions. This function is compatible with Tensorflow version 1.
It measures the element-wise probability error classification, therefore, returns an array of element-wise. It returns a single value. It returns a single value.
Dominic Rubhabha-Wardslaus
Dominic Rubhabha-Wardslaushttp://wardslaus.com
infosec,malicious & dos attacks generator, boot rom exploit philanthropist , wild hacker , game developer,
RELATED ARTICLES

Most Popular

Recent Comments