Saturday, October 5, 2024
Google search engine
HomeLanguagesJavascriptTensorflow.js tf.losses.softmaxCrossEntropy() Function

Tensorflow.js tf.losses.softmaxCrossEntropy() Function

Tensorflow.js is an open-source library developed by Google for running machine learning models and deep learning neural networks in the browser or node environment.

The Tensorflow.js tf.losses.softmaxrossEntropy() function Computes the softmax cross entropy loss between two tensors and returns a new tensor.

Syntax:

tf.losses.softmaxCrossEntropy(onehotLabels, 
    logits, weights, labelSmoothing, reduction)

Parameters: This function accepts five parameters (in which the last three are optional) which are illustrated below:

  • onehotLabels: It is a hot encoded label that has the same dimensions as predictions.
  • logits: It is the predicted outputs.
  • weights: These are those tensors whose rank is either 0 or 1, and they must be broad castable to loss of shape.
  • labelSmoothing: If the value of this parameter is greater than 0, then it smooths the labels.
  • reduction: It is the type of reduction to apply to loss. It must be of Reduction type.

Note: Parameters like weights, labelSmoothing, and reduction are optional.

Return Value: It returns a tensor having softmax cross-entropy loss between two tensors.

Javascript




// Importing the tensorflow.Js library
import * as tf from "@tensorflow/tfjs"
 
// Creating onehotLabels tensor
const a  = tf.tensor2d([[1, 4, 5], [5, 5, 7]]);
 
// Creating logits tensor
const b    = tf.tensor2d([[3, 2, 5], [3, 2, 7]])
 
// Computing soft max cross entropy distance
softmax_cross_entropy = tf.losses.softmaxCrossEntropy(a, b)
softmax_cross_entropy.print();


Output:

Tensor
   30.55956268310547

Example 2: In this example, we are passing on an optional parameter that is label smoothing. If it is greater than 0, then smooth the labels. 

Javascript




// Importing the tensorflow.Js library
import * as tf from "@tensorflow/tfjs"
 
// const tf = require("@tensorflow/tfjs")
 
// Creating labels tensor
const a = tf.tensor2d([[1,2,3,4,5], [7,8,9,10,11]])
 
// Creating predictions tensor
const b = tf.tensor2d([[6,735,8,59,10], [45,34,322,2,3]])
 
const c = tf.tensor2d([[4,34,34,2,4],[65,34,3,2,3]])
 
// Computing cross entropy  with an option parameter number
softmax_cross_entropy = tf.losses.softmaxCrossEntropy(a, b, 5)
softmax_cross_entropy.print();


 
 

Output:

 

Tensor
    50477.5

Reference: https://js.tensorflow.org/api/latest/#losses.softmaxCrossEntropy
 

Whether you’re preparing for your first job interview or aiming to upskill in this ever-evolving tech landscape, neveropen Courses are your key to success. We provide top-quality content at affordable prices, all geared towards accelerating your growth in a time-bound manner. Join the millions we’ve already empowered, and we’re here to do the same for you. Don’t miss out – check it out now!

Dominic Rubhabha-Wardslaus
Dominic Rubhabha-Wardslaushttp://wardslaus.com
infosec,malicious & dos attacks generator, boot rom exploit philanthropist , wild hacker , game developer,
RELATED ARTICLES

Most Popular

Recent Comments