Monday, November 18, 2024
Google search engine
HomeLanguagesJavascriptTensorflow.js tf.layers.thresholdedReLU() Function

Tensorflow.js tf.layers.thresholdedReLU() Function

Tensorflow.js is a Google-developed open-source toolkit for executing machine learning models and deep learning neural networks in the browser or on the node platform. It also enables developers to create machine learning models in JavaScript and utilize them directly in the browser or with Node.js.

The tf.layers.thresholdedReLU() function is used to apply the threshold rectified linear unit activation function on data.

Syntax:

 tf.layers.thresholdedReLU(args?)

Input Shape: Arbitrary. When utilizing this layer as the initial layer in a model, use the inputShape configuration.

Output Shape: The output has the same shape as the input.

Parameters: It accepts the args object which can have the following properties:

  • theta (number): It is the threshold location of activation. 
  • inputShape: If this property is set, it will be utilized to construct an input layer that will be inserted before this layer. 
  • batchInputShape: If this property is set, an input layer will be created and inserted before this layer. 
  • batchSize: If batchInputShape isn’t supplied and inputShape is, batchSize is utilized to build the batchInputShape.
  • dtype: It is the kind of data type for this layer. float32 is the default value. This parameter applies exclusively to input layers.
  • name: This is the layer’s name and is of string type.
  • trainable: If the weights of this layer may be changed by fit. True is the default value.
  • weights: The layer’s initial weight values.

Returns: It returns an object (ThresholdedReLU).

Example 1:

Javascript




import * as tf from "@tensorflow/tfjs";
 
const thresholdReLULayer =
    tf.layers.thresholdedReLU({theta: 10});
     
const x = tf.tensor([11, 8, 9, 12]);
 
thresholdReLULayer.apply(x).print();


Output:

Tensor
   [11, 0, 0, 12]

Example 2:

Javascript




import * as tf from "@tensorflow/tfjs";
 
const thresholdReLULayer =
    tf.layers.thresholdedReLU({ theta: 0.9 });
     
const x = tf.tensor([1.12, 0.8,
    1.9, 0.12, 0.25, 3.4], [2, 3]);
 
thresholdReLULayer.apply(x).print();


Output:

Tensor
   [[1.12, 0, 1.9],
    [0, 0, 3.4000001]]

Reference: https://js.tensorflow.org/api/latest/#layers.thresholdedReLU

Whether you’re preparing for your first job interview or aiming to upskill in this ever-evolving tech landscape, neveropen Courses are your key to success. We provide top-quality content at affordable prices, all geared towards accelerating your growth in a time-bound manner. Join the millions we’ve already empowered, and we’re here to do the same for you. Don’t miss out – check it out now!

RELATED ARTICLES

Most Popular

Recent Comments