Thursday, December 26, 2024
Google search engine
HomeLanguagesJavascriptTensorflow.js tf.layers.batchNormalization() Function

Tensorflow.js tf.layers.batchNormalization() Function

Tensorflow.js is a Google-developed open-source toolkit for executing machine learning models and deep learning neural networks in the browser or on the node platform. It also enables developers to create machine learning models in JavaScript and utilize them directly in the browser or with Node.js.

The tf.layers.batchNormalization() function is used to apply the batch normalization operation on data. Batch normalisation is a method for training very deep neural networks that standardises each mini-inputs batch’s to a layer. This stabilises the learning process and significantly reduces the number of training epochs needed to create deep networks.

Syntax:

tf.layers.batchNormalization(args?)

Input Shape: Arbitrary. When utilizing this layer as the initial layer in a model, use the inputShape configuration.

Output Shape: The output has the same shape as the input.

Parameters: It accepts the args object which can have the following properties:

  • axis (number): The integer axis that should be normalized (typically the features axis). -1 is the default value.
  • momentum (number): The moving average’s momentum. The default value is 0.99.
  • epsilon (number): The small float is added to the variance to avoid division by zero. Defaults to 1e-3.
  • center (boolean): If this is true, add the offset of beta to the normalized tensor. If false, beta isn’t taken into account. The value is set to true by default.
  • scale (boolean): If this is true, multiplied by gamma. Gamma is not utilized if false. True is the default value.
  • betaInitializer: This is the beta weight’s initializer. ‘zeroes’ is the default value.
  • gammaInitializer: This is the gamma weight’s initializer. ‘ones’ is the default value.
  • movingMeanInitializer: This is the moving mean’s initializer. ‘zeroes’ is the default value.
  • movingVarianceInitializer: This is the moving variance’s initializer. ‘ones’ is the default value.
  • betaConstraint: The constraint for the beta weight.
  • gammaConstraint: The constraint for the gamma weight.
  • betaRegularizer: The regularizer for the beta weight.
  • gammaRegularizer: The regularizer for the beta weight.

Return Value: It returns an object (BatchNormalization).

Example 1:

Javascript




import * as tf from "@tensorflow/tfjs";
  
const batchNormalizationLayer = tf.layers.batchNormalization();
      
const x = tf.tensor([1.12, -0.8, 1.9, 0.12, 0.25, -3.4], [2, 3]);
  
batchNormalizationLayer.apply(x).print();


Output:

Tensor
   [[1.1194404, -0.7996003, 1.8990507 ],
    [0.11994  , 0.2498751 , -3.3983014]]

Example 2:

Javascript




import * as tf from "@tensorflow/tfjs";
  
const batchNormalizationLayer = tf.layers.batchNormalization();
      
const x = tf.tensor([12, 3.2, 4.8, 9, 10, 2.5, 
    8, 11, 9.4, 25, 24.9, 98.7], [2, 3, 2]);
  
batchNormalizationLayer.apply(x).print();


Output:

Tensor
   [[[11.9940042, 3.1984012 ],
     [4.7976022 , 8.9955034 ],
     [9.9950037 , 2.4987509 ]],
    [[7.9960032 , 10.994504 ],
     [9.3953028 , 24.9875088],
     [24.8875599, 98.6506805]]]

Reference: https://js.tensorflow.org/api/latest/#layers.batchNormalization

Whether you’re preparing for your first job interview or aiming to upskill in this ever-evolving tech landscape, neveropen Courses are your key to success. We provide top-quality content at affordable prices, all geared towards accelerating your growth in a time-bound manner. Join the millions we’ve already empowered, and we’re here to do the same for you. Don’t miss out – check it out now!

RELATED ARTICLES

Most Popular

Recent Comments