Tensorflow.js is an open-source library developed by Google for running machine learning models and deep learning neural networks in the browser or node environment. The tf.initializers.Initializer() class is used to extend serialization.Serializable class. It is the base class of Initializer.
This tf.initializers.Initializer class contains fifteen inbuilt functions which are illustrated below:
- tf.initializers.Initializer class .constant() function
- tf.initializers.Initializer class .glorotNormal() function
- tf.initializers.Initializer class .glorotUniform() function
- tf.initializers.Initializer class .heNormal() function
- tf.initializers.Initializer class .heUniform() function
- tf.initializers.Initializer class .identity() function
- tf.initializers.Initializer class .leCunNormal() function
- tf.initializers.Initializer class .leCunUniform() function
- tf.initializers.Initializer class .ones() function
- tf.initializers.Initializer class .orthogonal() function
- tf.initializers.Initializer class .randomNormal() function
- tf.initializers.Initializer class .randomUniform() function
- tf.initializers.Initializer class .truncatedNormal() function
- tf.initializers.Initializer class .varianceScaling() function
- tf.initializers.Initializer class .zeros() function
1. tf.initializers.Initializer class .constant() function: It is used to generate the values initialized to some constant.
Example:
Javascript
// Importing the tensorflow.js libraryconst tf = require("@tensorflow/tfjs")// Use tf.initializers.constant() functionvar initializer = tf.initializers.constant({ value: 7, })// Print the value of constantconsole.log(initializer); |
Output:
Constant { value: 7 }
2. tf.initializers.Initializer class .glorotNormal() function: It extract samples from a truncated normal distribution which is been centered at 0 with stddev = sqrt(2 / (fan_in + fan_out)). Note, that the fan_in is the number of inputs in the tensor weight and the fan_out is the number of outputs in the tensor weight.
Example:
Javascript
// Importing the tensorflow.js libraryimport * as tf from "@tensorflow/tfjs"// Initializing the .initializers.glorotNormal() functionconsole.log(tf.initializers.glorotNormal(9));// Printing Individual gainvaluesconsole.log('\nIndividual values:\n');console.log(tf.initializers.glorotNormal(9).scale);console.log(tf.initializers.glorotNormal(9).mode);console.log(tf.initializers.glorotNormal(9).distribution); |
Output:
{
"scale": 1,
"mode": "fanAvg",
"distribution": "normal"
}
Individual values:
1
fanAvg
normal
3. tf.initializers.Initializer class .glorotUniform() function: It is used to extract samples from a uniform distribution within [-limit, limit] where limit is sqrt(6 / (fan_in + fan_out)) where fan_in is the number of input units in the weight tensor and fan out is the number of output units in the weight tensor.
Example:
Javascript
// Importing the tensorflow.Js libraryimport * as tf from "@tensorflow/tfjs"// Initializing the .initializers.glorotUniform() functionconst geek = tf.initializers.glorotUniform(7)// Printing gain valueconsole.log(geek);// Printing individual values from gainconsole.log('\nIndividual values:\n');console.log(geek.scale);console.log(geek.mode);console.log(geek.distribution); |
Output:
{
"scale": 1,
"mode": "fanAvg",
"distribution": "uniform"
}
Individual values:
1
fanAvg
uniform
4. tf.initializers.Initializer class .heNormal() function: It is used to draw samples from a truncated normal distribution centered on zero with stddev = sqrt(2 / fanIn) within [-limit, limit] where, limit is sqrt(6 / fan_in). Note, that the fanIn is the number of inputs in the tensor weight.
Example:
Javascript
// Importing the tensorflow.js libraryimport * as tf from "@tensorflow/tfjs"// Initializing the .initializers.heNormal()// functionconst geek = tf.initializers.heNormal(7)// Printing gainconsole.log(geek);console.log('\nIndividual values:\n');console.log(geek.scale);console.log(geek.mode);console.log(geek.distribution); |
Output:
{
"scale": 2,
"mode": "fanIn",
"distribution": "normal"
}
Individual values:
2
fanIn
normal
5. tf.initializers.Initializer class .heUniform() function: It draws samples from a uniform distribution within [-cap, cap] where, cap is sqrt(6 / fan_in). Note, that the fanIn is the number of inputs in the tensor weight.
Example:
Javascript
// Importing the tensorflow.js libraryimport * as tf from "@tensorflow/tfjs"// Initializing the .initializers.heUniform() functionconst geek = tf.initializers.heUniform(7)// Printing gainconsole.log(geek);console.log('\nIndividual values:\n');console.log(geek.scale);console.log(geek.mode);console.log(geek.distribution); |
Output:
{
"scale": 2,
"mode": "fanIn",
"distribution": "uniform"
}
Individual values:
2
fanIn
uniform
6. tf.initializers.Initializer class .identity() function: It is used to return a new tensor object with an identity matrix. Its only used for 2D matrices.
Example:
Javascript
// Importing the tensorflow.Js libraryimport * as tf from "@tensorflow/tfjs"// Generates the identity matrixconst value=tf.initializers.identity(1.0)// Print gainconsole.log(value) |
Output:
{
"gain": 1
}
7. tf.initializers.Initializer class .leCunNormal() function: It is used to extract samples from a truncated normal distribution which is centered at zero with stddev = sqrt(1 / fanIn). Note, that fanIn is the number of inputs in the tensor weight.
Example:
Javascript
// Importing the tensorflow.Js libraryimport * as tf from "@tensorflow/tfjs"// Initializing the .initializers.leCunNormal() functionconst geek = tf.initializers.leCunNormal(3)// Printing gainconsole.log(geek);console.log('\nIndividual values:\n');console.log(geek.scale);console.log(geek.mode);console.log(geek.distribution); |
Output:
{
"scale": 1,
"mode": "fanIn",
"distribution": "normal"
}
Individual values:
1
fanIn
normal
8. tf.initializers.Initializer class .leCunUniform() function: It takes samples from a uniform distribution in the interval [-cap, cap] with cap = sqrt(3 / fanIn). Note, that fanIn is the number of inputs in the tensor weight.
Example:
Javascript
// Importing the tensorflow.Js libraryimport * as tf from "@tensorflow/tfjs"// Initialising the .initializers.leCunUniform() functionconsole.log(tf.initializers.leCunUniform(4));// Printing individual values from the gainconsole.log("\nIndividual Values\n");console.log(tf.initializers.leCunUniform(4).scale);console.log(tf.initializers.leCunUniform(4).mode);console.log(tf.initializers.leCunUniform(4).distribution); |
Output:
{
"scale": 1,
"mode": "fanIn",
"distribution": "uniform"
}
Individual Values
1
fanIn
uniform
9. tf.initializers.Initializer class .ones() function: It is used to create a tensor with all elements set to 1, or it initializes tensor with value 1.
Example:
Javascript
//import tensorflow.jsconst tf=require("@tensorflow/tfjs")//use tf.ones()var GFG=tf.ones([3, 4]);//print tensorGFG.print() |
Output:
Tensor
[[1, 1, 1, 1],
[1, 1, 1, 1],
[1, 1, 1, 1]]
10. tf.initializers.Initializer class .orthogonal() function: It produces a random orthogonal matrix.
Example:
Javascript
// Importing the tensorflow.js libraryimport * as tf from "@tensorflow/tfjs"// Initializing the .initializers.orthogonal() functionlet geek = tf.initializers.orthogonal(2)// Printing gain valueconsole.log(geek);// Printing individual gain valueconsole.log('\nIndividual values:\n');console.log(geek.DEFAULT_GAIN);console.log(geek.gain); |
Output:
{
"DEFAULT_GAIN": 1,
"gain": 1
}
Individual values:
1
1
11. tf.initializers.Initializer class .randomNormal() function: It is used to produce random values that are initialized to a normal distribution.
Example:
Javascript
// Importing the tensorflow.js libraryimport * as tf from "@tensorflow/tfjs"// Initializing the .initializers.randomNormal() functionlet geek = tf.initializers.randomNormal(3)// Printing gain valueconsole.log(geek);// Printing individual gain value.console.log('\nIndividual values:\n');console.log(geek.DEFAULT_MEAN);console.log(geek.DEFAULT_STDDEV);console.log(geek.mean);console.log(geek.stddev); |
Output:
{
"DEFAULT_MEAN": 0,
"DEFAULT_STDDEV": 0.05,
"mean": 0,
"stddev": 0.05
}
Individual values:
0
0.05
0
0.05
12. tf.initializers.Initializer class .randomUniform() function: It is used to generate random values that are initialized to a uniform distribution. The values are distributed uniformly between the configured min-value and max-value.
Example:
Javascript
// Importing the tensorflow.js libraryimport * as tf from "@tensorflow/tfjs"// Initializing the .initializers.randomUniform() functionlet geek = tf.initializers.randomUniform(5)// Printing gain valueconsole.log(geek);// Printing individual gain value.console.log('\nIndividual values:\n');console.log(geek.DEFAULT_MINVAL);console.log(geek.DEFAULT_MAXVAL);console.log(geek.minval);console.log(geek.maxval); |
Output:
{
"DEFAULT_MINVAL": -0.05,
"DEFAULT_MAXVAL": 0.05,
"minval": -0.05,
"maxval": 0.05
}
Individual values:
-0.05
0.05
-0.05
0.05
13. tf.initializers.Initializer class .truncatedNormal(): It function produces random values initialized to a truncated normal distribution.
Example:
Javascript
// Importing the tensorflow.js libraryimport * as tf from "@tensorflow/tfjs"// Initializing the .initializers.truncatedNormal()// functionlet geek = tf.initializers.truncatedNormal(13)// Printing gain valueconsole.log(geek);// Printing individual gain valueconsole.log('\nIndividual values:\n');console.log(geek.DEFAULT_MEAN);console.log(geek.DEFAULT_STDDEV);console.log(geek.mean);console.log(geek.stddev); |
Output:
{
"DEFAULT_MEAN": 0,
"DEFAULT_STDDEV": 0.05,
"mean": 0,
"stddev": 0.05
}
Individual values:
0
0.05
0
0.05
14. tf.initializers.Initializer class .varianceScaling() function: It is capable of adjusting its scale to the shape of weights. Using the value of distribution = NORMAL, samples are drawn from a truncated normal distribution that has center at 0, with stddev = sqrt(scale / n).
Example:
Javascript
// Importing the tensorflow.js libraryimport * as tf from "@tensorflow/tfjs"// Initializing the .initializers.varianceScaling()// functionlet geek = tf.initializers.varianceScaling(33)// Printing gain valueconsole.log(geek);// Printing individual gain value.console.log('\nIndividual values:\n');console.log(geek.scale);console.log(geek.mode);console.log(geek.distribution); |
Output:
{
"scale": 1,
"mode": "fanIn",
"distribution": "normal"
}
Individual values:
1
fanIn
normal
15. tf.initializers.Initializer class .zeros() function: It is an initializer that is used to produce tensors that are initialized to zero.
Example:
Javascript
// Importing the tensorflow.Js libraryimport * as tf from "@tensorflow/tfjs"// Calling tf.initializers.zeros() functionconst initializer = tf.initializers.zeros();// Printing outputconsole.log(JSON.stringify(+initializer)); |
Output:
null
Reference: https://js.tensorflow.org/api/latest/#class:initializers.Initializer
