Sunday, November 17, 2024
Google search engine
HomeLanguagesJavascriptTensorflow.js tf.train.adagrad() Function

Tensorflow.js tf.train.adagrad() Function

Tensorflow.js is an open-source library developed by Google for running machine learning models and deep learning neural networks in the browser or node environment.

The tf.train.adagrad() function us used to create a tf.AdagradOptimizer that uses Adaptive Gradient Algorithm(adagrad). 

Syntax:

tf.train.adagrad(learningRate).

Parameters:

  • learningRate: It specifies the learning rate which will be used by adaptive gradient descent algorithm.
  • initialAccumulatorValue: It specifies the initial value of accumulators. It must be positive.

Return value: It returns a tf.adagradOptimizer.

Example 1 : Fit a function f = (x + y) by learning the coefficients x, y.

Javascript




// importing tensorflow
import tensorflow as tf
 
const xs = tf.tensor1d([0, 1, 2]);
const ys = tf.tensor1d([1.3, 2.5, 3.7]);
 
const x = tf.scalar(Math.random()).variable();
const y = tf.scalar(Math.random()).variable();
 
// Define a function f(x, y) = x + y.
const f = x => x.add(y);
const loss = (pred, label) =>
    pred.sub(label).square().mean();
 
const learningRate = 0.05;
 
// Create adagrad optimizer
const optimizer =
  tf.train.adagrad(learningRate);
 
// Train the model.
for (let i = 0; i < 5; i++) {
   optimizer.minimize(() => loss(f(xs), ys));
}
 
// Make predictions.
console.log(
`x: ${x.dataSync()}, y: ${y.dataSync()}`);
const preds = f(xs).dataSync();
preds.forEach((pred, i) => {
console.log(`x: ${i}, pred: ${pred}`);
});


Output

x: 0.8561810255050659, y: 0.6922483444213867
x: 0, pred: 0.6922483444213867
x: 1, pred: 1.6922483444213867
x: 2, pred: 2.6922483444213867

Example 2: Fit a quadratic function by learning the coefficients a, b, c.

Javascript




// importing tensorflow
import tensorflow as tf
 
const xs = tf.tensor1d([0, 1, 2, 3]);
const ys = tf.tensor1d([1.1, 5.9, 16.8, 33.9]);
 
const a = tf.scalar(Math.random()).variable();
const b = tf.scalar(Math.random()).variable();
const c = tf.scalar(Math.random()).variable();
 
const f = x => a.mul(
  x.square()).add(b.mul(x)).add(c);
const loss = (pred, label) =>
         pred.sub(label).square().mean();
 
const learningRate = 0.01;
const optimizer =
      tf.train.adagrad(learningRate);
 
// Train the model.
for (let i = 0; i < 10; i++) {
   optimizer.minimize(() => loss(f(xs), ys));
}
 
// Make predictions.
console.log(
`a: ${a.dataSync()}, b: ${b.dataSync()}, c: ${c.dataSync()}`);
const preds = f(xs).dataSync();
preds.forEach((pred, i) => {
   console.log(`x: ${i}, pred: ${pred}`);
});


Output

a: 0.3611285388469696, 
b: 0.6980878114700317, 
c: 0.8787991404533386
x: 0, pred: 0.8787991404533386
x: 1, pred: 1.9380154609680176
x: 2, pred: 3.7194888591766357
x: 3, pred: 6.223219394683838

Reference: https://js.tensorflow.org/api/1.0.0/#train.adagrad

Whether you’re preparing for your first job interview or aiming to upskill in this ever-evolving tech landscape, neveropen Courses are your key to success. We provide top-quality content at affordable prices, all geared towards accelerating your growth in a time-bound manner. Join the millions we’ve already empowered, and we’re here to do the same for you. Don’t miss out – check it out now!

RELATED ARTICLES

Most Popular

Recent Comments