Saturday, November 16, 2024
Google search engine
HomeLanguagesJavascriptTensorflow.js tf.layers.activation() Function

Tensorflow.js tf.layers.activation() Function

Introduction: Tensorflow.js is an open-source library developed by Google for running machine learning models and deep learning neural networks in the browser or node environment. Tensorflow.js tf.layers.activation() function is used to applied to function to all the element of our input layer . we can also apply function to the input data with dense layer. 

Syntax: 

tf.layers.activation(args);    

Parameters: Below are the parameters accepted by this function:

  • args: It is object type with fields:
    • activation: It is name of the function which is applied to the all the input element.
    • inputShape: It is the shape of input layer of model. It is used in the creation of the input layer.
    • batchInputShape: It is used in making of input layer. It defined the batch’s shape for the samples in input layer.
    • batchSize:  It is used in making of input layer. It is used as supplementary of batchInputShape in construction of  input layer.
    • dtype: It defined the data-type of layer. It is used for the first layer of model.
    • name: It declare string that is the  name for the input layer.
    • trainable: It declare the layer is trainable by the function or not. It is  boolean data-type.
    • weight: It is the tensor that is the initial data for the layer.
    • inputDType: It is the data-type for the input data in the layer.

Returns: Activation

Below are some examples for this function:

Example 1: In this example,  we will make activation layer and check the return value. 

Javascript




import * as tf from "@tensorflow/tfjs"
 
// Creating config for the activation layer
const config = {
    activation: 'sigmoid',
    inputShape: 5,
    dtype: 'int32',
    name: 'activationLayer'
};
 
// Defining the activation layer
const activationLayer = tf.layers.activation(config);
 
// printing return of activation layer
console.log(activationLayer);


Output: 

{
  "_callHook": null,
  "_addedWeightNames": [],
  "_stateful": false,
  "id": 38,
  "activityRegularizer": null,
  "inputSpec": null,
  "supportsMasking": true,
  "_trainableWeights": [],
  "_nonTrainableWeights": [],
  "_losses": [],
  "_updates": [],
  "_built": false,
  "inboundNodes": [],
  "outboundNodes": [],
  "name": "ActivationLayer",
  "trainable_": true,
  "initialWeights": null,
  "_refCount": null,
  "fastWeightInitDuringBuild": false,
  "activation": {}
}

Example 2: In this example, we will create our activation layer with some configuration and train our input data with activation layer. 

Javascript




import * as tf from "@tensorflow/tfjs"
 
// Configuration file for the activation layer
const geek_config = {
    activation: 'sigmoid',
    inputShape: 5,
    dtype: 'int32',
    name: 'activationLayer'
};
 
const geek_activation = tf.layers.activation(geek_config);
const geek_inputLayer = tf.layers.dense({units: 1});
 
// Our Input layer for the model
const geek_input = tf.input({shape: [7]});
 
// Making structure for the model
const geek_output = geek_inputLayer.apply(geek_input);
const geek_result = geek_activation.apply(geek_output);
 
// Making Model from structure 
const config2 = {inputs: geek_input, outputs: geek_result}
const model = tf.model(config2);
 
// Collect both outputs and print separately.
const config3 = tf.randomUniform([4, 7])
const  geek_activationResult = model.predict(confg3);
geek_activationResult.print();


Output: 

Tensor
    [[0.4178988],
     [0.2027801],
     [0.2813435],
     [0.2546847]]

Reference: https://js.tensorflow.org/api/latest/#layers.activation

Whether you’re preparing for your first job interview or aiming to upskill in this ever-evolving tech landscape, neveropen Courses are your key to success. We provide top-quality content at affordable prices, all geared towards accelerating your growth in a time-bound manner. Join the millions we’ve already empowered, and we’re here to do the same for you. Don’t miss out – check it out now!

RELATED ARTICLES

Most Popular

Recent Comments