Tensorflow.js is an open-source library that is developed by Google for running machine learning models as well as deep learning neural networks in the browser or node environment.
The .executeAsync() function is used to implement implication in favor of the given model for the stated input tensors in async manner. Moreover, you can utilize such method if your model includes flow operations.
Syntax:
executeAsyn(inputs, outputs?)
Parameters:
- inputs: It is the stated tensor or a tensor array or a tensor map of the inputs in favor of the model, handled via input node designations. It is of type (tf.Tensor|tf.Tensor[]|{[name: string]: tf.Tensor}).
- outputs: It is the stated output node designation from the stated tensorflow model. If the outputs are not stated, then the by default outputs of the stated model must be applied. Moreover, we can analyze the in-between nodes of the specified model by affixing them to the outputs array. It is of type string or string[].
Return Value: It returns promise of tf.Tensor or tf.Tensor[].
Example 1: In this example, we are loading MobileNetV2 from a URL.
Javascript
// Importing the tensorflow.js library import * as tf from "@tensorflow/tfjs" // Defining tensor input elements const model_Url = // Calling the loadGraphModel() method const mymodel = await tf.loadGraphModel(model_Url); // Defining inputs const inputs = tf.zeros([1, 224, 224, 3]); // Calling executeAsync() method const res = await mymodel.executeAsync(inputs); // Printing output console.log(res); |
Output:
Tensor [[-0.1800361, -0.4059965, 0.8190175, ..., -0.8953396, -1.0841646, 1.2912753],]
Example 2: In this example, we are loading MobileNetV2 from a TF Hub URL.
Javascript
// Importing the tensorflow.js library import * as tf from "@tensorflow/tfjs" // Defining tensor input elements const model_Url = // Calling the loadGraphModel() method const mymodel = await tf.loadGraphModel( model_Url, {fromTFHub: true }); // Defining inputs const inputs = tf.zeros([1, 224, 224, 3]); // Defining outputs const outputs = "module_apply_default/MobilenetV2/Logits/output" ; // Calling executeAsync() method const res = await mymodel.executeAsync(inputs, outputs); // Printing output console.log(res); |
Output:
Tensor [[-1.1690605, 0.0195426, 1.1962479, ..., -0.4825858, -0.0055641, 1.1937635],]
Reference: https://js.tensorflow.org/api/latest/#tf.GraphModel.executeAsync