• Load a model composed of Layer objects, including its topology and optionally weights. See the Tutorial named "How to import a Keras Model" for usage examples.

    This method is applicable to:

    1. Models created with the tf.layers.*, tf.sequential, and tf.model APIs of TensorFlow.js and later saved with the tf.LayersModel.save method.
    2. Models converted from Keras or TensorFlow tf.keras using the tensorflowjs_converter.

    This mode is not applicable to TensorFlow SavedModels or their converted forms. For those models, use tf.loadGraphModel.

    Example 1. Load a model from an HTTP server.

    const model = await tf.loadLayersModel(
    'https://storage.googleapis.com/tfjs-models/tfjs/iris_v1/model.json');
    model.summary();

    Example 2: Save model's topology and weights to browser local storage; then load it back.

    const model = tf.sequential(
    {layers: [tf.layers.dense({units: 1, inputShape: [3]})]});
    console.log('Prediction from original model:');
    model.predict(tf.ones([1, 3])).print();

    const saveResults = await model.save('localstorage://my-model-1');

    const loadedModel = await tf.loadLayersModel('localstorage://my-model-1');
    console.log('Prediction from loaded model:');
    loadedModel.predict(tf.ones([1, 3])).print();

    Example 3. Saving model's topology and weights to browser IndexedDB; then load it back.

    const model = tf.sequential(
    {layers: [tf.layers.dense({units: 1, inputShape: [3]})]});
    console.log('Prediction from original model:');
    model.predict(tf.ones([1, 3])).print();

    const saveResults = await model.save('indexeddb://my-model-1');

    const loadedModel = await tf.loadLayersModel('indexeddb://my-model-1');
    console.log('Prediction from loaded model:');
    loadedModel.predict(tf.ones([1, 3])).print();

    Example 4. Load a model from user-selected files from HTML file input elements.

    // Note: this code snippet will not work without the HTML elements in the
    // page
    const jsonUpload = document.getElementById('json-upload');
    const weightsUpload = document.getElementById('weights-upload');

    const model = await tf.loadLayersModel(
    tf.io.browserFiles([jsonUpload.files[0], weightsUpload.files[0]]));

    Parameters

    • pathOrIOHandler: string | IOHandler

      Can be either of the two formats

      1. A string path to the ModelAndWeightsConfig JSON describing the model in the canonical TensorFlow.js format. For file:// (tfjs-node-only), http:// and https:// schemas, the path can be either absolute or relative. The content of the JSON file is assumed to be a JSON object with the following fields and values:
        • 'modelTopology': A JSON object that can be either of:
          1. a model architecture JSON consistent with the format of the return value of keras.Model.to_json()
          2. a full model JSON in the format of keras.models.save_model().
        • 'weightsManifest': A TensorFlow.js weights manifest. See the Python converter function save_model() for more details. It is also assumed that model weights can be accessed from relative paths described by the paths fields in weights manifest.
      2. A tf.io.IOHandler object that loads model artifacts with its load method.
    • Optional options: LoadOptions

      Optional configuration arguments for the model loading, including:

      • strict: Require that the provided weights exactly match those required by the layers. Default true. Passing false means that both extra weights and missing weights will be silently ignored.
      • onProgress: A progress callback of the form: (fraction: number) => void. This callback can be used to monitor the model-loading process.

    Returns Promise<LayersModel>

    A Promise of tf.LayersModel, with the topology and weights loaded.

    Doc

Generated using TypeDoc