Google Machine Learning Engine Extension

Version 1.0.0

Request predictions from Cloud Machine Learning Engine. Through the predict action, you can request and serve predictions from an existing trained model.

After configuring this extension with your Google Cloud project ID and credentials, you make calls from an API proxy using the ExtensionCallout policy. In your policy configuration, you specify the model and model version to use, as well as the instance data you want the model to use for predicting.

To get started with Cloud ML Engine, see Introduction to Cloud ML Engine.

Prerequisites

This content provides reference for configuring and using this extension. Before using the extension from an API proxy using the ExtensionCallout policy, you must:

  1. Ensure that you have a model.

    For more about Cloud ML Engine models and other concepts, see Projects, Models, Versions, and Jobs.

  2. Use the GCP Console to generate a key for the service account.

  3. Use the contents of the resulting key JSON file when adding and configuring the extension using the configuration reference.

About Google Cloud Machine Learning Engine

You can use Cloud Machine Learning Engine to train machine learning models using the resources of Google Cloud Platform. You can host your trained models on Cloud ML Engine so that you can send them prediction requests and manage your models and jobs using the GCP services.

Actions

predict

Perform predictions on the specified instance data using the specified model.

Syntax

<Action>predict</Action>
<Input><![CDATA[{
  "model" : model-for-prediction,
  "version" : model-version,
  "instances" : data-to-use-for-making-prediction
}]]></Input>

Example

<Action>predict</Action>
<Input><![CDATA[{
  "model" : mymodel,
  "version" : version4,
  "instances" : {"instances": ["the quick brown fox", "la bruja le dio"]}
}]]></Input>

Request parameters

Parameter Description Type Default Required
model The model to use for predictions. String None. Yes.
version The version of an ML solution to use for predictions. String None. Yes.
instances The instances to get predictions for. The shape of items in this value will depend on the expectations of the model you're using to predict. For more, see Predict Request Details. Array None. Yes.

Response

A predictions array that includes prediction data returned by model specified in ExtensionCallout policy configuration.

{
  "predictions": [
    {
      "probabilities": [
        0.9435398578643799,
        0.05646015331149101
      ],
      "logits": [
        -2.816103458404541
      ],
      "classes": [
        "0"
      ],
      "class_ids": [
        0
      ],
      "logistic": [
        0.056460149586200714
      ]
    },
    {
      "probabilities": [
        0.9271764755249023,
        0.07282354682683945
      ],
      "logits": [
        -2.54410457611084
      ],
      "classes": [
        "0"
      ],
      "class_ids": [
        0
      ],
      "logistic": [
        0.07282353937625885
      ]
    }
  ]
}

Configuration Reference

Use the following when you're configuring and deploying this extension for use in API proxies. For steps to configure an extension using the Apigee console, see Adding and configuring an extension.

Common extension properties

The following properties are present for every extension.

Property Description Default Required
name Name you're giving this configuration of the extension. None. Yes.
packageName Name of the extension package as given by Apigee Edge. None. Yes.
version Version number for the extension package from which you're configuring an extension. None. Yes.
configuration Configuration value specific to the extension you're adding. See Properties for this extension package None. Yes.

Properties for this extension package

Specify values for the following configuration properties specific to this extension.

Property Description Default Required
projectId ID of the GCP project containing trained models used by this extension. None. Yes.
credentials When entered in the Apigee Edge console, this is the contents of your service account key file. When sent via the management API, it is a base64-encoded value generated from the service account key file. None. Yes.