Using Tecton features for inferences
You can connect Tecton with Modelbit and use features from your Tecton feature stores in Modelbit deployments.
Create a Tecton API key
Modelbit connects to Tecton using a Tecton API key. If you don't already one, you can create a Tecton API key following the steps below:
From a terminal that's authenticated with Tecton, create a Service Account for Modelbit:
tecton service-account create --name modelbit
That'll create an API key:
Save this API Key - you will not be able to get it again.
API Key: de064...
Service Account ID: 3721a...
You also need to grant the new Service Account access to call the Tecton API in your workspace:
tecton access-control assign-role --service-account 3721a... --role consumer --workspace my-workspace
Your Tecton API key is now ready to be added to Modelbit!
Adding a Tecton API key to Modelbit
In your Modelbit workspace, open Settings and then Integrations.
Click the Tecton tile and past in your Tecton API key. Then click Save
. Your Tecton API key has been added to Modelbit!
Using your Tecton feature store
In this example we assume you're already familiar with Tecton so we'll deploy the example model, and use the example Tecton feature store fraud_detection_feature_service
, from the Tecton Quickstart Tutorial.
import requests, json, os
import pandas as pd
import modelbit as mb
TECTON_WORKSPACE = "my-workspace"
TECTON_ACCOUNT = "my-account"
def get_tecton_feature_data(user_id: str):
online_feature_data = requests.post(
headers={"Authorization": "Tecton-key " + mb.get_secret("TECTON_API_KEY")},
url=f"https://{TECTON_ACCOUNT}.tecton.ai/api/v1/feature-service/get-features",
data=json.dumps({
"params": {
"feature_service_name": "fraud_detection_feature_service",
"join_key_map": {"user_id": user_id},
"metadata_options": {"include_names": True},
"workspace_name": TECTON_WORKSPACE
}
}),
)
return online_feature_data.json()
# main function
def predict_fraud(user_id: str) -> float:
feature_data = get_tecton_feature_data(user_id)
columns = [f["name"].replace(".", "__") for f in feature_data["metadata"]["features"]]
data = [feature_data["result"]["features"]]
features = pd.DataFrame(data, columns=columns)
return model.predict(features)[0]
You'll notice that we're calling mb.get_secret("TECTON_API_KEY")
to fetch our Tecton API key so we can authenticate with Tecton. This will work both in your local Python environment and in Modelbit deployments at inference time.
We're using the features returned by get_tecton_feature_data
, and the model
trained in the quickstart, to compute inferences in predict_fraud
.
We can call predict_fraud
to test that everything is working as expected:
predict_fraud("user_502567604689")
Which correctly returns 0
, not fraud!
Deploying to Modelbit
Finally, use mb.deploy
or Git to deploy predict_fraud
to Modelbit.