Skip to main content

Deploying models built with AWS SageMaker

It is possible to deploy models to Modelbit that were built and/or trained with AWS SageMaker.

To begin, retrieve your model's name from SageMaker. You can find it in the SageMaker section of the AWS console. Click "Models" in the left-hand navigation. You will see a list of your models, with "name" as the first column in the list.

You can also retrieve your model's name programmatically in Python:

import boto3
client = boto3.client('sagemaker')

model_list = client.list_models()

# E.g. retrieve the first model's name
your_model_name = model_list['Models'][0]['ModelName']

Now we need the model's data location, which will be a S3 URL. This is also available in the "Models" tab of the SageMaker section of the AWS console. Simply click your model in the list, and then look for "Model data location".

Or to retrieve it programmatically, given the model name:

import boto3
client = boto3.client('sagemaker')

model = client.describe_model(ModelName = your_model_name['ModelName'])

# Model dictionary structures can vary a bit model by model. Look for a PrimaryContainer or a list of Containers.
# 'ModelDataUrl' will be an attribute on the container.
if 'PrimaryContainer' in model:
model_data_location = model['PrimaryContainer']['ModelDataUrl']
else:
model_data_location = model['Containers'][0]['ModelDataUrl']

Now download this model file from S3 and untar it. This may be easiest using your computer's web and file browser. However you can also do it in code:

from urllib.parse import urlparse
import tarfile

model_data_s3_url = urlparse(model_data_location)
s3bucket = model_data_s3_url.netloc
s3filename = model_data_s3_url.path[1:]

boto3.resource('s3').Bucket(s3bucket).Object(s3filename).download_file('model.tar.gz')
file = tarfile.open('model.tar.gz')
file.extractall()
file.close()

Inside the tarfile you just extracted will be your model file, as well as any Python scripts you used in SageMaker for feature engineering. These can be deployed to Modelbit just like any other Python model!

For example, if you trained a model using XGBoost 0.90 in SageMaker, your tarfile will have an xgboost-model file you can deploy. In this example, we open the file, unpickle it, call it from a deploy function, and deploy it with a custom XGBoost 0.9 environment.

import pickle
from xgboost import DMatrix

xgb_file = open('xgboost-model', 'rb')
xgb = pickle.load(xgb_file)
xgb_file.close()

def inferFromSageMakerModel(feature_a: int, feature_b: str) -> float:
# Some XGBoost `predict` calls require casting input rows to DMatrix
return float(xgb.predict(DMatrix([[feature_a, feature_b]]))[0])

mb.deploy(
inferFromSageMakerModel,
name = 'sagemaker_predictor',
python_packages = ['xgboost==0.90']) # Match my XGBoost version from SageMaker