We know data scientists like to use a variety of tools during the model development process. That’s why Modzy was designed with flexibility top of mind. You can deploy any model to Modzy—as long as you’re able to containerize it according to Modzy’s specification. We’ve made that process straightforward through documentation, template code, and complete examples.

There may also be situations where data scientists are willing to forego a bit of granular control over their model containers in favor of automation and ease of use. To that end, we’ve built direct integrations, and are constantly adding more, through which users can easily deploy models trained using AWS SageMaker, MLflow, and Azure ML (coming soon) to Modzy.

Finally, we recently joined two open-source projects that make the containerization process easier for data scientists. chassis.ml, which is a Kubernetes-based service, allows you to build MLflow models into container images compatible with the Open Model Interface (OMI). We are so excited by this project that we built an integration to allow for streamlined deployment to Modzy and KFServing. Let’s dive into how these integrations and auto-containerization tools work, and how easy they make it for model developers to build models into Modzy-compatible container images.

AWS SageMaker Integration

AWS SageMaker provides pre-built algorithms for machine learning tasks such as image classification, object detection, and semantic segmentation. Users provide their own data and kick off a training job that executes on AWS EC2 instances. Once the SageMaker training job is complete, Modzy’s integration requires users to point to the output model artifacts and provide a bit of additional metadata. From there, the platform takes over, all the way through to having a production-ready model deployed to Modzy.

MLFlow Integration

MLflow has a variety of useful features, one of which is “MLflow Models” which allows you to package machine learning models from a variety of “flavors” (machine learning frameworks) into a standard format. They provide built-in support for many popular frameworks such as Scikit-learn, XGBoost, and PyTorch. If you train a model using a supported MLflow “flavor,” you can use MLflow’s built-in functionality to save the model in this standardized format, along with a “Model Signature” defining the model’s inputs and outputs. The user must then provide a bit of additional metadata, and from there the integration takes it all the way to a production-ready model deployed to Modzy.


The integrations we’ve covered in this post so far run using a service that ships as part of an instance of Modzy. However, a recent open-source project, chassis.ml, can run outside of Modzy – including on your laptop, by using minikube. Currently, chassis.ml supports packaging MLflow-supported models into either Open Model Interface-compatible or KFServing-compatible Docker images. However, the supported input models and deployment targets will continue to expand in the future, making Chassis a useful tool for model developers to easily package their models for deployment without worrying about managing dependencies or building web services. If you would like to learn more about Chassis or are interested in contributing to the project, please visit https://chassis.ml/ for more information.

We’ll continue to build integrations to other popular model training tools and frameworks, as well as expand the ways that Modzy makes it it easy for data scientists to quickly and easily containerize and deploy models into production.