Modzy, the ModelOps platform to responsibly deploy, manage, and get value from artificial intelligence (AI) at scale, joined the recently published Open Model Interface (OMI) and chassis.ml open source projects, providing a standard, interoperable, secure framework for containerizing AI models. OMI provides a standard specification to containerize machine learning models. Chassis.ml converts models from multiple training tools and frameworks into OMI compatible containers to run anywhere. By implementing OMI and chassis.ml, data scientists and developers gain a standard container specification allowing interoperability, portability, and security for models to seamlessly be integrated into production applications.

With the diversity in model training tools and frameworks, as well as fast evolving nature of the space, there was no standard way to containerize models built from different tools, nor ability to make containers compatible with different runtime engines. This led to significant fragmentation, increased deployment times, and reduced model portability, and is a contributing factor to the estimated 50-90 percent of AI models that never move into production applications.

“This open source collaboration has been one I’ve highly anticipated. There is now a means for data scientists to ‘auto-magically’ turn their models into secure, scalable machine learning containers that can run almost anywhere. Modzy is excited to join in these projects as we continue to find ways to support the AI community and its advancements,” said Seth Clark, Head of Product, Modzy.

Why OMI and Chassis?

The OMI is designed to serve as a spec for wrapping models in OCI-compliant containers with a simple yet powerful interface that makes it easier to move models into production. The OMI:

    • Creates a uniform way to convert models into portable, containerized applications that can run anywhere – in the cloud, on-premise, or at the edge
    • Allows teams the flexibility to continue using their existing training tools, frameworks, language, etc. and adopt a standardized container to package models that is DISA compliant, ensuring a high level of security
    • Eliminates the need to build new integrations for model types, as OMI provides a common, well-documented interface with support

OMI and Chassis.ml allow data scientists to turn their models into containerized applications without having to learn dozens of new technologies. Chassis converts models from all training tools and frameworks to OMI compliant containers to run in a number of different types of AI runtime platforms. Chassis.ml can be integrated into existing MLOps pipelines, making it possible for models to be automatically containerized, scanned, and deployed to a secure container registry.

Next Steps and Resources

If you’re interested in learning more, please check out the following resources: