Artificial intelligence (AI) models require constant monitoring to ensure that their performance characteristics are maintained. At Modzy, when we become aware that a model’s performance characteristics are faltering, we take action and refine the model to bring its performance characteristics back in line with the model’s original performance.
What You Need to Know
The Modzy model marketplace contains a variety of models, some of which were written at Modzy, others written by our partner organizations. Many of the models have objective credentials and can be counted themselves as best in class or the commercial gold standard. Having these accolades associated with our models allows Modzy to deliver the world’s best AI to our end users, but delivering high performance to our end users requires a lot of work and vigilance.
Models will always function at a high level as long as the data they are being presented resembles the data upon which the models were trained. If the data that a model is being presented with differs from its training data, or if the signatures within the data change, the model’s performance will deteriorate. These two phenomena are collectively referred to as model drift, and if models are going to remain relevant in spite of them, they must evolve to handle deviations from the original training set.
One can understand the first cause of model drift by considering the case of a transcription model trained on audio recorded in a studio. It might work well to transcribe the evening news, but if the studio recordings were taken with an aging microphone overlaid with a noisy hiss effect, then the transcription model will get confused and need new training data to learn how to compensate for the new noise in the input data. This kind of drift is related to the data being input into the model.
The second cause of model drift occurs when the underlying patterns that are being detected change. A model designed for catching fraud is only as good the last fraud scheme. It won’t perform well on emergent schemes until the model is trained to identify the new scheme. This kind of drift is related to the concept being identified by the model.
The Modzy platform was designed with an understanding of model drift in mind, and was built with infrastructure pathways that will allow for correction of models through the techniques designed over at Modzy Labs, as well as an ability to freeze a highly performant model on deployment.
At Modzy, we realize that one size rarely fits all, and so our platform is orchestrated to be both updatable and backwards compatible. If a model update would help our global end user base, we implement it and make it available; however, accessing any model in Modzy requires providing a version number so updating the platform won’t break any existing API calls. Additionally, there are situations in which end users might not want updates to a model, so we never take a model out of the platform that end users are actively running on a routine basis.
What This Means for You
It’s this attentiveness to end user needs that marks Modzy’s commitment to model maintenance and drift correction. If you’d like more information, feel free to contact [email protected].