Modzy Labs

Domain Adaptation and Retraining

Retraining, based on domain adaptation and transfer learning, is emerging as one way to overcome challenges with applying artificial intelligence (AI) models to certain real-world scenarios. The field of AI has achieved great success in recent years and machine learning models are increasingly being applied in many practical applications across different domains. Methods used to train machine learning models do still have some limitations for certain real-world scenarios. A machine learning model needs to be trained on a large dataset of labeled data points to allow the trained model to generalize well in practice. However, collecting a sufficient number of labeled training data is expensive, time-consuming, or impractical in many scenarios. At Modzy, we have developed a solution for this problem based on transfer learning and domain adaptation:

  • For a specific task, we train our models on large datasets of labeled datapoints sampled from a wide-range of distributions. Subsequently, our models are exposed to many different scenarios during training and can make correct predictions in a wide variety of scenarios at test time.
  • Additionally, we have developed a novel retraining solution which requires limited amount of computational resources and time. Our quick retraining solution, developed based on transfer learning and domain adaptation, gives the users the possibility of quickly retraining our models on their specific datasets so that our models can efficiently transfer their knowledge in performing a specific task to the users’ specific data and maintain their good performance.

Transfer learning and domain adaptation aim to improve the performance of a machine learning model on the users’ datasets by transferring the knowledge acquired by the model during the training on a different but related dataset and applying it to the users’ specific use case.

Modzy Approach to Retraining with Domain Adaptation and Transfer Learning

 Let’s now take a closer look at Modzy’s approach to quick retraining solution based on domain adaptation and transfer learning. Humans have a natural ability to transfer knowledge and experience across different tasks and domains. Our retraining solution imitates this behavior for machine learning models by using transfer learning. It is important to note that the relevance between the training and user’s datasets affects the performance of our retraining solution. Intuitively, a person who has learned the viola usually learns to play the violin faster than someone who knows how to play the piano. Our quick retraining solution improves performance and minimizes the requirement for the users to create large datasets of labeled data points as this may be impractical in many real-world scenarios.

The input data for machine learning models is usually sampled from a specific domain. For example, if we wanted to design and train a machine learning model that detects buildings in overhead imagery, the domain would be a set of all overhead imagery taken by satellites. A machine learning model is trained to perform a specific task on the data sampled from the domain i.e., the model may perform the task of detecting buildings in satellite imagery.

Modzy’s quick retraining solution takes things one step further:

  • Our machine learning model is trained to detect buildings in our training dataset
  • The retraining approach then takes the knowledge gained from the task on training domain and transfers that to your specific dataset for the same task, i.e., the model learns how to detect buildings in your specific dataset of overhead imagery captured from different types of sensors
  • By enabling the retraining feature for a specific model, you can retrain that model on your smaller dataset of labeled images for a brief time and expect good performance for your specific dataset and use case

At Modzy, we have developed retraining solutions using domain adaptation and transfer learning for the following use cases:

  • All type of object detection models created with YOLO or RCNN family of architectures
  • All types of image classification models created with ResNet family of architectures
  • Transformer-based language models designed to perform tasks such as sentiment analysis or language classification on large text documents
  • Transformer-based language models designed to translate from one language to another language. For example, our retraining solution for our translation models helps transfer the translation knowledge of the model trained on academic text corpora to more colloquial parallel text documents of the same language pair

Being able to retrain AI models over time is a crucial step in continuous development and deployment of AI models for mission-critical applications. Modzy’s novel retraining solution gives the users the choice to retrain our models on their specific datasets for their specific applications.

 

 

Get Details on Quick Start Program