Deploying ML Models in Production with Dataiku

READ DATASHEET
  • Deploy models in one click on the cloud with Kubernetes
  • Scalable and highly available API engine
  • Feedback loop ready
  • Deploy models in one click on the cloud with Kubernetes
  • Scalable and highly available API engine
  • Feedback loop ready
Deploy to production with Model API Deployer

Deploy to production in one click

  • Empower analysts and data scientists to deploy models into production in a few clicks.
  • Data cleaning, enriching, preprocessing, as well as models, are bundled together for simplified scoring pipelines.
  • Deployed models are versioned, enabling users to deploy new versions, compare them and rollback at anytime.
Model API Deployer monitoring

Scalability & high availability

  • Handle large quantities of real-time predictions with queuing, parallelism, and load balancing.
  • Run multiple scoring nodes for full high availability.
  • Automatic elastic scaling to handle unexpected traffic surges.
Docker and Kubernetes logos

Deploy on the cloud with Kubernetes

  • Deploy your API on-premises or in the cloud.
  • Fully native integration of Kubernetes for elastic and reproducible deployments.
  • Full GPU support for deep-learning models.
Automatic code samples

Powerful API engine for your applications

  • Deploy as an API: visual models, custom Python or R models, custom Python or R functions or SQL queries.
  • Easy to use REST API. Embed in a few lines of code.
  • Automatic generation of ready-to-use code samples.
Model API Deployer multiple versions

Avoid model drift with a feedback loop

  • Run multiple versions of the same model at the same time for automated A/B testing.
  • Monitor data changes over time.
  • Access history of logs queries and predictions at any time to check that model performance is not drifting with time.