Neptune.ai and ClearML are both ML platforms that help with experiment tracking, model management, and collaboration among data scientists and MLOps engineers.
This comparison goes back to Q1 2023.
While there are some similarities between these two platforms, there is a key difference: Neptune.ai has a strong focus on experiment tracking (experiment logging, sharing and collaboration) while ClearML has a broader scope and a wider range of features with its focus on automation and scalability of the ML workflow: automated hyperparameter tuning/optimization, ML pipeline management, versioning, data management, model orchestration, model serving…
Both have experiment tracking features, but ClearML also provides:
- Hyperparameter Optimization: ClearML provides hyperparameter optimization tools that allow users to optimize their models by automatically tuning hyperparameters. ClearML allows to automatically optimize a given task, while monitoring and recording each run execution details.
https://clear.ml/docs/latest/docs/fundamentals/hpo/ - Pipelines: ClearML provides several features for building and managing machine learning pipelines. In a nutshell: ClearML allows you to create code “blocks” (i.e. tasks) and combine them to create advanced automated pipelines (tasks orchestration — for example: run task 2 if run 1 passed, or if the model had an accuracy > a given threshold).
https://clear.ml/docs/latest/docs/pipelines/ - Serving/Production and Monitoring: ClearML provides a model-serving command line utility for model deployment and orchestration. It enables model deployment including serving and preprocessing code to a Kubernetes cluster or custom container based solution. Key features are: Automatic deployment CI/CD and Model Monitoring in real-time. https://clear.ml/docs/latest/docs/clearml_serving/
- Built-In report tool: a markdown environment for creating reports, with built in features to quickly include results from ClearML modules.
https://clear.ml/blog/introducing-clearml-reports/ - Auto-Scaler: you can define rules and thresholds for automatically provisioning and de-provisioning resources such as virtual machines or Kubernetes clusters based on the tasks workload demand. Autoscaler integrates with popular cloud providers such as AWS, Azure, and GCP) as well as on-premises infrastructure.
https://clear.ml/docs/latest/docs/guides/services/aws_autoscaler/ - Hyper-datasets (pay only — not open-source): are essentially collections of data that can be versioned, tracked, and shared across multiple experiments. An abstraction of datasets with useful features. https://clear.ml/docs/latest/docs/hyperdatasets/overview/
Main Aspects Comparison
Setup/Requirements
No special requirements for the setup other than having the clearML python package (if using managed hosting) or the neptune-client installed.
In case of self hosted ClearML; pre-available options are:
- pre-built AWS EC2 AMIs
- pre-built GCP custom images
- pre-built docker images
- Kubernetes + Helm
Tracked metadata can be accessed via CLI/Custom API or Python SDK.
- ClearML also supports REST API access
- Neptune.ai also has an R SDK
Tracking and other features requires minimalistic code changes: adding decorators or API calls to start tracking, create tasks, logging, etc. You don’t need to change your code, just add some lines of code.
Example in ClearML — create a task from a .py script
from clearml import Task
task = Task.init(project_name=’project’, task_name=’experiment’)
Experiment Tracking Features Comparison
Both can perform experiment tracking: the process of saving all experiment related information that you care about for every experiment you run. This “metadata you care about” will strongly depend on your project, but it may include: scripts used for running the experiment, environment config files, data versions used for training and evaluation, parameters, eval. metrics, model weights, performance, …
Some key differences are listed below:
- Both Neptune.ai and ClearML can track notebooks (code, and output cells): snapshots and versioning of notebooks, but in general, neptune.ai has a stronger notebook integration/support:
→ Log and display notebook checkpoints during model training
→ Connect notebook checkpoints with model training runs in Neptune
→ Organize and browse checkpoints
→ Compare notebooks side-by-side
- In general — Neptune.ai has a slightly more advanced and interactive WebUI interface for experiment tracking, compared to ClearML (even if clearML has some small unique features).
- ClearML supports Prediction Visualization (only for the paid version — and in a limited way). Neptune.ai does not support this kind of visualization.
→ Interactive confusion matrix for image classification
→ Overlayed prediction masks for image segmentation, bounding boxes, …
Comparing Experiments: both can compare different experiments (runs) with side-by-side comparisons of parameters, values, metrics, …
- There are some differences: ClearML also allows to compare videos and audio clips, git/source files comparison, pip requirements.txt (environment comparison), models and csv files. Not provided by Neptune.ai.
- Neptune.ai allows system information comparison and tracking: error stack trace, system details (host, owner), console logs. Less accurate in ClearML.
Organizing and searching experiments and metadata is slightly more efficient/powerful in Neptune.ai since it is based on a query language and it has more features such as advanced filters, capability of saving filters, search history and experiment table views, …
Experiments reproducibility and traceability: both tools allow experiment to be traced and re-run in the same conditions (env. is versioned and reproducible). ClearML also manages a cache of all downloaded content so nothing is duplicated, and code won’t need to download the same piece twice.
Project Overview/Report: ClearML has an additional feature which provides options to present a general picture of the project. The page consists of a graph that can show a snapshot of a specific metric’s value across the project’s experiments, and a space to enter and edit a project’s description in markdown.
https://clear.ml/docs/latest/docs/webapp/webapp_project_overview/
A detailed table comparing neptune.ai vs clearML (in terms of the characteristics of neptune vs clearML): https://neptune.ai/vs/clearml
Model Registry/Versioning
Model versioning in a way involves tracking the changes made to an ML model that has been previously built. It is a way of taking notes of the changes you make to the model through tweaking hyperparameters, retraining the model with more data, and so on.
Both ClearML and Neptune.ai support this feature. While tracking a task/experiment, they detect models (from known libraries) and automatically log/track them.
Code versions (used for training), parameters, dataset versions, results (metrics, visualizations) and model files (packaged models and weights) are versioned by both tools.
- ClearML also provides environment (used python env) versioning, neptune.ai does not.
- Neptune.ai also provides limited explanation versioning (SHAP, DALEX), ClearML does not.
- Neptune.ai allows to define for each versioned model a transition tag (develop, staging, production) and the ability to add annotations/comments from the WebUI.
- While exploring registered models, neptune.ai provides more advanced filtering/searching features.
Support and Integrations
List of supported external libraries differs:
ClearML: https://clear.ml/docs/latest/docs/integrations/libraries/
Neptune: https://docs.neptune.ai/integrations/#__tabbed_1_1
Model Training:
- Supported by both: fastai, LightGBM, PyTorch (basic, Ignite and Lightning), Scikit Learn, TensorFlow, Keras, XGBoost
- Only ClearML: CatBoost,
- Only Neptune.ai: Catalyst, FBProphet, HuggingFace, Skorch
Hyperparameter Optimization:
- Supported by both: Keras Tuner, Optuna,
- Only Neptune.ai: Scikit-Optimize
IDEs and Notebooks
JupyterLab, Jupyter Notebook, Google Colab and AWS Sagemaker are supported by both tools. Neptune.ai also supports Deepnote.
Other Libraries:
Neptune.ai also supports Kedro, ZenML, MLFlow, Sacred and TensorBoard.
Neptune.ai also has limited support to GitHub Actions while Jenkins is supported by ClearML.
ClearML Pricing
Prices listed here are from Q1 2023.
ClearML has a credit-based system (pay for what you use) pricing model.
- 💲FREE: up to 3 users; limited features → Limited usage (usage threshold limits)
- 💲PRO: up to 10 users; 15$ per user/month + usage cost. More enabled features. → Usage prices (after reaching FREE threshold limits): you pay for artifacts storage, metric events, API Calls, ClearML application hr usage.
Upgrade FREE ➡️ PRO: Pro has more enabled features, such as:
→ HyperParameter Optimization
→ Task Scheduling
→ Pipeline Triggers
→ Project Dashboards
→ Task Monitoring
→ ClearML Remote Development
- 💲SCALE: custom price/contract but unlimited users and usage
Upgrade PRO ➡️ SCALE and ENTERPRISE: added key features from PRO are:
→ Hyper-Datasets (Data access abstraction layer fully separating code from data, along with DB query capabilities and version control built-in)
→ ClearML Deploy
→ Kubernetes Integration
→Service level agreement
- 💲ENTERPRISE: custom price/contract but unlimited users and usage
Upgrade SCALE ➡️ ENTERPRISE: added key features from SCALE to ENTERPRISES are:
→ Configuration Vault (Store configuration vault per user / group / company, supporting environment variables, access keys, storage credentials etc.)
→ ClearML Custom Application (build your own app on top of ClearML)
→ Role-Based Access Control
→ Dedicated Support
Detailed pricing table: https://clear.ml/pricing/
A set of short videos describing each ClearML main module/feature:
https://clear.ml/docs/latest/docs/getting_started/video_tutorials/quick_introduction
Neptune.ai Pricing
Prices listed here are from Q1 2023.
Neptune.ai has a per user and per feature pricing model.
- 💲FREE: 1 user, 200 logging hr/month
- 💲TEAM: 150$/month → unlimited users, 1.500 logging hr/month, standard email and chat support
- 💲ORGANIZATION: 600$/month → unlimited users, 6.000 logging hr/month, priority email and chat support, user access management
- 💲CUSTOM: custom $/month → unlimited users, unlimited usage, SSO, Customisable contract and SLA, Dedicated support and onboarding
Detailed pricing table: https://neptune.ai/pricing
Personal (HPA) Evaluation:
Our requirements/needs: Only experiment tracking is needed (track, compare, save and reproduce “manually performed tasks” i.e. data analysis, training, pre-processing…) to improve our “modeling-workflow”.
Neptune.ai has a strong focus on this, it’s a bit more advanced and polished compared to ClearML. It’s the state-of-the-art platform for tracking machine learning experiments, logging metrics, analyzing performance charts, and much more.
For a Broader objective: to automate and improve the entire ML project life cycle, from experiment tracking to deployment, serving and monitoring:
→ ClearML offers a “stack” of functionalities, each of which can be used independently according to needs.
→ ClearML is more complex and extensive, the learning curve is more difficult.
→ ClearML can be more difficult to set up compared to other experiment tracking tools.
→ Documentation can be challenging to navigate, especially for new users; since it’s huge.