Documentation > Custom Dashboard > ML Model Integration

ML Model Integration & Visualization

Integrating trained Machine Learning (ML) models directly into your dashboards transforms them from static reports into dynamic analytical tools. ML Clever allows you to connect your models and visualize their performance, understand their inner workings, and even use them for real-time predictions directly within the dashboard interface.

This guide explores how to add, configure, and interact with specialized **Model Components**. These components are designed to display model-specific information like performance metrics (e.g., Accuracy, R², Confusion Matrices), feature importances, coefficient values, prediction results, and more. Mastering these components enables you to monitor model health, explain outcomes, and build interactive prediction workflows.

Example ML Clever dashboard showing model performance and prediction components

Understanding Model Components

Model components are specialized dashboard elements designed to interact with and display information derived from your trained ML models. They share characteristics with standard data visualization components but have unique features:

Model-Centric Display

Each component connects to a specific trained model and displays relevant information like performance metrics, feature importance, prediction interfaces, or actual vs. predicted plots.

Configurable Model Settings

Settings allow you to select the specific model, choose which metrics or features to display, configure prediction inputs, and customize visual aspects like chart types or color palettes.

Data & Model Driven

Components fetch data related to the model's performance (e.g., metrics from evaluation, prediction results) or structure (e.g., feature importances, coefficients) based on the linked model ID.

Interactive Prediction (Optional)

Some components, like the Prediction Calculator, provide interactive forms allowing users to input new data and receive real-time predictions from the integrated model.

Model Component Anatomy

Model components often include:

  • Content Area: Displays the model-specific visualization (metric report, chart, prediction form).
  • Settings Icon (): Opens the configuration panel for the component, including model selection.
  • Info Icon (): May show details about the connected model (name, type, score) or allow navigation to the full model details page.
  • Lock Indicator (): Shows if the component's position/size is locked.
  • Prediction Interface (if applicable): Input fields, buttons, and result displays for making predictions.

Adding Model Components

Integrate your ML models by adding specialized components to your dashboard. The process is similar to adding data visualization components but involves selecting model-specific types and linking them to a trained model.

Workflow diagram for adding a model component
1

Open the 'Add Component' Modal

Click the "Add Component" () button in the dashboard toolbar.

2

Select the Model Component Type

Browse the component library. Look for components specifically designed for ML models (e.g., 'Model Summary', 'Feature Importance Chart', 'Prediction Calculator'). Select the one that meets your needs. The available components may depend on the type of model you intend to connect (see 'Model Component Library' below).

UI showing selection of model-specific component types
3

Connect Model & Set Initial Options

The crucial step is **connecting the component to a trained model**. You'll typically select the desired model from a list or provide its ID. Depending on the component, you might also set initial options like the primary metric to display or the features for an importance chart.

Form showing initial configuration options, including model selection
4

Add to Dashboard

Click "Add" or "Create". The model component will appear on your dashboard, linked to the selected model and displaying the relevant information. You can then reposition and resize it.

Configuring Model Component Settings

After adding a model component, use the Component Settings modal () to fine-tune its behavior and appearance. This includes adjusting model-specific parameters, visualization options, and connecting to different models if needed.

Model Component Settings modal showing live preview and configuration form

Settings Panel & Configuration

The settings panel provides powerful customization options and visual controls:

Accessing Settings

Settings Icon

Hover over the component on your dashboard to reveal a settings or cog icon, typically located in one of the corners. Clicking this icon opens the settings panel.

Modal Layout

The settings are presented within a modal window. This layout typically includes a live preview area alongside the configuration form, allowing you to see changes instantly.

Live Preview

The integrated preview pane shows an interactive representation of your component. It updates in real-time as you adjust settings in the form, providing immediate visual feedback on model visualizations or prediction interfaces.

Common Model Option Types

Model SelectorConnect Model

A dropdown or search field to select the specific trained ML model instance (by ID or name) that the component should use.

Metric/Feature SelectorsMetrics, Features

Dropdowns or multi-select lists to choose which performance metrics (e.g., Accuracy, MSE), features (for importance/coefficients), or prediction outputs (e.g., class probabilities) to display.

Text & Number Inputs Title, Thresholds

Used for component titles, descriptions, setting numerical thresholds (e.g., for highlighting metrics), or specifying chart parameters (e.g., number of features to show).

Visualization Controls Color, Chart Type

Options like color pickers or palette selectors for charts, toggles for showing/hiding elements (e.g., confidence intervals), or selectors for different chart representations (e.g., bar vs. horizontal bar for importance).

Other Actions

Prevents accidental modification.

Permanently removes component.

Instant Save: Changes made in the settings panel are typically saved automatically moments after modification.
Model Details Link: Look for a link or button within the settings to navigate to the full details page of the connected model for deeper analysis.

Interacting with Model Components

Model components offer specific interactions beyond standard layout manipulation:

Interacting with model components - using prediction calculator, hovering on charts

Using Prediction Calculators

Enter values into the input fields (which correspond to model features). Click 'Predict' to send the data to the model API and see the resulting prediction (e.g., class label, regression value, probabilities) displayed directly in the component.

Interpreting Visualizations

Hover over elements in model charts (bars in feature importance, points in scatter plots, cells in confusion matrices) to view detailed tooltips with exact values, feature names, or class counts.

Accessing Model Info

Click the info icon () or dedicated links within the component or its settings to view metadata about the connected model or navigate to its full details page for comprehensive analysis.

Layout & Locking

Standard dashboard interactions like moving, resizing, and locking () components apply to model components as well, allowing you to organize your model analysis layout.

Model Component Library: Available Types

ML Clever provides a range of components tailored for different ML models and analysis tasks. Availability may depend on the specific model type (e.g., classification vs. regression). Common components include:

Model Summary / Details

Displays key information: model name, type, primary score (e.g., Accuracy, R²), training status, and links to full details.

Prediction Calculator

Provides an input form based on model features to make and display single predictions.

Model Parameters

Shows the hyperparameters and settings used during the model's training.

Predictions Table

Displays a table of predictions, often from a batch prediction job or historical test data.

Text Area

Standard text component to add notes, explanations, or context related to the model components.

Confusion Matrix

Visualizes the performance of a classification model, showing true/false positives/negatives.

ROC Curve

Plots the true positive rate against the false positive rate at various threshold settings.

Precision-Recall Curve

Plots precision against recall for different thresholds, useful for imbalanced datasets.

Classification Report

Table displaying key metrics per class (precision, recall, F1-score, support).

Actual vs. Predicted Plot

Scatter plot comparing the model's predicted values against the actual target values.

Performance Metrics (Regression)

Displays key regression metrics like R², MAE, MSE, RMSE in a table or card format.

Feature Importance / Coefficients

Displays the relative importance or learned coefficients of model's features, often as a bar chart or table.

Importance/Coefficients Chart

A dedicated bar chart visualizing feature importances or coefficients, often configurable (top N, colors).

Prophet Actual vs. Predicted

Line chart comparing actual time series data against Prophet's forecast (yhat), often including confidence intervals.

Trend Components Analysis

Visualizes the decomposed components of a time series model (trend, seasonality, holidays).

Note: The exact components available depend on the type of model selected (e.g., 'Confusion Matrix' for classification, 'Actual vs. Predicted Plot' for regression) and may vary based on platform updates.

Best Practices for Model Visualization

Effectively presenting model insights on a dashboard requires careful consideration:

Select Appropriate Model Visuals

Choose components relevant to the model type and the questions you want to answer (e.g., use Confusion Matrix for classification accuracy, Feature Importance for interpretability, Actual vs. Predicted for regression fit).

Clarity in Metrics and Labels

Ensure all metrics (Accuracy, R², MAE) are clearly labeled. Use descriptive titles for components and charts. Define axes clearly, especially for feature importance or coefficient plots.

Provide Context

Use Text Area components to explain what the model does, its primary use case, limitations, or the significance of certain metrics. Link to the full model details page for deeper dives.

Focus on Key Insights

Avoid overwhelming the dashboard with every possible metric or plot. Highlight the most important performance indicators and interpretable visualizations for the intended audience.

Consistent Layout for Comparison

If comparing multiple models, use a consistent layout and set of components for each to make comparisons easier.

Troubleshooting Model Component Issues

If model components aren't displaying correctly, consider these points:

ProblemTroubleshooting Steps
Component Shows Error or "No Data"

• Verify Model Connection: Open Settings () and ensure the correct trained model is selected and accessible.

• Check Model Status: Confirm the selected model has completed training (`training_status: complete`) and necessary artifacts (metrics, predictions) exist.

• Check Required Data: Ensure the model has the data needed for the component (e.g., feature importances calculated, evaluation metrics saved).

• API/Backend Issues: Check browser console for API errors related to fetching model data (`/components/.../data` or model-specific endpoints).

Prediction Calculator Fails

• Check Input Format: Ensure data entered matches the expected types (numeric, categorical) and ranges for the model's features.

• Model API Endpoint: Verify the prediction API endpoint (`/predict`) is functioning correctly and the model is loaded/available for inference.

• Permissions: Ensure the user has permissions to make predictions with the selected model.

• Review Input Fields: Check if the input fields generated match the actual features the model was trained on.

Visualizations Look Incorrect

• Re-check Component Settings: Verify the correct metrics, features, or axes are selected in the component settings.

• Data Scaling/Normalization: Check if features displayed (e.g., importance) are on vastly different scales, potentially requiring normalization or logarithmic scale options if available.

• Data Integrity: Ensure the underlying model artifacts (e.g., saved metrics, coefficients) are correct. Re-evaluate the model if necessary.

• Component Type Choice: Confirm you're using the most appropriate component for the data (e.g., don't use a classification metric component for a regression model).

Component Missing for Model Type

• Check Model Category: Confirm the model's category (e.g., 'classification', 'regression', 'prophet') is correctly set, as component availability depends on this.

• Verify Component Implementation: Ensure the frontend component and backend data retrieval logic exist for that specific model type and visualization.

• Platform Version: Check if the component was introduced in a later version of ML Clever.

Further Exploration

Leverage your understanding of model integration by exploring these related areas:

Model Details & Evaluation

Dive deeper into individual model performance using the dedicated Model Details page, which often provides more extensive metrics and visualizations.

Explore Model Details

Model Deployment

Learn how to deploy your trained models as APIs for real-time prediction use cases beyond the dashboard calculator.

Learn about Deployment

Advanced Data Visualization

Combine model insights with general data visualizations (charts, tables, KPIs) on the same dashboard for a holistic view.

Master Data Visualization

Dashboard Layout & Design

Refine the presentation of your model analysis by mastering dashboard layout, pages, and design settings.

Optimize Dashboard Layout

Was this page helpful?

Need help?Contact Support
Questions?Contact Sales

Last updated: 5/6/2025

ML Clever Docs