Visualizing Isolation Forest Boundaries
Understanding how an outlier detection model separates normal data from anomalies is crucial for trusting and improving your results. Visualizing decision boundaries allows you to see where the model considers the edge between typical and unusual observations. With models like Isolation Forest, these boundaries can reveal the regions where the algorithm expects data to cluster, and where it flags points as outliers. Such visualizations help you interpret why certain points are labeled as anomalies, diagnose overfitting or underfitting, and communicate model behavior to stakeholders.
123456789101112131415161718192021222324252627282930313233343536import numpy as np import matplotlib.pyplot as plt from sklearn.ensemble import IsolationForest # Generate synthetic 2D data rng = np.random.RandomState(42) X_inliers = 0.3 * rng.randn(100, 2) X_inliers = np.r_[X_inliers + 2, X_inliers - 2] X_outliers = rng.uniform(low=-4, high=4, size=(20, 2)) X = np.r_[X_inliers, X_outliers] # Fit Isolation Forest clf = IsolationForest(contamination=0.1, random_state=42) clf.fit(X) scores = clf.decision_function(X) y_pred = clf.predict(X) # Create meshgrid for contour plot xx, yy = np.meshgrid(np.linspace(-5, 5, 200), np.linspace(-5, 5, 200)) Z = clf.decision_function(np.c_[xx.ravel(), yy.ravel()]) Z = Z.reshape(xx.shape) plt.figure(figsize=(8, 6)) # Plot decision boundary plt.contourf(xx, yy, Z, levels=np.linspace(Z.min(), Z.max(), 10), cmap=plt.cm.Blues_r, alpha=0.6) # Draw decision boundary at threshold 0 a = plt.contour(xx, yy, Z, levels=[0], linewidths=2, colors='red') # Plot inliers and outliers plt.scatter(X[y_pred == 1, 0], X[y_pred == 1, 1], c='white', s=40, edgecolor='k', label='Inliers') plt.scatter(X[y_pred == -1, 0], X[y_pred == -1, 1], c='orange', s=40, edgecolor='k', label='Outliers') plt.legend() plt.title("Isolation Forest Decision Function and Outlier Detection") plt.xlabel("Feature 1") plt.ylabel("Feature 2") plt.show()
The decision boundaries drawn by Isolation Forest show regions the model considers "normal" versus "anomalous." Points outside the red contour (where the decision function crosses zero) are flagged as outliers. These boundaries are not always smooth or circular—they reflect the partitioning logic of the underlying isolation trees, which may result in piecewise or jagged shapes. Observing these boundaries can help you understand if the model is too strict (flagging too many outliers) or too loose (missing anomalies), and guide you in tuning parameters or interpreting results.
Danke für Ihr Feedback!
Fragen Sie AI
Fragen Sie AI
Fragen Sie alles oder probieren Sie eine der vorgeschlagenen Fragen, um unser Gespräch zu beginnen
Awesome!
Completion rate improved to 4.55
Visualizing Isolation Forest Boundaries
Swipe um das Menü anzuzeigen
Understanding how an outlier detection model separates normal data from anomalies is crucial for trusting and improving your results. Visualizing decision boundaries allows you to see where the model considers the edge between typical and unusual observations. With models like Isolation Forest, these boundaries can reveal the regions where the algorithm expects data to cluster, and where it flags points as outliers. Such visualizations help you interpret why certain points are labeled as anomalies, diagnose overfitting or underfitting, and communicate model behavior to stakeholders.
123456789101112131415161718192021222324252627282930313233343536import numpy as np import matplotlib.pyplot as plt from sklearn.ensemble import IsolationForest # Generate synthetic 2D data rng = np.random.RandomState(42) X_inliers = 0.3 * rng.randn(100, 2) X_inliers = np.r_[X_inliers + 2, X_inliers - 2] X_outliers = rng.uniform(low=-4, high=4, size=(20, 2)) X = np.r_[X_inliers, X_outliers] # Fit Isolation Forest clf = IsolationForest(contamination=0.1, random_state=42) clf.fit(X) scores = clf.decision_function(X) y_pred = clf.predict(X) # Create meshgrid for contour plot xx, yy = np.meshgrid(np.linspace(-5, 5, 200), np.linspace(-5, 5, 200)) Z = clf.decision_function(np.c_[xx.ravel(), yy.ravel()]) Z = Z.reshape(xx.shape) plt.figure(figsize=(8, 6)) # Plot decision boundary plt.contourf(xx, yy, Z, levels=np.linspace(Z.min(), Z.max(), 10), cmap=plt.cm.Blues_r, alpha=0.6) # Draw decision boundary at threshold 0 a = plt.contour(xx, yy, Z, levels=[0], linewidths=2, colors='red') # Plot inliers and outliers plt.scatter(X[y_pred == 1, 0], X[y_pred == 1, 1], c='white', s=40, edgecolor='k', label='Inliers') plt.scatter(X[y_pred == -1, 0], X[y_pred == -1, 1], c='orange', s=40, edgecolor='k', label='Outliers') plt.legend() plt.title("Isolation Forest Decision Function and Outlier Detection") plt.xlabel("Feature 1") plt.ylabel("Feature 2") plt.show()
The decision boundaries drawn by Isolation Forest show regions the model considers "normal" versus "anomalous." Points outside the red contour (where the decision function crosses zero) are flagged as outliers. These boundaries are not always smooth or circular—they reflect the partitioning logic of the underlying isolation trees, which may result in piecewise or jagged shapes. Observing these boundaries can help you understand if the model is too strict (flagging too many outliers) or too loose (missing anomalies), and guide you in tuning parameters or interpreting results.
Danke für Ihr Feedback!