Peculiarity of Spectral Clustering
The result of the last chapter was great! Spectral clustering correctly figured out the structure of the clusters, unlike K-Means and K-Medoids algorithms.
Thus, spectral clustering is very useful in case of intersect/overlapping clusters or when you can not use mean points and the centers.
For example, let's explore such a case. Given the 2-D training set of points, the scatter plot for which is built below.
Seems like 4 circles, therefore 4 clusters, doesn't it? But that is what K-Means will show us.
Not what we expected to see. Let's see how will spectral clustering deal with this data.
Please note, that the spectral clustering algorithm may take a long time to perform since it is based on hard math.
Swipe to start coding
For the given set of 2-D points data
perform a spectral clustering. Follow the next steps:
- Import
SpectralClustering
function fromsklearn.cluster
. - Create a
SpectralClustering
model with 4 clusters. - Fit the
data
and predict the labels. Save predicted labels within the'prediction'
column ofdata
. - Build scatter plot with
'x'
column on the x-axis'y'
column on the y-axis for each value of'prediction'
(separate color for each value). Do not forget to display the plot.
Solution
Merci pour vos commentaires !