-
Notifications
You must be signed in to change notification settings - Fork 158
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add wrappers for clustering to get uniform interface #982
Comments
It is not possible at present as 1, 2, and 3 are all different: 2 does not generalize to new observations, and 3 outputs probabilistic labels. There are natural differences in the API for this reason. For a uniform interface, we need a wrapper to convert non-generalizing clusterers to generalizing ones, and a wrapper to convert probabilistic-predicting clusterers into deterministic ones. These do not exist at present. They could be created using MLJ learning networks. |
No longer needed. |
Can you please provide an example with
where we call the same MLJ interface functions to "train" the clustering models with a random data set
X1 = rand(3,100)
and then "predict" the labels of new pointsX2 = rand(3, 10)
?Every time I try to accomplish this task I get stuck with various issues (e.g. the output of the models follow different formats and conventions). I am looking for a simple example with
MLJ.fit
,MLJ.predict
,MLJ.predict_mode
andMLJ.transform
only. I don't want to wrap the models into a machine in this example.The text was updated successfully, but these errors were encountered: