VDFT probs do not sum up to one #915
vinodkraman
started this conversation in
General
Replies: 1 comment 4 replies
-
Hi @vinodkraman, I did not observe the same problem. The majority class is indeed expected to perform worse than (adaptive) Naive Bayes models. This is especially true if no splits are performed. The tree does not split because either:
I suggest you to give a look in this book chapter for a starting point on decision trees. Also, you can refer to the guide on Hoeffding Trees that is available in the River website. |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am trying to run VDFT on the balance-scale dataset. However, when I print the model at the end, the probabilities do not sum to 1. Also, I am noticing that under mc leaf prediction the accuracy is significantly lower compared to non mc leaf prediction. However, I'm not sure why this is the case. I've attached the code below. More specifically, its seems like the tree never splits...
code.zip
Beta Was this translation helpful? Give feedback.
All reactions