Skip to content Skip to sidebar Skip to footer

Building A Decision Tree

When building a decision tree, at each node, we select the best feature, and then the best splitting position for that feature. However, when all values for the best feature is 0 f

Solution 1:

This is simply impossible. You are supposed to select threshold which leads to the biggest incrase of model certainty. Using threshold which puts every single instance in the same branch gives you 0 increase in models certainty, thus this is not the best split. This should happen if and only if, the impurity/entropy is already 0 in this feature, but then it is a stopping criterion for creating leaves in decision tree.


Post a Comment for "Building A Decision Tree"