Provable guarantees for decision tree induction: the agnostic setting

06/01/2020
by   Guy Blanc, et al.
0

We give strengthened provable guarantees on the performance of widely employed and empirically successful top-down decision tree learning heuristics. While prior works have focused on the realizable setting, we consider the more realistic and challenging agnostic setting. We show that for all monotone functions f and parameters s∈ℕ, these heuristics construct a decision tree of size s^Õ((log s)/ε^2) that achieves error ≤𝗈𝗉𝗍_s + ε, where 𝗈𝗉𝗍_s denotes the error of the optimal size-s decision tree for f. Previously, such a guarantee was not known to be achievable by any algorithm, even one that is not based on top-down heuristics. We complement our algorithmic guarantee with a near-matching s^Ω̃(log s) lower bound.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro