Carnegie Mellon University
Browse

Every decision tree has an influential variable

Download (276.17 kB)
journal contribution
posted on 1977-01-01, 00:00 authored by Ryan O'Donnell, Michael Saks, Oded Schramm, Rocco A. Servedio
We prove that for any decision tree calculating a boolean function f : {-1,1}^n \to : {-1,1}Var [f]{\le \sum\limits_{i = 1}^n {_{} } } \delta {\rm{i Inf(f),}} where di is the probability that the ith input variable is read and Infi(f) is the influence of the ith variable on f. The variance, influence and probability are taken with respect to an arbitrary product measure on {-1,1}^n. It follows that the minimum depth of a decision tree calculating a given balanced function is at least the reciprocal of the largest influence of any input variable. Likewise, any balanced boolean function with a decision tree of depth d has a variable with influence at least \frac{1}{d}. The only previous nontrivial lower bound known was \Omega(b^2) . Our inequality has many generalizations, allowing us to prove influence lower bounds for randomized decision trees, decision trees on arbitrary product probability spaces, and decision trees with non-boolean outputs. As an application of our results we give a very easy proof that the randomized query complexity of nontrivial monotone graph properties is at least \Omega(v^4/3/p^1/3), where v is the number of vertices and p \le \frac{1}{2} is the critical threshold probability. This supersedes the milestone \Omega(v^4/3) bound of Hajnal [13] and is sometimes superior to the best known lower bounds of Chakrabarti- Khot [9] and Friedgut-Kah

History

Publisher Statement

All Rights Reserved

Date

1977-01-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC