Abstract
Given a function \(f : {\{0,1\}}^n\rightarrow \mathbb {R}\), its Fourier Entropy is defined to be \(-\sum _S {\widehat{f}}^2(S) \log {\widehat{f}}^2(S)\), where \(\hat{f}\) denotes the Fourier transform of f. This quantity arises in a number of applications, especially in the study of Boolean functions. An outstanding open question is a conjecture of Friedgut and Kalai (1996), called the Fourier Entropy Influence (FEI) Conjecture, asserting that the Fourier Entropy of any Boolean function f is bounded above, up to a constant factor, by the total influence (= average sensitivity) of f.
In this paper we give several upper bounds on the Fourier Entropy of Boolean as well as real valued functions. We first give upper bounds on the Fourier Entropy of Boolean functions in terms of several complexity measures that are known to be bigger than the influence. These complexity measures include, among others, the logarithm of the number of leaves and the average depth of a parity decision tree. We then show that for the class of Linear Threshold Functions (LTF), the Fourier Entropy is at most \(O(\sqrt{n})\). It is known that the average sensitivity for the class of LTF is bounded by \(\Theta (\sqrt{n})\). We also establish a bound of \(O_d(n^{1-\frac{1}{4d+6}})\) for general degree-d polynomial threshold functions. Our proof is based on a new upper bound on the derivative of noise sensitivity. Next we proceed to show that the FEI Conjecture holds for read-once formulas that use \({{\mathrm{\mathsf {AND}}}}\), \({{\mathrm{\mathsf {OR}}}}\), \({{\mathrm{\mathsf {XOR}}}}\), and \(\mathsf {NOT}\) gates. The last result is independent of a recent result due to O’Donnell and Tan [14] for read-once formulas with arbitrary gates of bounded fan-in, but our proof is completely elementary and very different from theirs. Finally, we give a general bound involving the first and second moments of sensitivities of a function (average sensitivity being the first moment), which holds for real valued functions as well.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Bourgain, J., Kalai, G.: Influences of variables and threshold intervals under group symmetries. Geometric and Functional Analysis GAFA 7(3), 438–461 (1997)
Chakraborty, S., Kulkarni, R., Lokam, S.V., Saurabh, N.: Upper bounds on fourier entropy. Tech. rep., Electronic Colloquium on Computational Complexity (ECCC), TR13-052 (2013). http://eccc.hpi-web.de/report/2013/052
Das, B., Pal, M., Visavaliya, V.: The entropy influence conjecture revisited. Tech. rep. (2011). arXiv:1110.4301
Diakonikolas, I., Raghavendra, P., Servedio, R., Tan, L.: Average sensitivity and noise sensitivity of polynomial threshold functions. SIAM Journal on Computing 43(1), 231–253 (2014)
Friedgut, E., Kalai, G.: Every monotone graph property has a sharp threshold. Proceedings of the American Mathematical Society 124(10), 2993–3002 (1996)
Gopalan, P., Kalai, A.T., Klivans, A.R.: Agnostically learning decision trees. In: Proceedings of the 40th Annual ACM Symposium on Theory of Computing, STOC 2008, pp. 527–536 (2008)
Harsha, P., Klivans, A., Meka, R.: Bounding the sensitivity of polynomial threshold functions. Theory of Computing 10(1), 1–26 (2014)
Kahn, J., Kalai, G., Linial, N.: The influence of variables on boolean functions. In: Proceedings of the 29th Annual IEEE Symposium on Foundations of Computer Science, pp. 68–80 (1988)
Keller, N., Mossel, E., Schlank, T.: A note on the entropy/influence conjecture. Tech. rep. arXiv:1105.2651 (2011)
Klivans, A., Lee, H., Wan, A.: Mansour’s conjecture is true for random dnf formulas. In: Proceedings of the 23rd Conference on Learning Theory, pp. 368–380 (2010)
Mansour, Y.: An o(n\(^{\log \log n}\)) learning algorithm for dnf under the uniform distribution. Journal of Computer and System Sciences 50(3), 543–550 (1995)
O’Donnell, R.: Computational applications of noise sensitivity. Ph.D. thesis. MIT (2003)
O’Donnell, R.: Analysis of Boolean Functions. Cambridge University Press (2014)
O’Donnell, R., Tan, L.Y.: A composition theorem for the fourier entropy-influence conjecture. In: Proceedings of Automata, Languages and Programming - 40th International Colloquium (2013)
O’Donnell, R., Wright, J., Zhou, Y.: The fourier entropy-influence conjecture for certain classes of boolean functions. In: Proceedings of Automata, Languages and Programming - 38th International Colloquium, pp. 330–341 (2011)
Wan, A., Wright, J., Wu, C.: Decision trees, protocols and the entropy-influence conjecture. In: Innovations in Theoretical Computer Science, ITCS 2014, pp. 67–80 (2014)
de Wolf, R.: A brief introduction to fourier analysis on the boolean cube. Theory of Computing, Graduate Surveys 1, 1–20 (2008)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Chakraborty, S., Kulkarni, R., Lokam, S.V., Saurabh, N. (2015). Upper Bounds on Fourier Entropy. In: Xu, D., Du, D., Du, D. (eds) Computing and Combinatorics. COCOON 2015. Lecture Notes in Computer Science(), vol 9198. Springer, Cham. https://doi.org/10.1007/978-3-319-21398-9_60
Download citation
DOI: https://doi.org/10.1007/978-3-319-21398-9_60
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-21397-2
Online ISBN: 978-3-319-21398-9
eBook Packages: Computer ScienceComputer Science (R0)