Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/1806689.1806763acmconferencesArticle/Chapter ViewAbstractPublication PagesstocConference Proceedingsconference-collections
research-article

Bounding the average sensitivity and noise sensitivity of polynomial threshold functions

Published: 05 June 2010 Publication History

Abstract

We give the first non-trivial upper bounds on the average sensitivity and noise sensitivity of degree-d polynomial threshold functions (PTFs). These bounds hold both for PTFs over the Boolean hypercube {-1,1}n and for PTFs over Rn under the standard n-dimensional Gaussian distribution N(0,In). Our bound on the Boolean average sensitivity of PTFs represents progress towards the resolution of a conjecture of Gotsman and Linial [17], which states that the symmetric function slicing the middle d layers of the Boolean hypercube has the highest average sensitivity of all degree-d PTFs. Via the L1 polynomial regression algorithm of Kalai et al. [22], our bounds on Gaussian and Boolean noise sensitivity yield polynomial-time agnostic learning algorithms for the broad class of constant-degree PTFs under these input distributions.
The main ingredients used to obtain our bounds on both average and noise sensitivity of PTFs in the Gaussian setting are tail bounds and anti-concentration bounds on low-degree polynomials in Gaussian random variables [20, 7]. To obtain our bound on the Boolean average sensitivity of PTFs, we generalize the "critical-index" machinery of [37] (which in that work applies to halfspaces, i.e. degree-1 PTFs) to general PTFs. Together with the "invariance principle" of [30], this lets us extend our techniques from the Gaussian setting to the Boolean setting. Our bound on Boolean noise sensitivity is achieved via a simple reduction from upper bounds on average sensitivity of Boolean PTFs to corresponding bounds on noise sensitivity.

References

[1]
P. Austrin and J. Håstad. Randomly supported independence and resistance. In Proc. 41st Annual ACM Symposium on Theory of Computing (STOC), pages 483--492. ACM, 2009.
[2]
I. Benjamini, G. Kalai, and O. Schramm. Noise sensitivity of Boolean functions and applications to percolation. Inst. Hautes Études Sci. Publ. Math., 90:5--43, 1999.
[3]
E. Blais, R. O'Donnell, and K. Wimmer. Polynomial regression under arbitrary product distributions. In Proc. 21st Annual Conference on Learning Theory (COLT), pages 193--204, 2008.
[4]
V. Bogachev. Gaussian measures. Mathematical surveys and monographs, vol. 62, 1998.
[5]
J. Bourgain and G. Kalai. Influences of variables and threshold intervals under group symmetries. GAFA, 7:438--461, 1997.
[6]
N. Bshouty and C. Tamon. On the Fourier spectrum of monotone functions. Journal of the ACM, 43(4):747--770, 1996.
[7]
A. Carbery and J. Wright. Distributional and Lq norm inequalities for polynomials over convex bodies in Rn. Mathematical Research Letters, 8(3):233--248, 2001.
[8]
I. Diakonikolas, P. Raghavendra, R. Servedio, and L.-Y. Tan. Average sensitivity and noise sensitivity of polynomial threshold functions, 2009. Available at http://arxiv.org/abs/0909.5011.
[9]
I. Diakonikolas and R. Servedio. Improved approximation of linear threshold functions. In Proc. 24th Annual IEEE Conference on Computational Complexity (CCC), pages 161--172, 2009.
[10]
I. Diakonikolas, R. Servedio, L.-Y. Tan, and A. Wan. A regularity lemma, and low-weight approximators, for low-degree polynomial threshold functions. manuscript, 2009.
[11]
I. Diakoniokolas, P. Gopalan, R. Jaiswal, R. Servedio, and E. Viola. Bounded independence fools halfspaces. In Proc. 50th IEEE Symposium on Foundations of Computer Science (FOCS), pages 171--180, 2009.
[12]
I. Dinur, E. Friedgut, G. Kindler, and R. O'Donnell. On the Fourier tails of bounded functions over the discrete cube. In Proc. 38th ACM Symp. on Theory of Computing, pages 437--446, 2006.
[13]
W. Feller. An introduction to probability theory and its applications. John Wiley & Sons, 1968.
[14]
E. Friedgut. Boolean functions with low average sensitivity depend on few coordinates. Combinatorica, 18(1):474--483, 1998.
[15]
P. Gopalan, A. Kalai, and A. Klivans. Agnostically learning decision trees. In Proc. 40th Annual ACM Symposium on Theory of Computing (STOC), pages 527--536, 2008.
[16]
P. Gopalan and R. Servedio. Learning threshold-of-AC0 circuits. Manuscript, 2009.
[17]
C. Gotsman and N. Linial. Spectral properties of threshold functions. Combinatorica, 14(1):35--50, 1994.
[18]
P. Harsha, A. Klivans, and R. Meka. Bounding the sensitivity of polynomial threshold functions. Available at http://arxiv.org/abs/0909.5175, 2009.
[19]
J. Jackson, A. Klivans, and R. Servedio. Learnability beyond AC0. In Proc. 34th Annual ACM Symposium on Theory of Computing (STOC), pages 776--784, 2002.
[20]
S. Janson. Gaussian Hilbert Spaces. Cambridge University Press, Cambridge, UK, 1997.
[21]
J. Kahn, G. Kalai, and N. Linial. The influence of variables on boolean functions. In Proc. 29th Annual Symposium on Foundations of Computer Science (FOCS), pages 68--80, 1988.
[22]
A. Kalai, A. Klivans, Y. Mansour, and R. Servedio. Agnostically learning halfspaces. SIAM Journal on Computing, 37(6):1777--1805, 2008.
[23]
A. Kalai, Y. Mansour, and E. Verbin. On agnostic boosting and parity learning. In Proc. 40th Annual ACM Symposium on Theory of Computing (STOC), pages 629--638, 2008.
[24]
D. Kane. The Gaussian surface area and noise sensitivity of degree-d polynomial threshold functions. CCC 2010, to appear, 2010.
[25]
A. Klivans, R. O'Donnell, and R. Servedio. Learning intersections and thresholds of halfspaces. Journal of Computer & System Sciences, 68(4):808--840, 2004.
[26]
A. Klivans, R. O'Donnell, and R. Servedio. Learning geometric concepts via Gaussian surface area. In Proc. 49th IEEE Symposium on Foundations of Computer Science (FOCS), pages 541--550, 2008.
[27]
N. Linial, Y. Mansour, and N. Nisan. Constant depth circuits, Fourier transform and learnability. Journal of the ACM, 40(3):607--620, 1993.
[28]
E. Mossel. Lecture 4. Available at http://www.stat.berkeley.edu/mossel/teach/206af05/scribes/sep8.pdf, 2005.
[29]
E. Mossel and R. O'Donnell. On the noise sensitivity of monotone functions. Random Structures and Algorithms, 23(3):333--350, 2003.
[30]
E. Mossel, R. O'Donnell, and K. Oleszkiewicz. Noise stability of functions with low influences: invariance and optimality. In Proc. 46th Symposium on Foundations of Computer Science (FOCS), pages 21--30, 2005.
[31]
R. O'Donnell. Lecture 16: The hypercontractivity theorem. Available at http://www.cs.cmu.edu/odonnell/boolean-analysis/lecture16.pdf, 2007.
[32]
R. O'Donnell, M. Saks, O. Schramm, and R. Servedio. Every decision tree has an influential variable. In Proc. 46th Symposium on Foundations of Computer Science (FOCS), pages 31--39, 2005.
[33]
R. O'Donnell and R. Servedio. Learning monotone decision trees in polynomial time. SIAM J. Comput., 37(3):827--844, 2007.
[34]
R. O'Donnell and R. Servedio. The Chow Parameters Problem. In Proc. 40th Annual ACM Symposium on Theory of Computing (STOC), pages 517--526, 2008.
[35]
Y. Peres. Noise stability of weighted majority, 2004.
[36]
O. Schramm and J. Steif. Quantitative noise sensitivity and exceptional times for percolation. Ann. Math., to appear.
[37]
R. Servedio. Every linear threshold function has a low-weight approximator. Computational Complexity, 16(2):180--209, 2007.
[38]
Y. Shi. Lower bounds of quantum black-box complexity and degree of approximating polynomials by influence of boolean variables. Inform. Process. Lett., 75(1-2):79--83, 2000.
[39]
S. S. Shwartz, O. Shamir, and K. Sridharan. Agnostically learning halfspaces with margin errors. TTI Technical Report, 2009.

Cited By

View all
  • (2023)Testing Distributional Assumptions of Learning AlgorithmsProceedings of the 55th Annual ACM Symposium on Theory of Computing10.1145/3564246.3585117(1643-1656)Online publication date: 2-Jun-2023
  • (2022)Positive spectrahedra: invariance principles and pseudorandom generatorsProceedings of the 54th Annual ACM SIGACT Symposium on Theory of Computing10.1145/3519935.3519965(208-221)Online publication date: 9-Jun-2022
  • (2021)Provably efficient, succinct, and precise explanationsProceedings of the 35th International Conference on Neural Information Processing Systems10.5555/3540261.3540730(6129-6141)Online publication date: 6-Dec-2021
  • Show More Cited By

Index Terms

  1. Bounding the average sensitivity and noise sensitivity of polynomial threshold functions

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      STOC '10: Proceedings of the forty-second ACM symposium on Theory of computing
      June 2010
      812 pages
      ISBN:9781450300506
      DOI:10.1145/1806689
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 05 June 2010

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. average sensitivity
      2. boolean function
      3. fourier analysis
      4. noise sensitivity
      5. polynomial threshold function

      Qualifiers

      • Research-article

      Conference

      STOC'10
      Sponsor:
      STOC'10: Symposium on Theory of Computing
      June 5 - 8, 2010
      Massachusetts, Cambridge, USA

      Acceptance Rates

      Overall Acceptance Rate 1,469 of 4,586 submissions, 32%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)13
      • Downloads (Last 6 weeks)1
      Reflects downloads up to 09 Nov 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2023)Testing Distributional Assumptions of Learning AlgorithmsProceedings of the 55th Annual ACM Symposium on Theory of Computing10.1145/3564246.3585117(1643-1656)Online publication date: 2-Jun-2023
      • (2022)Positive spectrahedra: invariance principles and pseudorandom generatorsProceedings of the 54th Annual ACM SIGACT Symposium on Theory of Computing10.1145/3519935.3519965(208-221)Online publication date: 9-Jun-2022
      • (2021)Provably efficient, succinct, and precise explanationsProceedings of the 35th International Conference on Neural Information Processing Systems10.5555/3540261.3540730(6129-6141)Online publication date: 6-Dec-2021
      • (2021)VC dimension and distribution-free sample-based testingProceedings of the 53rd Annual ACM SIGACT Symposium on Theory of Computing10.1145/3406325.3451104(504-517)Online publication date: 15-Jun-2021
      • (2021)Hardness of learning DNFs using halfspacesProceedings of the 53rd Annual ACM SIGACT Symposium on Theory of Computing10.1145/3406325.3451067(467-480)Online publication date: 15-Jun-2021
      • (2019)Degree-𝑑 chow parameters robustly determine degree-𝑑 PTFs (and algorithmic applications)Proceedings of the 51st Annual ACM SIGACT Symposium on Theory of Computing10.1145/3313276.3316301(804-815)Online publication date: 23-Jun-2019
      • (2018)The gotsman-linial conjecture is falseProceedings of the Twenty-Ninth Annual ACM-SIAM Symposium on Discrete Algorithms10.5555/3174304.3175315(692-699)Online publication date: 7-Jan-2018
      • (2018)Learning geometric concepts with nasty noiseProceedings of the 50th Annual ACM SIGACT Symposium on Theory of Computing10.1145/3188745.3188754(1061-1073)Online publication date: 20-Jun-2018
      • (2017)A structure theorem for poorly anticoncentrated polynomials of Gaussians and applications to the study of polynomial threshold functionsThe Annals of Probability10.1214/16-AOP109745:3Online publication date: 1-May-2017
      • (2017)A new central limit theorem and decomposition for Gaussian polynomials, with an application to deterministic approximate countingProbability Theory and Related Fields10.1007/s00440-017-0804-y171:3-4(981-1044)Online publication date: 30-Sep-2017
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media