Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/237814.237849acmconferencesArticle/Chapter ViewAbstractPublication PagesstocConference Proceedingsconference-collections
Article
Free access

Noise-tolerant learning near the information-theoretic bound

Published: 01 July 1996 Publication History
First page of PDF

References

[1]
J.A. Aslaxn and S. E. Decatur. General bounds on statistical query learning and PAC learning with noise via hypothesis boosting. In Proc. 35th Annual IEEE Sympos. Found. Comput. Sci., November 1993.
[2]
Dana Angluin and Philip Laird. Learning from noisy examples. Machine Learning, 2(4):343- 370, 1988.
[3]
M. Anthony and J. Shawe-Taylor. A result of Vapnik with applications. Discrete Applied Mathematics, 47:207-217, 1994.
[4]
R.R. Bahadur. Some approximations to the binomial distribution function. Annals of Mathematzcal Statistzcs, 31:43-54, 1960.
[5]
Anselm Blumer, Andrzej Ehrenfeucht, David Haussler, and Manfred K. Waxmuth. Learnability and the Vapnik-Chervonenkis dimension. journal of the Association on Computing Machinery, 36(4):929-965, 1989.
[6]
Andrzej Ehrenfeucht, David Haussler, Michael Kearns, and Leslie Valiant. A general lower bound on the number of examples needed for learning. Information and Computation, 82(3):247-261, 1989.
[7]
Kumar Jogdeo and S. M. Samuels. Monotone convergence of binomial probabilities and a generalization of Ramanujan's equation. The Annals of Mathematical Statistics, 39(4):1191- 1195, 1968.
[8]
M. Kearns. Efficient noise-tolerant learning from statistical queries. In Proc. 25th Annual A CM Sympos. Theory Comput., pages 392-401. ACM Press, New York, NY, 1993.
[9]
M. Kearns and M. Li. Learning in the presence of malicious errors. SIAM J. Comput., 22:807- 837, 1993.
[10]
Philip D. Laird. Learning from good and bad data. In Kluwer international series in engineering and computer science. Kluwer Academic Publishers, Boston, 1988.
[11]
N. Littlestone. On the derivation and quality of Chernoff bounds. Submitted for publication, 1995.
[12]
Hans Ulrich Simon. General bounds on the number of examples needed for learning probabilistic concepts. In Proceedings of the 6th Annual Workshop on Computational Learning Theory, pages 402-412. ACM Press, 1993. To appear in Journal of Computer and System Sciences.
[13]
John Shawe-Taylor, Martin Anthony, and Norman Biggs. Bounding sample size with the Vapnik-Chervonenkis dimension. Discrete Applzed Mathematzcs, 41:65-73, 1993.
[14]
Leslie G. Valiant, A theory of the learnable. Commumcatzons o.f the ACM, 27(11):1134- 1142, 1984.

Cited By

View all
  • (2005)Randomized hypotheses and minimum disagreement hypotheses for learning with noiseComputational Learning Theory10.1007/3-540-62685-9_11(119-133)Online publication date: 3-Jun-2005
  • (2000)PAC Learning with Nasty NoiseAlgorithmic Learning Theory10.1007/3-540-46769-6_17(206-218)Online publication date: 19-May-2000

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
STOC '96: Proceedings of the twenty-eighth annual ACM symposium on Theory of Computing
July 1996
661 pages
ISBN:0897917855
DOI:10.1145/237814
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 July 1996

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Article

Conference

STOC96
Sponsor:
STOC96: ACM Symposium on Theory of Computing
May 22 - 24, 1996
Pennsylvania, Philadelphia, USA

Acceptance Rates

STOC '96 Paper Acceptance Rate 74 of 201 submissions, 37%;
Overall Acceptance Rate 1,469 of 4,586 submissions, 32%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)53
  • Downloads (Last 6 weeks)6
Reflects downloads up to 16 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2005)Randomized hypotheses and minimum disagreement hypotheses for learning with noiseComputational Learning Theory10.1007/3-540-62685-9_11(119-133)Online publication date: 3-Jun-2005
  • (2000)PAC Learning with Nasty NoiseAlgorithmic Learning Theory10.1007/3-540-46769-6_17(206-218)Online publication date: 19-May-2000

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media