Abstract
This paper investigates some delicate tradeoffs between the generality of an algorithmic learning device and the quality of the programs it learns successfully. There are results to the effect that, thanks to small increases in generality of a learning device, the computational complexity of some successfully learned programs is provably unalterably suboptimal. There are also results in which the complexity of successfully learned programs is asymptotically optimal and the learning device is general, but, still thanks to the generality, some of those optimal, learned programs are provably unalterably information deficient — in some cases, deficient as to safe, algorithmic extractability/provability of the fact that they are even approximately optimal. The paper is on the borderline between learning theory, complexity theory and logic.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Barzdin, J.A.: Two theorems on the limiting synthesis of functions. In: Theory of Algorithms and Programs, Latvian State University, Riga, U.S.S.R, vol. 210, pp. 82–88 (1974)
Blum, L., Blum, M.: Toward a mathematical theory of inductive inference. Information and Control 28, 125–155 (1975)
Blum, M.: A machine independent theory of the complexity of recursive functions. Journal of the ACM 14, 322–336 (1967)
Case, J., Chen, K., Jain, S.: Costs of general purpose learning. Theoretical Computer Science 259, 455–473 (2001)
Chen, K.: Tradeoffs in machine inductive inference, Ph.D. thesis, SUNY at Buffalo (1981)
Cormen, T., Leiserson, C., Rivest, R., Stein, C.: Introduction to algorithms, 2nd edn. MIT Press, Cambridge (2001)
Chew, P., Machtey, M.: A note on structure and looking back applied to the relative complexity of computable functions. Journal of Computer and System Sciences 22, 53–59 (1981)
Case, J., Smith, C.: Comparison of identification criteria for machine inductive inference. Theoretical Computer Science 25, 193–220 (1983)
Gold, E.M.: Language identification in the limit. Information and Control 10, 447–474 (1967)
Hartmanis, J., Stearns, R.: On the computational complexity of algorithms. Transactions of the American Mathematical Society 117, 285–306 (1965)
Hopcroft, J., Ullman, J.: Introduction to automata theory languages and computation. Addison-Wesley Publishing Company, Reading (1979)
Jones, N.: Computability and complexity from a programming perspective. MIT Press, Cambridge (1997)
Jain, S., Osherson, D., Royer, J., Sharma, A.: Systems that learn: An introduction to learning theory, 2nd edn. MIT Press, Cambridge (1999)
Ladner, R.: On the structure of polynomial time reducibility. Journal of the ACM 22, 155–171 (1975)
Landweber, L., Lipton, R., Robertson, E.: On the structure of sets in NP and other complexity classes. Theoretical Computer Science 15, 181–200 (1981)
Li, M., Vitányi, P.: An introduction to Kolmogorov complexity and its applications, 2nd edn. Springer, Heidelberg (1997)
Mendelson, E.: Introduction to mathematical logic, 4th edn. Chapman & Hall, London (1997)
Royer, J., Case, J.: Subrecursive programming systems: Complexity & succinctness. Birkhäuser, Basel (1994)
Regan, K.: The topology of provability in complexity theory. Journal of Computer and System Sciences 36, 384–432 (1988)
Rogers, H.: Theory of recursive functions and effective computability. Mc- Graw Hill, New York (1967); Reprinted MIT Press, Cambridge (1987)
Schoning, U.: A uniform approach to obtain diagonal sets in complexity classes. Theoretical Computer Science 18, 95–103 (1982)
Schmidt, D.: The recursion-theoretic structure of complexity classes. Theoretical Computer Science 38, 143–156 (1985)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Case, J., Chen, KJ., Jain, S., Merkle, W., Royer, J.S. (2003). Generality’s Price. In: Schölkopf, B., Warmuth, M.K. (eds) Learning Theory and Kernel Machines. Lecture Notes in Computer Science(), vol 2777. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-45167-9_50
Download citation
DOI: https://doi.org/10.1007/978-3-540-45167-9_50
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-40720-1
Online ISBN: 978-3-540-45167-9
eBook Packages: Springer Book Archive