Abstract
For learning functions in the limit, an algorithmic learner obtains successively more data about a function and calculates trials each resulting in the output of a corresponding program, where, hopefully, these programs eventually converge to a correct program for the function. The authors desired to provide a feasible version of this learning in the limit — a version where each trial was conducted feasibly and there was some feasible limit on the number of trials allowed. Employed were basic feasible functionals which query an input function as to its values and which provide each trial. An additional tally argument 0i was provided to the functionals for their execution of the i-th trial. In this way more time resource was available for each successive trial. The mechanism employed to feasibly limit the number of trials was to feasibly count them down from some feasible notation for a constructive ordinal. Since all processes were feasible, their termination was feasibly detectable, and, so, it was possible to wait for the trials to terminate and suppress all the output programs but the last. Hence, although there is still an iteration of trials, the learning was a special case of what has long been known as total Fin-learning, i.e., learning in the limit, where, on each function, the learner always outputs exactly one conjectured program. Our general main results provide for strict learning hierarchies where the trial count down involves all and only notations for infinite limit ordinals. For our hierarchies featuring finitely many limit ordinal jumps, we have upper and lower total run time bounds of our feasible Fin-learners in terms of finite stacks of exponentials. We provide, though, an example of how to regain feasibility by a suitable parameterized complexity analysis.
Case and Paddock were supported in part by NSF grant number NSF CCR-0208616. We are also grateful to anonymous referees for many helpful suggestions. One such referee provided hints about the truth and truth and proof, respectively, of what became, then, Lemmas 6 and 7; hence, these results are joint work with that referee. This same referee suggested, for the future, team learning as an approach to studying some probabilistic variants of our learning criteria.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Ambainis, A., Case, J., Jain, S., Suraj, M.: Parsimony hierarchies for inductive inference. Journal of Symbolic Logic 69, 287–328 (2004)
Case, J., Paddock, T., Kötzing, T.: Feasible iteration of feasible learning functionals (expanded version). Technical report, University of Delaware (2007), At http://www.cis.udel.edu/~case/papers/FeasibleLearningTR.pdf and contains complete proofs
Downey, R., Fellows, M.: Parameterized Complexity. In: Downey, R., Fellows, M. (eds.) Monographs in Computer Science. Springer, Heidelberg (1998)
Dor, D., Zwick, U.: Median selection requires (2 + ε)n comparisons. SIAM Journal on Discrete Mathematics 14(3), 312–325 (2001)
Freivalds, R., Smith, C.: On the role of procrastination in machine learning. Information and Computation 107(2), 237–271 (1993)
Irwin, R., Kapron, B., Royer, J.: On characterizations of the basic feasible functional, Part I. Journal of Functional Programming 11, 117–153 (2001)
Jain, S., Osherson, D., Royer, J., Sharma, A.: Systems that Learn: An Introduction to Learning Theory. 2nd edn. MIT Press, Cambridge (1999)
Kapron, B., Cook, S.: A new characterization of type 2 feasibility. SIAM Journal on Computing 25, 117–132 (1996)
Kearns, M., Vazirani, U.: An Introduction to Computational Learning Theory. MIT Press, Cambridge (1994)
Li, M., Vitanyi, P.: An Introduction to Kolmogorov Complexity and Its Applications, 2nd edn. Springer, Heidelberg (1997)
Mehlhorn, K.: Polynomial and abstract subrecursive classes. Journal of Computer and System Sciences 12, 147–178 (1976)
Royer, J., Case, J.: Subrecursive Programming Systems: Complexity and Succinctness. Research monograph in Progress in Theoretical Computer Science. Birkhäuser Boston (1994)
Rogers, H.: Theory of Recursive Functions and Effective Computability. McGraw Hill, New York, 1967. MIT Press (reprinted, 1987)
Reischuk, R., Zeugmann, T.: An average-case optimal one-variable pattern language learner. Journal of Computer and System Sciences, Special Issue for COLT 1998, 60(2), 302–335 (2000),
Sierpinski, W.: Cardinal and ordinal numbers. Second revised edn. PWN –Polish Scientific Publishers (1965)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Case, J., Kötzing, T., Paddock, T. (2007). Feasible Iteration of Feasible Learning Functionals. In: Hutter, M., Servedio, R.A., Takimoto, E. (eds) Algorithmic Learning Theory. ALT 2007. Lecture Notes in Computer Science(), vol 4754. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-75225-7_7
Download citation
DOI: https://doi.org/10.1007/978-3-540-75225-7_7
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-75224-0
Online ISBN: 978-3-540-75225-7
eBook Packages: Computer ScienceComputer Science (R0)