Carnegie Mellon University
eplatani_MachineLearning_2020.pdf (10.53 MB)

Learning Collections of Functions

Download (10.53 MB)
posted on 2021-01-25, 22:30 authored by Emmanouil PlataniosEmmanouil Platanios
Human intelligence is magnificent. One of its most impressive aspects is how humans always seem able to learn new skills quickly and without much supervision by utilizing previously learned skills and forming connections between them. More specifically, human learning is often not about learning a single skill in isolation, but rather about learning collections of skills and utilizing relationships between them to learn more efficiently. Furthermore, these relationships may either be explicitly provided or implicitly learned, indicating high levels of abstraction in the learned abilities. On the other hand, even though machine learning has witnessed growing success across a multitude of applications over the past years, current systems are each highly specialized to solve one or just a handful of problems.
In this thesis, we argue that a computer system that learns to perform multiple tasks jointly and that is aware of the relationships between these tasks, will be able to
learn more efficiently and effectively than a system that learns to perform each task in isolation. Moreover, the relationships between the tasks may either be explicitly
provided through supervision or implicitly learned by the system itself, and will allow the system to self-reflect and evaluate itself without any task-specific supervision. This includes learning relationships in the form of higher-order functions—namely functions that compose, transform, or otherwise manipulate other functions—that can enable truly multi-task and zero-shot learning. In the first part, we present a method that allows learning systems to evaluate
themselves in an unsupervised manner by leveraging explicitly provided relationships between multiple learned functions. We refer to this ability as self-reflection and show
how it addresses an important limitation of existing never-ending learning systems like the never-ending language learner (Mitchell et al., 2018).We then propose multiple
extensions that improve upon this method, resulting in several robust algorithms for estimating the accuracy of classifiers from unlabeled data. In the second part, we
consider more general multi-task learning settings and propose an abstract framework called contextual parameter generation (CPG), which allows systems to generate
functions for solving different kinds of tasks without necessarily having been shown any training data for these tasks. This framework generalizes existing approaches in
multi-task learning, transfer learning, and meta-learning, and it further allows for learning arbitrary higher-order functions. It does so by formalizing the notion of a function representation and what it means for functions to operate on other functions or even on themselves. This new type of learning, which we refer to as higher-order learning, enables learning relationships between multiple functions in the form of higher-order functions, and is inspired by functional programming and category theory. Finally, we propose the jelly bean world (JBW), a novel evaluation framework for never-ending learning systems.




Degree Type

  • Dissertation


  • Machine Learning

Degree Name

  • Doctor of Philosophy (PhD)


Tom Mitchell

Usage metrics


    Ref. manager