This book discusses the relevance of probabilistic supervised learning, to the pursuit of automated and reliable prediction of an unknown that is in a state of relationship with another variable. The book provides methods for secured mechanistic learning of the function that represents this relationship between the output and input variables, where said learning is undertaken within the remit of real-world information that can be messy in different ways. Occasions arise when one seeks values of the input at which a new output value is recorded, and such a demand is also addressed in the book.
The generic solution to the problem of secured supervised learning amidst real-world messiness lies in treating the sought inter-variable relation as a (function-valued) random variable, which, being random, is ascribed a probability distribution. Then recalling that distributions on the space of functions is given by stochastic processes, the sought function is proposed to be a sample function of a stochastic process. This process is chosen as one that imposes minimal constraints on the sought function - identified as a Gaussian Process (GP) in the book. Thus, the sought function can be inferred upon, as long as the covariance function of the underlying GP is learnt, given the available training set. The book presents probabilistic techniques to undertake said learning, within the challenges borne by the data, and illustrates such techniques on real data. Learning of a function is always followed by closed-form prediction of the mean and dispersion of the output variable that is realised at a test input.
To help with the background, the book includes reviews on stochastic processes and basic probability theory. This makes it valuable for students across disciplines, including students of computational sciences, statistics, and mathematics.