The purpose of these lecture notes is to provide an introduction to the general theory of
empirical risk minimization with an emphasis on excess risk bounds and oracle inequalities in
penalized problems. In recent years there have been new developments in this area motivated by
the study of new classes of methods in machine learning such as large margin classification
methods (boosting kernel machines). The main probabilistic tools involved in the analysis of
these problems are concentration and deviation inequalities by Talagrand along with other
methods of empirical processes theory (symmetrization inequalities contraction inequality for
Rademacher sums entropy and generic chaining bounds). Sparse recovery based on l_1-type
penalization and low rank matrix recovery based on the nuclear norm penalization are other
active areas of research where the main problems can be stated in the framework of penalized
empirical risk minimization and concentration inequalities and empirical processes tools have
proved to be very useful.