Continuous Vector Quantile Regression
Sanketh Vedula, Irene Tallini,
Aviv A. Rosenberg, Marco Pegoraro, Emanuele Rodolà, and
2 more authors
In ICML Workshop on New Frontiers in Learning, Control, and Dynamical Systems, 2023
Vector quantile regression (VQR) estimates the conditional vector quantile function (CVQF), a fundamental quantity which fully represents the conditional distribution of \rvecY|\rvecX. VQR is formulated as an optimal transport (OT) problem between a uniform \rvecU∼μand the target (\rvecX,\rvecY)∼ν, the solution of which is a unique transport map, co-monotonic with \rvecU. Recently non linear VQR (NL-VQR) has been proposed to estimate support non-linear CVQFs, together with fast solvers which enabled the use of this tool in practical applications. Despite its utility, the scalability and estimation quality of NL-VQR is limited due to a discretization of the OT problem onto a grid of quantile levels. We propose a novel \emphcontinuous formulation and parametrization of VQR using partial input-convex neural networks (PICNNs). Our approach allows for accurate, scalable, differentiable and invertible estimation of non-linear CVQFs. We further demonstrate, theoretically and experimentally, how continuous CVQFs can be used for general statistical inference tasks: estimation of likelihoods, CDFs, confidence sets, coverage, sampling, and more. This work is an important step towards unlocking the full potential of VQR.