Skip to yearly menu bar Skip to main content


Poster

Convergence of Online Learning Algorithm for a Mixture of Multiple Linear Regressions

YUJING LIU · Zhixin Liu · Lei Guo


Abstract:

Mixed linear regressions (MLR) is a powerful model to characterize nonlinear relationships among observed data while still keeping the models simple and computationally efficient. This paper investigates the online learning and data clustering problem for MLR model with arbitrary number of sub-models and arbitrary mixing weights. Most previous investigations mainly focus on offline learning algorithms, and the convergence results are established by requiring the independent and identically distributed (i.i.d) input data assumption. To overcome this fundamental limitation, we propose a novel online learning algorithm for parameter estimation based on the EM principle. By using Ljung's ODE method and Lyapunov stability theorem, we first establish the almost sure convergence results for the MLR problem without the traditional i.i.d assumption on the input data. In addition, we analyze the performance of online data clustering based on the parameter estimates, which are asymptotically the same as that in the case of known parameters. Finally, a simulation example is given to verify the effectiveness of the proposed algorithm.

Live content is unavailable. Log in and register to view live content