Publication
HiPC 2019
Conference paper

Acceleration of Sparse Vector Autoregressive Modeling Using GPUs

View publication

Abstract

Autoregressive modeling is a standard approach to mathematically describe the behavior of a time series. The vector autoregressive model (VAR) describes the behavior of multiple time series. The VAR modeling is a fundamental approach which has applications in multiple domains such as time series forecasting, Granger causality, system identification and stochastic control. Solving high dimensional VAR model requires the use of sparse regression techniques from machine learning. Efficient algorithms to solve the sparse regression problems are too slow to be useful in solving large high dimensional sparse VAR modeling problems. Earlier application of sparse VAR modeling in the neuroimaging domain required the use of the IBMs Blue Gene supercomputers. In this paper we describe an approach to accelerate large scale sparse VAR problems when solved using the lasso regression algorithm on state-of-the-art GPUs. Our accelerated implementation on NVIDIA GTX 1080 GPU takes a few seconds to solve the problem, reaching up to 4 TFLOPs of single-precision performance which is close to 55% of its peak matrix-multiply (GEMM) performance.

Date

01 Dec 2019

Publication

HiPC 2019

Authors

Share