Dilated Convolution for Time Series Learning
Abstract
The state-of-the-art (SOTA) deep learning based time series models are inspired by convolutional neural networks (CNN), recurrent neural networks (RNN) or transformers which are successful architectures for domains like vision, text, etc. However, the gold standard architecture for time series modeling is not yet established. In this paper, we propose a new neural network structure that can be used as a strong baseline for time series problems, leveraging dilated kernels with fully convolutional networks (FCNs). The proposed model, called the dilated multi-kernel fully convolutional network (DM-FCN), is a composite model that leverages a vast receptive field and is designed to capture the long-distance interaction in multivariate time series data. We evaluate the performance of the DM-FCN model on a variety of time series benchmarks. Our results show that the baseline DM-FCN model outperforms state-of-the-art models on many of the benchmarks by a large margin. By integrating statistical insights, we also evaluated different variations of DM-FCN and deliberated on model selections across diverse time series data.