In this paper, we propose a neuro-symbolic frame-work called signal temporal logic neural network (STONE) that combines the characteristics of neural networks and temporal logics. Weighted Signal Temporal Logic (wSTL) formulas are recursively composed of subformulas connected using logical and temporal operators. The quantitative semantics of wSTL is defined such that the quantitative satisfaction of subformulas with higher weights have a more significant influence on the quantitative satisfaction of a wSTL formula. In the STONE, each neuron represents a component of a wSTL formula, and the output of STONE corresponds to the quantitative satisfaction of a wSTL formula. We use STONE to represent wSTL formulas and classify time-series data. WSTL formulas are more interpretable and human-readable than classical time series classification models. The STONE is end-to-end differentiable, which allows learning of wSTL formulas to be done using back-propagation. Experiments on benchmark time-series datasets show that STONE is comparable to the state-of-the-art time series classification models and the wSTL learning algorithm is faster than the traditional STL learning algorithm.