In recent years, decentralized bilevel optimization problems have received increasing attention in the networking and machine learning communities. However, for decentralized bilevel optimization over networks with limited computation and communication capabilities, how to achieve low sample and communication complexities are two fundamental challenges. In this paper, we make the first attempt to investigate the class of decentralized bilevel optimization problems with nonconvex and strongly-convex structure corresponding to the outer and inner subproblems, respectively. Our main contributions in this paper are two-fold: i) We first propose a deterministic algorithm called interact (inner-gradient-descent-outer-tracked-gradient) that requires the sample complexity of O(nϵ-1) and communication complexity of O(ϵ-1) to solve the bilevel optimization problem, where n and ϵ > 0 are the number of samples at each agent and the desired stationarity gap, respectively. ii) To relax the need for full gradient evaluations in each iteration, we propose a stochastic variance-reduced version of interact (svr-interact), which improves the sample complexity to O(√nϵ-1) while achieving the same communication complexity as the deterministic algorithm. Our numerical experiments also corroborate our theoretical findings.