Maximum-A-Posteriori (MAP) inference is a fun- damental task in probabilistic inference and be- lief propagation (BP) is a widely used algorithm for MAP inference. Though BP has been applied successfully to many different fields, it offers no performance guarantee and often performs poorly on loopy graphs. To improve the performance on loopy graphs and to scale up to large graphs, we propose a variational message passing neural network (V-MPNN), where we leverage both the power of neural networks in modeling complex functions and the well-established algorithmic the- ories on variational belief propagation. Instead of relying on a hand-crafted variational assump- tion, we propose a neural-augmented free energy where a general variational distribution is parame- terized through a neural network. A message pass- ing neural network is utilized for the minimization of neural-augmented free energy. Training of the MPNN is thus guided by neural-augmented free energy, without requiring exact MAP configura- tions as annotations. We empirically demonstrate the effectiveness of the proposed V-MPNN by comparing against both state-of-the-art training- free methods and training-based methods.