Learning underlying patterns with unsupervised generative models is a challenging task. Inspired by the probabilistic nature of quantum physics, Born Machines as generative models have shown great success in learning the joint probability distribution of a given dataset. Leveraging on the expressibility and training power of Projected Entangled Pair State (PEPS) networks, we study the capability of our Born Machine with PEPS structure in learning the underlying patterns in the classical Ising model and two dimensional Rydberg atom. Considering that PEPS models may not be easily extended to larger systems with higher bond dimensions, we also investigate the effect of more efficient tensor network contractions such as MPS snake-like and MPS-MPO as well as random contractions on the performance of Born Machine. Our early results indicate that our PEPS model on the 2D Ising configurations significantly outperforms the MPS snake-like model. We also discuss that in the vicinity of the Ising critical point, the Energy and Magnetization profiles in sampled configurations of the PEPS model slightly deviate from the Monte Carlo samples. We argue that this is due to the emergence of long-range spin correlations close to the Ising critical point.