Containerized applications have become widely used in modern software development due to their high flexibility and lightweight deployment. Currently, there exists a large set of publicly available container images with many different operating systems (OSs) and versions. As a result, developers typically select images carefully on the basis of memory footprint versus flexibility with available libraries. However, information regarding performance is insufficient. We have verified that different OS-based images with different versions vary the performance of certain applications running in them. Additionally, minor updates in container images, without changing its versions, also affect application performance. Therefore, to understand application performance, it is important to determine impact on performance in different container images along with the continuous monitoring of the minor changes. Since existing performance test frameworks do not encompass the required features to analyze performance regressions in container images, in this paper, we introduce ImageJockey, an original test framework to continuously evaluate the performance of a broad range of container images. Our framework enables experiments of container benchmarks with periodic image builds, simple container orchestration, metrics collection, and result visualization. We demonstrate the usefulness of our framework through case studies that analyze the performance characteristics of sixteen container images and nine popular benchmarks. The experimental results show that there is a noticeable performance variation due to the deployed environments and characteristics of Alpine and JDK images.