Motivated by applications in areas such as cloud computing and information technology services, we consider GI/GI/1 queueing systems under workloads (arrival and service processes) that vary according to one discrete time scale and under controls (server capacity) that vary according to another discrete time scale. We take a stochastic optimal control approach and formulate the corresponding optimal dynamic control problem as a stochastic dynamic program. Under general assumptions for the queueing system, we derive structural properties for the optimal dynamic control policy, establishing that the optimal policy can be obtained through a sequence of convex programs. We also derive fluid and diffusion approximations for the problem and propose analytical and computational approaches in these settings. Computational experiments demonstrate the benefits of our theoretical results over standard heuristics.