Performance analysis via stochastic simulation is often subject to input model uncertainty, meaning that the input model is unknown and needs to be inferred from data. Motivated especially from situations with limited data, we consider a worst-case analysis to handle input uncertainty by representing the partially available input information as constraints and solving a worst-case optimization problem to obtain a conservative bound for the output. In the context of i.i.d. Input processes, such approach involves simulation-based nonlinear optimizations with decision variables being probability distributions. We explore the use of a specialized class of mirror descent stochastic approximation (MDSA) known as the entropic descent algorithm, particularly effective for handling probability simplex constraints, to iteratively solve for the local optima. We show how the mathematical program associated with each iteration of the MDSA algorithm can be efficiently computed, and carry out numerical experiments to illustrate the performance of the algorithm.