Recently we find several candidates of quantum algorithms that may be implementable in near-term devices for estimating the amplitude of a given quantum state, which is a core subroutine in various computing tasks such as the Monte Carlo methods. One of those algorithms is based on the maximum likelihood estimate with parallelized quantum circuits. In this paper, we extend this method so that it incorporates the realistic noise effect, and then give an experimental demonstration on a superconducting IBM Quantum device. The maximum likelihood estimator is constructed based on the model assuming the depolarization noise. We then formulate the problem as a two-parameters estimation problem with respect to the target amplitude parameter and the noise parameter. In particular we show that there exist anomalous target values, where the Fisher information matrix becomes degenerate and consequently the estimation error cannot be improved even by increasing the number of amplitude amplifications. The experimental demonstration shows that the proposed maximum likelihood estimator achieves quantum speedup in the number of queries, though the estimation error saturates due to the noise. This saturated value of estimation error is consistent to the theory, which implies the validity of the depolarization noise model and thereby enables us to predict the basic requirement on the hardware components (particularly the gate error) in quantum computers to realize the quantum speedup in the amplitude estimation task.