In this paper we examine the Gaussian and deterministic ML DOA estimates for several cases of a-priori noise statistics. We compare the structure of the Gaussian and the deterministic log-likelihood functions are show that for unknown noise spectrum, both estimators minimize the entropy of the measurements. We also shown that the Gaussian log-likelihood function includes an additional term, which is the entropy of the projection of the measurements onto the signal-subspace. We derive General conditions under which the deterministic ML estimate is also an extremum of the Gaussian likelihood function and shown that the asymptotic (large K) limit of these conditions guarantees equal asymptotic performances. We also shown that the asymptotic performance is not affected by prior knowledge of the noise spectrum.