A minimax-converse has been suggested for the general channel coding problem . This converse comes in two flavors. The first flavor is generally used for the analysis of the coding problem with non-vanishing error probability and provides an upper bound on the rate given the error probability. The second flavor fixes the rate and provides a lower bound on the error probability. Both converses are given as a min-max optimization problem of an appropriate binary hypothesis testing problem. The properties of the first converse were studies in  and a saddle point was proved. The minimax solution can also be used in conjunction with random coding to achieve 'optimal'  coding performance. In this paper we study the properties of the second form, i.e. when the rate is fixed. Necessary and sufficient conditions on the saddle point solution are proved. Moreover, an algorithm for the computation of the saddle point, and hence the bound, is developed. In the DMC case, the algorithm runs in a polynomial time.