When mean vectors and covariance matrices of two classes are available in a binary classification problem, Lanckriet et al. propose a minimax approach for finding a linear classifier which minimizes the worst-case (maximum) misclassification probability. In this paper, we extend the minimax approach to a multiple classification problem, where the number m of classes could be more than two. Assume that mean vectors and covariance matrices of all the classes are available, but no further assumptions are made with respect to class-conditional distributions. Then we define a problem for finding linear classifiers which minimize the worst-case misclassification probability α. Unfortunately, no efficient algorithms for solving the problem are known. So we introduce the maximum pairwise misclassification probability β instead of α. It is shown that β is a lower bound of α and a good approximation of α when m or α are small. We define a problem for finding linear classifiers which minimize the probability β and show some basic properties of the problem. Then the problem is transformed to a parametric Second Order Cone Programming problem. We propose an algorithm for solving the problem by using nice properties of it. We conduct preliminary numerical experiments and confirm that classifiers computed by our method work very well to benchmark problems.