## Hiroshima Mathematical Journal

### Asymptotic cut-off point in linear discriminant rule to adjust the misclassification probability for large dimensions

#### Abstract

This paper is concerned with the problem of classifying an observation vector into one of two populations $\mathit{\Pi}_{1} : N_{p}(\mu_{1},\Sigma)$ and $\mathit{\Pi}_{2} : N_{p}(\mu_{2},\Sigma)$. Anderson (1973, Ann. Statist.) provided an asymptotic expansion of the distribution for a Studentized linear discriminant function, and proposed a cut-off point in the linear discriminant rule to control one of the two misclassification probabilities. However, as dimension $p$ becomes larger, the precision worsens, which is checked by simulation. Therefore, in this paper we derive an asymptotic expansion of the distribution of a linear discriminant function up to the order $p^{-1}$ as $N_1$, $N_2$, and $p$ tend to infinity together under the condition that $p/(N_{1}+N_{2}-2)$ converges to a constant in $(0, 1)$, and $N_{1}/N_{2}$ converges to a constant in $(0, \infty)$, where $N_i$ means the size of sample drown from $\mathit{\Pi}_i(i=1, 2)$. Using the expansion, we provide a cut-off point. A small-scale simulation revealed that our proposed cut-off point has good accuracy.

#### Article information

Source
Hiroshima Math. J., Volume 47, Number 3 (2017), 319-348.

Dates
Revised: 19 December 2016
First available in Project Euclid: 3 November 2017

https://projecteuclid.org/euclid.hmj/1509674450

Digital Object Identifier
doi:10.32917/hmj/1509674450

Mathematical Reviews number (MathSciNet)
MR3719447

Zentralblatt MATH identifier
1381.62198

#### Citation

Yamada, Takayuki; Himeno, Tetsuto; Sakurai, Tetsuro. Asymptotic cut-off point in linear discriminant rule to adjust the misclassification probability for large dimensions. Hiroshima Math. J. 47 (2017), no. 3, 319--348. doi:10.32917/hmj/1509674450. https://projecteuclid.org/euclid.hmj/1509674450