Jason C. Parcon
Doctor of Philosophy
Title: Spearman Rank Regression
Dr. Joshua D. Naranjo, Chair
Dr. Joseph W. Mckean
Dr. Gerald L. Sievers
Dr. Bradley E. Huitema
Date: Friday, May 23, 2003, 3:00 p.m. - 5:00 p.m.
Everett Tower - Alavi Commons
The least squares estimator of a regression coefficient
ß is known to be optimal when errors in a regression model have
a normal distribution. However, in the presence of more extreme or outlying
values, the resulting least squares estimates have inflated MSEs since
they tend to get strongly pulled by these extreme values.
Rank-based estimates proposed by Jaeckel (1972) and Jureckova (1971)
achieved some robustness against outliers and has good efficiency for
normal error distribution. This approach, however, remains robust only
when the x values are fixed. If the x values are a random sample from
some underlying distribution, then the possibility of gross errors is
introduced and the method loses its robustness.
This dissertation proposes an estimator obtained from an estimating
function that is fundamentally the Spearman's correlation of the x values
and their corresponding residuals. For simple linear regression, the
solution is the weighted median of the pairwise slopes, Yj - Yi / xj-xi
, with the weights proportional to Rank (xj) -(Rank (xi ).
Specifically, this dissertation investigated the efficiency and robustness
properties of the simple Spearman regression estimate, developed Spearman
multiple regression estimators (for two independent variables) that
reduce to simple Spearman rank regression, derived consistency
and asymptotic normality properties of the proposed multiple regression
estimates, investigated small-sample performance of the proposed estimates
by simulation methods, and compared the performance of the proposed
multiple regression method with two alternative methods for multiple
Spearman rank regression.
List of Archives: