This paper introduces a methodology to improve supervised classification by eliminating multicollinearity in predictor variables through partial least squares (PLS) logistic regression. The authors demonstrate that transforming predictor variables into orthogonal components leads to reduced error rates in classification tasks and propose various strategies for feature selection and classifier performance evaluation. The findings contribute to better data analysis practices in scenarios with many predictors, emphasizing the importance of addressing multicollinearity for enhancing classification accuracy.