Abstract
The maximum likelihood estimator (MLE) suffers from the instability problem in the presence of multicollinearity for a Poisson regression model (PRM). In this study, we propose a new estimator with some biasing parameters to estimate the regression coefficients for the PRM when there is multicollinearity problem. Some simulation experiments are conducted to compare the estimators' performance by using the mean squared error (MSE) criterion. For illustration purposes, aircraft damage data has been analyzed. The simulation results and the real-life application evidenced that the proposed estimator performs better than the rest of the estimators.
Similar content being viewed by others
Introduction
The Poisson regression model (PRM) is often adopted in modelling count data. PRM is employed to model the relationship between the response variable and one or more regressors. The response variable comes in the form of a count variable or non-negative integers such as the defects in a unit of manufactured product, errors or bugs in software, number of road accidents, number of times a machine fail in a month, occurrences of virus disease, count of particulate matter or other pollutants in the environment etc. The regression coefficients in PRM are estimated using the Maximum Likelihood Estimator (MLE).
In LRM, the estimator performance suffers from high instability when the regressors are correlated, i.e. multicollinearity (for example, see1,2). Multicollinearity effects include significant variance and covariances of the regression coefficients, wider confidence intervals, insignificant t-ratios and high R-square. Multicollinearity also negatively influence the performance of the MLE in PRM3,4. Alternative estimators to the MLE in LRM are the ridge regression estimator by Hoerl and Kennard5, Liu estimator by Liu6, Liu type estimator by Liu7, two-parameter estimator by Ozkale and Kaciranlar8, k-d class estimator by Sakallioglu and Kaciranlar9, a two-parameter estimator by Yang and Chang10, modified two-parameter estimator by Dorugade11 and recently, the modified ridge type estimator by Lukman et al.12, modified new two-parameter estimator by Lukman et al.13, modified new two-parameter estimator by Ahmad and Aslam14, and K–L estimator by Kibria and Lukman15.
Researchers have applied some of these estimators to the Poisson regression model. These include the Poisson ridge regression estimator (PRE) by Månsson and Shukur3, Månsson et al.16 developed the Poisson Liu estimator (PLE) to mitigate the problem of multicollinearity in PRM. Batah et al.17 proposed the modified jackknifed ridge regression estimator (MJRE) for the LRM while Turkan and Özel18 adopted the MJRE to the Poisson regression model as a remedy to the problem of multicollinearity. Özkale and Kaciranlar8 combine the Liu regression estimator and the ridge regression to form the two-parameter estimator in LRM. Thus, Asar and Genc19 implemented the two-parameter estimator to the Poisson regression model. Rashad and Algamal20 developed a new ridge estimator for the Poisson regression model by modifying Poisson modified jackknifed ridge regression. Qasim et al.4 suggest some new shrinkage estimators for the PLE. We classified these estimators into Poisson regression estimators with a single shrinkage parameter and two-parameters, respectively. Recently, Kibria and Lukman15 proposed another ridge type estimator called K–L estimator with a single shrinkage parameter.
This study aims to propose an estimator that can handle multicollinearity in a Poisson regression model. We harmonize the K–L estimator to the PRM and suggest some shrinkage estimators for the estimator. Also, compare the proposed estimator's performance with the MLE, PRE and PLE in terms of the matrix mean square error (MSEM) and mean square error (MSE). The small sample properties are investigated using a simulation experiment. Finally, the new method's benefit is evaluated in an example using aircraft damage data that was initially analyzed by Myers et al.21.
This paper structuring is as follows: the Poisson regression model, some estimators and the MSEM and MSE properties of the estimators are discussed in Sect. 2. A Monte Carlo simulation experiment has been conducted in Sect. 3. To illustrate the finding of the paper, aircraft damage data was analyzed in Sect. 4. Some concluding remarks are presented in Sect. 5.
Statistical methodology
Poisson regression model and maximum likelihood estimator
Suppose that the response variable, \(y_{i}\) is in the form of non-negative integers (or count data), then the probability function is given as follows
where \(\mu_{i} > 0.\) The mean and variance of the Poisson distribution in Eq. (2.1) are the same (i.e.\(E(y) = Var\left( y \right) = \mu\)). The model is written in terms of the mean of the response. According to Myers et al.21, we assume that there exists a function, g, that relates the mean of the response to a linear predictor such that
where \(g(.)\) is a monotone differentiable link function. The log link function is a popular type of this link function such that \(g\left( {\mu_{i} } \right) = \ln \left( {\mu_{i} } \right) = \exp \left( {x^{\prime}_{i} \beta } \right).\) This log link is generally adopted for the Poisson regression model because it ensures that all the fitted values for the response variable are positive. The maximum likelihood estimator is popularly used to estimate the coefficients of the PRM, where the likelihood function is defined as:
where \(\mu_{i} = g^{ - 1} \left( {x^{\prime}_{i} \beta } \right).\) The log-likelihood function is used to estimate the parameter vector \(\beta\)
Since Eq. (2.4) is nonlinear in \(\beta\), the solution is obtained using iterative methods. A common such procedure is the Fisher Scoring method defined as:
where \(S(\beta ) = \frac{\partial l\left( \beta \right)}{{\partial \beta }}\) and \(I^{ - 1} \left( \beta \right) = \left( { - E(\partial^{2} l(\beta )/\partial \beta \partial \beta^{\prime})} \right)^{ - 1}\). The final step of the estimated coefficients corresponds to:
where \(\hat{W} = diag(\mu_{i}^{2} )\) matrix and \(\hat{z}\) is the adjusted response variable, \(\hat{z} = x^{\prime}_{i} \hat{\beta }^{PMLE} + \frac{{y_{i} - \hat{\mu }_{i} }}{{\hat{\mu }_{i}^{2} }}.\) \(\hat{W}\) and \(\hat{z}\) are obtained using Fisher scoring iterative procedure (see Hardin and Hilbe22). The covariance matrix and mean square error are given respectively as follows:
and
where \(\lambda_{i}\) is the ith eigenvalue of the matrix \(X^{\prime}\hat{W}X\).
Poisson K–L estimator
Månsson and Shukur3 developed the Poisson ridge regression estimator (PRRE) to mitigate the problem of multicollinearity, which is defined as follows:
where \(k > 0\) is the biasing parameter, \(I\) is a \(p \times p\) identity matrix and the optimal value of k is defined as:
where \(\hat{\alpha }_{i}\) is the ith component of \(\alpha = Q^{\prime}\beta\), Q is the matrix whose columns are the eigenvectors of \(X^{\prime}\hat{W}X.\)
Månsson et al.16 introduced the Poisson Liu estimator (PLE) as follows:
where \(d\) according to Månsson et al.16 may be estimated by the following formula:
Kibria and Lukman15 proposed a new single parameter ridge-type estimator for the linear regression model, which is defined as follows:
Following Kibria and Lukman15, we proposed the following new estimator for the Poisson regression model as follows:
Suppose \(\alpha = Q^{\prime}\beta\) and \(Q^{\prime}X^{T} \hat{W}XQ = \Lambda = diag\left( {\lambda_{1} ,...,\lambda_{p} } \right)\) where \(\lambda_{1} \ge \lambda_{2} \ge ... \ge \lambda_{p} ,\Lambda\) is the matrix of eigenvalues of \(X^{T} \hat{W}X\) and Q is the matrix whose columns are the eigenvectors of \(X^{T} \hat{W}X.\) The matrix mean square error and the mean square error of the estimators PMLE, PRRE, PLE and PKLE are provided in Eqs. (2.15) to (2.21) respectively as follows:
where \(\Lambda_{d} = \left( {\Lambda + I} \right)^{ - 1} \left( {\Lambda + dI} \right).\)
where \(\Lambda^{k} = \left( {\Lambda + kI_{p} } \right)^{ - 1} .\)
where \(\lambda_{i}\) is the ith eigenvalue of \(X^{\prime}\hat{W}X\) and \(\alpha_{j}\) is the jth element of \(\alpha .\) For the purpose of theoretical comparisons, we adopt the following lemmas.
Lemma 2.1
Let A be a positive definite (pd) matrix, that is A > 0, and \(a\) be some vector, then \(A - aa^{^{\prime}} \ge 0\) if and only if (iff) \(a^{\prime}A^{ - 1} a \le 1\)23.
Lemma 2.2
\(MSEM(\hat{\beta }_{1} ) - MSEM(\hat{\beta }_{2} ) = \sigma^{2} D + b_{1} b_{1}^{T} - b_{2} b_{2}^{T} > 0\) if and only if \(b_{2}^{T} \left[ {\sigma^{2} D + b_{1} b_{1}^{T} } \right]^{ - 1} b_{2} < 1\) where \(MSE\left( {\hat{\beta }_{j} } \right) = Cov\left( {\hat{\beta }_{j} } \right) + b_{j}^{T} b_{j}\)24.
Theorem 2.1
\(\hat{\alpha }^{PKLE}\) is better than \(\hat{\alpha }^{PMLE}\) iff, \(MSEM\;[\hat{\alpha }^{PMLE} ] - MSEM\;[\hat{\alpha }^{PKLE} ] > 0\) provided k > 0.
Proof.
The matrix \(\Lambda^{ - 1} - \Lambda^{k} (\Lambda - kI_{p} )\Lambda^{ - 1} \Lambda^{k} (\Lambda - kI_{p} )\) is pd since \(\lambda_{i} \left( {\lambda_{i} + k} \right)^{2} - \lambda_{i} \left( {\lambda_{i} - k} \right)^{2} > 0.\)
Theorem 2.2
\(\hat{\alpha }^{PKLE}\) is better than \(\hat{\alpha }^{PRRE}\) iff, \(MSEM\;[\hat{\alpha }^{PRRE} ] - MSEM\;[\hat{\alpha }^{PKLE} ] > 0\) provided k > 0.
Proof.
The matrix \(\Lambda^{k} \Lambda \Lambda^{k} - \Lambda^{k} (\Lambda - kI_{p} )\Lambda^{ - 1} \Lambda^{k} (\Lambda - kI_{p}\) is pd since \(\lambda_{i}^{2} \left( {\lambda_{i} + k} \right)^{2} - \left( {\lambda_{i} + k} \right)^{2} \left( {\lambda_{i} - k} \right)^{2} > 0\) for \(2\lambda_{i} - k > 0.\)
Theorem 2.3
\(\hat{\alpha }^{PKLE}\) is better than \(\hat{\alpha }^{PLE}\) iff, \(MSEM\;[\hat{\alpha }^{PLE} ] - MSEM\;[\hat{\alpha }^{PKLE} ] > 0\) provided k > 0.
Proof.
The matrix found in the above equation \(\Lambda_{d} \Lambda^{ - 1} \Lambda_{d}^{T} - \Lambda^{k} (\Lambda - kI_{p} )\Lambda^{ - 1} \Lambda^{k} (\Lambda - kI_{p} )\) is pd since \(\lambda_{i} \left( {\lambda_{i} + k} \right)^{2} \left( {\lambda_{j} + d} \right)^{2} - \lambda_{j} \left( {\lambda_{j} + 1} \right)^{2} \left( {\lambda_{i} - k} \right)^{2} > 0.\)
Selection of Biasing Parameter
The parameter is estimated by taking the first derivative of the MSE function of \(\hat{\alpha }^{PKLE}\) with respect to k and equating the resulting solution to zero. We obtain the following estimates of k:
Following Månsson et al.16 and Lukman and Ayinde25, we propose the following forms of the shrinkage parameters in Eq. (2.26).
Simulation Experiment
Simulation Design
Since a theoretical comparison among the estimators is not sufficient, as simulation experiment has been carried out in this section. We generate the response variable of the PRM from the Poisson distribution \(P_{0} \left( {\mu_{i} } \right)\) where \(\mu_{i} = \exp \left( {x_{i} \beta } \right)\;i = 1,2, \ldots ,n,\;\beta = \left( {\beta_{0} ,\beta_{1} ,\beta_{2} , \ldots ,\beta_{p} } \right)^{\prime }\) such that \(x_{i}\) is the ith row of the design matrix X and following Kibria1, we generated the X matrix as follows:
where \(\rho^{2}\) is the correlation between the explanatory variables. The values of \(\rho\) are chosen to be 0.85, 0.9, 0.95 and 0.99. The mean function is obtained for p = 4 and 7 regressors, respectively. According to Kibria et al.26, the intercept value are chosen to be − 1, 0 and 1 to change the average intensity of the Poisson process. The slope coefficients chosen so that \(\sum\nolimits_{j = 1}^{p} {\beta_{j}^{2} = 1}\) and \(\beta_{1} = \beta_{2} = \cdots = \beta_{p}\) for sample sizes 50, 75, 100 and 200. Simulation experiment conducted through R programming language27. The estimated MSE is calculated as
where \(\hat{\beta }_{ij}\) denotes the estimate of the ith parameter in jth replication and βi is the true parameter values. The experiment is replicated 1000 times. The simulated MSE values of the estimators for p = 4 and intercepts = − 1, 0 and 1 are presented in Tables 1, 2, 3 respectively and p = 7 and intercepts = − 1, 0 and 1 are presented in Tables 4, 5, 6, respectively.
Simulation results discussion
The simulation result in Tables 1, 2, 3, 4, 5, 6 shows that the following factors affect the estimators’ performances: the degree of correlation, the number of explanatory variables, the sample size and the value of the intercept. We observed that increasing the sample size led to a decrease in the MSE values of all the estimators, which is one of the unique properties for any statistical estimator. The proposed estimator, PKLE2 consistently possessed the minimum MSE. Increasing the degree of correlation increases the simulated MSE values for each of the estimators. The Poisson ridge (PRE) and Liu estimator (PLE) competes favorably with the proposed estimator. For instance, The MSE of PRE and PLE are very similar to the proposed estimator, especially when multicollinearity is low (ρ = 0.8–0.95).The performance of PMLE is the worst compared to other estimators, especially when the correlation among regressors is 0.90 and higher. This study increased explanatory variables from 4 to 7 and observed that the MSE rises by increasing explanatory variables. The MSE for all the estimators decreases when we change the intercept value from − 1 to + 1. Consistently, the proposed estimator PKLE2 outperforms all other estimators considered in this study. We also plotted MSE vs sample sizes and different ρ and intercepts and presented them Figs. 1, 2, 3, 4 and 5. From these figures, we observed that PKLE2 consistently possessed minimum value at the different sample size (n), followed by PKLE1 while PMLE has the worst performance. These figures also revealed that the estimators’ performance becomes similar for large n (200) or small correlation (0.80). However, the proposed estimator, PKLE2 performed the best.
Real life application
In this session, we examined the effectiveness of the new estimator using real-life data. We adopted the aircraft damage data to evaluate the proposed estimator's performance and some other estimators in this study. The dataset was initially used by Myers et al.21 and recently by Asar and Genc19 and others. The dataset provides the information about two types of aircraft, the McDonnell Douglas A-4 Skyhawk and the Grumman A-6 Intruder. This data describe 30 strike missions of these two aircraft. The explanatory variables are as follows: x1 is a binary variable representing the aircraft type (A-4 coded as 0 and A-6 coded as 1), x2 and x3 denote bomb load in tons and total months of aircrew experience, respectively. The response variable, y represents the number of locations with damage on the aircraft, which follows a Poisson distribution19,21. Amin et al.28 examine if the model follows a Poisson regression model by adopting the Pearson chi-square goodness of fit test. The test confirms that the response variable is well fitted to the Poisson distribution with test statistic (p-value) is given as 6.89812 (0.07521).
According to Myers et al.21, there is evident of multicollinearity problem in the data. The eigenvalues of the \(X^{\prime}\hat{W}X\) matrix are 4.3333, 374.8961 and 2085.2251. The condition number, \(CN = \sqrt {\frac{{\max \left( {eigenvalue} \right)}}{{\min \left( {eigenvalue} \right)}}} = 219.365\), also shows multicollinearity in the dataset2,12. The estimators’ performances are assessed through the mean squared error (MSE). The MSE of the estimators is computed using Eqs. (2.15). (2.17), (2.19) and (2.21), respectively. The biasing parameters are determined using Eqs. (2.10), (2.12), (2.27) and (2.28), respectively. The regression coefficients and the MSE values are provided in Table 7. From Table 7, we observed that all the coefficients have a similar sign. PMLE has the highest mean square error, while the proposed estimator (PKLE2) has the lowest MSE which established its superiority. The maximum likelihood estimator possesses the highest MSE due to the presence of multicollinearity. The ridge and Liu estimator equally perform well when there is multicollinearity. We observed that the performance of the proposed estimator is a function of the biasing parameter, k.
Some concluding remarks
The K–L estimator is an estimator with a single biasing parameter, k which eliminates the biasing parameter's computational rigour as obtainable in some of the two-parameter estimators. It falls in the ridge and Liu estimator class to mitigate multicollinearity in the linear regression model. According to Kibria and Lukman15, K–L estimator outclasses the following estimators: the ordinary least squares estimator, the ridge and the Liu estimator in the linear regression model. As earlier stated, the multicollinearity influences the performance of the maximum likelihood estimator (MLE) in both the linear regression models and the Poisson regression models (PRM). The ridge regression and Liu estimator at a different time were harmonized to the PRM to solve multicollinearity. However, in this study, we developed a new estimator, establish its statistical properties, carried out theoretical comparisons with the estimators mentioned above. Furthermore, we conducted a simulation experiment and analyzed a real-life application to show the proposed estimator effectiveness. The simulated and application results show that the proposed estimators outperform the existing estimators, while PMLE has the worst performance.
References
Kibria, B. M. G. Performance of some new ridge regression estimators. Commun. Stat. Simul. Comput. 32(2), 419–435 (2003).
Lukman, A. F., Ayinde, K., Aladeitan, B. B. & Rasak, B. An unbiased estimator with prior information. Arab J. Basic Appl. Sci. 27(1), 45–55 (2020).
Månsson, K. & Shukur, G. A Poisson ridge regression estimator. Econ. Model. 28, 1475–1481 (2011).
Qasim, M., Kibria, B. M. G., Månsson, K. & Sjölander, P. A new Poisson Liu regression estimator: method and application. J. Appl. Stat. https://doi.org/10.1080/02664763.2019.1707485 (2019).
Hoerl, A. E. & Kennard, R. W. Ridge regression: biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970).
Liu, K. A new class of biased estimate in linear regression. Commun. Stat. 22(2), 393–402 (1993).
Liu, K. Using Liu-type estimator to combat collinearity. Commun. Stat. 32(5), 1009–1020 (2003).
Ozkale, M. R. & Kaciranlar, S. The restricted and unrestricted two-parameter estimators. Commun. Statist. Theor. Meth. 36, 2707–2725 (2007).
Sakallıoğlu, S. & Kaçıranlar, S. A new biased estimator based on ridge estimation. Statist. Papers 49(4), 669–689 (2008).
Yang, H. & Chang, X. A new two-parameter estimator in linear regression. Commun. Stat. Theory Methods 39(6), 923–934 (2010).
Dorugade, A. V. Modified two parameter estimator in linear regression. J. Stat. Trans. New Ser. 15(1), 23–36 (2014).
Lukman, A. F., Ayinde, K., Binuomote, S. & Onate, A. C. Modified ridge-type estimator to combat multicollinearity: application to chemical data. J. Chemomet. https://doi.org/10.1002/cem.3125 (2019).
Lukman, A. F., Ayinde, K., Sek, S. K. & Adewuyi, E. A modified new two-parameter estimator in a linear regression model. Model. Simul. Eng. https://doi.org/10.1155/2019/6342702 (2019).
Ahmad, S. & Aslam, M. Another proposal about the new two-parameter estimator for linear regression model with correlated regressors. Commun. Stat. Simul. Comput. https://doi.org/10.1080/03610918.2019.1705975 (2020).
Kibria, B. M. G. & Lukman, A. F. A new ridge-type estimator for the linear regression model: simulations and applications. Scientifica https://doi.org/10.1155/2020/9758378 (2020).
Månsson, Kibria, B.M.G., Sjölander, P. and Shukur, G. (2012). Improved Liu estimators for the Poisson regression model. Int. J. Stat. Prob. 1(1).
Batah, F. S. M., Ramanathan, T. V. & Gore, S. D. The efficiency of modified Jackknife and ridge type regression estimators: a comparison. Surv. Math. Appl. 3, 111–122 (2008).
Türkan, S. & Özel, G. A new modified Jackknifed estimator for the Poisson regression model. J. Appl. Stat. 43, 1892–1905. https://doi.org/10.1080/02664763.2015.1125861 (2016).
Asar, Y. & Genc, A. A new two-parameter estimator for the poisson regression model. Iran J Sci Technol Trans Sci https://doi.org/10.1007/s40995-017-0174-4 (2017).
Rashad, N. K. & Algamal, Z. Y. A new ridge estimator for the poisson regression model. Iran J. Sci. Technol. Trans. Sci. https://doi.org/10.1007/s40995-019-00769-3(01 (2019).
Myers, R. H., Montgomery, D. C., Vining, G. G. & Robinson, T. J. Generalized linear models: With applications in engineering and the sciences, 791 (Wiley, New York, 2012).
Hardin, J. W. & Hilbe, J. M. Generalized linear models and extensions (Stata Press, College Station, 2012).
Farebrother, R. W. Further results on the mean square error of ridge regression. J. Roy. Statist. Soc. B 38, 248–250 (1976).
Trenkler, G. & Toutenburg, H. Mean squared error matrix comparisons between biased estimators—an overview of recent results. Stat Pap. 31(1), 165–179 (1990).
Lukman, A. F. & Ayinde, K. Review and classifications of the ridge parameter estimation techniques. Hacettepe J. Math. Stat. 46(5), 953–967 (2017).
Kibria, B. M. G., Månsson, K. & Shukur, G. A simulation study of some biasing parameters for the ridge type estimation of Poisson regression. Commun. Stat. Simul Comput. I 44, 943–957 (2015).
R Core Team (2020). R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. https://www.R-project.org.
Amin, M., Akram, M. N. & Amanullah, M. On the James-Stein estimator for the Poisson regression model. Commun. Stat. Simul. Comput. https://doi.org/10.1080/03610918.2020.1775851 (2020).
Author information
Authors and Affiliations
Contributions
A.F.L.: Conceptualization, Methodology, Writing—original draft. E.A.: Conceptualization, Software. K.M.: Supervision, Editting, Review. G.B.M.K.: Conceptualization, Supervision, Editting, Review.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher's note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Lukman, A.F., Adewuyi, E., Månsson, K. et al. A new estimator for the multicollinear Poisson regression model: simulation and application. Sci Rep 11, 3732 (2021). https://doi.org/10.1038/s41598-021-82582-w
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41598-021-82582-w
This article is cited by
-
New ridge parameter estimators for the quasi-Poisson ridge regression model
Scientific Reports (2024)
-
A new class of Poisson Ridge-type estimator
Scientific Reports (2023)
-
An Optimization Technique for Solving a Class of Ridge Fuzzy Regression Problems
Neural Processing Letters (2021)
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.