Estimating the Lifetime Parameters of the Odd-Generalized-Exponential–Inverse-Weibull Distribution Using Progressive First-Failure Censoring: A Methodology with an Application
Abstract
:1. Introduction
- 1.
- i.
- The importance of OGE-IWD in capturing failure characteristics: The OGE-IWD distribution offers flexibility in modeling skewed lifetime data and can capture the heavy tails or asymmetry often observed in failure times. This capability makes it particularly useful for datasets with progressive first-failure censoring, where failures may occur early or late in a product’s lifetime, depending on the external stressors or quality variances.
- ii.
- Modeling both early and late failures: Discussing cases of both early failures and late failures can illustrate how the OGE-IWD accommodates a wide range of failure behaviors. Early failures may be common in systems with a “burn-in” period, while late failures might indicate wear-out phenomena. The OGE-IWD’s structure allows it to adapt to these different modes, improving model accuracy across varying failure patterns.
- iii.
- OGE-IWD’s flexibility compared to traditional distributions: Traditional lifetime models, like the Weibull or exponential, may fall short in adequately modeling the dual nature of early and late failures due to their more restrictive shapes. The OGE-IWD adds flexibility through additional shape parameters, which enable it to fit a broader spectrum of real-world lifetime data characteristics, particularly for complex failure patterns.
2. Maximum-Likelihood Estimation
- S1.
- The starting values for need to be specified with ; that is, .
- S2.
- In the iteration, compute and , whereThe matrix representing the observed information for the parameters , , and is referred to as I. The components of this matrix, I, are detailed as follows:
- S3.
- Assign
- S4.
- By setting , the MLEs for the parameters (represented by , , and ) can be obtained by repeatedly executing steps S2 and S3 until
- 1.
- (a)
- Tolerance level for stopping: This is a threshold value that determines how close the estimated parameters need to be to the optimal solution before the algorithm halts. This tolerance can apply to either of the following
- Gradient norm tolerance: The algorithm stops when the norm (magnitude) of the gradient of the objective function falls below a specified small value, indicating that the function’s slope is almost flat and a minimum has been reached. Typical values for the gradient norm tolerance might be in the range of to .
- Parameter change tolerance: The algorithm also stops when consecutive iterations yield parameter estimates that differ by less than a certain tolerance. This indicates that the estimates are no longer changing significantly and have likely converged. Common values for parameter change tolerance are similar, often set around to .
- (b)
- BFGS method convergence: The BFGS method, which is a quasi-Newton method, utilizes an approximation of the Hessian matrix of second derivatives. This approximation is updated iteratively, so each iteration requires the Hessian to be closer to the true value. The stopping tolerance for BFGS is typically set based on the following.
- Gradient norm: As with Newton–Raphson, the gradient norm is checked to see if it has fallen below the tolerance.
- Objective function convergence: The algorithm also stops if changes in the objective function value (e.g., likelihood or error) between iterations fall below a set threshold, such as or smaller. This threshold is often used in conjunction with gradient tolerance to confirm that the method has reached an optimal point.
2.1. Existence and Uniqueness of MLEs
- ▸
- For (9), when , we have , but when , we have , for
- ▸
- Similarly, for (10), when , we have , but when , we have , for
- ▸
- Similarly, for (11), when , we have , but when , we have , for
2.2. Approximate Confidence Intervals
2.3. Delta Method
- S1.
- Define , , and as three quantities with the following specific forms:
- S2.
- Utilize the formulas provided to calculate the approximate variances for , and :
- S3.
- Compute the ACIs for , and by applying the following formula:
2.4. Log-Normal ACIs
3. Bayesian Estimation
3.1. Prior and Posterior Distributions
3.2. MCMC Techniques
- 1.
- Establish the starting values for the parameters and initialize .
- 2.
- Generate from gamma distribution .
- 3.
- Employ the following MH algorithm to generate from and from , using normal proposal distributions and , respectively. Execute the following tasks:
- (a)
- Generate a proposal from and from .
- (b)
- Assess the acceptance probabilities:
- (c)
- Obtain values and from a iniform distribution.
- (d)
- If , accept the proposal and set . Otherwise, retain the previous value by setting .
- (e)
- If , accept the proposal and set . Otherwise, retain the previous value by setting .
- 4.
- For a given t, compute the survival, hazard rate, and the inverse hazard rate functions:
- 5.
- Set .
- 6.
- Repeat steps M times to collect the required number of samples. After removing the initial burn-in samples, use the remaining samples to compute the Bayesian estimates.
- 7.
- It is now feasible to compute the Bayes estimate of for the SE, LINEX, and GE loss functions as follows:
- 8.
- Arrange in ascending order as . Accordingly,
4. Simulation Study
- 1.
- 2.
- Among the various methods, Bayes estimates exhibit the smallest MSEs and AWs for the parameters , , , , and , indicating superior performance compared to MLEs.
- 3.
- The Bayes estimate utilizing the GE loss function with delivers more accurate estimates for , , , , and due to its smaller MSEs.
- 4.
- When using the LINEX and GE loss function with , Bayes estimates outperform those obtained with , as they yield smaller MSEs.
- 5.
- For the SE loss function, Bayes estimates for , and are superior to those derived under the LINEX loss function with , demonstrated by their smaller MSEs.
- 6.
- Bayes estimates for , and under the LINEX loss function with show better performance compared to those under the SE loss function, as evidenced by their smaller MSEs.
- 7.
- Analysis of all tables reveals that increasing the group size k leads to higher MSEs and AWs for , , , , and .
- 8.
- Given fixed sample sizes and observed failures, the CSI scheme proves to be the most effective, offering the smallest MSEs and AWs.
- 9.
- In general, all the point estimates are completely effective because the average biases are very small. Average bias tends to zero when n and m increase.
- 10.
- The MLE and Bayesian methods provide very similar estimates, with both having ACIs with high CPs of around . However, Bayesian CRIs achieve the highest CPs.
5. Application to Real-Life Data
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Hassan, A.S.; Elsherpieny, E.A.; Mohamed, R.E. Odds generalized exponential-inverse Weibull distribution: Properties estimation. Pak. J. Stat. Oper. Res. 2018, 14, 1–22. [Google Scholar] [CrossRef]
- Mohamed, A.A.; Refaey, R.M.; AL-Dayian, G.R. Bayesian and E-Bayesian estimation for odd generalized exponential inverted Weibull distribution. J. Bus. Environ. Sci. 2024, 3, 275–301. [Google Scholar] [CrossRef]
- Noor, F.; Aslam, M. Bayesian inference of the inverse Weibull mixture distribution using type-I censoring. J. Appl. Stat. 2013, 40, 1076–1089. [Google Scholar] [CrossRef]
- Basu, S.; Singh, S.K.; Singh, U. Parameter estimation of inverse Lindley distribution for Type-I censored data. Comput. Stat. 2017, 32, 367–385. [Google Scholar] [CrossRef]
- Joarder, A.; Krishna, H.; Kundu, D. Inferences onWeibull parameters with conventional type-I censoring. Comput. Stat. Data Anal. 2011, 55, 1–11. [Google Scholar] [CrossRef]
- Kundu, D.; Howlader, H. Bayesian inference and prediction of the inverse Weibull distribution for Type-II censored data. Comput. Stat. Data Anal. 2010, 54, 1547–1558. [Google Scholar] [CrossRef]
- Kundu, D.; Raqab, M.Z. Bayesian inference and prediction of order statistics for a Type-II censored Weibull distribution. J. Stat. Plan. Inference 2012, 142, 41–47. [Google Scholar] [CrossRef]
- Singh, S.K.; Singh, U.; Sharma, V.K. Bayesian Estimation and Prediction for Flexible Weibull Model under Type-II Censoring Scheme. J. Probab. Stat. 2013, 2013, 146140. [Google Scholar] [CrossRef]
- Panahi, H.; Sayyareh, A. Parameter estimation and prediction of order statistics for the Burr Type XII distribution with Type II censoring. J. Appl. Stat. 2014, 41, 215–232. [Google Scholar] [CrossRef]
- Asgharzadeh, A.; Ng, H.K.; Valiollahi, R.; Azizpour, M. Statistical inference for Lindley model based o type II censored data. J. Stat. Theory Appl. 2017, 16, 178–197. [Google Scholar] [CrossRef]
- Xin, H.; Zhu, J.; Sun, J.; Zheng, C.; Tsai, T.R. Reliability inference based on the three-parameter Burr typ XII distribution with type II censoring. Int. J. Reliab. Qual. Saf. Eng. 2018, 25, 1850010. [Google Scholar] [CrossRef]
- Goyal, T.; Rai, P.K.; Maury, S.K. Classical and Bayesian studies for a new lifetime model in presence of type-II censoring. Commun. Stat. Appl. Methods 2019, 26, 385–410. [Google Scholar] [CrossRef]
- Arabi Belaghi, R.; Noori Asl, M.; Gurunlu Alma, O.; Singh, S.; Vasfi, M. Estimation and prediction for the Poisson-Exponential distribution based on type-II censored data. Am. J. Math. Manag. Sci. 2019, 38, 96–115. [Google Scholar] [CrossRef]
- Balakrishnan, N.; Aggarwala, R. Progressive Censoring: Theory, Methods, and Applications; Springer Science Business Media: Berlin/Heidelberg, Germany, 2000. [Google Scholar]
- Cohen, A.C. Progressively censored samples in life testing. Technometrics 1963, 5, 327–339. [Google Scholar] [CrossRef]
- Brito, E.S.; Ferreira, P.H.; Tomazella, V.L.; Martins Neto, D.S.; Ehlers, R.S. Inference methods for the Very Flexible Weibull distribution based on progressive type-II censoring. Commun. Stat.-Simul. Comput. 2024, 53, 5342–5366. [Google Scholar]
- Abo-Kasem, O.E.; El Saeed, A.R.; El Sayed, A.I. Optimal sampling and statistical inferences for Kumaraswamy distribution under progressive Type-II censoring schemes. Sci. Rep. 2023, 13, 12063. [Google Scholar] [CrossRef]
- Dey, S.; Al-Mosawi, R. Classical and Bayesian Inference of Unit Gompertz Distribution Based on Progressively Type II Censored Data. Am. J. Math. Manag. Sci. 2024, 43, 61–89. [Google Scholar] [CrossRef]
- Kumar, D.; Nassar, M.; Dey, S. Progressive Type-II Censored Data and Associated Inference with Application Based on Li—Li Rayleigh Distribution. Ann. Data Sci. 2023, 10, 43–71. [Google Scholar] [CrossRef]
- Choudhary, H.; Krishna, H.; Nagar, K.; Kumar, K. Estimation in generalized uniform distribution with progressively type-II censored sample. Life Cycle Reliab. Saf. Eng. 2024, 13, 309–323. [Google Scholar] [CrossRef]
- Johnson, L.G. Theory and Technique of Variation Research; Elsevier: Amsterdam, The Netherlands, 1964. [Google Scholar]
- Wu, J.W.; Hung, W.L.; Tsai, C.H. Estimation of the parameters of the Gompertz distribution under the first-failure-censored sampling plan. Statistics 2003, 37, 517–527. [Google Scholar] [CrossRef]
- Wu, J.W.; Yu, H.Y. Statistical inference about the shape parameter of the Burr type XII distribution under the first-failure-censored sampling plan. Appl. Math. Comput. 2005, 163, 443–482. [Google Scholar]
- Wu, S.J.; Kuş, C. On estimation based on progressive first-failure censored sampling. Comput. Stat. Data Anal. 2009, 53, 3659–3670. [Google Scholar] [CrossRef]
- Kumar, I.; Kumar, K.; Ghosh, I. Reliability estimation in inverse Pareto distribution using progressively first failure censored data. Am. J. Math. Manag. Sci. 2023, 42, 126–147. [Google Scholar] [CrossRef]
- Abd-El-Monem, A.; Eliwa, M.S.; El-Morshedy, M.; Al-Bossly, A.; EL-Sagheer, R.M. Statistical Analysis and Theoretical Framework for a Partially Accelerated Life Test Model with Progressive First Failure Censoring Utilizing a Power Hazard Distribution. Mathematics 2023, 11, 4323. [Google Scholar] [CrossRef]
- Elshahhat, A.; Sharma, V.K.; Mohammed, H.S. Statistical analysis of progressively first-failure-censored data via beta-binomial removals. AIMS Math. 2023, 8, 22419–22446. [Google Scholar] [CrossRef]
- Kumar, K.; Kumar, I.; Ng, H.K.T. On Estimation of Shannon’s Entropy of Maxwell Distribution Based on Progressively First-Failure Censored Data. Stats 2024, 7, 138–159. [Google Scholar] [CrossRef]
- Saini, S. Estimation of multi-stress strength reliability under progressive first failure censoring using generalized inverted exponential distribution. J. Stat. Comput. Simul. 2024, 94, 3177–3209. [Google Scholar] [CrossRef]
- Shi, X.; Shi, Y. Estimation of stress-strength reliability for beta log Weibull distribution using progressive first failure censored samples. Qual. Reliab. Eng. Int. 2023, 39, 1352–1375. [Google Scholar] [CrossRef]
- Fathi, A.; Farghal, A.W.A.; Soliman, A.A. Inference on Weibull inverted exponential distribution under progressive first-failure censoring with constant-stress partially accelerated life test. Stat. Pap. 2024, 1–33. [Google Scholar] [CrossRef]
- Eliwa, M.S.; Al-Essa, L.A.; Abou-Senna, A.M.; El-Morshedy, M.; EL-Sagheer, R.M. Theoretical framework and inference for fitting extreme data through the modified Weibull distribution in a first-failure censored progressive approach. Heliyon 2024, 10, e34418. [Google Scholar] [CrossRef]
- Gong, Q.; Chen, R.; Ren, H.; Zhang, F. Estimation of the reliability function of the generalized Rayleigh distribution under progressive first-failure censoring model. Axioms 2024, 13, 580. [Google Scholar] [CrossRef]
- He, D.; Sun, D.; Zhu, Q. Bayesian analysis for the Lomax model using noninformative priors. Stat. Theory Relat. Fields 2023, 7, 61–68. [Google Scholar] [CrossRef]
- Ran, H.; Bai, Y. Partially fixed bayesian additive regression trees. Stat. Theory Relat. Fields 2024, 1–11. [Google Scholar] [CrossRef]
- Zhuang, L.; Xu, A.; Wang, Y.; Tang, Y. Remaining useful life prediction for two-phase degradation model based on reparameterized inverse Gaussian process. Eur. J. Oper. Res. 2024, 319, 877–890. [Google Scholar] [CrossRef]
- Xu, A.; Fang, G.; Zhuang, L.; Gu, C. A multivariate student-t process model for dependent tail-weighted degradation data. IISE Trans. 2024, 1–17. [Google Scholar] [CrossRef]
- Greene, W.H. Econometric Analysis, 4th ed.; Prentice-Hall: New York, NY, USA, 2000. [Google Scholar]
- Meeker, W.Q.; Escobar, L.A. Statistical Methods for Reliability Data; Wiley: New York, NY, USA, 1998. [Google Scholar]
- Santos, T.; Lemmerich, F.; Helic, D. Bayesian estimation of decay parameters in Hawkes processes. Intell. Data Anal. 2023, 27, 223–240. [Google Scholar] [CrossRef]
- Han, M. Take a look at the hierarchical Bayesian estimation of parameters from several different angles. Commun. Stat.-Theory Methods 2023, 52, 7718–7730. [Google Scholar] [CrossRef]
- Vaglio, M.; Pacilio, C.; Maselli, A.; Pani, P. Bayesian parameter estimation on boson-star binary signals with a coherent inspiral template and spin-dependent quadrupolar corrections. Phys. Rev. D 2023, 108, 023021. [Google Scholar] [CrossRef]
- Bangsgaard, K.O.; Andersen, M.; Heaf, J.G.; Ottesen, J.T. Bayesian parameter estimation for phosphate dynamics during hemodialysis. Math. Biosci. Eng. 2023, 20, 4455–4492. [Google Scholar] [CrossRef]
- Varian, H.R. Bayesian Approach to Real Estate Assessment. In Studies in Bayesian Economics and Statistics in Honor of Savage; Fienberg, S.E., Zellner, A., Savage, L.J., Eds.; North Holland: Amsterdam, The Netherlands, 1975; pp. 195–208. [Google Scholar]
- Calabria, R.; Pulcini, G. Point estimation under asymmetric loss functions for left-truncated exponential samples. Communications in Statistics-Theory and Methods 1996, 25, 585–600. [Google Scholar] [CrossRef]
- Geman, S.; Geman, D. Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. Pattern Anal. Mach. Intell. 1984, 6, 721–741. [Google Scholar] [CrossRef]
- Metropolis, N.; Rosenbluth, A.W.; Rosenbluth, M.N.; Teller, A.H.; Teller, E. Equation of state calculations by fast computing machines. J. Chem. Phys. 1953, 21, 1087–1092. [Google Scholar] [CrossRef]
- Hastings, W.K. Monte Carlo sampling methods using Markov chains and their applications. Biometrika 1970, 57, 97–109. [Google Scholar] [CrossRef]
- Balakrishnan, N.; Sandhu, R.A. A simple simulation algorithm for generating progressively type-II censored samples. Am. J. Stat. 1995, 49, 229–230. [Google Scholar] [CrossRef]
- Lee, E.T.; Wang, J.W. Statistical Methods for Survival Data Analysis, 3rd ed.; Wiley: New York, NY, USA, 2003. [Google Scholar] [CrossRef]
CS | MLE | Bayes | |||||
---|---|---|---|---|---|---|---|
SE | LINEX | GE | |||||
CSI | 0.5956 | 0.5809 | 0.5906 | 0.5616 | 0.5748 | 0.5277 | |
0.0250 | 0.0134 | 0.0146 | 0.0129 | 0.0132 | 0.0114 | ||
CSII | 0.6129 | 0.5925 | 0.6020 | 0.5752 | 0.5993 | 0.5384 | |
−0.0199 | 0.0174 | 0.0186 | 0.0156 | 0.0163 | 0.0142 | ||
CSIII | 0.6353 | 0.6136 | 0.6209 | 0.6001 | 0.6160 | 0.5544 | |
−0.0176 | 0.0187 | 0.0203 | 0.0165 | 0.0177 | 0.0158 | ||
CSI | 0.5446 | 0.5239 | 0.5332 | 0.5047 | 0.5152 | 0.4860 | |
−0.0021 | 0.0022 | 0.0023 | 0.0020 | 0.0023 | 0.0019 | ||
CSII | 0.5705 | 0.5501 | 0.5716 | 0.5351 | 0.5616 | 0.5047 | |
0.0029 | 0.0027 | 0.0029 | 0.0024 | 0.0028 | 0.0021 | ||
CSIII | 0.6039 | 0.588 | 0.6045 | 0.5476 | 0.5967 | 0.5288 | |
0.0041 | 0.0039 | 0.0040 | 0.0036 | 0.0038 | 0.0026 | ||
CSI | 0.4742 | 0.447 | 0.4595 | 0.4174 | 0.4532 | 0.3919 | |
0.0016 | 0.0015 | 0.0016 | 0.0013 | 0.0014 | 0.0009 | ||
CSII | 0.5057 | 0.4715 | 0.4999 | 0.4422 | 0.4851 | 0.4096 | |
−0.0253 | 0.0022 | 0.0022 | 0.0023 | 0.0022 | 0.0018 | ||
CSIII | 0.5354 | 0.5063 | 0.5147 | 0.4691 | 0.5261 | 0.4423 | |
−0.0278 | 0.0025 | 0.0024 | 0.0021 | 0.0023 | 0.0014 | ||
CSI | 0.6652 | 0.6508 | 0.6605 | 0.6304 | 0.6554 | 0.5997 | |
−0.0283 | 0.0236 | 0.0265 | 0.0225 | 0.0246 | 0.0174 | ||
CSII | 0.6966 | 0.6811 | 0.6905 | 0.6627 | 0.6855 | 0.6322 | |
−0.0235 | 0.0250 | 0.0271 | 0.0242 | 0.0259 | 0.0189 | ||
CSIII | 0.7174 | 0.7105 | 0.7244 | 0.6864 | 0.7164 | 0.6495 | |
−0.0286 | 0.0263 | 0.0295 | 0.0255 | 0.0271 | 0.0194 | ||
CSI | 0.6353 | 0.6170 | 0.6296 | 0.5956 | 0.6208 | 0.5471 | |
−0.0334 | 0.0321 | 0.0354 | 0.0314 | 0.0325 | 0.0264 | ||
CSII | 0.6724 | 0.6608 | 0.6707 | 0.6263 | 0.6660 | 0.6065 | |
−0.0314 | 0.0387 | 0.0390 | 0.0368 | 0.0354 | 0.0297 | ||
CSIII | 0.6978 | 0.6810 | 0.6893 | 0.6643 | 0.6855 | 0.6244 | |
0.0456 | 0.0442 | 0.0453 | 0.0412 | 0.0428 | 0.0355 | ||
CSI | 0.5610 | 0.5454 | 0.5744 | 0.5222 | 0.5610 | 0.4616 | |
−0.0019 | 0.0018 | 0.0019 | 0.0014 | 0.0013 | 0.0007 | ||
CSII | 0.5956 | 0.5803 | 0.5915 | 0.5445 | 0.5897 | 0.4911 | |
−0.0017 | 0.0021 | 0.0023 | 0.0020 | 0.0021 | 0.0008 | ||
CSIII | 0.6279 | 0.6093 | 0.6217 | 0.5784 | 0.6163 | 0.5471 | |
0.0013 | 0.0012 | 0.012 | 0.0010 | 0.0011 | 0.0009 |
CS | MLE | Bayes | |||||
---|---|---|---|---|---|---|---|
SE | LINEX | GE | |||||
CSI | 0.3547 | 0.3375 | 0.3488 | 0.3154 | 0.3304 | 0.2785 | |
CSII | 0.3756 | 0.3511 | 0.3624 | 0.3308 | 0.3592 | 0.2899 | |
CSIII | 0.4036 | 0.3765 | 0.3855 | 0.3601 | 0.3795 | 0.3074 | |
CSI | 0.2966 | 0.2745 | 0.2843 | 0.2547 | 0.2654 | 0.2362 | |
CSII | 0.3255 | 0.3026 | 0.3267 | 0.2863 | 0.3154 | 0.2547 | |
CSIII | 0.3647 | 0.3457 | 0.3654 | 0.2999 | 0.3561 | 0.2796 | |
CSI | 0.2249 | 0.1998 | 0.2111 | 0.1742 | 0.2054 | 0.1536 | |
CSII | 0.2557 | 0.2223 | 0.2499 | 0.1955 | 0.2353 | 0.1678 | |
CSIII | 0.2866 | 0.2563 | 0.2649 | 0.2201 | 0.2768 | 0.1956 | |
CSI | 0.4425 | 0.4236 | 0.4362 | 0.3974 | 0.4295 | 0.3596 | |
CSII | 0.4852 | 0.4639 | 0.4768 | 0.4392 | 0.4699 | 0.3997 | |
CSIII | 0.5147 | 0.5048 | 0.5247 | 0.4712 | 0.5133 | 0.4219 | |
CSI | 0.4036 | 0.3807 | 0.3964 | 0.3547 | 0.3854 | 0.2993 | |
CSII | 0.4521 | 0.4366 | 0.4499 | 0.3922 | 0.4436 | 0.3678 | |
CSIII | 0.4869 | 0.4637 | 0.4752 | 0.4413 | 0.4699 | 0.3899 | |
CSI | 0.3147 | 0.2975 | 0.3299 | 0.2727 | 0.3147 | 0.2131 | |
CSII | 0.3547 | 0.3368 | 0.3499 | 0.2965 | 0.3478 | 0.2412 | |
CSIII | 0.3942 | 0.3713 | 0.3865 | 0.3345 | 0.3798 | 0.2993 |
CS | MLE | Bayes | |||||
---|---|---|---|---|---|---|---|
SE | LINEX | GE | |||||
CSI | 0.0293 | 0.0274 | 0.0289 | 0.0235 | 0.0282 | 0.0199 | |
CSII | 0.0346 | 0.0325 | 0.0338 | 0.0264 | 0.0331 | 0.0225 | |
CSIII | 0.0415 | 0.0388 | 0.0401 | 0.0296 | 0.0395 | 0.0257 | |
CSI | 0.0223 | 0.0204 | 0.0217 | 0.0187 | 0.0214 | 0.0154 | |
CSII | 0.0264 | 0.0245 | 0.0260 | 0.0222 | 0.0253 | 0.0186 | |
CSIII | 0.0313 | 0.0296 | 0.0311 | 0.0254 | 0.0302 | 0.0213 | |
CSI | 0.0164 | 0.0145 | 0.0160 | 0.0129 | 0.0159 | 0.0099 | |
CSII | 0.0202 | 0.0183 | 0.0195 | 0.0155 | 0.0191 | 0.0127 | |
CSIII | 0.0251 | 0.0233 | 0.0249 | 0.0185 | 0.0242 | 0.0149 | |
CSI | 0.0353 | 0.0332 | 0.0341 | 0.0277 | 0.0339 | 0.0241 | |
CSII | 0.0391 | 0.0375 | 0.0387 | 0.0335 | 0.0381 | 0.0296 | |
CSIII | 0.0442 | 0.0425 | 0.0432 | 0.0376 | 0.0429 | 0.0332 | |
CSI | 0.0292 | 0.0273 | 0.0283 | 0.0225 | 0.0278 | 0.0189 | |
CSII | 0.0331 | 0.0315 | 0.0326 | 0.0257 | 0.0321 | 0.0231 | |
CSIII | 0.0367 | 0.0351 | 0.0362 | 0.0285 | 0.0357 | 0.0259 | |
CSI | 0.0234 | 0.0215 | 0.0234 | 0.0176 | 0.0225 | 0.0139 | |
CSII | 0.0285 | 0.0264 | 0.0279 | 0.0223 | 0.0271 | 0.0187 | |
CSIII | 0.0334 | 0.0321 | 0.0331 | 0.0267 | 0.0328 | 0.0235 |
CS | MLE | Bayes | |||||
---|---|---|---|---|---|---|---|
SE | LINEX | GE | |||||
CSI | 0.1825 | 0.1597 | 0.1693 | 0.1355 | 0.1622 | 0.1176 | |
CSII | 0.2354 | 0.2143 | 0.2261 | 0.1524 | 0.2199 | 0.1369 | |
CSIII | 0.2677 | 0.2581 | 0.2652 | 0.1744 | 0.2607 | 0.1532 | |
CSI | 0.1163 | 0.0956 | 0.1123 | 0.0846 | 0.1069 | 0.0723 | |
CSII | 0.1456 | 0.1178 | 0.1357 | 0.0936 | 0.1243 | 0.0855 | |
CSIII | 0.1932 | 0.1699 | 0.1756 | 0.1235 | 0.1722 | 0.1062 | |
CSI | 0.0986 | 0.0775 | 0.0869 | 0.0734 | 0.0824 | 0.0593 | |
CSII | 0.1235 | 0.0992 | 0.0111 | 0.0846 | 0.0105 | 0.0692 | |
CSIII | 0.1577 | 0.1347 | 0.1453 | 0.1067 | 0.1422 | 0.0899 | |
CSI | 0.2546 | 0.2348 | 0.2453 | 0.1911 | 0.2398 | 0.1637 | |
CSII | 0.2865 | 0.2645 | 0.2733 | 0.2299 | 0.2675 | 0.1894 | |
CSIII | 0.3155 | 0.2943 | 0.3057 | 0.2654 | 0.2956 | 0.2278 | |
CSI | 0.1736 | 0.1564 | 0.1635 | 0.1157 | 0.1598 | 0.0969 | |
CSII | 0.2239 | 0.2077 | 0.1911 | 0.1356 | 0.1863 | 0.1175 | |
CSIII | 0.2536 | 0.2394 | 0.2456 | 0.1752 | 0.2431 | 0.1406 | |
CSI | 0.1235 | 0.1155 | 0.1237 | 0.0975 | 0.1196 | 0.0786 | |
CSII | 0.1536 | 0.1463 | 0.1501 | 0.1237 | 0.1498 | 0.0991 | |
CSIII | 0.1837 | 0.1694 | 0.1732 | 0.1423 | 0.1702 | 0.1195 |
CS | MLE | Bayes | |||||
---|---|---|---|---|---|---|---|
SE | LINEX | GE | |||||
CSI | 0.0225 | 0.0205 | 0.0209 | 0.0097 | 0.0198 | 0.0092 | |
CSII | 0.0268 | 0.0243 | 0.0257 | 0.0112 | 0.0235 | 0.0099 | |
CSIII | 0.0312 | 0.0276 | 0.0286 | 0.0145 | 0.2647 | 0.0115 | |
CSI | 0.0175 | 0.0156 | 0.0167 | 0.0079 | 0.0145 | 0.0068 | |
CSII | 0.0213 | 0.0204 | 0.0211 | 0.0085 | 0.0194 | 0.0074 | |
CSIII | 0.0254 | 0.0239 | 0.0246 | 0.0102 | 0.0231 | 0.0087 | |
CSI | 0.0116 | 0.0107 | 0.0111 | 0.0068 | 0.0099 | 0.0058 | |
CSII | 0.0147 | 0.0129 | 0.0132 | 0.0076 | 0.0125 | 0.0066 | |
CSIII | 0.0205 | 0.0195 | 0.0199 | 0.0083 | 0.0187 | 0.0075 | |
CSI | 0.0273 | 0.0255 | 0.0258 | 0.0117 | 0.0245 | 0.0109 | |
CSII | 0.0324 | 0.0305 | 0.0309 | 0.0159 | 0.0295 | 0.0134 | |
CSIII | 0.0367 | 0.0351 | 0.0367 | 0.0178 | 0.0346 | 0.0156 | |
CSI | 0.0213 | 0.0202 | 0.0207 | 0.0093 | 0.0195 | 0.0082 | |
CSII | 0.0276 | 0.0259 | 0.0261 | 0.0099 | 0.0256 | 0.0089 | |
CSIII | 0.0347 | 0.0327 | 0.0332 | 0.0117 | 0.0325 | 0.0096 | |
CSI | 0.0169 | 0.0148 | 0.0156 | 0.0088 | 0.0145 | 0.0077 | |
CSII | 0.0198 | 0.0182 | 0.0189 | 0.0094 | 0.0175 | 0.0083 | |
CSIII | 0.0235 | 0.0214 | 0.0227 | 0.0105 | 0.0205 | 0.0095 |
CS | MLE | Bayes | |||||
---|---|---|---|---|---|---|---|
SE | LINEX | GE | |||||
CSI | 0.0321 | 0.0097 | 0.0107 | 0.0088 | 0.0091 | 0.0073 | |
CSII | 0.0435 | 0.0109 | 0.0135 | 0.0092 | 0.0098 | 0.0079 | |
CSIII | 0.0512 | 0.0123 | 0.0154 | 0.0106 | 0.0110 | 0.0084 | |
CSI | 0.0275 | 0.0091 | 0.0094 | 0.0081 | 0.0084 | 0.0067 | |
CSII | 0.0346 | 0.0096 | 0.0103 | 0.0086 | 0.0093 | 0.0072 | |
CSIII | 0.0429 | 0.0111 | 0.0124 | 0.0093 | 0.0098 | 0.0079 | |
CSI | 0.0198 | 0.0083 | 0.0086 | 0.0065 | 0.0075 | 0.0058 | |
CSII | 0.0236 | 0.0087 | 0.0092 | 0.0073 | 0.0081 | 0.0064 | |
CSIII | 0.0319 | 0.0094 | 0.0105 | 0.0084 | 0.0089 | 0.0071 | |
CSI | 0.0521 | 0.0124 | 0.0137 | 0.0096 | 0.0112 | 0.0085 | |
CSII | 0.0583 | 0.0147 | 0.0153 | 0.0111 | 0.0135 | 0.0094 | |
CSIII | 0.0622 | 0.0179 | 0.0185 | 0.0138 | 0.0165 | 0.0109 | |
CSI | 0.0446 | 0.0097 | 0.0101 | 0.0087 | 0.0092 | 0.0078 | |
CSII | 0.0492 | 0.0113 | 0.0125 | 0.0093 | 0.0099 | 0.0084 | |
CSIII | 0.0536 | 0.0128 | 0.0139 | 0.0098 | 0.0117 | 0.0089 | |
CSI | 0.0255 | 0.0085 | 0.0089 | 0.0075 | 0.0081 | 0.0068 | |
CSII | 0.0376 | 0.0093 | 0.0097 | 0.0081 | 0.0087 | 0.0076 | |
CSIII | 0.0427 | 0.0105 | 0.0119 | 0.0094 | 0.0093 | 0.0082 |
CS | MLE | Bayes | |||||
---|---|---|---|---|---|---|---|
SE | LINEX | GE | |||||
CSI | 0.9145 | 0.8841 | 0.9023 | 0.7992 | 0.8755 | 0.7254 | |
CSII | 0.9542 | 0.9253 | 0.9344 | 0.8257 | 0.9128 | 0.7623 | |
CSIII | 0.9993 | 0.9547 | 0.9637 | 0.8644 | 0.9456 | 0.7952 | |
CSI | 0.8347 | 0.7962 | 0.8027 | 0.7442 | 0.7834 | 0.6791 | |
CSII | 0.8725 | 0.8355 | 0.8462 | 0.7725 | 0.8253 | 0.7248 | |
CSIII | 0.9166 | 0.8723 | 0.8923 | 0.7999 | 0.8641 | 0.7634 | |
CSI | 0.7825 | 0.7481 | 0.7533 | 0.6992 | 0.7354 | 0.6223 | |
CSII | 0.8264 | 0.7835 | 0.7964 | 0.7296 | 0.7753 | 0.6545 | |
CSIII | 0.8736 | 0.8364 | 0.8432 | 0.7542 | 0.8255 | 0.7039 | |
CSI | 1.2147 | 1.0849 | 1.0009 | 0.8999 | 1.1556 | 0.8556 | |
CSII | 1.2965 | 1.2667 | 1.2795 | 0.9536 | 1.2534 | 0.9145 | |
CSIII | 1.3462 | 1.3148 | 1.3254 | 0.9978 | 1.3015 | 0.9652 | |
CSI | 1.0984 | 0.9782 | 0.9965 | 0.8546 | 0.9547 | 0.7966 | |
CSII | 1.1568 | 1.0999 | 1.1174 | 0.9147 | 1.0987 | 0.8469 | |
CSIII | 1.2547 | 1.1799 | 1.1867 | 0.9648 | 1.1365 | 0.8893 | |
CSI | 0.9278 | 0.8865 | 0.8964 | 0.7967 | 0.8795 | 0.7361 | |
CSII | 0.9645 | 0.9257 | 0.9345 | 0.8342 | 0.9144 | 0.7564 | |
CSIII | 1.1326 | 0.9724 | 0.9831 | 0.8824 | 0.9637 | 0.7992 |
CS | |||||||||
---|---|---|---|---|---|---|---|---|---|
MLE | Bayes | MLE | Bayes | ||||||
ACIWs | CPs | CRIWs | CPs | ACIWs | CPs | CRIWs | CPs | ||
CSI | 1.2654 | 0.912 | 0.7936 | 0.941 | 0.6635 | 0.918 | 0.3523 | 0.939 | |
CSII | 1.3145 | 0.925 | 0.8234 | 0.942 | 0.7456 | 0.915 | 0.4536 | 0.941 | |
CSIII | 1.3869 | 0.919 | 0.8736 | 0.939 | 0.8215 | 0.922 | 0.4962 | 0.938 | |
CSI | 1.1652 | 0.925 | 0.6645 | 0.949 | 0.5863 | 0.941 | 0.2987 | 0.951 | |
CSII | 1.2235 | 0.918 | 0.7144 | 0.951 | 0.6241 | 0.939 | 0.3649 | 0.952 | |
CSIII | 1.2967 | 0.922 | 0.7869 | 0.947 | 0.6863 | 0.938 | 0.4325 | 0.949 | |
CSI | 0.9984 | 0.936 | 0.5763 | 0.955 | 0.4496 | 0.954 | 0.2147 | 0.954 | |
CSII | 1.1362 | 0.941 | 0.6347 | 0.949 | 0.5147 | 0.947 | 0.2869 | 0.957 | |
CSIII | 1.2147 | 0.939 | 0.7264 | 0.951 | 0.5636 | 0.952 | 0.3468 | 0.959 | |
CSI | 1.4666 | 0.923 | 0.8563 | 0.941 | 0.8345 | 0.927 | 0.5524 | 0.945 | |
CSII | 1.5123 | 0.921 | 0.9364 | 0.950 | 0.8994 | 0.925 | 0.6347 | 0.937 | |
CSIII | 1.6325 | 0.931 | 1.0567 | 0.939 | 0.9253 | 0.919 | 0.7221 | 0.941 | |
CSI | 1.2954 | 0.936 | 0.7236 | 0.952 | 0.7236 | 0.954 | 0.4362 | 0.961 | |
CSII | 1.3358 | 0.942 | 0.8147 | 0.947 | 0.7893 | 0.942 | 0.5124 | 0.942 | |
CSIII | 1.4125 | 0.941 | 0.9362 | 0.955 | 0.8475 | 0.938 | 0.5766 | 0.948 | |
CSI | 1.1647 | 0.955 | 0.6472 | 0.961 | 0.6625 | 0.950 | 0.3654 | 0.963 | |
CSII | 1.2113 | 0.949 | 0.7554 | 0.952 | 0.7334 | 0.949 | 0.4355 | 0.954 | |
CSIII | 1.2954 | 0.951 | 0.8233 | 0.955 | 0.7993 | 0.947 | 0.5067 | 0.955 |
CS | |||||||||
---|---|---|---|---|---|---|---|---|---|
MLE | Bayes | MLE | Bayes | ||||||
ACIWs | CPs | CRIWs | CPs | ACIWs | CPs | CRIWs | CPs | ||
CSI | 1.8932 | 0.912 | 0.8495 | 0.948 | 0.4231 | 0.939 | 0.2341 | 0.956 | |
CSII | 1.9968 | 0.923 | 0.9635 | 0.946 | 0.4763 | 0.928 | 0.2536 | 0.951 | |
CSIII | 2.1345 | 0.919 | 1.2145 | 0.939 | 0.4963 | 0.937 | 0.2847 | 0.955 | |
CSI | 1.6456 | 0.925 | 0.7145 | 0.951 | 0.3765 | 0.947 | 0.1963 | 0.959 | |
CSII | 1.7473 | 0.934 | 0.8566 | 0.947 | 0.3954 | 0.951 | 0.2245 | 0.949 | |
CSIII | 1.9745 | 0.939 | 0.9932 | 0.949 | 0.4165 | 0.946 | 0.2568 | 0.957 | |
CSI | 1.5536 | 0.951 | 0.6389 | 0.952 | 0.2965 | 0.955 | 0.1565 | 0.962 | |
CSII | 1.6477 | 0.938 | 0.7456 | 0.955 | 0.3215 | 0.951 | 0.1848 | 0.957 | |
CSIII | 1.8632 | 0.946 | 0.8763 | 0.949 | 0.3647 | 0.947 | 0.2269 | 0.958 | |
CSI | 2.3937 | 0.925 | 1.3496 | 0.951 | 0.6155 | 0.923 | 0.3997 | 0.946 | |
CSII | 2.4568 | 0.931 | 1.4658 | 0.955 | 0.6745 | 0.933 | 0.4421 | 0.951 | |
CSIII | 2.5661 | 0.929 | 1.6377 | 0.949 | 0.7231 | 0.941 | 0.4936 | 0.948 | |
CSI | 1.9998 | 0.945 | 1.0965 | 0.953 | 0.4236 | 0.939 | 0.2863 | 0.951 | |
CSII | 2.2357 | 0.944 | 1.1354 | 0.954 | 0.4968 | 0.941 | 0.3367 | 0.944 | |
CSIII | 2.3699 | 0.939 | 1.3659 | 0.948 | 0.5543 | 0.938 | 0.3875 | 0.941 | |
CSI | 1.7586 | 0.951 | 0.7963 | 0.961 | 0.3767 | 0.952 | 0.2365 | 0.961 | |
CSII | 1.8743 | 0.948 | 0.8672 | 0.957 | 0.4362 | 0.945 | 0.2966 | 0.957 | |
CSIII | 1.9521 | 0.952 | 0.9935 | 0.959 | 0.4839 | 0.949 | 0.3452 | 0.952 |
CS | |||||||||
---|---|---|---|---|---|---|---|---|---|
MLE | Bayes | MLE | Bayes | ||||||
ACIWs | CPs | CRIWs | CPs | ACIWs | CPs | CRIWs | CPs | ||
CSI | 0.5172 | 0.925 | 0.3481 | 0.939 | 5.1654 | 0.942 | 3.6473 | 0.952 | |
CSII | 0.5563 | 0.928 | 0.3952 | 0.932 | 5.9268 | 0.931 | 4.1287 | 0.955 | |
CSIII | 0.6247 | 0.921 | 0.4328 | 0.934 | 6.7475 | 0.935 | 5.2473 | 0.951 | |
CSI | 0.4436 | 0.933 | 0.2868 | 0.945 | 4.2587 | 0.942 | 2.7849 | 0.949 | |
CSII | 0.4863 | 0.929 | 0.3314 | 0.942 | 5.0625 | 0.951 | 3.6147 | 0.951 | |
CSIII | 0.5361 | 0.918 | 0.3799 | 0.939 | 5.8697 | 0.947 | 4.4693 | 0.948 | |
CSI | 0.3961 | 0.941 | 0.2358 | 0.952 | 3.6544 | 0.949 | 2.0247 | 0.955 | |
CSII | 0.4315 | 0.939 | 0.2863 | 0.955 | 4.5152 | 0.950 | 2.7965 | 0.961 | |
CSIII | 0.4799 | 0.942 | 0.3157 | 0.961 | 5.1337 | 0.946 | 3.6473 | 0.957 | |
CSI | 0.6344 | 0.939 | 0.4583 | 0.955 | 6.2653 | 0.939 | 4.5472 | 0.941 | |
CSII | 0.6894 | 0.928 | 0.4892 | 0.949 | 6.9112 | 0.933 | 5.1237 | 0.939 | |
CSIII | 0.7155 | 0.941 | 0.5213 | 0.957 | 7.5334 | 0.921 | 5.8624 | 0.951 | |
CSI | 0.5467 | 0.955 | 0.3661 | 0.961 | 5.5632 | 0.938 | 3.4251 | 0.949 | |
CSII | 0.5993 | 0.947 | 0.4055 | 0.958 | 6.2754 | 0.941 | 4.6557 | 0.951 | |
CSIII | 0.6342 | 0.948 | 0.4512 | 0.953 | 6.9651 | 0.937 | 5.1125 | 0.944 | |
CSI | 0.4463 | 0.951 | 0.2994 | 0.962 | 4.3322 | 0.948 | 2.5757 | 0.961 | |
CSII | 0.5083 | 0.946 | 0.3466 | 0.953 | 4.8693 | 0.945 | 3.3145 | 0.954 | |
CSIII | 0.5637 | 0.949 | 0.3937 | 0.955 | 5.6624 | 0.949 | 4.1956 | 0.957 |
Parameter | MLE | Bayes | ||||
---|---|---|---|---|---|---|
SE | LINEX | GE | ||||
0.2835 | 0.4866 | 0.4723 | 0.4548 | 0.4437 | 0.4266 | |
1.0837 | 1.0337 | 1.2134 | 1.0145 | 1.1965 | 0.9976 | |
0.9474 | 0.9599 | 0.9644 | 0.9343 | 0.9475 | 0.9291 | |
0.7338 | 0.9783 | 0.9455 | 0.9341 | 0.9552 | 0.9369 | |
0.4095 | 0.2944 | 0.3541 | 0.3225 | 0.3478 | 0.2899 | |
1.1288 | 2.5387 | 2.5541 | 1.7644 | 1.9658 | 1.6634 |
Parameter | ACIs | CRIs | ||
---|---|---|---|---|
[Lower, Upper] | Width | [Lower, Upper] | Width | |
[0.1652, 0.3758] | 0.2106 | [0.4131, 0.8026] | 0.3895 | |
[1.0651, 1.1026] | 0.0375 | [0.9120, 1.5972] | 0.6852 | |
[0.9270, 0.9683] | 0.0414 | [0.9181, 0.9735] | 0.0554 | |
[0.6915, 0.7760] | 0.0845 | [0.9299, 0.9917] | 0.0618 | |
[0.3382, 0.4809] | 0.1428 | [0.2517, 0.4052] | 0.1535 | |
[0.7646, 1.4930] | 0.7284 | [1.5219, 2.6354] | 1.1064 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ramadan, M.M.; EL-Sagheer, R.M.; Abd-El-Monem, A. Estimating the Lifetime Parameters of the Odd-Generalized-Exponential–Inverse-Weibull Distribution Using Progressive First-Failure Censoring: A Methodology with an Application. Axioms 2024, 13, 822. https://doi.org/10.3390/axioms13120822
Ramadan MM, EL-Sagheer RM, Abd-El-Monem A. Estimating the Lifetime Parameters of the Odd-Generalized-Exponential–Inverse-Weibull Distribution Using Progressive First-Failure Censoring: A Methodology with an Application. Axioms. 2024; 13(12):822. https://doi.org/10.3390/axioms13120822
Chicago/Turabian StyleRamadan, Mahmoud M., Rashad M. EL-Sagheer, and Amel Abd-El-Monem. 2024. "Estimating the Lifetime Parameters of the Odd-Generalized-Exponential–Inverse-Weibull Distribution Using Progressive First-Failure Censoring: A Methodology with an Application" Axioms 13, no. 12: 822. https://doi.org/10.3390/axioms13120822
APA StyleRamadan, M. M., EL-Sagheer, R. M., & Abd-El-Monem, A. (2024). Estimating the Lifetime Parameters of the Odd-Generalized-Exponential–Inverse-Weibull Distribution Using Progressive First-Failure Censoring: A Methodology with an Application. Axioms, 13(12), 822. https://doi.org/10.3390/axioms13120822