Many entropy-related methods for signal classification have been proposed and exploited successfully in the last several decades. However, it is sometimes difficult to find the optimal measure and the optimal parameter configuration for a specific purpose or context. Suboptimal settings may therefore produce subpar results and not even reach the desired level of significance. In order to increase the signal classification accuracy in these suboptimal situations, this paper proposes statistical models created with uncorrelated measures that exploit the possible synergies between them. The methods employed are permutation entropy (PE), approximate entropy (ApEn), and sample entropy (SampEn). Since PE is based on subpattern ordinal differences, whereas ApEn and SampEn are based on subpattern amplitude differences, we hypothesized that a combination of PE with another method would enhance the individual performance of any of them. The dataset was composed of body temperature records, for which we did not obtain a classification accuracy above 80% with a single measure, in this study or even in previous studies. The results confirmed that the classification accuracy rose up to 90% when combining PE and ApEn with a logistic model.
Keywords: approximate entropy; body temperature; logistic regression; permutation entropy; sample entropy; signal classification.