Abstract.
The basic objective of blind signal separation is to recover a set of source signals from a set of observations that are mixtures of the sources with no, or very limited knowledge about the mixture structure and source signals. To extract the original sources, many algorithms have been proposed; among them, the cross-correlation and constant modulus algorithm (CC-CMA) appears to be the algorithm of choice due to its computational simplicity.
An important issue in CC-CMA algorithm is the global convergence analysis, because the cost function is not quadratic nor convex and contains undesirable stationary points. If these undesirable points are local minimums, the convergence of the algorithm may not be guaranteed and the CC-CMA would fail to separate source signals.
The main result of this paper is to complete the classification of these stationary points and to prove that they are not local minimums unless if the mixing parameter is equal to 1.
This is obtained by using the theory of discriminant varieties to determine the stationnary points as a function of the parameter and then to show that the Hessian matrix of the cost function is not positive semidefinite at these stationnay points, unless if the mixing parameter is 1.
Similar content being viewed by others
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Gu, N., Lazard, D., Rouillier, F. et al. Using Computer Algebra to Certify the Global Convergence of a Numerical Optimization Process. Math.comput.sci. 1, 291–304 (2007). https://doi.org/10.1007/s11786-007-0021-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11786-007-0021-7