Abstract: We develop nonparametric methods, and theory, for analysing data on a random -vector represented as a linear form in a -vector , say , where the components of are nonnegative and uncorrelated. Problems of this nature are motivated by a wide range of applications in which physical considerations deny the possibility that can have negative components. Our approach to inference is founded on a necessary and sufficient condition for the existence of unique, nonnegative-score principal components. The condition replaces an earlier, sufficient constraint given in the engineering literature, and is related to a notion of sparsity that arises frequently in nonnegative principal component analysis. We discuss theoretical aspects of our estimators of the transformation that produces nonnegative-score principal components, showing that the estimators have optimal properties.
Key words and phrases: Correlation, independent component analysis, nonparametric statistics, permutation, principal component analysis, rate of convergence, rotation.