Abstract: We present a Bayesian approach for nonparametric function estimation based on a continuous wavelet dictionary, where the unknown function is modeled by a random sum of wavelet functions at arbitrary locations and scales. By avoiding the dyadic constraints for orthonormal wavelet bases, the continuous overcomplete wavelet dictionary has greater flexibility to adapt to the structure of the data, and may lead to sparser representations. The price for this flexibility is the computational challenge of searching over an infinite number of potential dictionary elements. We develop a novel reversible jump Markov chain Monte Carlo algorithm which utilizes local features in the proposal distributions to improve computational efficiency, and which leads to better mixing of the Markov chain. Performance comparison in terms of sparsity and mean squared error is carried out on standard wavelet test functions. Results on a non-equally spaced example show that our method compares favorably to methods using interpolation or imputation.
Key words and phrases: Bayesian inference, nonparametric regression, overcomplete dictionaries, reversible jump Markov chain Monte Carlo, stochastic expansions, wavelets.