Multiresolutional techniques for digital image filtering and watermarking
This thesis examines the use of multiresolutional techniques in two areas of digital image processing: denoising (speckle reduction) and watermarking. A speckle reduction algorithm operating in the wavelet a irons domain is proposed. This novel algorithm iteratively approaches the difference between the estimated noise standard deviation (in an image.) and the removed noise standard deviation. A method for ascertaining the overall performance of a filter, based upon noise removal and edge; preservation, is presented. Comparisons between the novel denoising algorithm and existing denoising filters are carried out using test images and medical ultrasound images. Results show that the novel denoising algorithm reduces speckle drastically whilst maintaining sharp edges. Two distinct areas of digital image watermarking are addressed in this thesis: (1) the presentation of a novel watermarking system for copyright protection and (2) a fair comparison of the effects of incorporating Error Correcting Codes (ECC) into various watermarking systems. The newly proposed watermarking system is blind, quantization based and operates in the wavelet domain. Tests carried out on this novel system show it to be highly robust and reliable. An extensive and fair study of the effects of incorporating ECCs (Bose. Chaud-huri and Hoequenghem (BCI1) and repetition codes) into various watermarking systems is carried out. Spatial. Discrete Cosine Transform (I)CT) and wavelet based systems are tested. It is shown that it is not always beneficial to add ECCs into a watermarking system.