Figure 30 - uploaded by Eric Balster
Content may be subject to copyright.
“Susie” Reconstructed via the Standard TS Wavelet (Quantization = 64). 

“Susie” Reconstructed via the Standard TS Wavelet (Quantization = 64). 

Source publication
Article
Full-text available
This report primarily addresses the problem of quantization noise since it is one of the very few processes that potentially eliminates valuable signal information. In a lossy compression system, the quantization step is totally responsible for information loss resulting in quality reduction of the reconstructed signal. For this effort, adaptive fi...

Context in source publication

Context 1
... sample image results for “TEST 8” are shown in Figures 29-31. As can be seen, slight improvements in shading and detail are visible in Figure 31 as compared the image reconstructed using the standard TS wavelet shown in Figure ...

Similar publications

Article
Full-text available
A long-distance imaging system can be strongly affected by atmospheric turbulence. Here a novel method is suggested for justifying the effects of atmospheric distortion on practical images, especially airborne turbulence which can cruelly corrupt a region of interest (ROI). In order to extract precise details about substance behind the distorted la...
Article
Full-text available
In this paper, a novel algorithm for the accurate localization of QRS complex with low average time error is proposed. The idea is thought that the various features of ECG signal like P, Q, R, S and T peaks can be independently detected from raw ECG recording and fused together to obtain a better estimate of QRS position. To explore, in this paper,...
Article
Full-text available
Transmitting the information in the form of images has drawn much importance in the modern age. The images are often corrupted by various types of noises during acquisition and transmission. Such images have to be cleaned before using in any applications. Image denoising is a thirst area in image processing for decades. Wavelet transform has been a...
Article
Full-text available
The process of removing noise from the original image is still a demanding problem for researchers. There have been several algorithms and each has its assumptions, merits, and demerits. The prime focus of this paper is related to the pre processing of an image before it can be used in applications. The pre processing is done by de-noising of image...
Conference Paper
Full-text available
In many situations, the Electrocardiogram (ECG) is recorded during ambulatory or strenuous conditions such that the signal is corrupted by different types of noise, sometimes originating from another physiological process of the body. Hence, noise removal is an important aspect of signal processing. Here five different filters i.e. median, Low Pass...

Citations

... An ongoing program of research conducted at Wright-Patterson Air Force Base and the University of Alaska Anchorage ([3], [4]) has provided considerable evidence suggesting that the search space of non-traditional transforms is indeed rich with such solutions. For selected classes of periodic, one-dimensional signals subjected to quantization, our GAs evolved DWT ...
Conference Paper
Full-text available
Ongoing research has established a new methodology for using genetic algorithms (2) to evolve forward and inverse transforms that significantly reduce quantization error in reconstructed signals and images. The approach promises to revolutionize the signal and image processing field, producing both higher quality images and higher compression ratios than is currently possible with wavelet-based techniques.
... I chose to concentrate on the Daubechies-4 (D4) discrete wavelet transform (DWT), a standard wavelet designed by Ingrid Daubechies, although the research is applicable for any existing DWT [2,3,4]. The D4 is usually represented by the scaling numbers h1 and the set of wavelet numbers g1 (see Figure 1) and referred to as the forward transforms. ...
Article
Full-text available
The use of digital images is increasing all the time in personal digital photography, medical imaging, and fingerprint image databases. The goal of this research is to improve the image quality of a given compressed digital image while maintaining the same file size of the image. Wavelet based image compression is improved upon by using Genetic Algorithms on a supercomputer to evolve transforms that have better image quality after image compression.
Article
Full-text available
Nowadays, optimal and intelligent design approaches are vital in almost all areas of engineering. Scientists and engineers are attempting to make frameworks and models more proficient and intelligent. This paper deals with a detailed investigation on design of various digital filters using optimization algorithms. Generally digital filters are classified into two types which are FIR and IIR filters and are again classified into one dimensional, two dimensional and three dimensional filters for signal, image and video respectively. The design of a digital filter that satisfies all the required conditions perfectly is a challenging factor. So, apart from the conventional mathematical methods, optimization algorithms can be used to design optimal digital filters. IIR Filters are infinite impulse response filter; they have impulse response of infinite duration. FIR Filters are finite impulse response filters; they have impulse response of finite duration. In this paper we have discussed the design of various optimal digital filters based on various optimization algorithms, for processing of signal, image and video. The design of digital filters based on Evolutionary algorithms and swarm intelligence algorithms like Genetic Algorithm, Particle Swarm Optimization, Artificial Bee Colony Optimization, Cuckoo Search Algorithm, Differential Evolution, Gravitational Search, Harmony Search, Spiral Optimization, teaching–learning based optimization, wind driven optimization, hybridization of optimization algorithm are presented.
Conference Paper
This research established a methodology for using a genetic algorithm to evolve coefficients for matched forward and inverse transform pairs. Beginning with an initial population of randomly mutated copies of the coefficients representing a standard wavelet, our GA consistently evolved transforms that outperformed wavelets for image compression and reconstruction applications under conditions subject to quantization error. Transforms optimized against a single representative image also outperformed wavelets when subsequently tested against other images from our test set. The new methodology has the potential to revolutionize the signal and image processing fields.
Conference Paper
This investigation uses a genetic algorithm to optimize coefficient sets describing inverse transforms that significantly reduce mean squared error of reconstructed images. Quantization error introduced during image compression and reconstruction is one of the worst noise sources, due to the fact that information is always permanently lost during the transformation process. Our approach establishes an adaptive filtering methodology for evolving transforms that outperform discrete wavelet inverse transforms for the reconstruction of images subjected to quantization error. Inverse transforms evolved against a single training image consistently generalize to exhibit superior performance against other images from the test set.
Conference Paper
This paper describes a genetic algorithm that evolves optimized sets of coefficients for one-dimensional signal reconstruction under lossy conditions due to quantization. Beginning with a population of mutated copies of the set of coefficients describing a standard wavelet-based inverse transform, the genetic algorithm systemically evolves a new set of coefficients that significantly reduces mean squared error (relative to the performance of the selected wavelet) for various classes of one-dimensional signals. The evolved transforms also outperform wavelets when subsequently tested against random signals from the same class