Fig 1 - uploaded by Taha Hasan
Content may be subject to copyright.
The transform between domain block (D j ) and range block (R i )

The transform between domain block (D j ) and range block (R i )

Source publication
Article
Full-text available
In this study, two methods have been suggested for breaking the long encoding time problem of the encoding process in Fractal Image Compression (FIC). The First, called Zero Mean Intensity Level (ZMIL), which is based on using an unconventional affine parameter (namely the range block mean) that has better properties than the conventional offset pa...

Contexts in source publication

Context 1
... of all large blocks known as domain block (D), constructs a codebook called domain pool (S) (Lisa, 2000). After partitioning a given image into R-blocks and D-blocks should be found pieces of D j and a collection of contractive maps: is to find contractive maps w i that minimize the distances between R i and corresponding D j as illustrated in Fig. 1 ( Lisa, 2000). This shows why fractal compression is a slow technique, since each range block must be compared to all domain blocks including eighth symmetry orientations (Fig. 2). This operation allows the best match to be found, the best matching between domain and range blocks which is satisfy the minimum distortion error E(R, D) of ...
Context 2
... encoding time is gained. From the results, the encoding time is reduced to about 80% as comparing with the time that is required in the full cases. It is noticed that the compression value is increased with good resolution of the reconstructed image. When the no symmetry is combined with RDIS, the encoding time will be reduced to only one second. Fig. 9, 10 and 11 show the effects of the symmetry ...

Similar publications

Article
Full-text available
This paper studies the computer desktop visual image compression technology based on fuzzy clustering algorithm, and proposes a computer desktop image compression scheme based on HEVC and color clustering. Aiming at the active adaptive block partitioning of computer desktop image, a high quality and low complexity compression algorithm for computer...
Conference Paper
Full-text available
Image compression is nothing but reducing the amount of data required to represent an image. To compress an image efficiently we use various techniques to decrease the space and to increase the efficiency of transfer of the images over network for better access. This paper explains about compression methods such as JPEG 2000, EZW, SPIHT (Set Partit...
Article
Full-text available
Fuzzy transform is a relatively recent fuzzy approximation method, mainly used for image and general data processing. Due to the growing interest in the application of fuzzy transform over the last years, it seems proper providing a review of the technique. In this paper, we recall F-transform-based compression methods for data and images. The rela...
Article
Full-text available
This paper presents a joint compression and encryption technique using Set Partitioning in Hierarchical Tree (SPIHT) in Multi-resolution Singular Value Decomposition (MSVD) domain. The main idea of this work is to identify significant and less significant information in the MSVD domain. The core information is encrypted using pseudo-random number s...
Article
Full-text available
A new processing algorithm based on fractal image compression is proposed for image processing efficiency. An image will partition into non-overlapping blocks called range blocks and overlapping blocks called domain blocks, with the domain blocks generally bigger than the range blocks, to achieve a rapid encoding time. This research introduced a ne...

Citations

... In this paper an adaptive method is proposed to; reduce the long time of the FIC,increase the compression ratio and keep the reconstructed image quality. This method worked on:1-Reducing the complexity of matching operations by using Zero Mean Intensity Level Fractal Image Compression (ZMIL FIC) which can speed up the encoding operation and increase both of the compression ratio and the reconstructed image quality as it illustrated in our previous work [25]. 2-Minmizing the number of matching operations by reducing both of the range and domain blocks needed in the matching operations, for this purpose, an adaptive quadtree partitioning technique is used then three techniques have been used; the first one is called Range Exclusion (RE), which used a variance factor to reduce the number of range blocks by excluding the homogenous ranges from the matching process; the second one is called Variance Domain Selection (VDS), which searches only the domain blocks with small variance difference to the encoded range; the third one is called Reducing the Domain Image Size (RDIZ), it reduces the domain pool by minimizing the Domain Image Size to only 1/16 th original image size. ...
... And in this research it was used and developed by merging it with other speeding up techniques to get a hi-performance method. As illustrated in our previous work [25] ZMIL can convert the full search scheme to a one-parameter optimization problem since the optimal approximation in the decoding unit for every range block can be obtained from Eq. (10). ...
...  In the introduced transformation, r is uniformly quantized by 6 bits and s is uniformly quantized by 2 bits instead of 5 and 7 bits for s and o in BFIC and this will lead to more CR [25]. ...
Article
In this paper an Adaptive Fractal Image Compression (AFIC) algorithm is proposed to reduce the long time of the Fractal Image Compression (FIC). AFIC worked on; minimizing the complexity and the number of matching operations by reducing both of the range and domain blocks needed in the matching process, for this purpose Zero Mean Intensity Level Fractal Image Compression based on Quadtree partitioning, Variance Factor Range Exclusion, Variance Factor Domain Selection and Domain Pool Reduction techniques is used. This in turn will affect the encoding time, compression ratio and the image quality. The results show that AFIC significantly speeds up the encoding process and achieves a higher compression ratio, with a slight diminution in the quality of the reconstructed image. In comparison with some resent methods, the proposed method spends much less encoding time, get higher compression ratio while the quality of the reconstructed images is almost the same.