Fig 7 - uploaded by Mueen Uddin
Content may be subject to copyright.
Scrambled input sequcne.

Scrambled input sequcne.

Source publication
Article
Full-text available
Data stored on physical storage devices and transmitted over communication channels often have a lot of redundant information, which can be reduced through compression techniques to conserve space and reduce the time it takes to transmit the data. The need for adequate security measures, such as secret key control in specific techniques, raises con...

Context in source publication

Context 1
... cases where the frequency of a character's occurrence exceeds the number of rows in the matrix, a new key value is determined by calculating the remainder of the frequency divided by the number of rows. Fig. 7 illustrates this ...

Citations

... Using BWT, these similar values are grouped together, which can significantly reduce the size of the data. BWT can also be used for methods that compress and encrypt data sent in power grids [65]. ...
Article
Full-text available
Significant amounts of data need to be transferred in order to optimize the operation of power grids. The development of advanced metering and control infrastructure ensures a growth in the amount of data transferred within smart grids. Data compression is a strategy to reduce the burden. This paper presents current challenges in the field of time-series data compression. This paper’s novel contribution is the division of data in smart grids to real-time data used for control purposes and big data sets used for non-time-critical analysis of the system. Both of these two applications have different requirements for effective compression. Currently used algorithms are listed and described with their advantages and drawbacks for both of these applications. Details needed for the implementation of an algorithm were also provided. Comprehensive analysis and comparison are intended to facilitate the design of a data compression method tailored for a particular application. An important contribution is the description of the influence of data compression methods on cybersecurity, which is one of the major concerns in modern power grids. Future work includes the development of adaptive compression methods based on artificial intelligence, especially machine learning and quantum computing. This review will offer a solid foundation for the research and design of data compression methods.
... Detailed instructions for both forward and reverse BWT are available in reference [22]. BWT method [23] combines the ideas of encryption and compression to make data both smaller and safer. During the evaluation, it was determined that the BWT method reduced the data size to nearly 90% of its original size. ...
Article
Full-text available
This research paper conducts a comprehensive analysis of three key lossless image compression algorithms: Run-Length Encoding (RLE), Burrows-Wheeler Transform (BWT) and Differential Pulse Code Modulation (DPCM). The increasing demand for efficient image storage and transmission necessitates a thorough examination of these algorithms. Lossless compression plays a crucial role in diminishing data redundancy while safeguarding the integrity and quality of images. The study encompasses data collection, performance metrics, and algorithm evaluation. Results reveal the strengths and weaknesses of each algorithm. RLE excels in image quality preservation but may not achieve the highest compression ratios. DPCM provides a compromise between resource-efficient compression and image fidelity. BWT offers a competitive balance between compression efficiency and image quality. Based on the comprehensive analysis of three key lossless image compression algorithms, it was observed that BWT emerges as a versatile choice that offers competitive compression while maintaining reasonable image quality. However, when choosing the most suitable algorithm, it is essential to consider specific application requirements, including the desired level of image quality preservation and the availability of computational resource.
Article
In 1999, the Polynomial Reconstruction Problem (PRP) was put forward as a new hard mathematics problem. A univariate PRP scheme by Augot and Finiasz was introduced at Eurocrypt in 2003, and this cryptosystem was fully cryptanalyzed in 2004. In 2013, a bivariate PRP cryptosystem was developed, which is a modified version of Augot and Finiasz's original work. This study describes a decryption failure that can occur in both cryptosystems. We demonstrate that when the error has a weight greater than the number of monomials in a secret polynomial, p, decryption failure can occur. The result of this study also determines the upper bound that should be applied to avoid decryption failure.
Article
As a result of the influence of individual appearance and lighting conditions, aberrant noise spots cause significant mis-segmentation for frontal portraits. This paper presents an accurate portrait segmentation approach based on a combination of wavelet proportional shrinkage and an upgraded sparrow search (SSA) clustering algorithm to solve the accuracy challenge of segmentation for frontal portraits. The brightness component of the human portrait in HSV space is first subjected to wavelet scaling denoising. The elite inverse learning approach and adaptive weighting factor are then implemented to optimize the initial center location of the K-Means algorithm to improve the initial distribution and accelerate the convergence speed of SSA population members. The pixel segmentation accuracy of the proposed method is approximately 70% and 15% higher than two comparable traditional methods, while the similarity of color image features is approximately 10% higher. Experiments show that the proposed method has achieved a high level of accuracy in capricious lighting conditions.