Figure 2 - uploaded by Karen Egiazarian
Content may be subject to copyright.
Four quadrant windows 1, 2, 3 and 4 used for directional window size selection by the ICI rule.

Four quadrant windows 1, 2, 3 and 4 used for directional window size selection by the ICI rule.

Source publication
Article
Full-text available
We describe a novel approach to solve a problem of window size (bandwidth) selection for filtering an image signal given with a noise. The approach is based on the intersection of confidence intervals (ICI) rule and gives the algorithm, which is simple to implement and nearly optimal in the point-wise mean squared error risk. The local polynomial a...

Contexts in source publication

Context 1
... ICI rule is graphically illustrated in Fig. 1, where the vertical lines with arrows show the successive intersections of the confidence intervals (1, 2), (1, 2, 3), and (1, 2, 3, 4). Assuming that the intersection with the forth confidence interval (corre- sponding h ˆ h 4 ) is empty, we obtainthèòptimal'' adaptive window size h ‡ ˆ h 3 X ...
Context 2
... more complex approach assumes that the varying window is composed from a number of separate segments, say from four quadrants shown in Figure 2. The centre of the window is the initial point of the Cartesian coordinate system (0Y 0). ...
Context 3
... ” y j (xY h ‡ j (x)) are the estimates with the ICI rule adaptive window size, j ˆ [1Y 2Y 3Y 4]Y obtained respectively for the windows 1Y 2Y 3Y 4 in Figure 2. Further k j and std j are the weights and the standard deviations of these estimates ” y j (xY h ‡ j (x)). ...
Context 4
... Figure 4 we emphasize that the obtained window sizes actually correspond to the intuitively clear behavior of the varying window size relevant to the smoothing of the data if the true image is known. Thus the window sizes delineate the true image of the square and the variations of the window sizes provides a shadowing of the image from different sides of the image in full agreement with the directional windows used for smoothing (see Figure 2). ...
Context 5
... first part is applied for a point-wise image segmentation. This segmentation assumes that the LPA with ICI rule is used for every pixel in order to find the adaptive sizes of four directional rectangular windows as shown in Figure 2. As a result, every pixel can be an entry of many different estimates obtained for adaptive vary- ing size windows with different centers. ...
Context 6
... Figure 6 shows the varying adaptive window sizes obtained respectively for the windows I, II, III and IV (Figure 2, h k ˆ 2 k Y k ˆ 0Y 1Y XXXY 6). Here black and white areas correspond, respectively, to small and large window sizes. ...

Citations

... Applying the filtering technique, we estimate the signalx(k) from the noisy signal y(k) in such a way that the estimation error is minimized. The LPA-based filter design technique consists of locally fitting a polynomial to the noisy data within a sliding window of an adaptive size [9]- [11], [16]- [18]. The optimal window size minimizes the mean squared estimation error and is selected by using the statistical test based on the intersection of the confidence intervals. ...
Conference Paper
Full-text available
Smart meters are modernizing the electrical energy systems, solving numerous problems of the classical systems, and introducing new functionalities to the grid. This paper aims to give an insight into these systems in terms of their security and privacy aspects. Special attention is paid to the energy theft problem (meter tampering, physical anti-tampering measures, and theft detection methods). We, next, address other security issues present in remote metering (attack types, risks, and threats to the system). Also, necessary security requirements and signature-based, anomaly-based, and specification-based Intrusion Detection Systems (IDS) are defined. We conclude by listing privacy concerns arising due to the remote meters and advanced metering infrastructures (AMI) being installed more and more.
... Applying the filtering technique, we estimate the signalx(k) from the noisy signal y(k) in such a way that the estimation error is minimized. The LPA-based filter design technique consists of locally fitting a polynomial to the noisy data within a sliding window of an adaptive size [9]- [11], [16]- [18]. The optimal window size minimizes the mean squared estimation error and is selected by using the statistical test based on the intersection of the confidence intervals. ...
Conference Paper
The detection of the gravitational-wave (GW) events from the noisy measured data requires reliable filtering techniques. This paper provides the performance analysis of the intersection of confidence intervals (ICI) and local polynomial approximation (LPA) based techniques when applied to the filtering of the GW signals in the real-life, low signal-to-noise-ratio (SNR) conditions. The techniques' sensitivities to the parameter changes are studied by evaluating the performance indices values. The analysis shows that the proper selection of the algorithm parameters is important in obtaining the optimal filtering performance, with the technique based on the relative intersection of confidence intervals (RICI) having an additional parameter that may compensate for the sub-optimally selected ICI parameter. Moreover, the evaluation of the results obtained by applying the optimally tuned filtering techniques suggests that the RICI-based technique provides better performance in filtering the GW signals, even compared to the tested wavelet-based techniques.
... In recent years, compressed sensing computational ghost imaging (CSCGI) technology has been applied to CGI reconstruction [7][8][9][10][11]. Although compressed sensing (CS) is promising in CGI, CS has two major problems [7,[12][13][14]. On the one hand, a CS algorithm requires partial knowledge of the target images in advance to reconstruct images from some samples; further, the images will not be sparsely fixed, which limits its practical application. ...
Article
Full-text available
Computational ghost imaging is difficult to apply under low sampling rate. We propose high-speed computational ghost imaging based on an auto-encoder network to reconstruct images with high quality under low sampling rate. The auto-encoder convolutional neural network is designed, and the object images can be reconstructed accurately without labeled images. Experimental results show that our method can greatly improve the peak signal-to-noise ratio and structural similarity of the test samples, which are up to 18 and 0.7, respectively, under low sampling rate. Our method only needs 1/10 of traditional deep learning samples to achieve fast and high-quality image reconstruction, and the network also has a certain generalization to the gray-scale images.
... Since the shape of the patch is adaptive to the image structures [10], the SA patch can avoid smoothing across edges. To extract SA patches efficiently, the anisotropic local polynomial approximation-intersection of confidence intervals (LPA-ICI) technique [11] is used. For each target pixel x ∈ X , we extract a target patch of size b × b centered at the target pixel and an SA mask is obtained for the target patch by its shape-adaptive neighborhood. ...
Article
Full-text available
Abstract In this letter, a novel single depth map super‐resolution algorithm is proposed, which combines the non‐local prior and local smoothness prior. Unlike the color‐guided methods, the proposed method does not need a corresponding color image to aid the depth map super‐resolution. To explore the non‐local self‐similarity in the depth map, a shape‐adaptive adjusted non‐local regression is constructed using the shape‐adaptive similar patch groups. This prior can make full use of the non‐local information of the depth map and alleviate the effect of irrelevant pixels. To construct different local structures, the direction‐based local smoothness prior to model the different directional information is proposed, which can preserve the different structures in the depth map. Compared with the state‐of‐the‐art methods, experimental results indicate that the proposed approach can achieve better reconstruction performance.
... wherex n (k, h) represents the signal sample value estimated using n samples in its vicinity by applying the kernel with varying width h(k). The mean squared estimation error, MSE(k, h), may be defined, with respect to the estimation bias b n (k, h) and the estimation variance σ 2 n (k, h), as [30,31]: ...
... The crucial task of the adaptive filtering procedure includes selecting the kernel width h o (k) that minimizes MSE(k, h), thus providing the optimal bias-variance trade-off [30,31]: ...
... The ICI algorithm's performance is highly sensitive to the selection of the optimal value of the threshold parameter Γ, as too small values result in signal undersmoothing, and too large values cause signal oversmoothing [24,30]. ...
Article
Full-text available
The real-life signals captured by different measurement systems (such as modern maritime transport characterized by challenging and varying operating conditions) are often subject to various types of noise and other external factors in the data collection and transmission processes. Therefore, the filtering algorithms are required to reduce the noise level in measured signals, thus enabling more efficient extraction of useful information. This paper proposes a locally-adaptive filtering algorithm based on the radial basis function (RBF) kernel smoother with variable width. The kernel width is calculated using the asymmetrical combined-window relative intersection of confidence intervals (RICI) algorithm, whose parameters are adjusted by applying the particle swarm optimization (PSO) based procedure. The proposed RBF-RICI algorithm’s filtering performances are analyzed on several simulated, synthetic noisy signals, showing its efficiency in noise suppression and filtering error reduction. Moreover, compared to the competing filtering algorithms, the proposed algorithm provides better or competitive filtering performance in most considered test cases. Finally, the proposed algorithm is applied to the noisy measured maritime data, proving to be a possible solution for a successful practical application in data filtering in maritime transport and other sectors.
... This results in the instantaneous slope changes and other features in s(k) being well preserved. In order to achieve this goal, we applied the LPA method [65][66][67][68][69] as the filter design technique and proposed the adaptive RICI algorithm for the filter support selection. ...
... The LPA method provides the estimateŝ(k) from the noisy measurements x(k), defined in (1), by fitting a polynomial to measurement data within a sliding window defined in the vicinity of the considered measurement. The polynomial, obtained as a linear combination of basis vectors, is fitted locally for each considered measurement so that it minimizes the following loss function using the weighted least squares (WLS) criterion [65][66][67][68]: ...
... The window function ψ w (k) defines the location of the polynomial fitting with respect to the central point k 0 and, in normalized form, satisfies the conventional kernel properties [65,66]: ...
Article
Full-text available
Gravitational-wave data (discovered first in 2015 by the Advanced LIGO interferometers and awarded by the Nobel Prize in 2017) are characterized by non-Gaussian and non-stationary noise. The ever-increasing amount of acquired data requires the development of efficient denoising algorithms that will enable the detection of gravitational-wave events embedded in low signal-to-noise-ratio (SNR) environments. In this paper, an algorithm based on the local polynomial approximation (LPA) combined with the relative intersection of confidence intervals (RICI) rule for the filter support selection is proposed to denoise the gravitational-wave burst signals from core collapse supernovae. The LPA-RICI denoising method’s performance is tested on three different burst signals, numerically generated and injected into the real-life noise data collected by the Advanced LIGO detector. The analysis of the experimental results obtained by several case studies (conducted at different signal source distances corresponding to the different SNR values) indicates that the LPA-RICI method efficiently removes the noise and simultaneously preserves the morphology of the gravitational-wave burst signals. The technique offers reliable denoising performance even at the very low SNR values. Moreover, the analysis shows that the LPA-RICI method outperforms the approach combining LPA and the original intersection of confidence intervals (ICI) rule, total-variation (TV) based method, the method based on the neighboring thresholding in the short-time Fourier transform (STFT) domain, and three wavelet-based denoising techniques by increasing the improvement in the SNR by up to 118.94% and the peak SNR by up to 138.52%, as well as by reducing the root mean squared error by up to 64.59%, the mean absolute error by up to 55.60%, and the maximum absolute error by up to 84.79%.
... In addition, we required each tumor sample to have at least three of the following five purity estimates-Estimate, Absolute, LUMP, IHC, and the consensus purity estimate (CPE) (Katkovnik et al., 2002;Pagès et al., 2010;Carter et al., 2012;Yoshihara et al., 2013;Zheng et al., 2014;Aran et al., 2015). On the same datasets, we applied THetA (Oesper et al., 2013(Oesper et al., , 2014)-a popular tool for assessing CNA and admixture from sequencing data-was also applied to the datasets. ...
... To demonstrate a practical application of the v PR values we use them to estimate tumor purity of the samples. To this end we compared the v PR based purity (VBP) estimates with ESTIMATE, ABSOLUTE, LUMP, IHC, and the Consensus Purity Estimation (CPE) (Katkovnik et al., 2002;Pagès et al., 2010;Carter et al., 2012;Yoshihara et al., 2013;Zheng et al., 2014;Aran et al., 2015). ...
Article
Full-text available
Variant allele frequencies (VAF) are an important measure of genetic variation that can be estimated at single-nucleotide variant (SNV) sites. RNA and DNA VAFs are used as indicators of a wide-range of biological traits, including tumor purity and ploidy changes, allele-specific expression and gene-dosage transcriptional response. Here we present a novel methodology to assess gene and chromosomal allele asymmetries and to aid in identifying genomic alterations in RNA and DNA datasets. Our approach is based on analysis of the VAF distributions in chromosomal segments (continuous multi-SNV genomic regions). In each segment we estimate variant probability, a parameter of a random process that can generate synthetic VAF samples that closely resemble the observed data. We show that variant probability is a biologically interpretable quantitative descriptor of the VAF distribution in chromosomal segments which is consistent with other approaches. To this end, we apply the proposed methodology on data from 72 samples obtained from patients with breast invasive carcinoma (BRCA) from The Cancer Genome Atlas (TCGA). We compare DNA and RNA VAF distributions from matched RNA and whole exome sequencing (WES) datasets and find that both genomic signals give very similar segmentation and estimated variant probability profiles. We also find a correlation between variant probability with copy number alterations (CNA). Finally, to demonstrate a practical application of variant probabilities, we use them to estimate tumor purity. Tumor purity estimates based on variant probabilities demonstrate good concordance with other approaches (Pearson's correlation between 0.44 and 0.76). Our evaluation suggests that variant probabilities can serve as a dependable descriptor of VAF distribution, further enabling the statistical comparison of matched DNA and RNA datasets. Finally, they provide conceptual and mechanistic insights into relations between structure of VAF distributions and genetic events. The methodology is implemented in a Matlab toolbox that provides a suite of functions for analysis, statistical assessment and visualization of Genome and Transcriptome allele frequencies distributions. GeTallele is available at: https://github.com/SlowinskiPiotr/GeTallele.
... On the other hand, undersized w shrinks the window and, unfortunately, increases the variance of the estimation error. Thus, the optimal w (which minimizes estimation error by providing the trade-off between variance and bias of the estimation error) is to be calculated using the relative intersection of confidence intervals based algorithm [25]. ...
... A brief outline of the intersection of confidence intervals based algorithm is given in sequel. Its detailed description may be found in [24,25,[29][30][31]. The absolute estimation error for the tth measurement is [23][24][25]29] ...
... Its detailed description may be found in [24,25,[29][30][31]. The absolute estimation error for the tth measurement is [23][24][25]29] ...
Article
The paper proposes a power system state estimator upgraded by the improved relative intersection of confidence intervals (RICI) algorithm combined with the local polynomial approximation (LPA). In the proposed approach, a novel LPA-RICI denoising technique is utilized to preprocess the input measurements. The accuracy of the adaptive, data-driven LPA-RICI based state estimator was tested on the IEEE test systems with 14 and 30 buses, as well as on a model of the real-life Croatian transmission power system. Due to its adaptivity to high transitions in measurement series achieved by varying filter support size, the LPA-RICI based state estimator enhanced the estimation accuracy for all tested power systems, and at the same time reduced the number of iterations required for the estimator to converge when compared to the classical weighted least squares (WLS) based estimator. Namely, improvement when using the proposed preprocessor was obtained in estimation of the input voltage magnitudes, active and reactive power flows and injections for the Croatian transmission power system by up to 29.3% and 19.4% in terms of the reduction of the mean squared error (MSE) and mean absolute error (MAE), respectively. Furthermore, the LPA-RICI based estimator resulted in an overall estimation quality enhancement for the Croatian transmission power system by up to 13.3%, 12.3%, and 3.7%, in terms of the objective function, variance of the estimated states, and average time for the state estimator to converge, respectively.
... In addition, we required each tumour sample to have at least three of the following five purity estimates -Estimate, Absolute, LUMP, IHC, and the consensus purity estimate (CPE), (Supplementary Table 1). Finally, each sample was required to have CNA estimation (genomic segment means based on Genome-Wide-SNPv6 hybridization array) (Aran, et al., 2015;Carter, et al., 2012;Katkovnik, et al., 2002;Pagès, et al., 2010;Yoshihara, et al., 2013;Zheng, et al., 2014). ...
... As a proof of concept, we assessed matched normal and tumour exome and transcriptome sequencing data of 72 breast carcinoma (BRCA) datasets with pre-assessed copy-number and genome admixture estimation acquired through TCGA (Supplementary Table 1). For these datasets, purity and genome admixture has been assessed using at least three of the following five approaches: ESTIMATE, ABSOLUTE, LUMP, IHC, and the Consensus Purity Estimation (CPE) (Aran, et al., 2015;Carter, et al., 2012;Katkovnik, et al., 2002;Pagès, et al., 2010;Yoshihara, et al., 2013;Zheng, et al., 2014). In addition, on the same datasets we applied THetA -a popular tool for assessing CNA and admixture from sequencing data (Oesper, et al., 2013;Oesper, et al., 2014). ...
Preprint
Full-text available
Background Asymmetric allele expression typically indicates functional and/or structural features associated with the underlying genetic variants. When integrated, RNA and DNA allele frequencies can reveal patterns characteristic of a wide-range of biological traits, including ploidy changes, genome admixture, allele-specific expression and gene-dosage transcriptional response. Results To assess RNA and DNA allele frequencies from matched sequencing datasets, we introduce a method for generating model distributions of variant allele frequencies (VAF) with a given variant read probability. In contrast to other methods, based on whole sequences or single SNV, proposed methodology uses continuous multi-SNV genomic regions. The methodology is implemented in a GeTallele toolbox that provides a suite of functions for integrative analysis, statistical assessment and visualization of Ge nome and T ranscriptome allele frequencies. Using model VAF probabilities, GeTallele allows estimation and comparison of variant read probabilities (VAF distributions) in a sequencing dataset. We demonstrate this functionality across cancer DNA and RNA sequencing datasets. Conclusion Based on our evaluation, variant read probabilities can serve as a dependable indicator to assess gene and chromosomal allele asymmetries and to aid calls of genomic events in matched sequencing RNA and DNA datasets. Contact P.M.Slowinski@exeter.ac.uk
... La aproximación local polinomial (LPA) es un método de estimación no-paramétrica propuesto y desarrollado en estadística para la reducción de errores (Schoukens et al., 2013), el procesamiento de datos distorsionados por ruido (en una o varias dimensiones) (Katkovnik et al., 2002) y el diseño de filtros para señales transitorias, no-estacionarias y de frecuencia variante en el tiempo (Katkovnik et al., 2006;Lerga et al., 2008;Rojas et al., 2018). Este artículo presenta un método para la reducción de ruido en las mediciones de señales eléctricas producidas por DPs basado en la aproximación local polinomial (LPA) y su combinación con el algoritmo de intersección de intervalos de confianza (ICI), el cual resuelve el problema de la selección del tamaño de la ventana y el valor óptimo del umbral de confianza. ...
... Originalmente, la LPA fue propuesta y aplicada en estadística para procesamiento escalar y el análisis multidimensional de datos con ruido (Katkovnik et al., 2002). En términos del proceso de filtrado, la LPA permite determinar versiones aproximadas (estimaciones) de una señal y(t)y(tk) de forma localizada, usando una ventana deslizante. ...
... Para resolver el problema de seleccionar el tamaño de la ventana usada en la LPA, es posible aplicar el algoritmo de intersección de intervalos de confianza (ICI), el cual es un método matemático propuesto y aplicado en el análisis multidimensional de datos con ruido (Katkovnik et al., 2002). Este algoritmo proporciona una selección adaptativa del ancho de la ventana y permite incluir dentro de sus parámetros un grupo finito de anchos de banda (conjunto H), los cuales deben cumplir la siguiente condición: ...
Article
Full-text available
This paper presents an alternative for noise reduction in electrical signals produced by partial discharges, PDs. The method is based on the local polynomial approximation (LPA) and its combination with the intersection of confidence intervals (ICI) algorithm. The method differs from other techniques by providing the best version of a signal from a mathematical and statistical analysis that does not depend on the calculation of its representation in the time-frequency plane. Likewise, an evaluation of the proposed method is done from experimentally obtained PD signals. Finally, the results are compared with those obtained by different filters (Fourier, FIR and discrete wavelets transform-DWT). This comparison shows that, although the methods based on the DWT provide good results, the LPA-ICI method provides signals with high signal-to-noise ratio (>70%), a high correlation (>0.899) and a low amplitude distortion (<3.5%).