Article

A new process analytical technology soft sensor based on electrical tomography for real-time monitoring of multiphase systems

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Background: Electrical tomography is widely recognized for its high time resolution and low cost. However, the implementation of electrical tomographic solutions has been hindered by the high computational overhead associated, which causes delays in the analysis, and numerical instability, that results in unclear reconstructed images. Therefore, it has been mostly applied offline, for qualitative tasks and with some delay. Applications requiring fast response times and quantification have been hindered or ruled out. Results: In this article, we propose a new process analytical technology soft sensor that maps directly electrical tomography signals to the relevant parameter to be monitored. The data acquisition and estimation steps occur almost instantaneously, and the final accuracy is very good (R2 = 0,994). Significance and novelty: The proposed methodology opens up good prospects for real-time quantitative applications. It was successfully tested on a pilot piping installation where the target property is the interface height between two immiscible fluids.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
This paper presents a review of two-dimensional (2D) and three-dimensional (3D) electrical tomography (ET) hardware accelerators for real-time applications. While many recent review papers have discussed various algorithms for image reconstruction or acquisition systems, none of them has considered state-of-the-art hardware implementations of the associated image reconstruction algorithms to achieve real-time performance, especially for 3D ET where the computation requirement is excessively high. A 3D ET is useful in various applications such as robotics, autonomous vehicles, and process control, but it is computationally very expensive with respect to its 2D counterpart. Most implementations are based on single or multi-core CPUs and, to a lesser extent, on either graphics processing units (GPUs) or field programmable gate arrays (FPGAs). However, there is a clear gap between the currently available processors, whose computation power exceeds hundreds of teraflops per second (TOPS) at a reasonable low power consumption, and the ones recently used in ET systems. This gives great potential for next-generation ET systems to achieve real-time 2D and 3D ET reconstruction within a small form factor. The paper summarizes the most recent ET hardware systems with respect to their performance in terms of quality and processing frame rate, reconstruction methods, along with optimization and future directions.
Article
Full-text available
Understanding the behaviour of suspension flows continues to be a subject of great interest considering its industrial relevance, regardless of the long time and effort dedicated to it by the scientific and industrial communities. Information about several flow characteristics, such as flow regimen, relative velocity between phases, and spatial distribution of the phases, are essential for the development of exact models for description of processes involving pulp suspension. Among the diverse non-invasive techniques for flow characterisation that have been reported in the literature for obtaining experimental data about suspension flow in different processes, Electrical Tomography is one of the most interesting, since it presents perhaps the best compromise among cost, portability, and, above all, safety of handling (indeed there is no need to use radiation, which requires special care when using it). In this paper, a brief review and comparison between existing technologies for pulp suspension flow monitoring will be presented, together with their strengths and weaknesses. Emphasis is given to Electrical Tomography, because it offers the above-mentioned compromise and thus was the strategy adopted by the authors to characterise different flow processes (solid–liquid, liquid–liquid, fibres, etc.). The produced portable EIT system is described, and examples of results of its use for pulp suspension flow characterisation are reported and discussed.
Article
Full-text available
In this paper a number of LT Spice simulations have been carried out on an Electrical Impedance Tomography (EIT) system, which includes the whole analog and digital circuitry as well as the subject to be examined (phantom model). The aim of this study is to show how the analog and digital parts, the electrodes and the subject’s physical properties may impact the measurements and the quality of the reconstructed image. This could provide a useful tool for designing an EIT system. Special attention has been given to the current source’s output impedance and swing, to the noise produced by the circuits and to the Analog to Digital Converters (ADCs) resolution and sampling rate. Furthermore, some 3D phantom subjects have been modeled and simulated as equivalent circuits, merged with the EIT simulated hardware, in order to observe how changes on their properties interact with the whole circuitry and affect the final result. Observations show that mirrored current sources with z o u t > 350 k Ω and sufficiently high ADC acquisition sampling rate ( f s a m p l e ≥ 16 f i n ) can result to accurate impedance measurements and therefore quality image reconstruction within a frequency span of at least 10 to 100 kHz. Moreover, possible hardware failures (electrode disconnections and imbalanced contact impedances) can be detected with a simple examination of the first extracted image and measurement set, so that by direct modification of the reconstruction process, a corrected result can be obtained.
Article
Full-text available
This paper presents a new approach to analyzing measurement records from industrial processes. The proposed methodology is based on the model of contextual processing and uses big data from experimental process tomography datasets. Electrical capacitance tomography (ECT) is used to monitor non-invasive flow and for data acquisition. The measurement data are collected, stored and processed to identify process regimes and process threats. A specific physical modification was introduced into the pneumatic conveying flow rig in order to study flow behaviour under extreme conditions, extending the available knowledge base. A support vector machine (SVM) was applied for data classification. This study illustrates how contextual processing can facilitate data interpretation and opens the way for the development of methods for detecting pre-emergency flow patterns.
Article
Full-text available
Flow visualization and characterization of multiphase flows have been the quest of many fluid mechanicians. The process is fairly straight forward only when there is good optical access (i.e., the vessel is not opaque or there are appropriate viewing ports) and the flow is transparent, implying a very low volume fraction of the dispersed phase; however, when optical access is not good or the fluid is opaque, alternative methods must be developed. Several different noninvasive visualization tools have been developed to provide high-quality qualitative and quantitative data of various multiphase flow characteristics, and overviews of these methods have appeared in the literature. X-ray imaging is one family of noninvasive measurement techniques used extensively for product testing and evaluation of static objects with complex structures. X-rays can also be used to visualize and characterize multiphase flows. This paper provides a review of the current status of X-ray flow visualization and details various X-ray flow visualization methods that can provide qualitative and quantitative information about the characteristics of complex multiphase flows. [DOI: 10.1115/1.4004367]
Article
Full-text available
Despite decades of research, the study of suspension flows still continues to be a subject of great scientific interest. In the development of accurate models for suspension-related processes, prior knowledge of several flow characteristics is essential, such as spatial distribution of phases, flow regimen, relative velocity between phases, etc. Several non-invasive techniques of flow characterisation can be found in the literature, however, electrical tomography offers a vast field of possibilities due to its low cost, portability and, above all, safety of handling. In this paper, a review of the use of electrical tomography for industrial/process monitoring purposes will be presented, giving information about the evolution throughout the years and about the limitations and advantages of the different configurations. Moreover, the signal de-convolution strategies, to obtain the images of the process, will also be discussed. The most recent advances in both fields will be presented. Additionally, information about the strategy adopted by the authors to produce a portable EIT system will be described. Finally, the future challenges for electrical tomography will be addressed.
Article
Full-text available
Principal component analysis is one of the most important and powerful methods in chemometrics as well as in a wealth of other areas. This paper provides a description of how to understand, use, and interpret principal component analysis. The paper focuses on the use of principal component analysis in typical chemometric areas but the results are generally applicable.
Article
Full-text available
Different approaches have been followed to model the hydraulic transport of particles, ranging from pure empirical correlations to general models based on fundamental principles. However, these models suffer from uncertainties associated with the parameters in the constitutive equations and scarcity of experimental data in the literature. Non-intrusive techniques such as Electric Impedance Tomography can be used to circumvent the difficulties associated with sampling techniques. EIT is an imaging technique for the phase distribution in a two-phase flow field, allowing reconstructing the resistivity/conductivity distribution gradients from electrical data in a medium subjected to arbitrary excitations. Our best efforts were concentrated on the development of a new EIT system that is analogue based, portable, low-cost and capable of providing high quality sharp images when used to characterize the flow of particle suspensions. A voltage source was used, rather than a more complex and costly current source, since it provided the EIT system with a more precise and flexible current output. The data acquisition system consists of 16 electrodes equally spaced in the boundary of a tube and a custom dedicated electronic apparatus. The software supplies results in the form of 2D reconstructed images that allow mapping the phase distribution inside the tube.
Article
Full-text available
The thermodynamics of the general system of two immiscible electrolytes in the presence of an electric field depends strongly on the distribution of ions near the liquid interface. Here, we calculate the corresponding electrostatic potential difference, excess surface tension, and differential capacity via Monte Carlo simulations, which include ion correlations and polarization effects, and via a modified nonlinear Poisson–Boltzmann theory. Macroscopically, we find good agreement between our results and experimental data without needing any fitting parameter. At higher salt concentrations, charge overcompensation in the lower-permittivity region is observed, which results in a local inversion of the electric field accompanied by charge inversion near the interface. We find that these interesting phenomena are mainly driven by the excluded-volume effects associated with large organic ions in the oil phase, although polarization effects and between-layer ion correlations have a significant impact in the adsorption of ions close to the liquid interface. In addition, our Monte Carlo simulations predict that the differential capacity is maximal at the point of zero charge, in contrast with the classical Poisson–Boltzmann theory results.
Article
Full-text available
This paper provides a practical guide to variable selection in chemometrics with a focus on regression-based calibration models. Several approaches, such as genetic algorithms (GAs), jack-knifing, forward selection, etc., are explained; it is also explained how to choose between different kinds of variable selection methods. The emphasis in this paper is on how to use variable selection in practice and avoid the most common pitfalls. Copyright © 2010 John Wiley & Sons, Ltd.
Article
Full-text available
In the petrochemical industry, the product quality reflects the commercial and operational performance of a manufacturing process. However, real-time measurement of product quality is generally difficult. Online prediction of quality using readily available, frequent process measurements would be beneficial in terms of operation and quality control. In this paper, a novel soft sensor technology based on partial least squares (PLS) regression is developed and applied to a refining process for quality prediction. The modeling process is described, with emphasis on data preprocessing, multivariate-outlier detection and variables selection. Enhancement of PLS strategy is also discussed for taking into account the dynamics in the process data. The proposed approach is applied to data from a refining process and the performance of the resulting soft sensor is evaluated by comparison with laboratory data and analyzer measurements.
Article
The possibility of reconstructing the velocity structure of inspected objects with a high spatial resolution and high sensitivity in ultrasonic tomographic nondestructive testing within the framework of a wave model has been demonstrated in a real experiment. In this study, a scheme of a tomographic experiment with rotation is proposed, which provides sounding of an inspected object from multiple sides. A tomographic scheme of the experiment with linear antenna transducer arrays operating at a frequency of ∼5 MHz was used. The experiment was conducted on dedicated samples including inserts with different sound propagation velocities. It was shown that the velocity structure and the boundaries of inserts can be reconstructed in the transmission and the reflection schemes. The reconstruction of the velocity structure was formulated as a nonlinear coefficient inverse problem for a scalar wave equation. Efficient iterative methods for its solution on a supercomputer were developed using direct formulas to compute the gradient of the residual functional between the computed and experimentally measured wave field at the detectors. Nonlinearity of the inverse problem of ultrasound tomography leads to multiple local minima of the residual functional. A two-stage iterative method was used for velocity reconstruction. The transmission scheme enables a spatial resolution of approximately 1 mm with a velocity contrast of 2%.
Article
In spite of decades of study and investigation, the research on tomography and electrical resistance tomography (ERT) in particular, remains to be focus of immense scientific significance. ERT provides the ability to measure conductivity distribution inside a process plant and delivers time evolving multidimensional information. Such important and otherwise inaccessible information enhances critical process knowledge whilst improving the design and function of the process equipment. ERT has been employed in a variety of fields including chemical engineering. This paper reviews previous research carried out on the application of ERT within the chemical engineering arena. The applications are classified based on the objective of ERT measurements, the unit operations ERT has been utilized on, the media under examination, and also other technologies and data processing techniques used in combination with ERT. The objective of this taxonomy is to offer the reader with a broad insight into the current situation of ERT related research and developed applications in the chemical engineering field and to assist in the identification of research gaps for future investigation.
Article
Spectral AutoML is a platform for fast development of PAT soft sensors that considers the combined effect of pre-processing, band selection, band-wise resolution definition, hyper-parameter tuning and model estimation. Spectral AutoML was compared with models developed under the classic paradigm, and their performance assessed on an independent test set. The validation study regards the prediction of 12 different diesel fuels properties, using FTIR-ATR spectra. The proposed framework led to clearly better predictions in 8 out of the 12 properties, and minor improvements in 3 properties. The Spectral AutoML results were obtained overnight, without interfering in the daily work of the users, while the benchmark models resulted from several months of work and fine tuning of the methods. The results demonstrated the added value of the proposed Spectral AutoML approach in terms of prediction accuracy, development time of the models and reduced dependence on resident experts.
Article
A systematic approach for advanced soft sensor development was applied to predict free fatty acid (FFA) content from NIR spectra under real plant conditions, namely in process streams from a biofuel producing unit integrating raw materials with time-varying complex matrices (e.g., wasted cooking oils, WCOs). The proposed methodology systematically screened through 52 combinations of preprocessing and inferential modeling methods, including current state of the art predictive methodologies, and the recently proposed multiresolution soft sensors. The model’s prediction capabilities were compared through a rigorous framework based on Monte Carlo double cross-validation and statistical hypothesis testing. This study used 119 samples with FFA content in the range of 0.1030% to 5.6740%. The best model is based on the novel multiresolution soft sensor framework. This model had a coefficient of determination (R²) of 0.9792 and a prediction root mean squared error of 0.1187 (13.6% lower than the best model based on standard modelling methodologies). The proposed approach can be easily replicated to other scenarios where soft sensors based on Process Analytical Technology (PAT) are to be developed.
Article
The Research Octane Number (RON) is a key parameter for specifying gasoline quality. It assesses the ability to resist engine knocking as the fuel burns in the combustion chamber. In this work we address the critical but complex problem of predicting RON using real process data in the context of a catalytic reforming process from a petrochemical refinery. We considered data collected from the process over an extended period of time (21 months). RON measurements are obtained offline, by laboratory analysis, with a significant delay and at much lower rates when compared to process measurements. The proposed workflow covers all the way from data collection, cleaning and pre-processing to data-driven modelling, analysis and validation for a real industrial refinery located in Portugal. The accuracy achieved with the best soft sensors open up perspectives for industrial applications and the results obtained also provide relevant information about the main RON variability sources.
Article
Measuring in real-time two-phase flow composition of a mixed fluid having high gas void fraction (GVF) remains a challenging task in oil–gas fields. Such fluid is abundant in gas pipelines where pressure and temperature fluctuations lead to condensate gas. This may also be the case of crude oil produced from CO <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sub> or steam-based enhanced oil recovery, where the injected gas is mixed with the produced oil. This article presents a new concept of high GVF measurement and flow regime determination using a terahertz-based imaging system. It explores the fact that the gas phase has very low absorption of THz waves, while it yields an absorption factor that is proportional to the amount of liquid. The recent availability of low-cost THz imaging systems that can generate two-dimensional images at more than 100 frames/s makes them well suitable for flow metering applications. Two different artificial intelligence algorithms, namely support vector machine (SVM) and artificial neural network (ANN), were assessed using an in-house multiphase flow loop. The corresponding results reveal that while ANN and SVM yield very accurate results, the SVM technique performed slightly better where a maximal error of 0.46% for GVF in the GVF range from 80% to 100% could be achieved. In addition, it could accurately determine all three type of flow regimes (i.e., annular, stratified, or slug flow). This suggests that the technique can be considered as a good candidate for next-generation flow metering and imaging of multiphase flows.
Article
The optimized operation of modern analytical instrumentation is a critical but complex task. It involves the simultaneous consideration of a large number of factors, both qualitative and quantitative, where multiple responses should be quantified and several goals need to be adequately pondered, such as global quantification performance, selectivity, and cost. Furthermore, the problem is highly case specific, depending on the type of instrument, target analytes, and media where they are dispersed. Therefore, an optimization procedure should be conducted frequently, which implies that it should be efficient (requiring a low number of experiments), as simple as possible (from experimental design to data analysis) and informative (interpretable and conclusive). The success of this task is fundamental for achieving the scientific goals and to justify, in the long run, the high economic investments made and significant costs of operation. In this article, we present a systematic optimization procedure for the prevalent class of situations where multiple responses are available regarding a family of chemical compounds (instead of a single analyte). This class of problems conducts to responses exhibiting mutual correlations, for which, furthermore, several goals need to be simultaneously considered. Our approach explores the latent variable structure of the responses created by the chemical affinities of the compounds under analysis and the orthogonality of the interpretable extracted components to conduct their simultaneous optimization with respect to different analysis goals. The proposed methodology was applied to a real case study involving the quantification of a family of analytes with impact on wine aroma.
Article
In the big data and Manufacturing 4.0 era, there is a growing interest in using advanced analytical platforms to develop predictive modeling approaches that take advantage of the wealthy of data available. Typically, practitioners have their own favorite methods to address the modeling task, as a result of their technical background, past experience or software available, among other possible reasons. However, the importance of this task in the future justifies and requires more informed decisions about the predictive solution to adopt. Therefore, a wider variety of methods should be considered and assessed before taking the final decision. Having passed through this process many times and in different application scenarios (chemical industry, biofuels, drink and food, shipping industry, etc.), the authors developed a software framework that is able to speed up the selection process, while securing a rigorous and robust assessment: the Predictive Analytics Comparison framework (PAC). PAC is a systematic and robust framework for model screening and development that was developed in Matlab, but its implementation can be carried out on other software platforms. It comprises four essential blocks: i) Analytics Domain; ii) Data Domain; iii) Comparison Engine; iv) Results Report. PAC was developed for the case of a single response variable, but can be extended to multiple responses by considering each one separately. Some case studies will be presented in this article in order to illustrate PAC's efficiency and robustness for problem-specific methods screening, in the absence of prior knowledge. For instance, the analysis of a real world dataset reveals that, even when addressing the same predictive problem and using the same response variable, the best modeling approach may not be the one foreseen a priori and may not even be always the same when different predictor sets are used. With an increasing frequency, situations like these raise considerable challenges to practitioners, underlining the importance of having a tool such as PAC to assist them in making more informed decisions and to benefit from the availability of data in Manufacturing 4.0 environments.
Article
The quality of information generated in data‐driven empirical studies is of central importance in Industry 4.0. However, despite the undeniable and widely accepted importance, not sufficient attention has been devoted to its rigorous assessment and analysis. Consequently, if information quality cannot be measured, it also cannot be improved, and therefore current efforts for extracting value from big data empirical studies and data collectors are exposed to the risk of generating limited findings and insights, leading to suboptimal solutions. In this article we describe and apply a framework for evaluating, analysing and improving the quality of information generated in empirical studies called InfoQ, in the context of the Chemical Processing Industry (CPI). This systematic framework can be used by anyone involved in data‐driven activities, irrespectively of the context and specific goals. The application of InfoQ framework to several case studies is described in detail, in order to illustrate its practical relevance. This article is protected by copyright. All rights reserved.
Article
Data-driven models used in soft sensor applications are expected to capture the dominant relationships between the different process variables and the outputs, while accounting for their high-dimensional, dynamic and multiresolution character. While the first two characteristics are often addressed, the multiresolution aspect is usually disregarded and confused with a multirate scenario: multiresolution occurs when variables have different levels of granularity due to, for instance, automatic averaging operations over certain time windows; on the other hand, a multirate structure is caused by the existence of different sampling rates, but the granularity of the recorded values is the same. This has two major and immediate implications. Firstly, current methods are unable to handle variables with different resolutions in a consistent and rigorous way, since they tacitly assume that data represent instant observations and not averages over time windows. Secondly, even if data is available at a single-resolution (i.e., all variables with the same granularity), it is not guaranteed that the native resolution of the predictors is the most appropriate for modeling. Therefore, soft sensor development must address not only the selection of the best set of predictors to be included in the model, but also the optimum resolution to adopt for each predictor. In this work, two novel multiresolution frameworks for soft sensor development are proposed, MRSS-SC and MRSS-DC, that actively introduce multiresolution into the data by searching for the best granularity for each variable. The performance of these methodologies is comparatively assessed against current single-resolution counterparts. The optimized multiresolution soft sensors are bounded to be at least as good as their single-resolution versions and the results confirm that almost always they perform substantially better.
Article
The key quality features of industrial processes are typically obtained offline with a considerable delay and by resort to expensive equipment. To avoid this experimental burden, soft sensors have been developed. However, the current methodologies assume that all variables under analysis have the same level of granularity, while in reality they often present a multiresolution structure, with some variables containing instantaneous information of the process and others representing averages over hours, shifts, or production batches. Furthermore, this multiresolution structure is quite often confused with a multirate or multiscale problem and therefore a clear definition of their main differences is highlighted. Based on this distinction, we propose a new class of model structures that explicitly incorporates multiresolution in industrial soft sensors. Several simulated examples demonstrate that the proposed approach leads to better prediction capabilities than its counterparts and is also robust to mismatches between the modelling assumptions and the actual multiresolution structure of data.
Article
This paper presents a new hardware algorithm for real-time and nonlinear 2-D electrical capacitance tomography (ECT) imaging, along with its parallel hardware architecture. A potential application of this system is to reconstruct in real time cross-sectional image of a two-phase fluid with different dielectric constants when it passes through a given section of a pipeline. The proposed hardware algorithm explores the spatial correlation that may occur between one or several consecutive frames. It uses this property to reformulate the classical regularized forward and inverse problems that are indeed very time consuming, preventing them to be used for fast ECT applications. As a result, the proposed algorithm features one-single-step iteration with a substantial reduction of the size of the Jacobian matrix. In addition, the intrinsically parallel feature of the algorithm makes it suitable for parallel hardware architecture. This architecture is based on a pipeline multiprocessor architecture using advanced features of field-programmable gate array technology. It explores the variable bit-width and floating-point multipliers array available in the digital signal processor blocks, to cooperatively perform the partial matrix product with associated arithmetic and logic units, and distributed memory. The experimental results obtained on a two-phase flow loop demonstrate the capability of the system to build in real time and with good accuracy (e.g., less than 3% error) the cross-sectional image of the fluid passing through the pipeline. Around 560 frames of 4096 moving boundary pixels can be reconstructed in 1 s using IEEE754 floating-point data representation and a clock frequency of 400 MHz, for a total power consumption of less than 33 W.
Article
In this paper, a new system that improves the image obtained by an array of ultrasonic sensors using Electrical Resistance Tomography (ERT) is presented. One of its target applications can be in automatic exploration of soft tissues, where different organs and eventual anomalies exhibit simultaneously different electrical conductivities and different acoustic impedances. The exclusive usage of the ERT technique usually leads to some significant uncertainties around the regions' boundaries and usually generates images with relatively low resolutions. The proposed method shows that by properly combining this technique with an ultrasonic-based method, which can provide good localization of some edge points, the accuracy of the shape of individual cells can be improved if these edge points are used as constraints during the inversion procedure. The performance of the proposed reconstruction method was assessed by conducting extensive tests on some simulated phantoms which mimic soft tissues. The obtained results clearly show the outperformance of this method over single modalities techniques that use either ultrasound or ERT imaging.
Article
Gas dispersion in horizontal developing pipe flow downstream of a 90°-tee mixer and an in-line mechanical mixer was investigated by means of electrical resistance tomography (ERT) for both water and softwood kraft pulp suspensions over a range of fibre mass concentrations (0–3.0%), superficial liquid/pulp velocities (0.5–5.0 m/s) and superficial gas velocities (0.11–0.44 m/s). A gas mixing index, derived from the standard deviation of local gas holdup in each image pixel, quantified the uniformity of gas in cross-sectional planes along the pipe for various flow patterns. The distribution of the gas phase in a cross-section was determined from the vertical gas holdup profiles, and the relative size of gaseous entities for different flow conditions was evaluated based on the scale of segregation. For air–water flow, the gas uniformity and size of gaseous entities depended strongly on the flow pattern. For pulp fibre suspensions, the gas flow varied significantly depending on the flow regime. Mixing was similar to that in water when the flow was turbulent for dilute suspensions, but differed greatly for higher mass concentrations, likely due to robust fibre networks of a plug in the core of the pipe causing bubbles to concentrate near the wall and accelerating coalescence. The impeller disrupted the plug and distributed gas throughout the cross-section, leading to significantly improved gas uniformity in the high-shear zone around the impeller, but decaying turbulence and re-establishment of fibre networks caused bubbles to coalesce at the top of the pipe and worse mixing downstream.
Article
In permittivity distribution reconstruction using electrical capacitance tomography (ECT), it is usually required to divide the image area into a finite number of elements. Since finer meshes lead to more accurate results at the detriment of a slower reconstruction time, a good tradeoff is usually sought by researchers. In this paper, a new reconstruction method of the image area in a hierarchical manner is proposed. It consists of localizing gradually the regions of interest which hold the inhomogeneous phases by refining the pixels only around their boundaries. To improve even more the reconstructed images, this paper suggests a new ECT device consisting of a multitude of miniaturized pressure and temperature sensors distributed at different locations of a cross section of a pipeline (in addition to the electrical electrodes surrounding the pipe). Using these sensors, an estimation of the density distribution of the process across a section of the pipeline can be performed using the Bernoulli equation. This density data is then used as a hard constraint for the forward and inverse problem which uses the data acquired from the electrical electrodes. Experimental results on synthetic and real images show that the proposed scheme improves the accuracy and the quality of the reconstructed images while keeping the computation time significantly lower than other traditional methods.
Article
A well-defined variance of reconstruction error (VRE) is proposed to determine the number of principal components in a PCA model for best reconstruction. Unlike most other methods in the literature, this proposed VRE method has a guaranteed minimum over the number of PC's corresponding to the best reconstruction. Therefore, it avoids the arbitrariness of other methods with monotonic indices. The VRE can also be used to remove variables that are little correlated with others and cannot be reliably reconstructed from the correlation-based PCA model. The effectiveness of this method is demonstrated with a simulated process.
Article
Partial Least Squares (PLS) is by far the most popular regression method for building multivariate calibration models for spectroscopic data. However, the success of the conventional PLS approach depends on the availability of a 'representative data set' as the model needs to be trained for all expected variation at the prediction stage. When the concentration of the known interferents and their correlation with the analyte of interest change in a fashion which is not covered in the calibration set, the predictive performance of inverse calibration approaches such as conventional PLS can deteriorate. This underscores the need for calibration methods that are capable of building multivariate calibration models which can be robustified against the unexpected variation in the concentrations and the correlations of the known interferents in the test set. Several methods incorporating 'a priori' information such as pure component spectra of the analyte of interest and/or the known interferents have been proposed to build more robust calibration models. In the present study, four such calibration techniques have been benchmarked on two data sets with respect to their predictive ability and robustness: Net Analyte Preprocessing (NAP), Improved Direct Calibration (IDC), Science Based Calibration (SBC) and Augmented Classical Least Squares (ACLS) Calibration. For both data sets, the alternative calibration techniques were found to give good prediction performance even when the interferent structure in the test set was different from the one in the calibration set. The best results were obtained by the ACLS model incorporating both the pure component spectra of the analyte of interest and the interferents, resulting in a reduction of the RMSEP by a factor 3 compared to conventional PLS for the situation when the test set had a different interferent structure than the one in the calibration set.
Article
A condensed review of recent advances accomplished in the development and the applications of noninvasive tomographic and velocimetric measurement techniques to multiphase flows and systems is presented. In recent years utilization of such noninvasive techniques has become widespread in many engineering disciplines that deal with systems involving two immiscible phases or more. Tomography provides concentration, holdup, or 2D or 3D density distribution of at least one component of the multiphase system, whereas velocimetry provides the dynamic features of the phase of interest such as the flow pattern, the velocity field, the 2D or 3D instantaneous movements, etc. The following review is divided into two parts. The first part summarizes progress and developments in flow imaging techniques using γ-ray and X-ray transmission tomography; X-ray radiography; neutron transmission tomography and radiography; positron emission tomography; X-ray diffraction tomography; nuclear magnetic resonance imaging; electrical capacitance tomography; optical tomography; microwave tomography; and ultrasonic tomography. The second part of the review summarizes progress and developments in the following velocimetry techniques:  positron emission particle tracking; radioactive particle tracking; cinematography; laser-Doppler anemometry; particle image velocimetry; and fluorescence particle image velocimetry. The basic principles of tomography and velocimetry techniques are outlined, along with advantages and limitations inherent to each technique. The hydrodynamic and structural information yielded by these techniques is illustrated through a literature survey on their successful applications to the study of multiphase systems in such fields as particulate solids processes, fluidization engineering, porous media, pipe flows, transport within packed beds and sparged reactors, etc.
Article
This paper details the development of a tomographic technique for imaging gas - solid flow distributions in pneumatic conveying pipelines. The technique utilizes ultrasonic transmission-mode measurements constrained to the megahertz region. Image reconstruction is performed by an efficient backprojection method implemented with standard graphics algorithms. Simulated reconstructions of dense and dilute distributions are presented. These results demonstrate the capabilities and limitations of the technique. Aspects of transducer array design are also addressed. An optimal arrangement for imaging dense phase flow distributions is derived and the characteristics of air-loaded and water-loaded, matched and unmatched peizoceramic transducers are evaluated. The validity of the technique is demonstrated using a low-frequency (72 kHz) system constructed with prototype fan-shaped-beam electrostatic transducers. Further development of the technique for practical application is discussed.
Article
A review of existing and developing process tomographic instrumentation suitable for characterising dry and wet particulate systems is presented. Factors governing the selection of sensing techniques appropriate for static and dynamic imaging of a wide range of single and multiphase particulate processes are discussed. The paper presents a systematic comparison of different image reconstruction methods. Examples of existing, developing and proven applications are cited. Future needs and ways in which these needs can be met are suggested, including the use of multi-modality methods in which different types of sensing methods are embodied in a single tomographic instrument.
Article
A simple way to develop non-linear PLS models is presented, INLR (implicit non-linear latent variable regression). The paper shows that by simply added squared x-variables x2a, both the square and cross terms of the latent variables are implicitly included in the resulting PLS model. This approach works when X itself is well modelled by a projection model T*PT. Hence, if a latent structure is present in X, it is not necessary to include the cross terms of the X-variables in the polynomial expansion. Analogously, with cubic non-linearities, expanding X with cubic terms x3a is sufficient.INLR is attractive in that all essential features of PLS are preserved i.e. (a) it can handle many noisy and collinear variables, (b) it is stable and gives reliable results and (c) all PLS plots and diagnostics still apply.The principles of INLR are outlined and illustrated with three chemical examples where INLR improved the modelling and predictions compared with ordinary linear PLS. © 1997 John Wiley & Sons, Ltd.
Article
With the development of measurement instrumentation methods and metrology, one is very often able to rigorously specify the uncertainty associated with each measured value (e.g. concentrations, spectra, process sensors). The use of this information, along with the corresponding raw measurements, should, in principle, lead to more sound ways of performing data analysis, since the quality of data can be explicitly taken into account. This should be true, in particular, when noise is heteroscedastic and of a large magnitude. In this paper we focus on alternative multivariate linear regression methods conceived to take into account data uncertainties. We critically investigate their prediction and parameter estimation capabilities and suggest some modifications of well-established approaches. All alternatives are tested under simulation scenarios that cover different noise and data structures. The results thus obtained provide guidelines on which methods to use and when. Interestingly enough, some of the methods that explicitly incorporate uncertainty information in their formulations tend to present not as good performances in the examples studied, whereas others that do not do so present an overall good performance. Copyright © 2005 John Wiley & Sons, Ltd.
Article
Multivariate monitoring and control schemes based on latent variable methods have been receiving increasing attention by industrial practitioners in the last 15 years. Several companies have enthusiastically adopted the methods and have reported many success stories. Applications have been reported where multivariate statistical process control, fault detection and diagnosis is achieved by utilizing the latent variable space, for continuous and batch processes, as well as, for process transitions as for example start ups and re-starts. This paper gives an overview of the latest developments in multivariate statistical process control (MSPC) and its application for fault detection and isolation (FDI) in industrial processes. It provides a critical review of the methodology and describes how it is transferred to the industrial environment. Recent applications of latent variable methods to process control as well as to image analysis for monitoring and feedback control are discussed. Finally it is emphasized that the multivariate nature of the data should be preserved when data compression and data preprocessing is applied. It is shown that univariate data compression and reconstruction may hinder the validity of multivariate analysis by introducing spurious correlations. Copyright © 2005 John Wiley & Sons, Ltd.
Article
In this paper we develop the mathematical and statistical structure of PLS regression. We show the PLS regression algorithm and how it can be interpreted in model building. The basic mathematical principles that lie behind two block PLS are depicted. We also show the statistical aspects of the PLS method when it is used for model building. Finally we show the structure of the PLS decompositions of the data matrices involved.
Article
To implement on-line process monitoring techniques such as principal component analysis (PCA) or partial least squares (PLS), it is necessary to extract data associated with the normal operating conditions from the plant historical database for calibrating the models. One way to do this is to use robust outlier detection algorithms such as resampling by half-means (RHM), smallest half volume (SHV), or ellipsoidal multivariate trimming (MVT) in the off-line model building phase. While RHM and SHV are conceptually clear and statistically sound, the computational requirements are heavy. Closest distance to center (CDC) is proposed in this paper as an alternative for outlier detection. The use of Mahalanobis distance in the initial step of MVT for detecting outliers is known to be ineffective. To improve MVT, CDC is incorporated with MVT. The performance was evaluated relative to the goal of finding the best half of a data set. Data sets were derived from the Tennessee Eastman process (TEP) simulator. Comparable results were obtained for RHM, SHV, and CDC. Better performance was obtained when CDC is incorporated with MVT, compared to using CDC and MVT alone. All robust outlier detection algorithms outperformed the standard PCA algorithm. The effect of auto scaling, robust scaling and a new scaling approach called modified scaling were investigated. With the presence of multiple outliers, auto scaling was found to degrade the performance of all the robust techniques. Reasonable results were obtained with the use of robust scaling and modified scaling.
Article
This paper presents an application of electrical resistance tomography to the investigation of mixing processes at plant scale. An 8-plane 16-electrode ring sensor installed within a stirred tank with a 1.5 m inner diameter is described. Three-dimensional and non-stationary behaviour of mixing processes are illustrated by the images obtained simultaneously from eight axial levels along the tank height. The results illustrate air-core vortex detection, miscible fluid mixing and gas-liquid mixing processes.
Article
This paper summarizes the characteristics of electrical tomography techniques, highlights their current applications and gives an indication of their future applications in the chemical process engineering environment.
Article
Electrical tomographic imaging has been applied to a broad range of chemical engineering processes, including: bubble columns, fluidised beds, pneumatic transport, liquid mixing, cyclonic separation, pressure filtration, liquid pipe-flow, polymerisation, emergency depressurisation, and paste extrusion. Two imaging approaches are described, electrical capacitance tomography (ECT) and electrical impedance tomography (EIT). To date, these have primarily been used as low-cost research tools for studying process dynamics, although they potentially may also act as sensors permitting on-line monitoring and control. Various aspects of design, operation and data processing are described, along with a review of applications in the literature.
Article
With the growth of computer usage at all levels in the process industries, the volume of available data has also grown enormously, sometimes to levels that render analysis difficult. Most of this data may be characterized as historical in the sense that it was not collected on the basis of experiments designed to test specific statistical hypotheses. Consequently, the resulting datasets are likely to contain unexpected features (e.g. outliers from various sources, unsuspected correlations between variables, etc.). This observation is important for two reasons: first, these data anomalies can completely negate the results obtained by standard analysis procedures, particularly those based on squared error criteria (a large class that includes many SPC and chemometrics techniques). Secondly and sometimes more importantly, an understanding of these data anomalies may lead to extremely valuable insights. For both of these reasons, it is important to approach the analysis of large historical datasets with the initial objective of uncovering and understanding their gross structure and character. This paper presents a brief survey of some simple procedures that have been found to be particularly useful at this preliminary stage of analysis.
Article
In the last two decades Soft Sensors established themselves as a valuable alternative to the traditional means for the acquisition of critical process variables, process monitoring and other tasks which are related to process control. This paper discusses characteristics of the process industry data which are critical for the development of data-driven Soft Sensors. These characteristics are common to a large number of process industry fields, like the chemical industry, bioprocess industry, steel industry, etc. The focus of this work is put on the data-driven Soft Sensors because of their growing popularity, already demonstrated usefulness and huge, though yet not completely realised, potential. A comprehensive selection of case studies covering the three most important Soft Sensor application fields, a general introduction to the most popular Soft Sensor modelling techniques as well as a discussion of some open issues in the Soft Sensor development and maintenance and their possible solutions are the main contributions of this work.
Article
The present paper is mainly expository, giving a review of Nonlinear Iterative PArtial Least Squares (NIPALS) modelling, an approach of general scope for cause-effect inference and prediction. What is new in the paper should lie in the arrangement of the material and in the emphasis on the explicit definition of latent variables that is a characteristic feature of NIPALS modelling.
Article
This paper presents a review of electrical tomography methods for investigating, monitoring and controlling gas–solids and liquid–solids systems. The physical laws governing the electrical measurements and issues associated with image reconstruction are described in some detail. Experimental results, obtained for a number of case studies conducted in the pilot plant scale and industrial rigs, are presented. These include circulating fluidised bed, pneumatic and hydraulic conveyor, multiphase flow metering and hydrocyclone flow. Instantaneous images, captured with the speed up to 200 frames per second, illustrate how flow patterns vary, and reveal the dynamic behaviour of two-phase systems. Application of electrical tomography for control and fault diagnosis in industrial systems is addressed — the examples include dense pneumatic conveying and hydrocyclone performance.
Article
PLS-regression (PLSR) is the PLS approach in its simplest, and in chemistry and technology, most used form (two-block predictive PLS). PLSR is a method for relating two data matrices, X and Y, by a linear multivariate model, but goes beyond traditional regression in that it models also the structure of X and Y. PLSR derives its usefulness from its ability to analyze data with many, noisy, collinear, and even incomplete variables in both X and Y. PLSR has the desirable property that the precision of the model parameters improves with the increasing number of relevant variables and observations.This article reviews PLSR as it has developed to become a standard tool in chemometrics and used in chemistry and engineering. The underlying model and its assumptions are discussed, and commonly used diagnostics are reviewed together with the interpretation of resulting parameters.Two examples are used as illustrations: First, a Quantitative Structure–Activity Relationship (QSAR)/Quantitative Structure–Property Relationship (QSPR) data set of peptides is used to outline how to develop, interpret and refine a PLSR model. Second, a data set from the manufacturing of recycled paper is analyzed to illustrate time series modelling of process data by means of PLSR and time-lagged X-variables.
Article
Pixel-based tomography has been used with great success for medical applications where it is most appropriate but this approach does not always transfer easily to industrial applications. For example pixel-based image reconstruction from electrical impedance tomography measurements is well known to be an ill-posed problem and with high noise levels such tomograms cannot be reliable. An alternative approach is to use a parametric representation of the tomogram for which reconstruction can be better posed. The primary reason for parametric modelling, however, is interpretation. This paper compares parametric modelling to other methods and then gives an example of the method for an application to a hydrocyclone. Two tomographic modalities are discussed and the results from parametric modelling are validated. This example demonstrates the great power achievable from a parametric modelling approach to tomographic imaging of industrial processes.
Article
The latent variable multivariate regression (LVMR) model is made up of two sets of variables, X and Y, both of which contain a latent variable structure plus random error. The wide applicability of this model is illustrated in this paper with several real examples. The chemometrics community has developed several empirical methods to estimate the latent structure in this model, including partial least squares regression (PLS) and principal components regression (PCR). However, the majority of the statistical work in this area relies on the standard or reduced rank regression models, thus ignoring the latent variable nature of the X data. Considering methods like PLS and PCR in the context of these models has led to some misleading conclusions. This paper reaffirms the claim made frequently in the chemometrics literature that the reason PLS and PCR have been successful is that they take into account the latent variable structure in the data. It is also shown through several examples that the LVMR model provides the means to model more effectively many datasets in applied science resulting in improved techniques for process monitoring, experimental design and prediction. The focus in this paper is on the general model rather than on parameter estimation methods.
Article
This paper describes the results of a series of experiments using Electrical Impedance Tomography (EIT) to investigate the distribution of solids inside a hydrocyclone. A case study is presented for a 44 mm diameter hydrocyclone. Amongst the characteristics examined are: (i) identification of particle distribution inside the separator, (ii) air core formation as a function of the feed rate and solids concentration, and (iii) the manner in which the air core behaviour can be related to the type of underflow discharge (spray, rope).