Article

A particle filter-based data assimilation framework for discrete event simulations

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

With the advent of new sensor technologies and communication solutions, the availability of data for discrete event systems has greatly increased. This motivates research on data assimilation for discrete event simulations that has not yet fully matured. This paper presents a particle filter-based data assimilation framework for discrete event simulations. The framework is formally defined based on the Discrete Event System Specification formalism. To effectively apply particle filtering in discrete event simulations, we introduce an interpolation operation that considers the elapsed time (i.e., the time elapsed since the last state transition) when retrieving the model state (which was ignored in related work) in order to obtain updated state values. The data assimilation problem finally boils down to estimating the posterior distribution of a state trajectory with variable dimension. This seems to be problematic; however, it is proven that in practice we can safely apply the sequential importance sampling algorithm to update the random measure (i.e., a set of particles and their importance weights) that approximates this posterior distribution of the state trajectory with variable dimension. To illustrate the working of the proposed data assimilation framework, a case is studied in a gold mine system to estimate truck arrival times at the bottom of the vertical shaft. The results show that the framework is able to provide accurate estimation results in discrete event simulations; it is also shown that the framework is robust to errors both in the simulation model and in the data.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

Article
Full-text available
The globalization of the world’s economies is a major challenge to local industry and it is pushing the manufacturing sector to its next transformation – predictive manufacturing. In order to become more competitive, manufacturers need to embrace emerging technologies, such as advanced analytics and cyber-physical system-based approaches, to improve their efficiency and productivity. With an aggressive push towards “Internet of Things”, data has become more accessible and ubiquitous, contributing to the big data environment. This phenomenon necessitates the right approach and tools to convert data into useful, actionable information.
Conference Paper
Full-text available
This paper provides simulation practitioners and consumers with a grounding in how discrete-event simulation software works. Topics include discrete-event systems; entities, resources, control elements and operations; simulation runs; entity states; entity lists; and their management. The implementations of these generic ideas in AutoMod, SLX, ExtendSim, and Simio are described. The paper concludes with several examples of "why it matters" for modelers to know how their simulation software works, including discussion of AutoMod, SLX, ExtendSim, Simio, Arena, ProModel, and GPSS/H.
Article
Full-text available
Abstract These training course lecture notes are an advanced and comprehensive presentation of most data assimilation methods,that are considered useful in applied meteorology,and oceanography today. Some are considered old-fashioned but they are still valuable for low cost applications. Others have never been implemented yet in realistic applications, but they are regarded as the future of data assimilation. A mathematical approach has been chosen, which allows a compact and rigorous presentation of the algorithms, though only some basic mathematical competence is required from the reader . This document has been put together with the help of pre vious lecture notes, which are now superseded:
Article
Full-text available
The DEVS formalism was conceived by Zeigler [Zei84a, Zei84b] to provide a rigourous common basis for discrete-event modelling and simulation. For the class of formalisms denoted as discrete-event [Nan81], system models are described at an abstraction level where the time base is continuous (), but during a bounded time-span, only a finite number of relevant events occur. These events can cause the state of the system to change. In between events, the state of the system does not change. This is unlike continuous models in which the state of the system may change continuously over time. As an extension of Finite State Automata, the DEVS (Discrete Event Systems) formalism captures concepts from Discrete Event simulation. As such it is a sound basis for meaningful model exchange in the Discrete Event realm.
Conference Paper
Full-text available
Here we describe recent advances in particle filtering algorithms and models for tracking of manoeuvring objects in clutter. The methods develop on the basic variable dimension particle filtering algorithms introduced in S.J. Godsill and J. Vermaak (2004), in which a new type of dynamical model is introduced whose state variables arrive at unknown times relative to the observation process (hence 'variable rate'). Targets are assumed to follow deterministic trajectories in between state times, determined by an appropriate model, such as the differential equation model for the object. The framework allows for automatic modelling and estimation of the trajectories of targets using an adaptation of particle filtering methods (A. Doucet et al., 2000) into the variable dimension setting. In this paper we introduce more effective sampling schemes for the variable rate setting that ensure future states are only generated as and when required, new dynamical models appropriate for manoeuvring objects, and new observation models under the assumption of a nonhomogeneous Poisson process for both targets and clutter. Simulations show very effective tracking performance under challenging settings which cannot be emulated in a standard fixed rate scheme
Article
Full-text available
Simulating wildfire spread and containment remains a challenging problem due to the complexity of fire behavior. In this paper, the authors present an integrated simulation environment for surface wildfire spread and containment called DEVS-FIRE. DEVS-FIRE is based on the discrete event system specification (DEVS) and uses a cellular space model for simulating wildfire spread and agent models for simulating wildfire containment. The cellular space model incorporates real spatial fuels data, terrain data and temporal weather data into the prediction of wildfire behavior across both time and space. DEVS-FIRE is designed to be integrated with stochastic optimization models that use the scenario results from the simulation to determine an optimal mix of firefighting resources to dispatch to a wildfire. Preliminary computational experiments with fuel, terrain and weather data for a real forest demonstrate the viability of the integrated simulation environment for wildfire spread and containment.
Conference Paper
Full-text available
This contribution is devoted to the comparison of various resampling approaches that have been proposed in the literature on particle filtering. It is first shown using simple arguments that the so-called residual and stratified methods do yield an improvement over the basic multinomial resampling approach. A simple counter-example showing that this property does not hold true for systematic resampling is given. Finally, some results on the large-sample behavior of the simple bootstrap filter algorithm are given. In particular, a central limit theorem is established for the case where resampling is performed using the residual approach.
Conference Paper
Even though DEVS provides a convenient framework for discrete event modeling, it can be observed that there is a large difference between the abstraction levels of conceptual models and their specifications in DEVS. Equation based modeling is a declarative modeling style that has become popular for describing continuous dynamic systems in a way that is conceptually closer to conceptual models. Although equation based modeling languages usually include the notion of a discrete event, they are not a natural choice for discrete event modeling. This paper combines equation based modeling with constraint solving in order to create a fully equation based discrete event modeling style. The approach centers around phases, which are described by a constraint (an inequality) and behavior (a set of equations). The resulting models are fully compliant with DEVS: phase descriptions are transformed to internal transitions and time advance. The main contribution of this paper is the adoption of constraints and relations as the sole mechanism to specify models and using an algebraic solver to infer transitions and time advance. Compared to equation based systems, the contribution is to make constraint inequations first-class citizens and to use them symbolically to determine events.
Article
Assimilating real-time sensor data into large-scale spatial-temporal simulations, such as simulations of wildfires, is a promising technique for improving simulation results. This asks for advanced data assimilation methods that can work with the complex structures and nonlinear behaviors associated with the simulation models. This article presents a data assimilation framework using Sequential Monte Carlo (SMC) methods for wildfire spread simulations. The models and algorithms of the framework are described, and experimental results are provided. This work demonstrates the feasibility of applying SMC methods to data assimilation of wildfire spread simulations. The developed framework can potentially be generalized to other application areas where sophisticated simulation models are used.
Article
Research on using high-resolution event-based data for traffic modeling and control is still at early stage. In this paper, we provide a comprehensive overview on what has been achieved and also think ahead on what can be achieved in the future. It is our opinion that using high-resolution event data, instead of conventional aggregate data, could bring significant improvements to current research and practices in traffic engineering. Event data records the times when a vehicle arrives at and departs from a vehicle detector. From that, individual vehicle’s on-detector-time and time gap between two consecutive vehicles can be derived. Such detailed information is of great importance for traffic modeling and control. As reviewed in this paper, current research has demonstrated that event data are extremely helpful in the fields of detector error diagnosis, vehicle classification, freeway travel time estimation, arterial performance measure, signal control optimization, traffic safety, traffic flow theory, and environmental studies. In addition, the cost of event data collection is low compared to other data collection techniques since event data can be directly collected from existing controller cabinet without any changes on the infrastructure, and can be continuously collected in 24/7 mode. This brings many research opportunities as suggested in the paper.
Article
DEVS-FIRE is a discrete event system specification (DEVS) model for simulating wildfire spread and suppression. It employs a cellular space model to simulate fire spread and agent models that interact with the cellular space to simulate fire suppression with realistic tactics. The complex interplay among forest cells and agents calls for formal treatment of the fire spread and fire suppression models to verify the correctness of DEVS-FIRE. This paper gives formal design specifications of fire spread and suppression agent models used in DEVS-FIRE and applies DEVS-FIRE to both artificially generated and real topography, fuels and weather data for a study area located in the US state of Texas. The paper also develops a new method, called pre_Schedule, for scheduling ignition events of forest cells more efficiently than the original onTime_Schedule event scheduling method used in DEVS-FIRE. Simulation results show the performance improvement of the new method, and demonstrate the utility of DEVS-FIRE as a viable discrete event model for wildfire simulations.
Article
In k\hbox{-}{\rm{means}} clustering, we are given a set of n data points in d\hbox{-}{\rm{dimensional}} space {\bf{R}}^d and an integer k and the problem is to determine a set of k points in {\bf{R}}^d, called centers, so as to minimize the mean squared distance from each data point to its nearest center. A popular heuristic for k\hbox{-}{\rm{means}} clustering is Lloyd's algorithm. In this paper, we present a simple and efficient implementation of Lloyd's k\hbox{-}{\rm{means}} clustering algorithm, which we call the filtering algorithm. This algorithm is easy to implement, requiring a kd-tree as the only major data structure. We establish the practical efficiency of the filtering algorithm in two ways. First, we present a data-sensitive analysis of the algorithm's running time, which shows that the algorithm runs faster as the separation between clusters increases. Second, we present a number of empirical studies both on synthetically generated data and on real data sets from applications in color quantization, data compression, and image segmentation.
Article
The purpose of this paper is to provide a comprehensive presentation and interpretation of the Ensemble Kalman Filter (EnKF) and its numerical implementation. The EnKF has a large user group, and numerous publications have discussed applications and theoretical aspects of it. This paper reviews the important results from these studies and also presents new ideas and alternative interpretations which further explain the success of the EnKF. In addition to providing the theoretical framework needed for using the EnKF, there is also a focus on the algorithmic formulation and optimal numerical implementation. A program listing is given for some of the key subroutines. The paper also touches upon specific issues such as the use of nonlinear measurements, in situ profiles of temperature and salinity, and data which are available with high frequency in time. An ensemble based optimal interpolation (EnOI) scheme is presented as a cost-effective approach which may serve as an alternative to the EnKF in some applications. A fairly extensive discussion is devoted to the use of time correlated model errors and the estimation of model bias.
Conference Paper
Wildfire propagation is a complex process influenced by many factors. Simulation models of wildfire spread, such as DEVS-FIRE, are important tools for studying fire be- havior. This paper presents how the sequential Monte Carlo methods, i.e., particle filters, can work together with DEVS-FIRE for better simulation and prediction of wild- fire. We define an application framework of particle filters for the problem of wildfire spread using the DEVS-FIRE model, and discuss several applications. A case study ex- ample is provided and preliminary results are presented.
Article
In this paper, we propose to model basic continuous component of dynamic systems in a way that facilitate the transposition to a G-DEVS model, which is a paradigm that offers the ability to develop a uniform approach to model hybrid systems (abstraction closer to real systems), i.e. composed of both continuous and discrete components. In that, our approach is clearly a discrete event approach where the choice of the time interval between two steps of calculation is based on the behavior changes of the process and no longer constant and/or a priori given, the underlying objective being to strictly satisfy to a given accuracy with a low computational cost. More precisely, we present a generalized discrete event model of an integrator using polynomial descriptions of input–output trajectories. We shall show its great capability of easily handling the delicate problem of input discontinuities, and a detailed comparison with classical discrete time simulation methods, will demonstrate its relevant properties. Several examples, including a complete hybrid system, will illustrate our results.
Article
Time and state descriptions form the core of a simulation model representation. The historical influence of initial application areas and the exigencies of language implementations have created a muddled view of the time and state relationships. As a consequence, users of simulation programming languages work in relative isolation; model development, simulation application, model portability, and the communication of results are inhibited and simulation practice fails to contribute to the recognition of an underlying foundation or integrating structure. A model representation structure has been forged from a small set of basic definitions which carefully distinguish the state and time relationships. This paper focuses on the coordination of the time and state concepts using “object” as the link. In addition to clarifying the relationships, the structure relates the concept of “state-sequenced simulation” to the variations in time flow mechanisms. In conclusion, some speculations are offered regarding alternative algorithms for time flow mechanisms.
Article
Atmosphere and ocean systems can be simulated effectively by discrete numerical models and, provided that the initial states of the system are known, accurate forecasts of future dynamical behaviour can be determined. Complete information defining all of the states of the system at a specified time are, however, rarely available. Moreover, both the models and the measured data contain inaccuracies and random noise. In this case, observations of the system measured over an interval of time can be used in combination with the model equations to derive estimates of the expected values of the states. The problem of constructing a ‘state-estimator,’ or ‘observer,’ for these systems can be treated by using feedback design techniques from control theory. For the very large nonlinear systems arising in climate, weather and ocean prediction, however, traditional control techniques are not practicable and ‘data assimilation’ schemes are used instead to generate accurate state-estimates (see, for example, Daley, 1994; Bennett, 1992).
Conference Paper
Sequences of events are an important form of data that occurs in many application domains, such as telecommunications, biostatistics, user interface design, etc. We present a simple model for measuring the similarity of event sequences, and show that the resulting measure of distance can be efficiently computed using a form of dynamic programming
Article
Increasingly, for many application areas, it is becoming important to include elements of nonlinearity and non-Gaussianity in order to model accurately the underlying dynamics of a physical system. Moreover, it is typically crucial to process data on-line as it arrives, both from the point of view of storage costs as well as for rapid adaptation to changing signal characteristics. In this paper, we review both optimal and suboptimal Bayesian algorithms for nonlinear/non-Gaussian tracking problems, with a focus on particle filters. Particle filters are sequential Monte Carlo methods based on point mass (or "particle") representations of probability densities, which can be applied to any state-space model and which generalize the traditional Kalman filtering methods. Several variants of the particle filter such as SIR, ASIR, and RPF are introduced within a generic framework of the sequential importance sampling (SIS) algorithm. These are discussed and compared with the standard EKF through an illustrative example
Article
Standard algorithms in tracking and other state-space models assume identical and synchronous sampling rates for the state and measurement processes. However, real trajectories of objects are typically characterized by prolonged smooth sections, with sharp, but infrequent, changes. Thus, a more parsimonious representation of a target trajectory may be obtained by direct modeling of maneuver times in the state process, independently from the observation times. This is achieved by assuming the state arrival times to follow a random process, typically specified as Markovian, so that state points may be allocated along the trajectory according to the degree of variation observed. The resulting variable dimension state inference problem is solved by developing an efficient variable rate particle filtering algorithm to recursively update the posterior distribution of the state sequence as new data becomes available. The methodology is quite general and can be applied across many models where dynamic model uncertainty occurs on-line. Specific models are proposed for the dynamics of a moving object under internal forcing, expressed in terms of the intrinsic dynamics of the object. The performance of the algorithms with these dynamical models is demonstrated on several challenging maneuvering target tracking problems in clutter.
Article
An overview of discrete-event dynamic system (DEDSs) is given, explaining what they are and discussing their modeling. The types of models are outlined, and factors to be taken into account in modeling are examined. They are: the discontinuous nature of discrete events; the continuous nature of most performance measures; the importance of probabilistic formulation; the need for hierarchical analysis; the presence of dynamics; and the feasibility of the computational burden. It is stressed that DEDS research must include experiment as well as theory
What is the ensemble Kalman filter and how well does it work?
  • S Gillijns
  • O Mendoza
  • J Chandrasekar
Gillijns S, Mendoza O, Chandrasekar J, et al. What is the ensemble Kalman filter and how well does it work? In: Proceedings of the 2006 American control conference, Minneapolis, MN, USA, 14-16 June 2006, pp.4448-4453. Piscataway, NJ: IEEE.
Lagrangian multi-class traffic state estimation
  • Y Yuan
Yuan Y. Lagrangian multi-class traffic state estimation. PhD Thesis, Delft University of Technology, 2013.
Data assimilation in agent based simulation of smart environments using particle filters
  • M Wang
  • X Hu
Wang M and Hu X. Data assimilation in agent based simulation of smart environments using particle filters. Simulat Model Pract Theor 2015; 56: 36-54.
Data assimilation for spatial temporal simulations using localized particle filtering
  • Y Long
Long Y. Data assimilation for spatial temporal simulations using localized particle filtering. PhD Thesis, Georgia State University, 2016.
Sequential Monte Carlo based data assimilation framework and toolkit for dynamic system simulations
  • P Wu
Wu P. Sequential Monte Carlo based data assimilation framework and toolkit for dynamic system simulations. PhD Thesis, Georgia State University, 2017.