Figure 1 - uploaded by Stewart Robinson
Content may be subject to copyright.
Simulation Model Verification and Validation in the Modelling Process  

Simulation Model Verification and Validation in the Modelling Process  

Context in source publication

Context 1
... logical sequence to these steps, they are not necessarily performed in a strictly sequential manner and that iteration through the steps is necessary. Figure 1, which is adapted from Sargent (1992), shows how V&V map onto the modelling process. What this shows is that each stage in the modelling process requires, in parallel, a verification or validation process. ...

Similar publications

Article
Full-text available
Maintainability parameter is the basis of maintainability quantitative analysis, and it is widely used in maintainability verification, demonstration or evaluation. The nonlinear least-squares parameter evaluation methods of maintenance time distribution based on particle swarm optimization is proposed. Parameter's confidence interval is chosen as...
Article
Full-text available
We introduce a local feature descriptor for large-scale image retrieval applications, called DELF (DEep Local Feature). The new feature is based on convolutional neural networks, which are trained without object- and patch-level annotations on a landmark image dataset. To enhance DELF's image retrieval performance, we also propose an attention mech...
Preprint
Full-text available
We introduce a probabilistic robustness measure for Bayesian Neural Networks (BNNs), defined as the probability that, given a test point, there exists a point within a bounded set such that the BNN prediction differs between the two. Such a measure can be used, for instance, to quantify the probability of the existence of adversarial examples. Buil...
Technical Report
Full-text available
VeriDevOps aims at bringing together fast and cost-effective security verification through formal modelling and verification, as well as test generation, selection, execution and analysis capabilities to enable companies to deliver quality systems with confidence in a fast-paced DevOps environment. Security requirements are intended to be processed...
Conference Paper
Full-text available
Safety cases embody arguments that demonstrate how safety properties of a system are upheld. Such cases implicitly document the barriers that must exist between hazards and vulnerable components of a system. For safety certification, it is the analysis of these barriers that provide confidence in the safety of the system. The explicit representatio...

Citations

... In contrast to the computational studies, physical case studies provide a tangible and experimental approach to validate the effectiveness of new methods and tools in real-world prototyping scenarios [284]. The following physical case studies aim to ascertain the tools effectiveness and practical feasibility towards addressing research question RQ2/RQ3. ...
Thesis
Full-text available
The tensions between prototype iteration and its associated costs, both in terms of time and resource, are identified as a critical challenge in contemporary prototyping practices. Whilst it is acknowledged that increased prototyping correlates to improved products and greater product innovation, the cost of prototype iteration, particularly in the later stages of the development process is a key factor in determining the extent of prototyping that is undertaken, thus the potential for product success. Addressing this challenge, the research presented in this thesis investigates the development of a methodology for prototype remanufacturing in New Product Development (NPD). Remanufacturing, defined as the restoration of used products to a like-new functional state, emerges as a strategic approach with potential to significantly improve the efficiency of prototype iteration. By remanufacturing, an existing prototype could be ’edited’ to embody only the necessary change between design iterations, promising to reduce not only the time and cost associated with prototyping but also aligning with the shift towards sustainable practices in NPD. This thesis therefore aims to investigate the unique challenges and opportunities that remanufacturing presents when applied in the context of prototyping in NPD. The research adopts a methodological framework encompassing: a comprehensive analysis of design change between high-fidelity prototype iterations, exploring the potential benefits of remanufacturing, and the formalisation of remanufacturing strategies for prototype remanufacturing. Central to this research is the development and optimisation of a computational-remanufacturing tool, designed to enhance the computational efficiency of the prototype remanufacturing process. The research addresses key issues such as the fidelity-efficiency trade-off in prototyping and the need for rapid iteration processes, particularly in the later stages of NPD. Through a series of case studies, simulations, and empirical evaluations, the thesis demonstrates the practical feasibility and benefits of implementing a remanufacturing method in prototype development. In conclusion, the research establishes remanufacturing as a method with significant potential to augment the prototyping process. Findings show remanufacturing to yield significant time and resource savings whilst maintaining a relative level of prototype fidelity and functionality between design iterations. The research offers a novel approach to prototype development, in particular towards iteration, marked by increased efficiency and sustainability. The thesis concludes with guidance for the integration of remanufacturing strategies to support design and engineering practitioners.
... A significant element of any simulation study is the verification and validation of the simulation model [43]. We explain the application procedures of them in this section. ...
Article
Full-text available
This paper studies e-grocery order fulfillment policies by leveraging both customer and e-grocery-based data. Through the utilization of historical purchase data, product popularity trends, and delivery patterns, allocation strategies are informed to optimize performance metrics such as fill rate, carbon emissions, and cost per order. The study aims to conduct a sensitivity analysis to identify key drivers influencing these performance metrics. The results highlight that fulfillment policies optimized with the utilization of the mentioned data metrics demonstrate superior performance compared to policies not informed by data. These findings underscore the critical role of integrating data-driven models in e-grocery order fulfillment. Based on the outcomes, a grocery allocation policy, considering both proximity and product availability, emerges as promising for simultaneous improvements in several performance metrics. The study recommends that e-grocery companies leverage customer data to design and optimize delivery-oriented policies and strategies. To ensure adaptability to new trends or changes in delivery patterns, continual evaluation and improvement of e-grocery fulfillment policies are emphasized.
... This includes both black-box and white-box validation. Although black-box validation, which considers whether the overall behaviour of the model represents the behaviour of the real system with sufficient accuracy for its purpose (Kleijnen & Wan, 2007;Robinson, 1997), requires a completed simulation model, modellers should plan for this activity at the model conceptualisation phase. Therefore, planning for black-box validation is discussed within the framework. ...
Article
Full-text available
The growing complexity of systems and problems that stakeholders from the private and public sectors have sought advice on has led systems modellers to increasingly use multimethodology and to combine multiple OR/MS methods. This includes hybrid simulation that combines two or more of the following methods: system dynamics (SD), discrete-event simulation, and agent-based models (ABM). Although a significant number of studies describe the application of hybrid simulation across different domains, research on the theoretical and practical aspects of combining simulation modelling methods, particularly the combining of SD and ABM, is still limited. Existing frameworks for combining simulation methods are high-level and lack methodological clarity and practical guidance on modelling decisions and elements specific to hybrid simulation that modellers need to consider. This paper proposes a practical framework for developing a conceptual hybrid simulation model that is built on reviews and reflections of theoretical and application literature on combining methods. The framework is then used to inform and guide the process of conceptual model building for a case study in controlling the spread of COVID-19 in care homes. In addition, reflection on the use of the framework for the case study led to refining the framework itself. This case study is also used to demonstrate how the framework informs the structural design of a hybrid simulation model and relevant modelling decisions during the conceptualisation phase.
... Conducting pilot and case studies to validate regional wealth models is critical in developing and maintaining accurate and reliable models of wealth that are adaptable as new information becomes available and necessitates change [76][77][78]. Multiple authors have identified several steps necessary for model validation studies [79][80][81][82][83]. Those steps may include (but are not limited to) defining the scope and objectives of the pilot study, selecting the appropriate data sources, designing the study methodology, collecting and analyzing data, and evaluating the results. ...
Article
Full-text available
National-level studies present the development techniques and challenges of sustaining energy-rich economies, particularly those in the developing world. However, examples of the application and interpolation of these broad-scale analyses to the regional level are scarce. Conversely, methods used at national levels are often infeasible when using higher-resolution regional or local data. Ultimately, progress in developing, managing, and advancing regional wealth databases and models is significantly missing from the literature. Herein, proposed pathways and general development frameworks are presented based on the presumptive constancy of total capital stock. Processes are outlined for acquiring information (data) and developing models to serve as a basis for qualitative and quantitative analyses of sustainable development policymaking decisions. We present a discussion around the sustainable wealth of energy-rich regions, and we suggest potential workflow methods for developing regional wealth knowledge bases and regional wealth models (RWMs). Structural scaffolding opportunities are presented for the validation of RWMs using pilot studies, followed by the process of disseminating modeling outcomes. Finally, we offer recommendations and needed innovations to advance the development of RWMs. The objectives of this article are not to provide a comprehensive literature review or consider all potential perspectives but rather to identify tools and necessary enhancements to established methods for assessing and modeling regional wealth and provide an inroad for readers wishing to learn more. The increased awareness generated through this article will mobilize assistance and generate new information that will strengthen this emergent area of research to intensify regional wealth sustainability for future generations.
... Verification and validation of a simulation model (adapted from[22]). ...
Chapter
Full-text available
When designing simulations, the objective is to create a representation of a real-world system or process to understand, analyze, predict, or improve its behavior. Typically, the first step in assessing the credibility of a simulation model for its intended purpose involves conducting a face validity check. This entails a subjective assessment by individuals knowledgeable about the system to determine if the model appears plausible. The emerging field of process mining can aid in the face validity assessment process by extracting process models and insights from event logs generated by the system being simulated. Process mining techniques, combined with the visual representation of discovered process models, offer a novel approach for experts to evaluate the validity and behavior of simulation models. In this context, outliers can play a key role in evaluating the face validity of simulation models by drawing attention to unusual behaviors that can either raise doubts about or reinforce the model’s credibility in capturing the full range of behaviors present in the real world. Outliers can provide valuable information that can help identify concerns, prompt improvements, and ultimately enhance the validity of the simulation model. In this paper, we propose an approach that uses process mining techniques to detect outlier behaviors in agent-based simulation models with the aim of utilizing this information for evaluating face validity of simulation models. We illustrate our approach using the Schelling segregation model.
... Therefore, the model is working as expectation providing no logical error which satisfies the verification condition. Validation is the process of ensuring the accuracy of the model (Robinson, 1997;Moazzami et al., 2013;Madadi et al., 2013). The one year inventory level provided by the company is considered to validate the dynamic model. ...
... Rather, model output (simulated data) and data from the real system are what have been referred to as "two moving targets" that we try to overlay one upon the other (Rykiel, 1996). The data used to validate the scale and/or the model must themselves be validated (Robinson, 1997). Data validation requires that the data and its processing meet a specified standard (quality assurance) and that the interpretation of the data is demonstrably valid (Sargent, 2008). ...
Article
Full-text available
Models of socio-environmental or social-ecological systems (SES) commonly address problems requiring interdisciplinary scientific expertise and input from a heterogeneous group of stakeholders. In SES modelling multiple interactions occur on different scales among various phenomena. These scale phenomena include the technical, such as system variables, process detail, inputs and outputs, which most often require spatial, temporal, thematic and organisational choices. From a good practice and project efficiency perspective, the problem scoping and conceptual model formulation phase of modelling is the one to address well from the outset. During this phase, intense and substantive discussions should arise regarding appropriate scales at which to represent the different phenomena. Although the details of these discussions influence the path of model development, they are seldom documented and as a result often forgotten. We draw upon personal experience with existing protocols and communications in recent literature to propose preliminary guidelines for documenting these early discussions about the scale(s) of the studied phenomena. Our guidelines aim to aid modelling group members in building and capturing the richness of their rationale for scoping and scale decisions. The resulting transcripts are intended to promote transparency of modelling decisions and provide essential support for the justification of the final model for its intended use. They also facilitate adaptive
... We use outcomes of real target cases to validate or support our agent-based model and simulations. Validation is critical for simulation studies, and it checks and determines whether the model is sufficiently accurate to back-calculate real-world cases (Robinson 1997). Based on behavior rules and mechanism settings (in the previous section), we try to find optimal parameters to confirm the model's validity by adaptively traversing parameters. ...
Article
Full-text available
Emergencies such as terrorist attacks, with a large number of casualties, have spread worldwide and become the global issue for a long time. Previous researchers employed traditional two-dimensional (2D) models to simulate the crowd dynamics between terrorists and civilians. However, these 2D models simplify real situations and have yet to consider individual heights and visions. Therefore, more accurate models are needed. In this work, we extend the 2D model and propose the three-dimensional (3D) model, and the core is to bring the mechanism of individuals heights into decision-making process of these 3D agents for both terrorists and civilians. We first build the 3D environment. For the mechanism of crowd dynamics, under the framework of perception-decision-behavior, our 3D model has included individualized heights for all agents. Comparing 2D and 3D models, we find that individual heights and visions have greatly shaped the outcomes. The height heterogeneity has significant effects on attack deaths and slight effects on stampede deaths because smaller heights slow the moving speed for the crowd, and the higher heterogeneity (of heights) impairs the visibility of civilians. The effects of height heterogeneity on deaths will be more obvious, as the group size of civilians is beyond 1900. We have the phase transition threshold of 3030, beyond which stampede deaths exceed attack deaths. Moreover, the size effect of heroes follows the law of diminishing marginal returns. We also find that the number of heroes should twice that of terrorists, guiding better allocations of the police force and other public resources for emergencies responses. To strengthen the counter-force, heterogeneity effects of civilian heights should be controlled, and self-motivated heroes should be encouraged, which is critical for public safety worldwide.
... To this end, Verification and Validation (V&V) techniques are generally carried out to assure the effectiveness of a simulation model (Kleijnen, 1995). Specifically, the verification process assures that the conceptual model of the problem was transformed into a computer simulation model with sufficient accuracy (Robinson, 1997). The wellstructured debug tool of NetLogo® and its model visualisation were used to perform a dynamic verification test of the simulation model, which is widely used in literature (Sargent, 2013). ...
Article
Nowadays, the increase in patient demand and the decline in resources are lengthening patient waiting times in many chemotherapy oncology departments. Therefore, enhancing healthcare services is necessary to reduce patient complaints. Reducing the patient waiting times in the oncology departments represents one of the main goals of healthcare managers. Simulation models are considered an effective tool for identifying potential ways to improve patient flow in oncology departments. This paper presents a new agent-based simulation model designed to be configurable and adaptable to the needs of oncology departments which have to interact with an external pharmacy. When external pharmacies are utilised, a courier service is needed to deliver the individual therapies from the pharmacy to the oncology department. An oncology department located in southern Italy was studied through the simulation model and different scenarios were compared with the aim of selecting the department configuration capable of reducing the patient waiting times.
... In the fourth stage (verification and validation), trust in the study's outcomes is increased by using simulations (Robinson, 1997). The goal of the verification stage is to check that no debugging occurs and that the logic of the simulation model works (Shannon, 1998). ...
... The goal of the verification stage is to check that no debugging occurs and that the logic of the simulation model works (Shannon, 1998). The validation test is used to determine if the model fits the real conditions and to confirm that the model generated is accurate enough to reflect the system under investigation (Robinson, 1997(Robinson, , 2004. Law and Kelton (2000) pointed out that the verification stage needs software debugging for verification stage and checking for inappropriate implementations of the conceptual model and verifying calculations for the validation stage. ...
Article
Full-text available
Poultry products such as chicken and eggs are part of staple foods and a source of protein in Indonesia so their availability for consumers needs to be ensured. Besides that, there is also a pressing need to reduce imports and keep prices stable. In Indonesia, poultry products are consumed more than beef, fish, and other staple protein products. This paper aims to develop a model of poultry product-supply chain using System Dynamics (SD) approach to a) capture the causality between variables in the availability of corn supply, feed mills, chicken production and customer demand; and b) to create a scenario where commodity demands can be fulfilled adequately so that food security can be strengthened. As the main ingredient of poultry feed, corn produce can be increased by using a tight planting method and the use of superior seeds. The sustainable availability of chicken feed is then expected to help boost poultry produce, in addition to improving the chicken farming methods per se. The proposed model is expected to provide a better understanding of poultry product food chain, which will subsequently inform relevant policymakers in formulating strategic programmes to strengthen food security.