Article

Introduction to the Special Issue on Software Architecture.

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Maintaining the consistency of multiple program representations¿such as abstract syntax trees and program dependence graphs¿in a program manipulation tool is difficult. This paper describes a hybrid software architecture for a meaning-preserving program ...

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... The term 'architecture' is widely used in publications. It is generally assumed that the reader knows what this term means (Zachman, 1987; Van Waes, 1991;Soni et al., 1995;Garlan and Perry, 1995). In this section, an initial definition is derived by a small dictionary survey. ...
... Obviously, more actors became necessary to control the growing complexity of software. Garlan and Perry (1995) found that the term 'architecture' is used in a number of ways in software engineering. Among the various uses are a) the architecture of a particular system, as in 'the architecture of this system consists of the following three components,' b) an architectural style, as in 'this system adopts a client-server architecture,' and c) the general study of architecture, as in 'the papers in that issue are about architecture.' ...
... Zachman states that the increased scope of design and levels of complexity of system implementations are forcing the use of architectural models for defining and controlling the interfaces and the integration of the system components (Zachman, 1987). Moreover, at its best, architectural descriptions expose the high-level constraints on system design, as well as the rationale for making specific architectural choices (Garlan and Perry, 1995). ...
Book
Full-text available
Ta Prohm, one of Angkor’s temples, was built in approximately 1186 AD. After the defeat of the Khmer empire, most of the temples were left to the jungle. For centuries, the jungle threatened the structures of Ta Prohm. The temple was intertwined by trees that reduced the strength of its structure. Ta Prohm survived the encroaching trees, but would have gone lost in the jungle without human intervention. However, removing some of the trees might lead to a partial collapse of the temple. After years of evolution, shop floor control systems frequently consist of entangled components. Modifications violate the system architecture, and the complexity of the system becomes increasingly hard to manage. Because of the entanglement of the components, changes in a component propagate to other components. Systems become so complex, that they can hardly be changed anymore. Systems architecting tries to maintain the integrity of complex systems so that they can be adapted to future requirements. This study explores the systems architecting discipline.
... The scalability of the HSI is an entry question. For client-server architecture, whereas agents act as clients and API for servers, the governance relationship from constrained configuration defines the use patterns of the service (adaptation to Päivärinta 2019 cited Garlan & Perry 1995). ...
... Definition of Centralized Software Architecture(Päivärinta 2019, 20 cited Clements et al. 2003Garlan & Perry 1995;Perry & Wolf 1992); Microservice Pattern (ibid., 20 cited Medvidovic & Taylor 2010); Virtualized Docker Containers (ibid., 20 cited Docker 2019); API transport (ibid.,); Authentication (ibid., 38-40); Application Portal (ibid., 41-46); Virtual Machines (ibid., 50-51). ...
Article
Human systems integration (HSI) involves Human Factors and Ergonomics (HFE), Human-Machine Interaction (HMI), engineering, and domain experience, which are the initial components of systems engineering (SE) in all industries' economies: wellbeing, transportation, energy, IT, retail, finance, manufacturing, and production. HSI can be achieved by combining virtual prototyping with Human-In-the-Loop (HIL) simulations. HMI is typically a model-based and patented innovation; it uses HIL and requires a homogenized systemic reflection with feedback. Virtual Reality (VR) Human-Centered Design (HCD) is sustainable. VR-controlled HCD acts as a definitive Key Enabling Technology (KET) concept in considering the full range of system Life Cycle Assessments (LCAs) and whether the process is sustainable. To this end, on the planet earth, human organizational elements are not only assessed during the design process but the whole LC of a system. Against intuitive education, it has been stated that conservative and narrow LCA should not be implemented in a sustainable world but instead Cradle-to-Cradle (C2C) design from social, economic, or environmental terms; the objective is to increase positive impacts, not reduce negative ones as in LCA. By enabling virtual environments, digital tools enable these new capabilities, which should be realized as sustainable by Digital Twin (DT) formable as a Sustainable Model Based HSI (SMBHSI) concept with high-level Artificial Intelligence (AI) and C2C consideration forming the level of the metaverse, straining from VR. DT-based Internet of Things (IoT) solutions enable investigators to test scenarios for future foresight, give corporations abilities to benefit from performance metrics based on domain experience, and are a crucial concept in SMBHSI. A case on this proceeding instance will display an example for SMBHSI when the method is scoping review to the strategic objective to form an up-to-date linear outlook. Integrating HSI on AI and C2C thinking methodologies helps to save resources and move towards the globalized green level of circular economies, representing the economic integration of the economy of the human system by indirect resources utilization suggestion as indicative as inference and adaptation to blockchains.
... A. Dependency Analysis Ideally, the system architecture is thoroughly developed according to the project goals and requirements before the actual development phase takes place [9], [10]. This approach allows more precise project management in the future stages and helps achieve higher standards of clarity in communication between all stakeholders involved [11]. ...
... Despite the universal acceptance of these facts in industry, initial design decisions might be overturned with time, due to the unforeseen problems and limitations or due to poor development practice. The erosion of clean structure of a codebase is common [12], and in some cases it might weaken and even destroy the initial intent completely [9]. Furthermore, unintended design alterations induce the accumulation of technical debt, which may make the current development and maintenance processes more convoluted, if left unchecked. ...
Preprint
Full-text available
Dependency analysis is recognized as an important field of software engineering due to a variety of reasons. There exists a large pool of tools providing assistance to software developers and architects. Analysis of inter- and intra-project dependencies can help provide various insights about the entire development process. There is, however, currently a lack of tools that would support researchers by extracting intra-project dependencies data in a format most suited for further analysis. In this paper we introduce DepMiner - an open source, language-agnostic tool for mining detailed dependencies data from source code, based on extensive static analysis capabilities of an industry standard IDE. DepMiner can be easily integrated into arbitrary mining pipelines to conduct large-scale source code processing jobs involving intra-project dependencies. It is easily extensible to support other languages of source code, different granularities of analysis, and other use-specific needs.
... Concerning software architecture evaluation, intended as a way to achieve quality attributes (i.e., maintainability and reliability in a system), some approaches have emerged, the most prominent being ATAM, proposed by the Software Engineering Institute [4,5,14,30]. Typical research in this domain is about how architectural patterns and guidelines impact software components and configurations [22]. A survey study [16] analyzes architectural patterns to identify potential risks, and to verify the quality requirements that have been addressed in the architectural design of a system. ...
... SwQualityFactor v Description (18) SwQualityFactor v 8usesQualChar.SwQualityChar (19) SwQualityFactor v 9usesQualChar.SwQualityChar (20) MeasureRes v 9assess.SwQualityChar (21) MeasureRes v= 1hasValue.Value (22) MeasureRes v= 1hasMetric.Metric (23) ArcAlignmentRes v MeasureRes (24) ...
Article
Full-text available
Quality, architecture, and process are considered the keystones of software engineering. ISO defines them in three separate standards. However, their interaction has been scarcely studied, so far. The SQuAP model (Software Quality, Architecture, Process) describes twenty-eight main factors that impact on software quality in banking systems, and each factor is described as a relation among some characteristics from the three ISO standards. Hence, SQuAP makes such relations emerge rigorously, although informally. In this paper, we present SQuAP-Ont, an OWL ontology designed by following a well-established methodology based on the re-use of Ontology Design Patterns (i.e. ODPs). SQuAP-Ont formalises the relations emerging from SQuAP to represent and reason via Linked Data about software engineering in a three-dimensional model consisting of quality, architecture, and process ISO characteristics.
... The software architecture sub-field has deep roots dating back to the very early days of software engineering, but started gaining traction in earnest in the mid to late 1980s. Garlan and Perry [40] noted that one of the trends that prompted the attention for software architecture: ...
Preprint
Full-text available
This chapter seeks to support software engineering (SE) researchers and educators in teaching the importance of theory as well as the theorizing process. Drawing on insights from other fields, the chapter presents 12 intermediate products of theorizing and what they mean in an SE context. These intermediate products serve different roles: some are theory products to frame research studies, some are theory generators, and others are components of theory. Whereas the SE domain doesn't have many theories of its own, these intermediate products of theorizing can be found widely. The chapter aims to help readers to recognize these intermediate products, their role, and how they can help in the theorizing process within SE research. To illustrate their utility, the chapter then applies the set of intermediate theorizing products to the software architecture research field. The chapter ends with a suggested structure for a 12-week course on theorizing in SE which can be readily adapted by educators.
... The software architecture can contribute to reducing the time-to-market, establishing some strategies such as the use of existing assets and common architectural frameworks. Also, it can optimize the integration and mechanisms of generation of the architecture (Garlan and Perry, 1995). For open source ecosystems, time-to-market depends on the business scenario and community goals. ...
Article
Full-text available
The health state of a software ecosystem has been determined by its capacity of growth and longevity. Three health indicators represent a healthy software ecosystem: robustness, productivity, and niche creation. Studies focusing on understanding the causes and processes of the state of health of ecosystems have used these indicators largely. Researchers have intensified studies to understand how to achieve a good health state. Despite the growing number of studies, there is little knowledge about influences and actions to achieve health and, more specifically, that consider the effects of the software architecture on the ecosystem. This article presents a study exploring seven open source ecosystems within different domains to describe the influence of architectural practices on the software ecosystem health in terms of their motivations and effects. Our main goal was to understand how the software architecture and related practices can contribute to a healthy ecosystem. We conducted a netnography-based study to gather practices used to create and maintain the software architecture of these ecosystems. Our study brings evidence that architectural practices play a critical role in the achievement of ecosystems’ health. We found fifty practices that have influenced different aspects of health indicators. We highlight the importance of five influential factors – business goals, experience, requirements, resources, and time-to-market – for motivating the adoption of such practices. These factors may also contribute to understanding different strategies used to achieve a good health state. Moreover, we proposed a novel health indicator, trustworthiness, that accounts for the normal operation of a healthy software ecosystem.
... The architecture of a system can be defined as an abstraction of the system in the form of a set of software structures needed to reason about it [145]. An important concept when discussing system architectures is the architectural style, which defines constraints on the form and structure of an architecture [146]. This is closely related to the architectural pattern, which is a reusable, well-established architectural solution to a recurring design problem [145]. ...
Preprint
Full-text available
In recent years, there has been an increased focus on early detection, prevention, and prediction of diseases. This, together with advances in sensor technology and the Internet of Things, has led to accelerated efforts in the development of personal health monitoring systems. Semantic technologies have emerged as an effective way to not only deal with the issue of interoperability associated with heterogeneous health sensor data, but also to represent expert health knowledge to support complex reasoning required for decision-making. This study evaluates the state of the art in the use of semantic technologies in sensor-based personal health monitoring systems. Using a systematic approach, a total of 40 systems representing the state of the art in the field are analysed. Through this analysis, six key challenges that such systems must overcome for optimal and effective health monitoring are identified: interoperability, context awareness, situation detection, situation prediction, decision support, and uncertainty handling. The study critically evaluates the extent to which these systems incorporate semantic technologies to deal with these challenges and identifies the prominent architectures, system development and evaluation methodologies that are used. The study provides a comprehensive mapping of the field, identifies inadequacies in the state of the art, and provides recommendations for future research directions.
... Software architectures have played a central role in the development of successful software systems over the past 20 years [15,25]. Garlan and Perry [10] state that software architectures can have a positive impact in many aspects of software development, such as understanding, reuse, evolution, analysis, and management. Thereby, software architectures also play an important factor for guaranteeing software systems quality [6], such as maintainability, dependability, and interoperability [26]. ...
... A software architecture aims to explain the organization of a software system, presenting its main components, the interaction between them and the interaction between other (external) systems [68]. There are several styles to describe the architecture of an IoT platform. ...
Article
Full-text available
Due to the large variety of Internet of Things (IoT) platforms, selecting the right one to implement an IoT solution is a tough task. To mitigate right selection by the developer, this paper presents a Systematic Multivocal Mapping Study on IoT platforms and its main software elements, to define their anatomy considering how they were studied by the market analysts and academia. By using a precise protocol defined on this work, it was possible to select 50 academic articles and industry reports that perform IoT platform descriptions, evaluations and comparisons. As results, this paper identified the most important IoT platforms are AWS IoT, Azure IoT, Watson IoT, PTC ThingWorx and Google IoT. Its main capabilities are Interoperability, Security & Privacy, Developer Support, Data Management, Device Management and Services Management. It was also defined an architectural model with the main platform components highlighted according to their relevance, the main communication models (Publish/Subscribe and REST APIs) and the common API that should be implemented by the IoT platforms.
... To fully appreciate the differences between testing MSBAs and traditional applications, it is necessary to describe them from their software architecture perspectives. Garlan and Perry (1995) define software architecture as The structure of the components of a program/ system, their interrelationships, and principles and guidelines governing their design and evolution over time. This definition refers to software architecture as high-level structures of a software system, the discipline of creating such structures, and their documentation. ...
Article
Full-text available
In recent years, there has been an increase in the number of software applications developed using the microservices architectural pattern. This trend is due to the benefits derived from the more traditional N-tier architectural patterns that use monolithic designs for each tier. The value of using the microservices architectural pattern, particularly in the cloud, has been pioneered by companies such as Netflix and Google. These companies have created protocols and tools to support the development of cloud-based applications. However, the testing of microservices applications continues to be challenging due to the added complexity of network communication between the collaborating services. In addition, an increasing number of tools are being used to test microservices-based applications, which makes selecting the most appropriate tool(s) a challenging task. In this article, we compare several open-source tools used to support the testing of microservices based on testing levels, the scaffolding required, languages used for test cases, and the type of interface used to interact with the applications under test. We describe a prototype for a microservices-based application called Rideshare that allows users to reserve rides from available drivers. Using the Rideshare application, we performed a study using a subset of selected open-source tools to determine the overhead added by these tools. We present the results of the study and describe our experiences in configuring the tools to test the Rideshare application using different testing approaches.
... The primary mechanism for coordination is the software architecture, which describes the activities and interdependencies in terms of components and their relationships [17]. From activities (and subsystem) toward system-level dependencies requires software architects and developers to have a common understanding of the software architecture [35]. ...
Conference Paper
Full-text available
Despite considerable efforts to address organizational problems of distributed software development, currently available solutions do not seem to be sufficient. They are fragmented into individual patterns either not forming coherent pattern languages to address organizational distributed software development or being incorporated into extensive pattern languages for organizing software development in general. Another problem is their disconnection from the current technological support for collaboration. We attempt at overcoming these problems by providing a set of six organizational patterns for distributed software development. We relate them to each other and to other known patterns and practices practically establishing a pattern language for the organization of distributed software development. The overall idea of how this pattern language can be used is presented using a pattern story of a real company.
... II. RELATED WORK The domain of software architecture is not a new topic of study, Its importance has been noted and researchers have focused on this area over 25 years ago [1]. A lot of researchers have spent time studying how to better architect software products and analyzed different software architectural patterns from both an academic point of view and a practitioner's one. ...
... Due to competition, this creates pressure to software companies to provide high quality software faster than their competitors [2], that is lesser time-to-market. These challenges are rising and increasing day by day, to handle this issue IT companies are searching for ways to improve and fast track the software development process to distribute first-class products/software with more requirements in lesser time but with same or lesser man-hour and time [3], [4]. ...
Conference Paper
Full-text available
Software development organizations face issue like Technical Debt in their software projects. Technical Debt (TD) is incurred when a person involved in engineering of software, intentionally or unintentionally makes wrong or non-optimal design decisions. This problem occurs due to non-systematic and undefined approach to manage the high level of uncertainty in requirements. These non-optimal design decisions and the non- systematic approach can result in introduction of code smells. Code smells are actually technical debt (also known as perceived debt), which may cause a project to fade out with time. Code of a software project with technical debt is considered as unclean code. Refactoring is an activity of modifying the code to remove technical debt from a software project and make the project code clean. Goal of this work is to study the impact of removing TD on the effort required to add features and remove bugs from software. To this end, this work uses a stepwise approach; first code smells (considered as TD) are identified, and in the next step the smells are removed and impact of removing the code smells (on removing bug and adding new feature) is observed in terms of effort required during maintenance. Technical debt in five open source software applications has been calculated and impact of removing the code smells has been observed on one of the applications named NopCommerce (an ecommerce-based system). Effort to add a new feature and remove a bug, before and after refactoring, has been calculated in terms of man- hours. The effort required to add new feature and remove a bug from the clean code has been 7% less as compared to the effort required to do the same in unclean code. Whether to perform refactoring or not is a decision made by software developers. The impact of refactoring in reducing effort required to maintain code will help a developer make a decision regarding refactoring.
... As expertise are enthused to distributed structures, the snowballing complication of systems and their surroundings led to extra complicated design decision making and concerning long-term adjustments [15]. This led to software architecture actuality renowned as a discipline and the outline of methods to pact with these more complicated situation, such as lookouts and opinions for architectural explanation, clear documentation and administration of stakeholders [16]. The description of verified and recyclable architectural styles and the outline of architectural evaluation techniques to permit organized contrast of architectural options [17].Internet connected systems presented additional fresh defies that ensued in a fresh architectural attention on their defies non-functional potentials [18]. ...
Conference Paper
Full-text available
Software Architecture is the vital part of every software to provide such mechanism which can enhance the rule of credibility in between client and vendor side organization. The IoT software will face with failure, if the architecture of it is not designed as desired. Elaborating the challenges faced by vendor side organization in the begging of the software development is an integral part of this study. The goal of Software Architecture designing Model (SADCM) for IoT software development will be used to reduce the failure chances of any software system in future use and to make a user friendly environment system. This will assist the vendor organization to overcome on all those challenges that create hurdles in designing of architecture for IoT software. Systematic review of the existing literature about this study will be analyzed to find the challenges. These challenges shall be validated by empirical approach and finally the proposed model shall be assessed by study in software Development organization.
... The analyzer can break down services into a set of functions using requirements gathered in requirement phase and can determine functions with high possibility of change during executing time of software system product [3], [4]. Structure of a software system is an output of the design phase and an input to of the implementation phase [5], [6], [7]. Structure of a software system is used in the implementation phase to program the software system [5,8,9]. ...
Article
Full-text available
Software system is built to service users in its environment domain. It must have the ability to change its services in the case of user requests or environment changes after the deploy time. When design phase of the development life cycle is built, the dynamic software architecture must be designed in the way to give the system an ability to change its services at run time. This paper introduces an Independent Architecture Framework (IAF) for dynamic software system. The proposed IAF is designed in way to represent each functions independently. Then these function are connected to build the complete software system services. Therefore, the new IAF enable the software system to easily add, update, upgrade and swap its services at run time.
... The existing design and implementation languages, however, were not appropriate to describe software architectures. So called "box-and-lines" informal drawings were used to describe software architectures in the beginning (Garlan and Perry, 1995), but with limited possibilities. Therefore, researchers focused on the development of dedicated ADLs. ...
Article
Purpose The paper proposes the development of a graphical architecture description language (ADL) that allows a better understanding of software architectures for nontechnical actors and purposes and, beyond, can serve as a communication tool between domain experts and IT experts, for instance, in a software development process. Design/methodology/approach The paper follows the methods and guidelines of design science research. By deriving characteristics and general requirements for ADLs from a research literature review and from industry standards, the paper provides a conceptual modeling approach for an ADL. The model design is based on typical requirements and suggestions derived from literature and related work. The application possibilities and advantages are then demonstrated with a usage scenario. Findings The paper elaborates a user-oriented ADL that makes software architecture comprehensible for stakeholders and end users. It provides a high level of abstraction and, thus, is not restricted to a particular domain. The paper also provides a corresponding modeling editor as well as an underlying catalogue with symbols and rules for the ADL. Research limitations/implications As this is a conceptual study, the ADL has not been practically evaluated yet. Thus, the usefulness of this academic approach for the industry remains to be validated. Originality/value The elaborated ADL can serve as a language to visualize software architectures, particularly in the business domain, in a comprehensible manner. Still, it retains the structured character of ADLs to facilitate communication on an IT-near level. In including nontechnical actors, the approach broadens the overall application capabilities of ADLs.
... SA refers to a set of components of a software system, their connections and their principles and guidelines to manage the development and evolution during software life cycle [14]. SA describes the system's structure, interaction of its components and their core properties, playing an important role as an interface between requirements and source code [15]. ...
... Différents avantages de l'architecture logicielle ont été identifiés par [132,133] -En se basant sur les dépendances entre les composants, l'architecture garantit une gestion plus précise des coûts et des risques de modifications. Elle permet également une évaluation des qualités du système dans son ensemble, mais aussi des qualités de chaque composant. ...
... This instance could be represented by means of diagrams and its documentation. An Architectural Style is responsible for define the constraints and structure of how the architectural instances should be represented [7]. A reference architecture is a generic architecture to a specific domain that specifies a standard interface integration rules between architecture components [9]. ...
Conference Paper
Full-text available
Context: The modernization of a legacy software system is a reality of companies that need to improve their systems. Architecture-Driven Modernization (ADM) is a model-driven blueprint that support modernization engineers in the modernization processes. ADM considers the usage of reengineering and model driven architecture (MDA) concepts and advocates the employment of standard metamodels, such as Knowledge Discovery Metamodel (KDM). One of the main challenges while considering ADM is on how modernization tools (MTs) should be developed in order to allow the success of modernization projects that follows the ADM blueprint. Objective: In this paper we address the main requirements and guidelines on how modernization tools should be designed by means of a Reference Architecture (RA) for ADM-based Modernization Tools named RADMoTo. Methodology: We followed a four-steps methodological process to build RADMoTo and to evaluate it we submitted the RA technical report and a survey to ADM experts and software architects in order to get a feedback of the RA acceptance level. Results: As a result, we found out that, in general, the participants believe that our RA provides an acceptable level while considering the architectural views. Conclusion: The feedback from the participants helped in the improvement process of the RA. Thus, we claim that this paper provides concrete guidelines on how modernization tools should be designed and on how architectural instances derived from our RA could benefit from reusability and interoperability.
... These include functionality achieved by the system, such as the availability of the system in the face of faults, the difficulty of making specific changes to the system, and the responsiveness of the system to user requests [2]. Hence, the appropriate use of software architectures can have a positive impact on various aspects of software development, such as understanding, reuse, evolution, analysis, and management of systems, being a factor of extreme importance on the quality of software systems [11]. Considering their relevance, over the years, various approaches, processes, techniques, best practices [5,15,21,29], and international standard [18] were developed for software architecture development process. ...
Conference Paper
Research on sustainability in software engineering has gained importance as a result of the need to create better software and therefore avoid compromising future generations opportunities, whether in the social, economic, technical or environmental dimension. Social dimension encompasses the direct support of the software systems in any domain, as well as activities or processes that create benefits for social communities, such as health, education, and transportation. Although social aspects have been previously examined within the broader context of software engineering, the software systems design based on the notion of social sustainability is still poorly understood and practiced. This paper outline relevant points surrounding social sustainability as a concern in software architectures design. In particular, we discuss some issues in designing software systems that generate social values and have a positive impact on communities. We hope that this discussion will help broaden the concept of social sustainability in architectural design decisions and to bring awareness to the particular needs of software systems that have a direct impact on human well-being and contribute to sustainable development.
... Concerning software architecture evaluation, intended as a way to achieve quality attributes (i.e., maintainability and reliability in a system), some approaches have emerged, the most prominent being ATAM, proposed by the Software Engineering Institute Kazman et al. (1994); Clements et al. (2002); Bengtsson et al. (2004); Bellomo et al. (2015). Typical research in this domain is about how architectural patterns and guidelines impact software components and configurations Garlan and Perry (1995). A survey study Dobrica and Niemela (2002) analyzes architectural patterns to identify potential risks, and to verify the quality requirements that have been addressed in the architectural design of a system. ...
Preprint
Full-text available
Quality, architecture, and process are considered the keystones of software engineering. ISO defines them in three separate standards. However, their interaction has been scarcely studied, so far. The SQuAP model (Software Quality, Architecture, Process) describes twenty-eight main factors that impact on software quality in banking systems, and each factor is described as a relation among some characteristics from the three ISO standards. Hence, SQuAP makes such relations emerge rigorously, although informally. In this paper, we present SQuAP-Ont, an OWL ontology designed by following a well-established methodology based on the re-use of Ontology Design Patterns (i.e. ODPs). SQuAP-Ont formalises the relations emerging from SQuAP to represent and reason via Linked Data about software engineering in a three-dimensional model consisting of quality, architecture, and process ISO characteristics.
... Numerous similar issues lead to an ineffective use of documentation and, ultimately, its abandonment. The inconsistency problem caused by the lack of common understanding of the term architectural view was not surprising: after all, there exists no single agreedupon definition for software architecture (Garlan and Perry, 1995). However, we found out that even some terms that are well rooted and used within the software community, were largely misunderstood. ...
Conference Paper
Full-text available
In this paper we draw on our experience in the automotive industry to portray the clear need for proper documentation of Simulink models when they describe the implementations of embedded systems. We effectively discredit the “model is documentation” motto that has been hounding the model-based paradigm of software development. The state of the art of documentation of Simulink designs of embedded systems, both in academia and industrial practice, is reviewed. We posit that lack of proper documentation is costing industry dearly, and propose that a significant change in development culture is needed to properly position documentation within the software development process. Further, we discuss what is required to foster such a culture.
... It should thus be noted that, in a sense, ideas were converging naturally, and it was during that time (ca., 1995) that more attention started to be given to the true meaning of (software) architecture, trying to reach a consensus (Garlan and Perry, 1995), and, finally, in 1996 an increasingly universal vocabulary in this discipline started to settle (Shaw and Garlan, 1996), even considering that architectural styles were "becoming the lingua franca of architecture-level design" (Shaw and Clements, 1997 ISO et al., 2011) which currently is fixed as follows: "fundamental concepts or properties of the system in its environment embodied in its elements, relationships, and in the principles of its design and evolution". ...
Thesis
Full-text available
Over time, people in the face of vicissitudes and the concrete necessities of life were forced, in a natural and implicitly way, to create defence, protection and even survival mechanisms, having the need to adapt to the surroundings and to use the offered goods and information. In the case of blind or partially sighted people, the surrounding environment de per si shows up as a challenge and natural barrier, limiting their action, movement and orientation. Despite the navigation and guidance systems being in constant evolution and development, there are still many gaps and constraints in the support and help of people with visual impairments, and thus, the research done in this specific field deserves all the credit and support at all levels, always having as a beacon the improvement the welfare of people with these difficulties. The aim of the project on which this dissertation is part of (CE4Blind), is built upon the advantages of the latest technological developments to create innovative solutions that can strongly contribute to the support and inclusion of people with special needs, specifically the blind community and all people suffering from visual impairment. Perfectly aware that the work presented herein tries to support and contribute to the inclusion of blind and partially sighted people - a problem affecting more and more the world's population - in terms of their mobility, guidance and navigation, an architecture, that is governed by bases that the literature considers feasible and reliable, is specified, designed and implemented, also taking into account the importance of partnerships with various institutions, and even with the Governments and Countries themselves. As personal motivation, new ideas and contributions have been brought, always keeping in mind a greater good, which is the quality of life of those in need, in particular, the visually impaired.
... Numerous similar issues lead to an ineffective use of documentation and, ultimately, its abandonment. The inconsistency problem caused by the lack of common understanding of the term architectural view was not surprising: after all, there exists no single agreedupon definition for software architecture (Garlan and Perry, 1995). However, we found out that even some terms that are well rooted and used within the software community, were largely misunderstood. ...
Conference Paper
Full-text available
In this paper we draw on our experience in the automotive industry to portray the clear need for proper docu- mentation of Simulink models when they describe the implementations of embedded systems. We effectively discredit the “model is documentation” motto that has been hounding the model-based paradigm of soft- ware development. The state of the art of documentation of Simulink designs of embedded systems, both in academia and industrial practice, is reviewed. We posit that lack of proper documentation is costing in- dustry dearly, and propose that a significant change in development culture is needed to properly position documentation within the software development process. Further, we discuss what is required to foster such a culture.
... In an introductory article to a particular issue on software architecture Garlan and Perry (1995), the authors explained how the software architecture was starting to be a necessary discipline in software engineering. They explored several definitions of software architecture and introduced an evolved one. ...
Article
Full-text available
Enterprise Information Systems (EIS) are widely and extensively used in many domains such as banking, telecommunication, e-commerce and government. Although several research studies and investigations were performed that explore the importance of EIS, only a few studies have focused on effective and efficient end-to-end approaches to developing such systems. In this article, a proposed software development framework (Smart-EIS) is presented. The primary objective of Smart-EIS is making the development of high-quality EIS more effective and efficient. In particular, it aims to reduce the development cost and to provide built-in transparent quality, security, performance and user-experience features. A comprehensive review of the traditional EIS is presented. This includes a discussion of the characteristics and patterns of such systems, the layered architectural patterns and the main components of these systems. The working methodology for the work discussed in this article depends on dynamically construct the common and general aspects of EIS at runtime. The methodology starts with extracting metadata models from the traditional architectural and components patterns. Based on these metadata, APIs have been designed and implemented. These libraries were then composed to make the full and complete proposed framework. In terms of validation and evaluation, the proposed framework -including its APIs- has been implemented as open-source projects, used to build a simple human resource management system, then utilized to re-build a student information system. Results of validation and evaluation have been presented and discussed, which show promising potential.
... Des extensions telles que les machines à états hiérarchiques et les state charts ont cherché à y trouver remède. Les réseaux de Petri ont connu de nombreuses extensions, comme les réseaux de Petri colorés [18], temporisés [19], et hybrides [20]. ...
Research
Full-text available
Objectifs du cours : connaitre les caractéristiques intrinsèques des systèmes interactifs, comprendre les différentes abstractions intervenant dans le développement des systèmes interactifs et leurs interconnexions, appréhender les problèmes et les solutions liés à la conception des systèmes critiques interactifs, et avoir une vision claire à la fois de l’état de l’art de la recherche en interaction Homme Machine et les principaux axes de recherche.
... The structure of the components of a program/system, their interrelationships, and principles and guidelines governing their design and evolution over time. (Garlan and Perry 1995) Another extended definition was proposed by Kruchten in 1995: ...
Thesis
Over the last years, the size and complexity of software systems has been dramatically increased, making the evolution process more and more complex and consuming a great deal of resources and time. Consequently, software architecture is becoming one of the most important artifacts in planning and carrying out the evolution process. This architecture can provide an overall structural view of the system without undue focus on low-level details. This view can provide a deep understanding of previous design decisions and a means of exploring, analysing and comparing alternative scenarios of evolution. Therefore, software architecture evolution has gained significant importance in both industrial and academic areas in order to develop methods, techniques and tools that can help architects to plan evolution. To this end, an evolution styles approach has been introduced with the aim of capitalising on the recurrent evolution practices in a particular domain and of fostering their reuse.In this thesis, we endeavour to tackle the challenges in software architecture evolution reuse by specifying a standard modeling framework that can conform to different evolution styles and satisfy the concerns of the different teams involved in the evolution process. The primary contribution of this thesis is twofold. First, it introduces a meta-evolution style (meta-modeling language) which specifies the core conceptual elements for software architecture evolution modeling in order to address the difference in modeling concepts among evolution styles approaches. Second, it introduces a new methodology to develop a multi-view and multi- abstraction evolution style in order to reduce the complexity of the evolution process model by breaking down an evolution style into several views, each of which covers a relevant set of aspects or satisfies certain stakeholders’ concerns. The central ideas are embodied in a prototype tool in order to validate the applicability and feasibility of the proposed approaches (method and technique).
... The primary use of systems architecture can have an impact on at least five aspects of its development. For Garlan and Perry (1995), these aspects are: ...
Article
Full-text available
Knowledge is the primary organizational asset since it becomes a source of profit when directly related to the development of the final product. Thus, this is the scenario in which software industry organizations are looking for ways to manage their organizational knowledge, like Knowledge Management (KM), which offers processes for the capture, storage, sharing, and application of organizational knowledge. The individual knowledge within the software industry is explicit through different knowledge products, namely software artifacts. So, investigate means for knowledge products’ availability for the whole organization is relevant once it enables them to increase their solving problems capability, keeps their processes updated, and grow into more profitable products. In this sense, a system architecture offers robust and integrated features essential to aid the organizations in knowledge products availability, indexing, and management, besides that can solve the challenge of the knowledge fragmentation. Therefore, this paper presents a system architecture addressed to manage the knowledge products grounded in the KM process to capture, storage, sharing and use organizational knowledge. Such the system architecture is essential to help organizations within the software industry improve their KM on their knowledge. Then, it is exploratory research with mixed methods (qualitative and quantitative) that, as a result, presents a system architecture for the management of the knowledge products of organizations belonging to the software development industry.
... Traditional software engineering literature is focused on the well-known suitable patterns for software design [116]. Hence, typical research in this domain is about how architectural patterns and guidelines impact software components and configurations [53]. A survey study analyzes in this perspective architectural patterns to identify potential risks and to verify which quality requirements have been addressed in the design [46]. ...
Article
Full-text available
Information Systems Quality (ISQ) is a critical source of competitive advantages for organizations. In a scenario of increasing competition on digital services, ISQ is a competitive differentiation asset. In this regard, managing, maintaining, and evolving IT infrastructures have become a primary concern of organizations. Thus, a technical perspective on ISQ provides useful guidance to meet current challenges. The financial sector is paradigmatic, since it is a traditional business, with highly complex business-critical legacy systems, facing a tremendous change due to market and regulation drivers. We carried out a Mixed-Methods study, performing a Delphi-like study on the financial sector. We developed a specific research framework to pursue this vertical study. Data were collected in four phases starting with a high-level randomly stratified panel of 13 senior managers and then a target panel of 124 carefully selected and well-informed domain experts. We have identified and dealt with several quality factors; they were discussed in a comprehensive model inspired by the ISO 25010, 42010, and 12207 standards, corresponding to software quality, software architecture, and software process, respectively. Our results suggest that the relationship among quality, architecture, and process is a valuable technical perspective to explain the quality of an information system. Thus, we introduce and illustrate a novel meta-model, named SQuAP (Software Quality, Architecture, Process), which is intended to give a comprehensive picture of ISQ by abstracting and connecting detailed individual ISO models.
... The primary use of systems architecture can have an impact on at least five aspects of its development. For Garlan and Perry (1995), these aspects are: ...
Conference Paper
Full-text available
Software development organizations are dynamic and complex, so they need to continually renew their processes to excel in a highly competitive marketplace. The primary organizational asset for that matter is ‘the knowledge’ since it becomes a source of profit when directly related to the development of the final product. Furthermore, such knowledge is also used to address issues related to the management of people, processes, technologies, and products. Besides, knowledge is essential to increase sustainable competitive advantage and profit, as well as help in decision making. This is the scenario in which software industry organizations are looking for ways to manage their organizational knowledge. Thus, Knowledge Management (KM) emerges as a practical alternative which offers processes for the capture, storage, sharing, and application of organizational knowledge, aiming at improving the performance of the organizations, and bringing benefits such as innovation and sustainability. The individual knowledge within the software industry is explicit through different knowledge products, namely software artifacts. So, investigate means for knowledge products’ availability for the whole organization is relevant once it enables them to increase their solving problems capability, keeps their processes updated, and grow into more profitable products. In this sense, a system architecture offers robust and integrated features essential to aid the organizations in knowledge products availability, indexing, and management. In addition, a system architecture solves the challenge of the knowledge fragmentation, which causes knowledge loss and difficulties in the use of organizational knowledge. Therefore, this paper presents a system architecture addressed to manage the knowledge products grounded in the KM process to capture, storage, sharing and use the organizational knowledge. Such the system architecture is essential to help organizations within the software industry improve their KM on their knowledge. Then, it is exploratory research with mixed methods (qualitative and quantitative) that, as a result, presents a system architecture for the management of the knowledge products of organizations belonging to the software development industry.
... The architecture is the skeleton of the system and, because of it, it becomes the highest-level plan in the building of a system (KRAFZIG; BANKE; SLAMA, 2004). According to Garlan (1995), the main use of the software architecture may impact in five system development aspects: ...
Conference Paper
Full-text available
Knowledge Management (KM) is a relevant discipline to software companies because it can promote the innovation and business sustainability in the market in which these companies operate. KM is fundamental to organize, systematize, and facilitate information access and control within the organizational environment. Moreover, KM is vital for the organization to achieve productivity and competitiveness in the current market. Thus, in this paper, we present a study regarding technologies which surround the system architecture for a KM cycle through an exploratory and bibliographical research. Finally, we discuss the technologies to drive an architecture KM cycle-based system for the software industry.
... Traditional software engineering literature is focused on the well-known suitable patterns for software design [116]. Hence, typical research in this domain is about how architectural patterns and guidelines impact software components and con gurations [53]. A survey study analyzes in this perspective architectural patterns to identify potential risks and to verify which quality requirements have been addressed in the design [46]. ...
Preprint
Full-text available
Information Systems Quality (ISQ) is a critical source of competitive advantages for organizations. In a scenario of increasing competition on digital services, ISQ is a competitive differentiation asset. In this regard, managing, maintaining, and evolving IT infrastructures has become a primary concern of organizations. Thus, a technical perspective on ISQ provides a useful guidance to meet current challenges. The financial sector is paradigmatic, since it is a traditional business, with highly complex business--critical legacy systems, facing a tremendous change due to market and regulation drivers. We carried out a Mixed-Methods study, performing a Delphi-like study on the financial sector. We developed a specific research framework to pursue this vertical study. Data were collected in four phases starting with a high level randomly stratified panel of 13 senior managers and then a target-panel of 124 carefully selected and well-informed domain experts. We have identified and dealt several quality factors; they were discussed in a comprehensive model inspired by the ISO 25010, 42010, and 12207 standards, corresponding to software quality, software architecture, and software process, respectively. Our results suggest that the relationship among quality, architecture, and process is a valuable technical perspective to explain the quality of an information system. Thus, we introduce and illustrate a novel meta-model, named SQuAP (Software Quality, Architecture, Process), which is intended to give a comprehensive picture of ISQ by abstracting and connecting detailed individual ISO models.
... Both functional and nonfunctional requirements captured during requirements phase will have to be segregated logically based on their relationships into appropriate components. Proper care should be taken by architect to ensure that requirements are logically grouped into components [14] [15] to ensure that the coupling between components (Inter dependency) should be low and Cohesion (Intra dependency) should be high. ...
... Both functional and nonfunctional requirements captured during requirements phase will have to be segregated logically based on their relationships into appropriate components. Proper care should be taken by architect to ensure that requirements are logically grouped into components [14] [15] to ensure that the coupling between components (Inter dependency) should be low and Cohesion (Intra dependency) should be high. ...
... Architecture style: is a set of common restrictions to the form and structure of software architectures (GARLAN; PERRY, 1995;CLEMENTS;KAZMAN, 2012). ...
Thesis
Systems-of-Systems (SoS) encompass diverse and independent systems that must cooperate with each other for performing a combined action that is greater than their individual capabilities. In parallel, architecture descriptions, which are the main artifact expressing software architectures, play an important role in fostering interoperability among constituents by facilitating the communication among stakeholders and supporting the inspection and analysis of the SoS from an early stage of its life cycle. The main problem addressed in this thesis is the lack of adequate architectural descriptions for SoS that are often built without an adequate care to their software architecture. Since constituent systems are, in general, not known at design-time due to the evolving nature of SoS, the architecture description must specify at design-time which coalitions among constituent systems are feasible at run-time. Moreover, as many SoS are being developed for safety-critical domains, additional measures must be placed to ensure the correctness and completeness of architecture descriptions. To address this problem, this doctoral project employs SoSADL, a formal language tailored for the description of SoS that enables one to express software architectures as dynamic associations between independent constituent systems whose interactions are mediated for accomplishing a combined action. To synthesize concrete architectures that adhere to one such description, this thesis develops a formal method, named Ark, that systematizes the steps for producing such artifacts. The method creates an intermediate formal model, named TASoS, which expresses the SoS architecture in terms of a constraint satisfaction problem that can be automatically analyzed for an initial set of properties. The feedback obtained in this analysis can be used for subsequent refinements or revisions of the architecture description. A software tool named SoSy was also developed to support the Ark method as it automates the generation of intermediate models and concrete architectures, thus concealing the use of constraint solvers during SoS design and development. The method and its accompanying tool were applied to model a SoS for urban river monitoring in which the feasibility of candidate abstract architectures is investigated. By formalizing and automating the required steps for SoS architectural synthesis, Ark contributes for adopting formal methods in the design of SoS architectures, which is a necessary step for obtaining higher reliability levels.
... The architecture of a software system is the high-level organization of [its constituent] computational elements and the interactions between those elements ( [5], p.269). In this context, according to [13], there are two general approaches for software reconfiguration: parameter adaptation and compositional adaptation. ...
Article
Enterprise Applications (EA) are complex software systems for supporting the business of companies. Evolution of an EA should not affect its availability, e.g., because of a temporal shutdown, business operations may be affected. One possibility to address this problem is the seamless reconfiguration of the affected EA, i.e., applying the relevant changes while the system is running. Our approach to seamless reconfiguration focuses on component-oriented EAs. It is based on the Autonomic Computing infrastructure mKernel that enables the management of EAs that are realized using Enterprise Java Beans (EJB) 3.0 technology. In contrast to other approaches that provide no or only limited reconfiguration facilities, our approach consists of a comprehensive set of steps, that perform fine-grained reconfiguration tasks. These steps can be combined into generic and autonomous reconfiguration procedures for EJB-based EAs. The procedures are not limited to a certain reconfiguration strategy. Instead, our approach provides several reusable strategies and is extensible w.r.t. the opportunity to integrate new ones.
... Therefore, ADLs are a support for the description of the application's structure (Soucé and Duchien, 2002). As examples of description languages for this kind of architectures we quote: Fractal (Bruneton et al., 2006), Wright (Allen et al., 1998), Darwin , Rapid , 2SADEL (Medvidovic et al., 1999), ACME (Garlan and Perry, 1995), and xADL2.0 (Dashofy and Van Der Hoek, 2002). ...
Article
The mobile applications have enjoyed explosive growth these last years. Taking advantage from these existing softwares, the constituent software bricks to compose such mobile application can take different implementation forms and manipulate heterogeneous data by dint of user’s requirements or its execution context. However, the software developer confronts difficulty to compose already existing software entities because of their heterogeneity. An emerging need is then to have a new modelling space to support the development of heterogeneous mobile applications. In view of this fact, this paper discusses the proposal of a multi-paradigm for representing mobile applications based on heterogeneous conceptual bricks, including their architectural conception and the specification of the necessary adaptation mediators. It aims to deal with the heterogeneity of the constituent conceptual bricks and the execution environment of the final product. A conceptual description of a mobile application called ShopReview is presented to show the usability of the proposed paradigm.
Article
Full-text available
The evolution of software architecture has led to the emergence of various paradigms, including monolithic, microservices, and the lesser-explored modular monolith architecture. This paper delves into the historical development of these architectures, assessing their advantages and limitations, with a specific focus on their application in the fintech domain. Through an in-depth literature review and case studies of organizations like Shopify, Root, and Google, the study evaluates the potential of modular monolith architecture as the primary choice for developing efficient payment systems. By addressing the research gaps in existing studies and comparing modular monoliths with traditional monolithic and microservices architectures, this paper provides valuable insights for software developers, architects, and fintech industry professionals.
Conference Paper
Neste trabalho, um sistema de padrões [BMR+96] para software criptográfico orientado a objetos é proposto. Neste sistema, os aspectos da arquitetura de software criptográfico são abordados como microarquiteturas [GHJV93] interrelacionadas. Quatro padrões são classificados de acordo com os objetivos fundamentais da criptografia [Mv0V96](ver Apêndice A). Os padrões são: Sigilo da Informação, Integridade da Mensagem, Autenticação da Mensagem e Autenticação do Remetente. Outros quatro padrões são obtidos a partir de combinações dos anteriores: Sigilo com Autenticação, Sigilo com Integridade, Sigilo com Assinatura e Assinatura com Apêndice. Os oito padrões criptográficos possuem estrutura e dinâmica comuns e, por isso, podem ser representados por um padrão mais genérico, chamado Metapadrão Criptográfico. Os oito padrões e o Mctapadrão Criptográfico podem ser dispostos como vértices de um grafo direcionado acíclico no qual um percurso documenta uma sequência de decisões de projeto.
Chapter
Softwaresysteme können enorme Umfänge annehmen, weshalb es unabdingbar ist, sie angemessen zu strukturieren. Dies erfolgt durch die Gliederung in Teilsysteme und die Beschreibung des Zusammenwirkens der Teilsysteme. Schnittstellen kapseln die Teilsysteme. Über die Schnittstellen wird festgelegt, welche Einwirkungsmöglichkeiten es von außen auf ein Teilsystem gibt und welche Dienste ein Teilsystem nach außen anbietet. Die Schnittstellenspezifikation entspricht einem Vertrag, der Entwickler und Nutzer verbindet, der Systemstrukturen festlegt und die arbeitsteilige Entwicklung einer Software unterstützt. Für eine ganze Reihe von Qualitätsfragen eines Softwaresystems ist die angemessene Strukturierung von entscheidender Bedeutung. Dies gilt auch für die Realisierung der Software auf eine Hardwareplattform, einer Ausführungsumgebung, bei der in aller Regel die unterschiedlichen Teilsysteme gegebenenfalls auf unterschiedliche Rechner abgebildet und dort zur Ausführung gebracht werden. Dieses Kapitel führt die Grundlagen des Architekturentwurfs ein. Zentral sind hier die Architekturprinzipien sowie die Ansätze zur Strukturierung von softwareintensiven Systemen.
Book
Full-text available
This volume is the result of three years of research and prototype activity performed in the framework of the Horizon 2020 STORM project. Together with twenty organisations comprising the final phase of the pilots, this book forms one of the last initiatives of the Consortium which is briefly presented in the poster reproduced in the last pages of this book which had been realised as project's propotional material once STORM started in 2016. All the authors of the following Chapters are members of the STORM team and some of them are Member of the Executive Board thereof, and the four editors are the expression of different aspects of the complex research performed during the project lifecycle. Authors of this book come from diverse backgrounds, and so do their Chapters. The works herein all evidence of climate change and therefore they allow us to understand the multifarious aspects its threats. Coming back to the STORM project, it is the case to start giving evidence of its origin mentioning its framework and the H2020 ‘call for proposal’ which it refer to. STORM born as a proposal in the summer of 2015 written to ‘reply’ to a call belonging to the ‘Secure Societies’ programme which is the seventh programme within the Horizon 2020 pillar called ‘Societal Challenges’. In details, taking into consideration the four working area comprising ‘Secure Societies’ the project is related to the one called Disaster-resilience: safeguarding and securing society, including adapting to climate change and, more in detail to its third topic DRS-11-2015: Mitigating the impacts of climate change and natural hazards on cultural heritage sites, structures and artefacts. The Consortium is composed of twenty Partners across six European Countries plus an extra one: the Turkey. Further to the Partners there are also two so called ‘associated Partners’ which participate to the research without being funded.
Chapter
Brief summary of H2020 STORM project on Cultural Heritage protection from climate change effects -
Book
Full-text available
With the importance of Cultural Heritage recognised not only for its contribution in shaping ethnic identities, but also as an important factor in the economic growth of a country through its contribution to tourism, the need of protecting and preserving Cultural Heritage sites has been acknowledged as a priority for our societies. In this course, the collaboration of experts from different disciplines is important. STORM project, is an attempt to combine the expertise of a multidisciplinary group of academics, researchers, experts and practitioners, towards effective prediction of environmental changes and for revealing threats and conditions that could damage cultural heritage sites. STORM has completed its 3 years works, leading to results reporting on scientific and technical advancements, proposed strategies and policies, as well as practices and interventions towards the protection of Cultural Heritage. This book reports these results through a spherical coverage addressing both challenges and problems, and solutions, proposals and insights on the future.
Book
Full-text available
This volume is the result of three years of research and prototype activity performed in the framework of the Horizon 2020 STORM project. Together with twenty organisations comprising the final phase of the pilots, this book forms one of the last initiatives of the Consortium which is briefly presented in the poster reproduced in the last pages of this book which had been realised as project's propotional material once STORM started in 2016. All the authors of the following Chapters are members of the STORM team and some of them are Member of the Executive Board thereof, and the four editors are the expression of different aspects of the complex research performed during the project lifecycle. Authors of this book come from diverse backgrounds, and so do their Chapters. The works herein all evidence of climate change and therefore they allow us to understand the multifarious aspects its threats. Coming back to the STORM project, it is the case to start giving evidence of its origin mentioning its framework and the H2020 ‘call for proposal’ which it refer to. STORM born as a proposal in the summer of 2015 written to ‘reply’ to a call belonging to the ‘Secure Societies’ programme which is the seventh programme within the Horizon 2020 pillar called ‘Societal Challenges’. In details, taking into consideration the four working area comprising ‘Secure Societies’ the project is related to the one called Disaster-resilience: safeguarding and securing society, including adapting to climate change and, more in detail to its third topic DRS-11-2015: Mitigating the impacts of climate change and natural hazards on cultural heritage sites, structures and artefacts. The Consortium is composed of twenty Partners across six European Countries plus an extra one: the Turkey. Further to the Partners there are also two so called ‘associated Partners’ which participate to the research without being funded. The author of the foreword is a non-expert in the variety of competencies around the world of cultural heritage protection and preservation. For certain situations this is a disadvantage even though it enables us to consider the matter from a different perspective. From this angle, this book aims to give evidence of the various technologies and methodologies enhanced by years of research and experiments on the field to give cultural heritage sites ‘resilience’ to climate change. The sad reality is that the starting level for every kind of cultural heritage site is ‘zero’ and only with a lot of determination could this aim be achieved. Further evidence of difficulties faced is the lack of a proper manner to measure resilience both in qualitative and quantitative terms. This fact affirms it is no possible to define a “Resilient” cultural heritage site in a shared view. Empirical evidence demonstrates the non-uniformity in defining all the processes and measures adopted from a cultural heritage management to achieve this purpose. This consideration encourages the need of the creation of a proper ‘resilience certification’ like is happening in other emerging sectors also [see i.e. the ISO (International Organization for Standardization) which “covers almost every product, process or service imaginable, ISO makes standards used everywhere”]. As per the mentioned certification its potential of innovation lies in the principles from which it draws inspiration, that is the sharing of responsibility in the management of conservation issues, the control of activities generating impacts and the use of market mechanisms that seek in cultural heritage preservation excellence a source of competitive advantage. The strong point of this potential resilience record, beyond the creation of a solid structure capable of systematically controlling and managing climate change and environmental impacts on a cultural heritage site, lies in the pursuit for communication and transparency, or in improvement of the relations between cultural heritage site’s manager and control bodies, institutions, citizens one of the pillars on which the STORM project is based on. However, there are other vast obstacles before one should think about resilience certification. These relate to the fact that every site and each threat from climate change has its own peculiarities. In other word, it is the case to introduce a new relevant concept: the ‘quantity of resilience’ necessary in each site. This amount is another unknown element which should be studied. But this is another story which could form the subject of an ad hoc research with other multidisciplinary teams. Chapter 1 presents several recommendations that resulted from the experience gained within STORM project, as well as thoughts from experts, in order to improve government policies on cultural heritage risk management. Starting from the main European and international frameworks, this chapter explores different areas that require some improvements in order to implement a Disaster Risk Management (DRM) approach in cultural heritage sites. More precisely, it introduces operative proposals regarding: heritage Conservation; Communication between climate researchers and heritage managers; Coping and Adaptive capacities approaches based on current conceptual models; Cooperation among the different actors involved in the DRM of cultural heritage; Capacity building of heritage professionals, communities, via training and education programmes (the STORM 5 ‘C’s’). A STORM risk-oriented proposal to improve policies at governmental level focused on prevention (i.e. focused on reducing vulnerabilities and exposure of cultural heritage) are also envisioned, although in a broader scope in order to answer to the common constraints of the different STORM countries. Chapter 2 presents an integrated methodology of risk assessment and management for cultural heritage properties in response to the adverse effects of natural hazards and climate change-related events. The proposed methodology is applied to the five STORM pilot sites to identify and analyse the potential hazards and their corresponding risks. Accordingly, relative risk maps are generated to share a common understanding of the risks with the site managers and stakeholders. The output of the risk assessment for the pilot sites will further support the decision-making process to determine risk treatment strategies, including risk mitigation, risk preparedness, and recovery plan. Chapter 3 focusses on the specific sensors and supporting information technologies developed during the Project for timely artefact diagnosis and early detection of potential threats to the cultural heritage. Several technical solutions were chosen on the basis of the plethora of existing and emerging techniques in this field — discussed, analysed and benchmarked at the first stage of STORM. The selection was determined, first of all, by the peculiarities of hazards for each of the pilot sites where the technical solution was going to be deployed and, secondarily, by the cost-effectiveness and how safe the diagnostic procedure is for the artefact (in particular, at what extent the measurements are non-destructive and non-invasive. The reviewed sensing and information technologies cover all the five pilot sites of the Project and numerous measurement techniques and data processing algorithms dealing with assessing structural performance by vibration, crack monitoring, electrical resistivity tomography, ground penetrating and interferometry radar, fibre Bragg grating interrogation, induced fluorescence spectroscopy, multispectral aerial photography, as well as photogrammetry and terrestrial laser scanning. Chapter 4 charts the use of the data streaming in from the tools and sensors used in the STORM project. Several aspects are discussed herein, the analysis of weather data collected from the UK pilot sites weather station, the analysis of earthquake damage on structures at the Turkish pilot site together with the analysis of the novel Twitter Event Extractor developed by Resil- Tech and cursory analysis of the wireless acoustic sensors currently deployed across the STORM pilot sites to detect hazards from noise. This chapter gives an overview of just a small selection of data analysis currently being tested across the consortium and within the scope of the STORM project in a bid to help site managers and stakeholders in the efficient monitoring and preservation of their Cultural Heritage sites. Chapter 5 gives an overview of the tools and services developed in the STORM project that contribute to share knowledge and critical information to face critical events in Cultural Heritage sites. The STORM Collaborative Decision-Making Dashboard provides two environments, the collaborative and the operative, which are strongly interconnected with one other. The user interface and the services developed in the backend permits to collect, show, store and retrieve all the information related to existing knowledge about disastrous events and to new knowledge (e.g. from the situational picture, risk assessment) of the actual situation shared by team of experts in order to identify the best recovery actions. The STORM’s surveying and diagnosis service and mobile application will make it simpler for sites to monitor their CH assets through the STORM Prevention and Mitigation Processes, allowing to report issues within the application while conducting surveying activities, while the STORM Risk Assessment and Management Tool aims at providing to the site managers and experts a tool to identify and analyse the natural hazards, assessing the level of risk in different areas of a site and giving a level of priority to the items contained in the areas. Finally, the STORM web-GIS infrastructure supports the visualization of geospatial data managed by several services such as the risk assessment and the situational awareness Chapter 6 describes in detail the cloud-based infrastructure that supports the data management of the STORM platform. An overview of the modular STORM cloud architecture is presented, which consists of a Core cloud and several Edge cloud instances. Moreover, herein are introduced the STORM platform’s authentication and registration mechanisms for establishing a secure communication between the sensors and the data analysis services. Finally, the chapter concludes by defining the interfaces between the Core cloud and the Edge clouds. Chapter 7 presents the STORM System Architecture inspired by a layered architectural principle that includes six main logical layers (Source, Data, Information, Event, Service and Application Layer) implementing the STORM functional and non-functional requirements. Going through each layer, this chapter gives an overview of the main STORM Logical Architecture sources and modules, including their functionalities, dependencies and basic operations. Moreover, the STORM Interoperability Architecture is described to show the interactions and the control flow among the architectural modules. Finally, the chapter focuses on which technologies are used to implement such functionalities. The technical and implementation aspects of all STORM modules are described, and some technical guidelines and details match the requirements of the logical architecture are proposed Chapter 8 gives a brief overview about advantages and possibilities offered to protection and enhancement of Cultural Heritage by the chance of always being connected by a net. In particular, all the advantages given by the technologies developed within the STORM Project and the usefulness of the STORM approach in remote monitoring are described, since all these help in having greater preparedness and effectiveness of interventions, in addition to the possibility of collecting and storing very huge number of data. in order to prevent damage or material loss. In a connected world, every specialist has the opportunity to acquire the necessary data and to know the work of art’s situation in advance, having the time to plan the right intervention to be carried out and to organize the needed activities with the due attention. Particular attention is also given to the usefulness that apps and services created for recreational purposes (i.e. social networks) may have not only to enhance cultural heritage, but also to raise awareness among the population about this theme. Chapter 9 provides an overview of the STORM strategy in the pilot sites, focusing on pilot practical experiences, with an initial assessment of the results achieved until now. Multiple experimental scenarios in five countries (the UK, Italy, Portugal, Greece, and Turkey), covering both slow- and sudden- onset hazards, validate the proposed solutions in relation to the three phases defined in the project: Risk Assessment, Situation Awareness and First Aid activities. STORM introduces a comprehensive approach that supports end users with transversal services as data analytics and knowledge sharing during all these phases. The book is ending with an ‘epilogue’ in which there is a ‘recipe’ on how to proceed in making cultural heritage more resilient against climate change. Not only in term preservation, but in view of an aware use of this huge value which the Europe Union is a guardian, paladin as well as proud owner!
ResearchGate has not been able to resolve any references for this publication.