Article

An experiment on the role of graphical elements in architecture visualization

Authors:
  • Lufthansa Systems, Budapest, Hungary
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

The evolution and maintenance of large-scale software systems requires first an understanding of its architecture before delving into lower-level details. Tools facilitating the architecture comprehension tasks by visualization provide different sets of configurable, graphical elements to present information to their users. We conducted a controlled experiment that exemplifies the critical role of such graphical elements when aiming at understanding the architecture. In our setting, a different configuration of graphical elements had significant influence on program comprehension tasks. In particular, a 63% gain in effectiveness in architectural analysis tasks was achieved simply by changing the configuration of the graphical elements of the same tool. Based on the results, we claim that significant effort should be spent on the configuration of architecture visualization tools and that configurability should be a requirement for such tools.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... For example, classification and clustering techniques are two classical techniques used in pattern recognition, which are both based on data mining and statistics. In this SMS, one selected study may make use of two or more text analysis techniques, and consequently the number of overall text analysis techniques (80) is greater than the number of the selected studies (55). In most of the selected studies, text analysis technique can be subdivided into specific techniques. ...
... Researchers and practitioners focused their efforts on AU when applying text analysis techniques in SA. Software architecture should be well understood before any corrections or changes can be applied to it [55]. AU is a prerequisite for and supportive to other architecting activities (e.g., AA, AI, AME, and AE). ...
Article
Full-text available
Context: Information from artifacts in each phase of the software development life cycle can potentially be mined to enhance architectural knowledge. Many text analysis techniques have been proposed for mining such artifacts. However, there is no comprehensive understanding of what artifacts these text analysis techniques analyze, what information they are able to extract or how they enhance architecting activities. Objective: This systematic mapping study aims to study text analysis techniques for mining architecture-related artifacts and how these techniques have been used, and to identify the benefits and limitations of these techniques and tools with respect to enhancing architecting activities. Method: We conducted a systematic mapping study and defined five research questions. We analyzed the results using descriptive statistics and qualitative analysis methods. Results: Fifty-five studies were finally selected with the following results: (1) Current text analysis research emphasizes on architectural understanding and recovery. (2) A spectrum of text analysis techniques have been used in textual architecture information analysis. (3) Five categories of benefits and three categories of limitations were identified. Conclusions: This study shows a steady interest in textual architecture information analysis. The results give clues for future research directions on improving architecture practice through using these text analysis techniques.
... However, understanding software architecture (i.e. the process of acquiring knowledge about certain characteristics of a software system) is not insignificant [32]. It includes a set of strenuous and time-consuming activities, high complexity, large amount of data, and a large number of dependencies [32]. ...
... However, understanding software architecture (i.e. the process of acquiring knowledge about certain characteristics of a software system) is not insignificant [32]. It includes a set of strenuous and time-consuming activities, high complexity, large amount of data, and a large number of dependencies [32]. Conceptual models of software architecture enable faster and more efficient learning of software architecture and design [3]. ...
Article
Full-text available
The purpose of this research is to determine appropriate methods for efficient learning and adoption of new software technologies. We considered Expert Modeling (EM) and Self Guided Modeling (SGM) approaches of Model Centered Instruction applied to learning Java Web Application frameworks. As a conceptual model we focused on software architecture, and thereby provided integral (expert) meta-model represented in 3D learning environment. To evaluate this approach, we conducted an experiment with two groups of students using the two different instruction approaches. Finally, we used statistical methods to determine whether there was an observable effect of the instructional approach that used the integral metamodel JSP Model 2 software architecture. Although the number of participants in the experiment was limited, our findings confirmed that the use of different Model Centered Instructions has significant impact on learning efficiency. Our experience shows that application of EM in learning Java Web application frameworks gives better results compared to those obtained by applying SGM.
... We focused on the understanding of architecture design since understandability is one of the main factors which influence the maintainability of software design [27] and it is also considered as one of major quality characteristics that an engineering model should have [33]. An architecture design should be first understood well before any changes can be applied to it [7]. The goal of this study is to investigate whether using ADDs can improve the understanding of architecture design and the reasoning behind the design rationale. ...
... Knodel et al. conducted a controlled experiment with 29 subjects (researchers and graduate students on software engineering) to investigate the role of configurable graphical elements in architecture understanding [7]. Configurability in this work means, for example, enabling and/or disabling certain graphical elements. ...
Conference Paper
Full-text available
Architectural design decision (ADD) and its design rationale, as a paradigm shift on documenting and enriching architecture design description, is supposed to facilitate the understanding of architecture and the reasoning behind the design rationale, which consequently improves the architecting process and gets better architecture design results. But the lack of empirical evaluation that supports this statement is one of the major reasons that prevent industrial practitioners from using ADDs in their daily architecting activities. In this paper, we conducted two controlled experiments, as a family of experiments, to investigate how presence of ADDs can improve the understanding of architecture. The main results of our experiments are: (i) using ADDs and their rationale in architecture documentation does not affect the time needed for completing architecture design tasks; (ii) one experiment and the family of experiments achieved a significantly better understanding of architecture design when using ADDs; and (iii) with regard to the correctness of architecture understanding, more experienced participants benefited more from ADDs in comparison with less experienced ones.
... Another lesson we learned from Quante's work is that an experiment that failed with respect to the expected results is not necessarily a failure. Knodel et al. presented the results of a controlled experiment for the evaluation of the role of graphical elements in visualization of software architecture [13] . In a preliminary step, the authors verified the soundness of the tasks with the help of two experts in the object system (i.e., Tomcat). ...
... [27] and made available the materials and results to facilitate its evaluation and replication: @BULLET subject selection criteria and justification @BULLET subject screening materials and results (with private information replaced by unique identifiers) @BULLET pre-test questions, and results keyed to the unique subject identifiers, as well as explanation that the questions are designed to evaluate @BULLET control and treatment groups (i.e., sets of subject identifiers) @BULLET post-test design and control/treatment group materials, as well as an explanation of the knowledge the post-test questions are designed to evaluate @BULLET if different instructions are given to the control and treatment groups, some summary of the contents of these instructions 12. To allow an analysis based on the experience level which is supposed to influence the participants' performance in solving the given tasks [28, 13], we use blocking, which implies dividing the subjects of each treatment into blocks based on their experience and skills. ...
Article
Full-text available
We describe an empirical evaluation of a visualization approach based on a 3D city metaphor, implemented in a tool called CodeCity. We designed the controlled experiment based on a set of lessons extracted from the current body of research and perfected it during a preliminary pilot phase. We then conducted the experiment in four locations across three countries over a period of four months, involving participants from both academia and industry. We detail the decisions behind our design as well as the lessons we learned from this experience. Finally, we present the results of the experiment and the complete set of data required to ensure repeatability of the experiment.
... Although there have been several works on evaluation of visualization techniques, we restrict ourselves to briefly report on controlled experiments aimed at assessing graph-based software visualization tools and approaches used to support program comprehension. Knodel et al. (2008) evaluated the impact of changing the configuration of graphical elements in their tool, called SAVE, for software architecture visualization and evaluation. The experiment compares the influence of two configurations utilizing Tomcat web server as object system. ...
Article
Full-text available
Muchos investigadores han señalado la falta de estudios empíricos que sistemáticamente examinen las ventajas y desventajas del uso de técnicas de visualización para soportar la comprensión del software. Estos estudios son indispensables para recolectar y analizar evidencia objetiva y cuantificable acerca de la utilidad de las técnicas de visualización y herramientas propuestas, y más aún, para servir como guía de la investigación en visualización de software. En este estudio, 6 tareas típicas de comprensión de software fueron realizadas por 20 estudiantes de un curso de ingeniería de software. Se midió el tiempo de respuesta y se calificó la exactitud en las respuestas de los participantes. Los resultados indican que, por una parte, el uso de la técnica de visualización basada en grafos mejoró la exactitud en las respuestas de los estudiantes (21.45% en promedio); por otra parte, no se encontró evidencia significativa de reducción en el tiempo gastado por los estudiantes para resolver las tareas de comprensión de software.
... Knodel et al. (2008) evaluated the impact of changing the configuration of graphical elements in their tool, called SAVE, for software architecture visualization and evaluation. The experiment compares the influence of two configurations utilizing Tomcat web server as object system. ...
Article
Full-text available
Cada vez el uso de dispositivos móviles para realizar pagos se ha ido incrementando. Día a día son más las organizaciones que adoptan sistemas que incluyen algún sistema de pagos móvil, por ello es necesario contar con sistemas rápidos y ágiles que permitan garantizar la seguridad y confiabilidad, tanto para el operador como para el mismo usuario, a fin de que los usuarios obtengan un servicio de calidad basado en las tecnologías móviles. En éste trabajo se realiza un análisis de la tecnología NFC tomando en cuenta una propuesta de desarrollo de un sistema de medios de pago electrónicos, aplicable al Metro de Quito, para asegurar el tiempo de respuesta y la seguridad transaccional, se adopta la solución de sistemas en la Nube.
... This equates to an overall efficiency gain of 170 % for the eCITY configuration compared to the SAVE configuration. Knodel et al. [18] conduct a similar empirical experiment where they also evaluate the results with respect to the effect size, a representation of the difference in mean values as compared to the standard deviation. Similarly, we calculate the standard deviation in Table 2 using the formula of Hedges et al. [19] and claim an overall effect size of 5.77 standard deviations to be highly significant [20]. ...
Chapter
An essential component in the evolution and maintenance of large-scale software systems is to track the structure of a software system to explain how a system has evolved to its present state and to predict its future development. Current mainstream tools facilitating the structural evolution of software architecture by visualization are confined with easy to integrate visualization techniques such as node-link diagrams, while more applicable solutions have been proposed in academic research. To bridge this gap, we have incorporated additional views to a conventional tool that integrates an interactive evolving city layout and a combination of charts. However, due to a limited access to the stakeholders it was not possible to solicit them for a formal modeling process. Instead, an early prototype was developed and a controlled experiment was conducted to illustrate the vital role of such in-situ visualization techniques when aiming to understanding the evolution of software architecture.
... This equates to an overall efficiency gain of 170 % for the eCITY configuration compared to the SAVE configuration. Knodel et al. [18] conduct a similar empirical experiment where they also evaluate the results with respect to the effect size, a representation of the difference in mean values as compared to the standard deviation. Similarly, we calculate the standard deviation in Table 2 using the formula of Hedges et al. [19] and claim an overall effect size of 5.77 standard deviations to be highly significant [20]. ...
... Visual communication is a key factor in the teaching and learning process of software architecture and systems modeling [8] [9]. Understanding architectures is not a trivial activity, which consumes time and effort, due to factors such as [10]: (i) the high complexity of systems, (ii) the large amount of data, and (iii) the high number of dependencies among their architectural elements. For these reasons, the process of acquiring knowledge should be supported by visualization approaches and tools to reduce the effort of understanding by improving awareness of the behavior, structure, and evolution of the system. ...
Conference Paper
Full-text available
Creation, interpretation, and evolution of models and diagrams are activities commonly performed in software projects, and the teaching of such activities is fundamental for software engineering education. However, it is difficult to see the changes made in model corrections, evolutions, or maintenance done by the student, as well as by the teacher, comparing templates and different solutions. In this regard, this paper presents the use of the PREViA approach, which aims at visualizing software architecture models. This work focuses on supporting teaching and learning in the modeling education context. A feasibility study was conducted and its results provided positive evidences of the use of the approach in the correction of diagrams.
... Knodel et al. presented the results of a controlled experiment for the evaluation of the role of graphical elements in visualization of software architecture [KMN08]. In a preliminary step, the authors verified the soundness of the tasks with the help of two experts in the object system (i.e., Tomcat). ...
Article
Full-text available
Software understanding takes up a large share of the total cost of a software system. The high costs attributed to software understanding activities are caused by the size and complexity of software systems, by the continuous evolution that these systems are subject to, and by the lack of physical presence which makes software intangible. Reverse engineering helps practitioners deal with the intrinsic complexity of software, by providing a broad range of patterns and techniques. One of these techniques is software visualization, which makes software more tangible, by providing visible representations of software systems. Interpreting a visualization is by no means trivial and requires knowledge about the visual language of the visualization. One means to ease the learning of a new visualization's language are metaphors, which allow the interpretation of new data representations by analogy. Possibly one of the most popular metaphors for software visualization is the city metaphor, which has been explored in the past by a number of researchers. However, in spite of the efforts, the value of this metaphor for reverse engineering has never been taken beyond anecdotical evidence. In this dissertation, we demonstrate the value of the city metaphor for reverse engineering along two directions. On the one hand, we show that the metaphor is versatile enough to allow the representation of different facets of software. On the other hand, we show that the city metaphor enables the creation of software visualizations which efficiently and effectively support reverse engineering activities. Our interpretation of the city metaphor at its core depicts the system as a city, the packages as districts, and the classes as buildings. The resulting "code city" visualization provides a structural overview of the software system, enriched with contextual data. To be able to perform analyses of real systems using our approach, we implemented a tool called CodeCity. We demonstrate the versatility of the metaphor, by using it in three different analysis contexts, \ie program comprehension, software evolution analysis, and software design quality assessment. For each of the contexts, we describe the visualization techniques we employ to encode the contextual data in the visualization and we illustrate the application by means of case studies. The insights gained in the three analysis contexts are complementary to each other, leading to an increasingly more complete "big picture" of the systems. We then demonstrate how the visualizations built on top of our city metaphor effectively and efficiently support reverse engineering activities, by means of an extensive controlled experiment. The design of our experiment is based on a list of desiderata that we extracted from our survey of the current body of research. We conducted the experiment over a period of six months, in four sites located in three countries, with a heterogeneous sample of subjects composed of fair shares of both academics and industry practitioners. The main result of our experiment was that, overall, our approach outperforms the state- of-practice in supporting users solve reverse engineering tasks, in terms of both correctness and completion time.
Article
Full-text available
It has been argued that reporting software engineering experiments in a standardized way helps researchers find relevant information, understand how experiments were conducted and assess the validity of their results. Various guidelines have been proposed specifically for software engineering experiments. The benefits of such guidelines have often been emphasized, but the actual uptake and practice of reporting have not yet been investigated since the introduction of many of the more recent guidelines. In this research, we utilize a mixed-method study design including sequence analysis techniques for evaluating to which extent papers follow such guidelines. Our study focuses on the four most prominent software engineering journals and the time period from 2000 to 2020. Our results show that many experimental papers miss information suggested by guidelines, that no de facto standard sequence for reporting exists, and that many papers do not cite any guidelines. We discuss these findings and implications for the discipline of experimental software engineering focusing on the review process and the potential to refine and extend guidelines, among others, to account for theory explicitly.
Conference Paper
Successful software systems evolve over time and are typically tailored to individual customer needs. Consequently, these adaptations result in multiple variants of the system. These multiple variants impose a challenge on the development organizations because the variation points are often neither explicitly known nor managed, and emerge uncontrolled. In this paper, we propose a technique that visualizes the variation points on the level of the software architecture. The technique - called variant comparison - has been successfully applied in one internal and two industrial studies. This paper summarizes our practical experience in the application of variant comparison. Further, we discuss our lessons learned on how the variant comparison can enable explicit variability management in a development organization.
Conference Paper
The ArQuE project has developed an integrated and comprehensive method that enables goal-oriented,architecture-centric development and strategic quality engineering. The consolidated expertise from applying the ArQuE approach at different industry partners shows the applicability and scalability in the embedded systems domain. The approach leads to reduced maintenance effort and simplified evolution of the respective software systems. The experiences made and knowledge gained in the instantiations of the approach for the industrial systems document the overall project success and give evidence that the overall return on investment of the Argue approach is positive.
Article
Full-text available
Software organizations' main assets are not plants, buildings, or expensive machines. A software organization's main asset is its intellectual capital, as it is in sectors such asconsulting, law, investment banking, and advertising. The major problem with intellectual capital is that it has legs and walks home every day. At the same rate experience walks out the door, inexperience walks in the door. Whether or not many software organizations admit it, they face the challenge ofsustaining the level of competence needed to win contracts and fulfill undertakings.
Article
Full-text available
SUMMARY Software visualization is concerned with the static visualization as well as the animation of software artifacts, such as source code, executable programs, and the data they manipulate, and their attributes, such as size, complexity, or dependencies. Software visualization techniques are widely used in the areas of software maintenance, reverse engineering, and re-engineering, where typically large amounts of complex data need to be understood and a high degree of interaction between software engineers and automatic analyses is required. This paper reports the results of a survey on the perspectives of 82 researchers in software maintenance, reverse engineering, and re-engineering on software visualization. It describes to which degree the researchers are involved in software visualization themselves, what is visualized and how, whether animation is frequently used, whether the researchers believe animation is useful at all, which automatic graph layouts are used if at all, whether the layout algorithms have deficiencies, and—last but not least—where the medium-term and long-term research in software visualization should be directed. The results of this survey help to ascertain the current role of software visualization in software engineering from the perspective of researchers in these domains and give hints on future research avenues. Copyright c � 2003 John Wiley & Sons, Ltd.
Conference Paper
Full-text available
The evolution and maintenance of large-scale software systems requires first an understanding of its architecture before delving into lower level details. Tools facilitating the architecture comprehension tasks by visualization provide different sets of graphical elements. We conducted a controlled experiment that exemplifies the critical role of such graphical elements when aiming at understanding the architecture. The results show that a different configuration of graphical elements influences program comprehension tasks significantly. In particular, a gain of effectiveness by 63% in basic architectural analysis tasks was achieved simply by choosing a different set of graphical elements. Based on the results we claim that significant effort should be spent on the configuration of architecture visualization tools
Article
Full-text available
This paper presents one experiment to explain why and under which circumstances visual programming languages would be easier to understand than textual programming languages. Towards this goal we bring together research from psychology of programming and image processing. According to current theories of imagery processing imagery facilitates a quicker access to semantic information. Thus, visual programming languages should allow for quicker construction of a mental representation based on data flow relationships of a program than procedural languages. To test this hypothesis the mental models of C and spreadsheet programmers were assessed in different program comprehension situations. The results showed that spreadsheet programmers developed data flow based mental representations in all situations, while C programmers seemed to access first a control flow and then data flow based mental representations. These results could help to expand theories of mental models from psychology of programming to account for the effect of imagery.
Conference Paper
Full-text available
An effective approach to program understanding involves browsing, exploring, and creating views that document software structures at different levels of abstraction. While exploring the myriad of relationships in a multi-million line legacy system, one can easily loose context. One approach to alleviate this problem is to visualize these structures using fisheye techniques. This paper introduces Simple Hierarchical Multi-Perspective views (SHriMPs). The SHriMP visualization technique has been incorporated into the Rigi reverse engineering system. This greatly enhances Rigi's capabilities for documenting design patterns and architectural diagrams that span multiple levels of abstraction. The applicability and usefulness of SHriMPs is illustrated with selected program understanding tasks
Conference Paper
Full-text available
This paper describesthe SHriMPvisualizationtechnique for seamlessly exploring software structure and browsing source code, with a focus on effectively assisting hybrid pro- gram comprehension strategies. The technique integrates both pan+zoom and fisheye-viewvisualization approaches for exploring a nested graph view of software structure. The fisheye-viewapproach handles multiple focal points, which are necessary when examining several subsystems and their mutual interconnections. Source code is presented by embeddingcode fragmentswithin the nodes of the nested graph. Finer connections among these fragments are rep- resented by a network that is navigated using a hypertext link-following metaphor. SHriMP combines this hypertext metaphorwith animatedpanningand zoomingmotionsover the nested graph to provide continuousorientation and con- textual cues for the user. The SHriMP tool is currently be- ing evaluated in several user studies. Observationsof users performing program understanding tasks with the tool are discussed.
Conference Paper
Full-text available
Open source softwar es ystems provide a variety of field-tested components offering software development organizations the potential to reuse and adapt such component sf or their own purposes. The main challenge before achieving the reuse benefits is to acquire a thorough understanding of open source software systems (i.e., the reuse candidates) in order to reason about alternative solutions, to learn about the points where to adapt the system and eventually to decide whether or not to invest into reuse. Manually analyzing even small systems is a time-consuming, complex and costly task. In this paper we present a case stud yw here we analyzed the Apache Tomcat web server supported by a software architecture visualization and evaluation tool an d demonstrate how the too lf acilitated our comprehension tasks to learn about the architectural means and concepts.
Conference Paper
Full-text available
Modern object-oriented programs are hierarchical systems with many thousands of interrelated subsystems. Visualization helps developers to better comprehend these large and complex systems. This paper presents a three-dimensional visualization technique that represents the static structure of object-oriented programs using landscape-like distributions of three-dimensional objects on a two-dimensional plane. The familiar landscape methaphor facilitates intuitive navigation and comprehension. The visual complexity is reduced by adjusting the transparency of object surfaces to the distance of the viewpoint. An approach called Hierarchical Net is proposed for a clear representation of the relationsships between the subsystems.
Article
Full-text available
In this paper, we explore the question of whether program understanding tools enhance or change the way that programmers understand programs. The strategies that programmers use to comprehend programs vary widely. Program understanding tools should enhance or ease the programmer 's preferred strategies, rather than impose a fixed strategy that may not always be suitable. We present observations from a user study that compares three tools for browsing program source code and exploring software structures. In this study, 30 participants used these tools to solve several high-level program understanding tasks. These tasks required a broad range of comprehension strategies. We describe how these tools supported or hindered the diverse comprehension strategies used. Keywords: Fisheye views, program comprehension, program understanding tools, reverse engineering, software maintenance, software visualization, user study. 1 Introduction Program understanding tools should help programmers to...
Chapter
Full-text available
Article
Full-text available
One possible reason for the continued neglect of statistical power analysis in research in the behavioral sciences is the inaccessibility of or difficulty with the standard material. A convenient, although not comprehensive, presentation of required sample sizes is provided. Effect-size indexes and conventional values for these are given for operationally defined small, medium, and large effects. The sample sizes necessary for .80 power to detect effects at these levels are tabled for 8 standard statistical tests: (1) the difference between independent means, (2) the significance of a product-moment correlation, (3) the difference between independent rs, (4) the sign test, (5) the difference between independent proportions, (6) chi-square tests for goodness of fit and contingency tables, (7) 1-way analysis of variance (ANOVA), and (8) the significance of a multiple or multiple partial correlation.
Conference Paper
Full-text available
The software architecture is one of the most crucial artifacts within the lifecycle of a software system. Decisions made at the architectural level directly enable, facilitate, hamper, or interfere with the achievement of business goals, functional and quality requirements. Architecture evaluations play an important role in the development and evolution of software systems since they determine how adequate the architecture is for its intended usage. This paper summarizes our practical experience with using architecture evaluations and gives an overview on when and how static architecture evaluations contribute to architecture development. We identify ten distinct purposes and needs for static architecture evaluations and illustrate them using a set of industrial and academic case studies. In particular, we show how subsequent steps in architecture development are influenced by the results from architecture evaluations.
Conference Paper
Full-text available
Previous studies have shown that novices do not tend to extract or use data-flow information during program comprehension. However, for impact analysis and similar tasks, data-flow information is necessary and highly relevant. Visual data-flow programming languages, such as Prograph/CPX, have been commercially successful, suggesting that they provide effective data-flow representations. To explore data-flow representations for program comprehension, we augment Prograph data-flow programs with control-flow features to determine the effects on comprehension. We hypothesize that combined control/data-flow representations will aide comprehension better than data-flow alone. To validate this hypothesis, we present the results of an experiment comparing three combined representations against a data-flow only representation. While the addition of control-flow was found to be beneficial, the complexity of the representations plays an important role. Complex and highly detailed control-flow, although perceived as useful, is less effective when combined with data-flow, than less detailed and less complex control-flow descriptions. This finding suggests a tradeoff exists between a representation's content and complexity. We found a nested representation describing inter-method control-flow to be the most effective for supporting program comprehension.
Conference Paper
Full-text available
Many analyses of software systems can be formalized as relational queries, for example the detection of design patterns, of patterns of problematic design, of code clones, of dead code, and of differences between the as-built and the as-designed architecture. This paper describes the concepts of CrocoPat, a tool for querying and manipulating relations. CrocoPat is easy to use, because of its simple query and manipulation language based on predicate calculus, and its simple file format for relations. CrocoPat is efficient, because it internally represents relations as binary decision diagrams, a data structure that is well-known as a compact representation of large relations in computer-aided verification. CrocoPat is general, because it manipulates not only graphs (i.e. binary relations), but n-ary relations.
Conference Paper
Full-text available
The reflexion model originally proposed by Murphy and Notkin allows one to structurally validate a descriptive or prescriptive architecture model against a source model. First, the entities in the source model are mapped onto the architectural model, then discrepancies between the architecture model and source model are computed automatically. The original reflexion model allows an analyst to specify only non-hierarchical architecture models, which is insufficient for larger systems that are decomposed into hierarchical subsystems. This paper extends the original reflexion model to hierarchical architecture models, describes a method to apply this technique, and reports on case studies conducted on two large-scale and complex applications (namely, the C compiler sdcc for 8-bit microprocessors and the GNU C compiler gcc).
Conference Paper
Full-text available
This paper describes a structured tool demonstration, a hybrid evaluation technique that combines elements from experiments, case studies and technology demonstrations. Developers of program understanding tools were invited to bring their tools to a common location to participate in a scenario with a common subject system. Working simultaneously the tool teams were given reverse engineering tasks and maintenance tasks to complete on an unfamiliar subject system. Observers were assigned to each team to find out how useful the observed program comprehension tool would be in an industrial setting. The demonstration was followed by a workshop panel where the development teams and the observers presented their results and findings from this experience
Conference Paper
Full-text available
The purpose of the article is to report on a structured demonstration for comparing program comprehension tools. Five teams of program comprehension tool designers applied their tools to a set of maintenance tasks on a common subject system. By applying a variety of reverse engineering techniques to a predefined set of tasks, the tools can be compared using a common playing field. A secondary topic of discussion will address the development of “guinea pig” systems and how to use them in a structured demonstration for evaluating software tools
Conference Paper
Full-text available
Software architecture is important for large systems in which it is the main means for among other things, controlling complexity. Current ideas on software architectures were not available more than ten years ago. Software developed at that time has been deteriorating from an architectural point of view over the years, as a result of adaptations made in the software because of changing system requirements. Parts of the old software are nevertheless still being used in new product lines. To make changes in that software, like adding features, it is imperative to first adapt the software to accommodate those changes. Architecture improvement of existing software is therefore becoming more and more important. The paper describes a two-phase process for software architecture improvement, which is the synthesis of two research areas: the architecture visualisation and analysis area of Philips Research, and the transformation engines and renovation factories area of the University of Amsterdam. Software architecture transformation plays an important role, and is to our knowledge a new research topic. Phase one of the process is based on Relation Partition Algebra (RPA). By lifting the information to higher levels of abstraction and calculating metrics over the system, all kinds of quality aspects can be investigated. Phase two is based on formal transformation techniques on abstract syntax trees. The software architecture improvement process allows for a fast feedback loop on results, without the need to deal with the complete software and without any interference with the normal development process
Conference Paper
Full-text available
Software architecture visualization tools tend to support browsing, that is, exploration by following concepts. If architectural diagrams are to be used during daily software maintenance tasks, these tools also need to support specific fact-finding through searching. Searching is essential to program comprehension and hypothesis testing. Furthermore, searching allows users to reverse the abstractions in architectural diagrams and access facts in the underlying program code. We consider the problem of searching and browsing software architectures using perspectives from information retrieval and program comprehension. After analyzing our own user studies and results from the literature, we propose a solution: the Searchable Bookshelf, an architecture visualization tool that supports both navigation styles. We also present a prototype of our tool which is an extension of an existing architecture visualization tool
Article
Full-text available
The artifacts constituting a software system often drift apart over time. We have developed the software reflexion model technique to help engineers perform various software engineering tasks by exploiting, rather than removing, the drift between design and implementation. More specifically, the technique helps an engineer compare artifacts by summarizing where one artifact (such as a design) is consistent with and inconsistent with another artifact (such as source). The technique can be applied to help a software engineer evolve a structural mental model of a system to the point that it is “good enough” to be used for reasoning about a task at hand. The software reflexion model technique has been applied to support a variety of tasks, including design conformance, change assessment, and an experimental reengineering of the million-lines-of-code Microsoft Excel product. We provide a formal characterization of the reflexion model technique, discuss practical aspects of the approach, relate experiences of applying the approach and tools, and place the technique into the context of related work
Article
Full-text available
Modern object-oriented programs are hierarchical systems with many thousands of interrelated subsystems. Visualization helps developers to better comprehend these large and complex systems. This paper presents a three-dimensional visualization technique that represents the static structure of object-oriented programs using landscape-like distributions of three-dimensional objects on a two-dimensional plane. The familiar landscape metaphor facilitates intuitive navigation and comprehension. The visual complexity is reduced by adjusting the transparency of object surfaces to the distance of the viewpoint. An approach called Hierarchical Net is proposed for a clear representation of the relationships between the subsystems.
Article
Full-text available
Tool support is needed to cope with the complexity and the large amounts of information in reverse engineering. By creating representations in another form, often at a higher level of abstraction, state-of-the-art tools aid in reducing complexity and gaining insights into parts of a system's structure. However, orientation and navigation among these representations remains difficult. Often superfluously toolinduced effort is needed to perform a certain task. We call this artificially added effort friction.
Conference Paper
An experimental environment for reverse engineering Java software is discussed. Static information is extracted from class files and viewed using Rigi reverse engineering environment. The dynamic information is generated by running the target software under a debugger. The debugged event trace information is viewed as scenario diagrams using a prototype tool called SCED. In SCED state diagrams can be synthesized automatically from scenario diagrams.Dynamic information can also be attached to the static Rigi graph. Both static and dynamic views contain information about software artifacts and their relations. Such overlapping information forms a connection for information exchange between the views. SCED scenario diagrams are used for slicing the Rigi view and the Rigi view, in turn, is used to guide the generation of SCED scenario diagrams and for raising their level of abstraction.
Article
Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture
Article
Recent advances in statistical methods for meta-analysis help reviewers to identify systematic variation in research results.
Article
Compared to a decade ago, when the first comprehensive study was done in software maintenance, many changes have occurred in the practice of system development. Longitudinal data were obtained by using the same survey instrument, updated to reflect current practices, and sampling the same population. Comparing the current with 1977 results has helped to identify the persistent problems and issues as well as the emerging problems and issues. One of the important, but somewhat disturbing, conclusions is that maintenance problems are pretty much the same as during the 1970s (except for minor changes), despite advances made in structured methodologies and techniques. In terms of specific problems, personnel problems of maintenance programmers, i.e. turnover and availability, and programmer effectiveness problems, i.e. skills, motivation and productivity, have shown a rise, while problems associated with users' knowledge of computer systems have declined.
Conference Paper
In many contexts, humans often represent their own “neighborhood” in great detail, yet only major landmarks further away. This suggests that such views (“fisheye views”) might be useful for the computer display of large information structures like programs, data bases, online text, etc. This paper explores fisheye views presenting, in turn, naturalistic studies, a general formalism, a specific instantiation, a resulting computer program, example displays and an evaluation
Article
Visualization of software systems is a widely used tech-nique in software engineering. This paper proposes a 3D user-navigable software visualization system, termed KScope, that is comprised of a modular, component-based architecture. The flexibility of this construction al-lows for a variety of component configurations to vali-date experimental software visualization techniques. The first iteration of KScope is described and evaluated.
Conference Paper
Sab is a tool for automatic generation of class diagrams from Java code. In addition to hierarchical layout, it features a wealth of user interaction facilities for dealing with complex software. Its main trait, however, is its support for visualizing hierarchical layers of a system. Sab is well suited for exploring and reengineering arbitrary systems, including the standard libraries. The central features of the tool are presented, and an overview on its implementation is given
Article
In the Program Understanding Project at IBM's Research Division, work began in late 1986 on tools which could help programmers in two key areas: static analysis (reading the code) and dynamic analysis (running the code). The work is reported in the companion papers by Cleveland and by Pazel in this issue. The history and background which motivated and which led to the start of this research on tools to assist programmers in understanding existing program code is reported here.
Conference Paper
This paper describes a multi-perspective software visualization environment, SHriMP, which combines single view and multi-view techniques to support software exploration at both the architectural and source code levels. SHriMP provides three different views: a primary nested view and two subsidiary views. The primary nested view employs fisheye views of nested graphs, provides contextual cues, and supports general exploration activities. In SHriMP, subsidiary views exist as a searching tool and a relation tracer. These views complement each other and allow programmers to examine a software system from multiple perspectives.
Conference Paper
We present MetricView, a software visualization and exploration tool that combines traditional UML diagram visualization with metric visualization in an effective way. MetricView is very easy and natural to use for software architects and developers yet offers a powerful set of mechanisms that allow fine customization of the visualizations for getting specific insights. We discuss several visual and architectural design choices which turned out to be important in the construction of MetricView, and illustrate our approach with several results using real-life datasets.
Conference Paper
Visualization is a sound means to facilitate understanding of complex correlations and offers a broad variety of concepts. A problem with the visualization of software architectures is that there are almost no empirical evidences of the benefits of particular visualization concepts. That is why we introduce an approach that explicitly integrates architecture development with the selection, use and validation of visualization metaphors. We successfully applied this approach to realize our software architecture visualization tool and empirically validated the use of the visualization concepts in the software architecture domain. We claim that software architecture visualizations should always be thoroughly assessed, especially by software architects in an industrial environment.
Conference Paper
The Eclipse platform presents an opportunity to openly collaborate and share visualization tools amongst the research community and with developers. In this paper, we present our own experiences of "plugging-in" our visualization tool, SHriMP Views, into this environment. The Eclipse platform's Java Development Tools (JDT) and CVS plug-ins provide us with invaluable information on software artifacts relieving us from the burden of creating this functionality from scratch. This allows us to focus our efforts on the quality of our visualizations and, as our tool is now part of a full-featured Java IDE, gives us greater opportunities to evaluate our visualizations. The integration process required us to re-think some of our tool's architecture, strengthening its ability to be plugged into other environments. We step through a real-life scenario, using our newly integrated tool to aid us in merging of two branches of source code. Finally we detail some of the issues we have encountered in this integration and provide recommendations for other developers of visualization tools considering integration with the Eclipse platform.
Conference Paper
Product line practices are increasingly becoming popular in the domain of embedded software systems. This paper presents results of assessing success, consistency, and quality of Testo's product line of climate and flue gas measurement devices after its construction and the delivery of three commercial products. The results of the assessment showed that the incremental introduction of architecture-centric product line development can be considered successful even though there is no quantifiable reduction of time-to-market as well as development and maintenance costs so far. The success is mainly shown by the ability of Testo to develop more complex products and the satisfaction of the involved developers. A major issue encountered is ensuring the quality of reusable components and the conformance of the products to the architecture during development and maintenance
Book
Like other sciences and engineering disciplines, software engineering requires a cycle of model building, experimentation, and learning. Experiments are valuable tools for all software engineers who are involved in evaluating and choosing between different methods, techniques, languages and tools. The purpose of Experimentation in Software Engineering is to introduce students, teachers, researchers, and practitioners to empirical studies in software engineering, using controlled experiments. The introduction to experimentation is provided through a process perspective, and the focus is on the steps that we have to go through to perform an experiment. The book is divided into three parts. The first part provides a background of theories and methods used in experimentation. Part II then devotes one chapter to each of the five experiment steps: scoping, planning, execution, analysis, and result presentation. Part III completes the presentation with two examples. Assignments and statistical material are provided in appendixes. Overall the book provides indispensable information regarding empirical studies in particular for experiments, but also for case studies, systematic literature reviews, and surveys. It is a revision of the authors' book, which was published in 2000. In addition, substantial new material, e.g. concerning systematic literature reviews and case study research, is introduced. The book is self-contained and it is suitable as a course book in undergraduate or graduate studies where the need for empirical studies in software engineering is stressed. Exercises and assignments are included to combine the more theoretical material with practical aspects. Researchers will also benefit from the book, learning more about how to conduct empirical studies, and likewise practitioners may use it as a "cookbook" when evaluating new methods or techniques before implementing them in their organization. © Springer-Verlag Berlin Heidelberg 2012. All rights are reserved.
Article
Dokumentation ist ein integraler Bestandteil jedes Software-Systems, die die notwendigen Informationen zur effektiven und erfolgreichen Erstellung, Nutzung und Wartung des Systems enthält. In der heutigen Praxis wird die Dokumentation oft vernachlässigt, was zu Dokumentationen niedriger Qualität führt. Die Qualität einer Software-Dokumentation kann allerdings nur in begrenzten Maße objektiv bestimmt werden; sie ist vielmehr von den Aufgaben abhängig, die durch die Dokumentation unterstützt werden sollen. Darum wird in dieser Dissertation ein sichtenbasierter Ansatz präsentiert und validiert, der die Qualität von Software-Dokumentation dadurch erhöht, dass maßgeschneiderte Dokumente entwickelt werden, die für die verschiedenen Nutzer einer Dokumentation genau die Informationen enthalten, die sie benötigen. Dabei werden software-entwickelnde Organisationen durch ein Qualitätsmodell unterstützt, dass die Qualität existierender Dokumentationen analysiert, sowie durch eine Methode, die, basierend auf existierenden Dokumentationspraktiken, die Dokumentationsqualität systematisch verbessert. Documentation is an integral part of any software system. It contains the information that is necessary to effectively and successfully develop, use, and maintain a software system. In practice, however, the creation of appropriate documentation is largely neglected resulting in documentation of low quality. The quality of software documentation is dependent on the tasks it is used for and can thus only be assessed objectively to a small extent. A view-based approach to software documentation is, therefore, presented and validated in this thesis that improves the quality of software documentation by developing customized documents that provide, for the users of documentation, precisely the information they require. The approach supports software-developing organizations in improving their documentation by providing a quality model to assess the quality of their current documentation and an approach to systematically improve the quality based on the established documentation practices.
Article
Thesis (M.S.)--State University of New York at New Paltz, 2001. Bibliographical references: leaf 49. Photocopy.
Article
Software organizations' main assets are not plants, buildings, or expensive machines Software organizations' main assets are not plants, buildings, or expensive machines. A software organization’s main asset is its intellectual capital, as it is in sectors such as consulting, law, investment banking, and advertising. The major problem with intellectual capital is that it has legs and walks home every day. At the same rate experience walks out the door, inexperience walks in the door. Whether or not many software organizations admit it, they face the challenge of sustaining the level of competence needed to win contracts and fulfill undertakings.
Conference Paper
The paper describes a system, Imsovision, for visualizing object-oriented software in a virtual reality Environment. A visualization language (COOL) is defined that maps C++ source code to a visual representation. Our aim is to develop a language with few metaphors and constructs, but with the ability to represent a variety of elements with no ambiguity or loss of meaning. In addition, the visualization has to maximally use the potential of the used media. The design of the OO software system and its attributes are represented in the visualization. Class information, relationships between classes, and metric information is displaced. VRML is used for the visualization and it is rendered in the CAVE environment
Conference Paper
An experimental environment for reverse engineering Java software is discussed. Static information is extracted from class files and viewed using Rigi reverse engineering environment. The dynamic information is generated by running the target software under a debugger. The debugged event trace information is viewed as scenario diagrams using a prototype tool called SCED. In SCED, state diagrams can be synthesized automatically from scenario diagrams. Dynamic information can also be attached to the static Rigi graph. Both static and dynamic views contain information about software artifacts and their relations. Such overlapping information forms a connection for information exchange between the views. SCED scenario diagrams are used for slicing the Rigi view and the Rigi view in turn, is used to guide the generation of SCED scenario diagrams and for raising their level of abstraction
Conference Paper
We explore the question of whether program understanding tools enhance or change the way that programmers understand programs. The strategies that programmers use to comprehend programs vary widely. Program understanding tools should enhance or ease the programmer's preferred strategies, rather than impose a fixed strategy that may not always be suitable. We present observations from a user study that compares three tools for browsing program source code and exploring software structures. In this study, 30 participants used these tools to solve several high level program understanding tasks. These tasks required a broad range of comprehension strategies. We describe how these tools supported or hindered the diverse comprehension strategies used