Conference PaperPDF Available

Decomposition of data flow diagrams

Authors:

Abstract

Data flow diagrams are an important design aid in system development. CASE tools allow data flow diagram construction and modification to be automated. Decomposition is the top-down development of a data flow diagram starting with the system inputs and the system outputs. Decomposition may also be automated, resulting in an interactive process for data flow diagram design. Adler (1988) described an algebra for the decomposition of data flow diagrams. A set of quality measures was also described. The authors show that these quality measures do not correspond to the intuitive notion of a good decomposition. A new set of criteria is proposed which does correspond to the intuitive notion of a good decomposition. The use Adler's algebra leads to an inefficient decomposition process, as well as one which is not guaranteed to find a good decomposition. The authors give an efficient algorithm which gives a good decomposition
... Adler (1988) presented an algebra formalizing the decomposition procedure of data flow diagrams, however, it has some drawbacks like inefficiency and no guarantee on finding a good decomposition. Arndt and Guercio (1992) improved Adlers work by replenishing new criteria and implementing their own algorithm for the decomposition of data flow diagrams. Both of these two studies equally lack assessments on decomposition result except for the intuitive notion. ...
Article
Microservices architecture emphasizes employing multiple small-scale and independently deployable microservices, rather than encapsulating all function capabilities into one monolith. Correspondingly, microservice-oriented decomposition, which has been identified to be an extremely challenging task, plays a crucial and prerequisite role in developing microservice-based systems. To address the challenges in such a task, we propose a dataflow-driven semi-automatic decomposition approach. In particular, a four-step decomposition procedure is defined: (1) conduct the business requirement analysis to generate use case and business logic specification; (2) construct the fine-grained Data Flow Diagrams (DFD) and the process-datastore version of DFD (DFDPS) representing the business logics; (3) extract the dependencies between processes and datastores into decomposable sentence sets; and (4) identify candidate microservices by clustering processes and their closely related datastores into individual modules from the decomposable sentence sets. To validate this microservice-oriented decomposition approach, we performed a case study on Cargo Tracking System that is a typical case decomposed by other microservices identification methods (Service Cutter and API Analysis), and made comparisons in terms of specific coupling and cohesion metrics. The results show that the proposed dataflow-driven decomposition approach can recommend microservice candidates with sound coupling and cohesion through a rigorous and easy-to-operate implementation with semi-automatic support.
... Adler [19] presented an algebra formalizing the decomposition procedure of data flow diagrams, however, it has some drawbacks like inefficiency and no guarantee on finding a good decomposition. Arndt et al. [20] improved Adler's work by replenishing new criteria and implementing their own algorithm for the decomposition of data flow diagrams. Both of these two studies equally lack assessments on decomposition result except for the intuitive notion. ...
... DFD is an important technique in system development. DFD allows developers to show the system's flow [4]. As shown in Figure 2, there are four symbols used to represent system requirements, which are external agents, data flows, data stores and processes [5]. ...
Article
Full-text available
[MUHAMAD AMIRUL BIN MAT HUSSAIN, AHMAD SUHAIMI BAHARUDIN, KAMAL KARKONASASI (2016). USM INTERNSHIP AND CAREER PORTAL. INTERNATIONAL JOURNAL OF APPLIED ENGINEERING RESEARCH ISSN 0973-4562 VOLUME 11, NUMBER 20, PP 10247-10251 [INDEXING: SCOPUS].] ----------> The use of the information technology in the people’s lives is rapidly becoming more important every day. The development and use of the web page or portal is also part of this information technology and becoming more popular among people because it makes easier for people to access the information and services all over the world. Therefore, this paper will discuss the details about the development of the USM Internship and Career Portal. This portal is developed in order to overcome the problem and limitation that arise when using the manual system. The current system which is not efficient and effective in the workflow is no longer reliable to use now. The proposed system is divided into seven modules which is Profile module, Internship module, Career module, Forum module, Evaluation module, Report module, and Logbook module.
... Data Flow Diagram is a graphical system model that shows all of the main requirements for an information system in one diagram. Itrepresents the flow of information within the system for example, how information enters and leaves the system, what changes the information, where information is stored and so on [5]. ...
Article
Full-text available
[NUR FATIN EISA, AHMAD SUHAIMI BAHARUDIN, KAMAL KARKONASASI (2016). DEVELOPING ADUN (AHLI DEWAN UNDANGAN NEGERI/STATE ASSEBLYMAN) E-COMMUNITY PORTAL:ANNOUNCEMENT MODULE AND DISCUSSION MODULE. INTERNATIONAL JOURNAL OF ENGINEERING AND TECHNOLOGY (IJET), [P-ISSN: 2319-8613, E-ISSN: 0975-4024], VOL (8), NUMBER (3), JUN-JUL 2016, PP 1463-1470. [INDEXING: SCOPUS].] ----------> Nowadays, there are many government agencies had already developed a portal sites for their citizens. This kind of portal is found to be particularly useful to those who faced certain problem regarding their dissatisfaction on the government's service. However, the main disadvantage of such portal is the lack of their public services to the citizens. Thus, ADUN e-community portal is designed to utilize information and help ADUN to manage the complaints created by the community. By using the system community, members can keep a close eye on the progress of many troublesome complaints which are disturbing their living condition and ADUN can take a proper control of the information and complaints received from the community. The system was divided into nine modules which is myprofile, announcement, discussion, complaints, directory, message board, crime prevention tip, report and analysis. This paper discusses further on the announcement module and discussion module.
... DFD (see Appendix A) describes graphically a system from general to specific perspective; this description is based on the inputs, outputs and processes. Decomposition technique is essential to start from a high level of description until a desired level of detail is obtained [AG92]; basically, this consists of defining the main functionality or process of the system as top level, from which other functionalities are defined in further sub-levels. Figure 3.5 illustrates two levels of description. ...
Thesis
Daimler FleetBoard offers telematic services by means of a special hardware installed in customers' vehicles to collect and send data to the FleetBoard's Service Centre (FBSC) platform. FBSC is in charge of receiving, processing and storing data generated by vehicles. The quality assurance and testing department guarantees that the telematic services meet their purpose, and no failures exist in the system. In that way, software to simulate vehicles' behaviour is required to test the functionalities of FBSC. However, the problem rises since this software uses simulated data instead of real data. In addition, the process of creating routes for simulations is manual. Based on the mentioned problems, the objective of this thesis is to design, implement and evaluate a prototype as mechanism of importing routes generated by real vehicles to the simulator’s database, to emphasise on using real data for simulations. Additionally, the process of creating routes is optimized using Web Map Services to automate this process. Consequently, an evaluation of the prototypical implementation is considered to guarantee the proper operation of the prototype's layers: WEB GUI (supported by Java Server Faces), business logic and the persistence layer (fostered by Java Persistence API).
... Dragos Truscan, Joo M. Fernandes, Johan Lilius, Arndt T., Guercio A. stated that a set of quality measures described by Adler do not correspond to the intuitive notion of a good decomposition. The use Adler's algebra leads to an inefficient decomposition process, as well as one which is not guaranteed to find a good decomposition [4].These authors proposed an approach of automating the process of DFD design. ...
Conference Paper
Full-text available
In the past two decades there has been a continuous change in the software development. Organizations use different programming languages for developing different software applications. The applications which were developed earlier were based on procedural programming languages like ‘C’, FORTRAN, COBOL etc. The applications which are being developed now, may be based on object oriented languages or procedural languages or a mix of both. In order to understand how the information system is designed one may need to understand the behavior of the program. The behavior of the program can be understood with the help of design information. This design information about the application program can be abstracted the from data flow diagram. In this paper we are proposing a methodology to abstract the behavior of the program and then representing this behavior in the form of a data flow diagram through a series of steps.
Conference Paper
Software engineering is particularly concerned with the construction of large systems. Existing software engineering tools tend to be adequate for medium size systems, but not as useful for large and very large systems. This paper presents a model viewing system as the basis for graph based CASE tools which overcomes this lack of scalability. With existing commercial tools the number of user steps to browse or edit a model increases with the size of the model. With the approach of this paper the number of steps remains constant regardless of the model size. In fact, only one step is required for operations such as adding a flow between two processes anywhere in the model, or moving a submodel to a new parent. This paper outlines the approach with a structured analysis example, provides a formal description of the model viewing system, and discusses some limitations
Article
DARTS—a design method for real-time systems—leads to a highly structured modular system with well-defined interfaces and reduced coupling between tasks.
Article
A layout algorithm is presented that allows the automatic drawing of data flow diagrams, a diagrammatic representation widely used in the functional analysis of information systems. A grid standard is defined for such diagrams, and aesthetics for a good readability are identified. The layout algorithm receives as input an abstract graph, specifying connectivity relations between the elements of the diagram, and produces as output a corresponding diagram according to the aesthetics. The basic strategy is to build incrementally the layout; first, a good topology is constructed with few crossings between edges; subsequently, the shape of the diagram is determined in terms of angles appearing along edges; and finally, dimensions are given to the graph, obtaining a grid skeleton for the diagram.
Article
DeMarco's "Structured Analysis and System Specification" is the final paper chosen for inclusion in this book of classic articles on the structured revolution. It is last of three on the subject of analysis, and, together with Ross/Schoman [Paper 22] and Teichroew/Hershey [Paper 23], provides a good idea of the direction that structured analysis will be taking in the next few years. Any competent systems analyst undoubtedly could produce a five-page essay on "What's Wrong with Conventional Analysis." DeMarco, being an ex-analyst, does so with pithy remarks, describing conventional analysis as follows" "Instead of a meaningful interaction between analyst and user, there is often a period of fencing followed by the two parties' studiously ignoring each other... The cost-benefit study is performed backwards by deriving the development budget as a function of expected savings. (Expected savings were calculated by prorating cost reduction targets handed down from On High.)" In addition to providing refreshing prose, DeMarco's approach differs somewhat --- in terms of emphasis --- from that of Teichroew/Hershey and of Ross/Schoman. Unlike his colleagues, DeMarco stresses the importance of the maintainability of the specification. Take, for instance, the case of one system consisting of six million lines of COBOL and written over a period of ten years by employees no longer with the organization. Today, nobody knows what the system does.t Not only have the program listings and source code been lost --- a relatively minor disaster that we all have seen too often --- but the specifications are completely out of date. Moreover, the system has grown so large that neither the users nor the data processing people have the faintest idea of what the system is supposed to be doing, let alone how the mysterious job is being accomplished! The example is far from hypothetical, for this is the fate that all large systems eventually will suffer, unless steps are taken to keep the specifications both current and understandable across generations of users. The approach that DeMarco suggests --- an approach generally known today as structured analysis --- is similar in form to that proposed by Ross and Schoman, and emphasizes a top-down, partitioned, graphic model of the system-to-be. However, in contrast to Ross and Schoman, DeMarco also stresses the important role of a data dictionary and the role of scaled-down specifications, or minispecs, to be written in a rigorous subset of the English language known as Structured English. DeMarco also explains carefully how the analyst proceeds lrom a physical description of the user's current system, through a logical description of that same system, and eventually into a logical description of the new system that the user wants. Interestingly, DeMarco uses top-down, partitioned dataflow diagrams to illustrate this part of the so-called Project Life Cycle --- thus confirming that such a graphic model can be used to portray virtually any system. As in other short papers on the subject, the details necessary for carrying out DeMarco's approach are missing or are dealt with in a superficial manner. Fortunately, the details can be found: Listed at the end of the paper are references to three full-length books and one videotape training course, all dealing with the kind of analysis approach recommended by DeMarco.
Article
Classical and formal methods of information and software systems development are reviewed. The use of computer-aided software engineering (CASE) is discussed. These automated environments and tools make it practical and economical to use formal system-development methods. Their features, tools, and adaptability are discussed. The opportunities that CASE environments provide to use analysis techniques to assess the reliability of information systems before they are implemented and to audit a completed system against its design and maintain the system description as accurate documentation are examined
Article
Data flow diagram process decomposition, as applied in the analysis phase of software engineering, is a top-down method that takes a process, and its input and output data flows, and logically implements the process as a network of smaller processes. The decomposition is generally performed in an ad hoc manner by an analyst applying heuristics, expertise, and knowledge to the problem. An algebra that formalizes process decomposition is presented using the De Marco representation scheme. In this algebra, the analyst relates the disjoint input and output sets of a single process by specifying the elements of an input/output connectivity matrix. A directed acyclic graph is constructed from the matrix and is the decomposition of the process. The graph basis, grammar matrix, and graph interpretations, and the operators of the algebra are discussed. A decomposition procedure for applying the algebra, prototype, and production tools and outlook are also discussed
An Approach to Structured Analysis
  • D T Ross
  • J W Brackett
D. T. Ross, J.W. Brackett, " An Approach to Structured Analysis ", Computer Decisions, vol. 8, no. 9, pp. 40-44, Sept. 1976.