Figure 7 - uploaded by Jakob Mauss
Content may be subject to copyright.
An 8-bit full adder For the empirical comparison, we used a set R of 137 relations, representing eight 1-bit full adders connected in series as shown in Figure 7, and the assignments c 0 = 1, and for 0 ≤ k ≤ 7: x k = 0, y k = 1. If we add one more relation of the form c k = 0, then R becomes inconsistent and contains a minimal conflict M of size |M| = 2 + 3 k. This gives us 8 different sets R k of size 138, containing a minimal conflict M of size 2 + 3 k.

An 8-bit full adder For the empirical comparison, we used a set R of 137 relations, representing eight 1-bit full adders connected in series as shown in Figure 7, and the assignments c 0 = 1, and for 0 ≤ k ≤ 7: x k = 0, y k = 1. If we add one more relation of the form c k = 0, then R becomes inconsistent and contains a minimal conflict M of size |M| = 2 + 3 k. This gives us 8 different sets R k of size 138, containing a minimal conflict M of size 2 + 3 k.

Source publication
Conference Paper
Full-text available
We address here the following question: Given an inconsistent theory, find a minimal subset of it responsible for the inconsistency. Such conflicts are essential for problem solvers that make use of conflict-driven search (cf. (2, 4, 9)), for interactive applications where explanations are required (cf. (16, 22)), or as supporting tools for consist...

Citations

... In that case, representations that precisely express the inaccuracy, such as inequalities or intervals of values [lb, ub] may be used. Several papers deal with consistency-based diagnosis given inaccurate data [8,18,20,22]. ...
... Fortunately, for diagnosis, it is not always necessary to consider all derived intervals or inequations. Unnecessary computations can be avoided by ignoring derived intervals and inequations as long as there is no evidence that they cannot be ignored, and by identifying minimal conflicts using the derivation tree for a derived inconsistency [22]. ...
Article
Full-text available
Models used for Model-Based Diagnosis usually assume that observations, and predictions based on the system description are accurate. In some domains, however, this assumption is invalid. Observations may not be accurate or the behavior model of the system does not allow for accurate predictions. Therefore, the accuracy of predictions, which is a function of the accuracy of the observed system inputs and the behavior model of the system, may differ from the accuracy of the observed system outputs.This paper investigates the consequences of using inaccurate values.1An initial version of the paper has been published in Roos (2009).1 The paper will show that traditional notions of preferred diagnoses such as abductive diagnosis and minimum consistency-based diagnosis are no longer suited if the available data has different accuracies. A new notion of preferred diagnoses, called maximal-confirmation diagnoses, is introduced.
... In that case, representations that precisely express the inaccuracy, such as inequalities or intervals of values [ul, ub] may be used [15]. Several papers deal with consistency-based diagnosis given inaccurate data [3, 8, 9, 10]. The use of inaccurate values raises a number of problems with respect to the notion of preferred diagnoses . ...
... Fortunately, for diagnosis, it is not always necessary to consider all derived intervals or inequations . Unnecessary computations can be avoided by ignoring derived intervals and inequations as long as there is no evidence that they cannot be ignored, and by identifying minimal conflicts using the derivation tree for a derived inconsistency [10]. Cordier [2] has addressed consequences of using inaccurate values for abductive diagnosis. ...
Article
Full-text available
Models used for Model-Based Diagnosis usually assume that the inaccuracy of data is smaller than the precision with which the data is described. In some domains, however, this assumption is invalid. Obser-vations may not be accurate or the behavior model of the system does not allow for accurate predictions. Therefore, the accuracy of predictions, which is a function of the accuracy of the observed system inputs and the behavior model of the system, may differ from the accuracy of the observed system outputs. This paper investigates the consequences of using inaccurate values. The paper will show that tradi-tional notions of preferred diagnoses such as abductive diagnosis and minimum consistency-based diagno-sis are no longer suited if the available data has different accuracies. A new notion of preferred diagnoses, called maximum confirmation diagnoses, is introduced.
... As extracting a minimal φ-nogood is an activity limited to a branch of a search tree, the proposed algorithms involve (at least, partially) a constructive schema in order to keep some incrementality of the propagation process. On the other hand, the last version of QuickXplain [22] exploits a divide and conquer approach (as in [29]) but is defined in a more general context. For example, it can be used to extract Minimal Unsatisfiable Cores (MUCs) of constraint networks which has been recently studied both theoretically and experimentally in [18]. ...
Article
Full-text available
In this paper1., nogood recording is investigated for CSP within the randomization and restart framework. Our goal is to avoid the same situations to occur from one run to the next ones. More precisely, nogoods are recorded when the current cutoff value is reached, i.e. before restarting the search algorithm. Such a set of nogoods is extracted from the last branch of the current search tree and exploited using the structure of watched literals o riginally proposed for SAT. We prove that the worst-case time complexity of extracting such nogoods at the end of each run is only O(n2d) where n is the number of variables of the constraint network and d the size of the greatest domain, whereas for any node of the search tree, the worst-case time complexity of exploiting these nogoods to enforce Generalized Arc Consistency (GAC) is O(n|B|) where |B| denotes the number of recorded nogoods. As the number of nogoods recorded before each new run is bounded by the length of the last branch, the total number of recorded nogoods is polynomial in the number of restarts. Interestingly, we show that when the minimization of the nogoods is envisioned with respect to an inference operator �, it is possible to directly identify some nogoods that canno t be minimized. For � = AC (i.e. for MAC), the worst-case time complexity of extracting minimal nogoods is slightly increased to O(en2d3) where e is the number of constraints of the network. Experimentation over a wide range of CSP instances using a generic state-of-the-art CSP solver demonstrates the effectiveness of this approach. Recording nogoods (and in particular, minimal nogoods) from restarts significantly improves the robustne ss of the solver.
... A method to find all MUCs from a given set of constraints has been presented in [12] and in [9], which corresponds to an exhaustive exploration of a so-called CS-tree but is limited by the combinatorial blow-up in the number of subsets of constraints. Other approaches are given in [18] and in [15], where an explanation that is based on the user's preferences is extracted. Also, the PaLM framework [16] provides cores that are not guaranteed to be minimal. ...
Conference Paper
Full-text available
When a constraint satisfaction problem (CSP) admits no solution, most current solvers express that the whole search space has been explored unsuccessfully but do not exhibit which constraints are actually contradicting one another and make the problem infeasible. In this paper, we improve a recent heuristic-based approach to compute infeasible minimal subparts of CSPs, also called minimally unsatisfiable cores (MUCs). The approach is based on the heuristic exploitation of the number of times each constraint has been falsified during previous failed search steps. It appears to improve the performance of the initial technique, which was the most efficient one until now
... The (new) method QuickXplain [12] exploits a divide and conquer approach (which exploits a dichotomic process) and whose complexity has been discussed in Section 4.2. A similar approach, called XC1, has been proposed in [18] in a more general context. ...
Conference Paper
Full-text available
We address the problem of extracting Minimal Unsatisfi- able Cores (MUCs) from constraint networks. This computationally hard problem has a practical interest in many application domains such as configuration, planning, diagnosis, etc. Indeed, identifying one or several disjoint MUCs can help circumscribe different sources of inconsistency in order to repair a system. In this paper, we propose an original approach that involves performing successive runs of a complete backtracking search, using constraint weighting, in order to surround an inconsistent part of a network, before identifying all transition constraints belonging to a MUC using a dichotomic pro- cess. We show the effectiveness of this approach, both theoretically and experimentally.
... En conséquence, de façon à conserver une incrémentalité dans le processus de propagation, l'algorithme proposé dans [12,24] implique (au moins partiellement) une approche constructive. La (nouvelle) méthode QuickXplain de U. Junker [13] est basée sur une approche de type "Divide and Conquer" de même que celle proposée dans [20]. QuickXplain, qui est le composant proposant les explications dans l'outil industriel de la société Ilog pour la configuration basé sur les contraintes, peut être utilisé pour extraire des MUCs, mais cette méthode effectue, dans le pire des cas, un nombre d'appels à MAC supérieure à la notre dés lors que 2k e /(k e − 1).(log 2 (k e ) − 1) < log 2 (e). ...
Article
Full-text available
Résumé Nous nous intéressons au problème de l'extraction de noyaux insatisfiables minimaux (MUCs) de réseaux de contraintes. Ce problème a un intérêt pratique dans de nombreux domaines d'application tels que la confi-guration, la planification, le diagnostique, etc. En ef-fet, identifier un ou plusieurs MUCs indépendants, i.e. des MUCs qui ne partagent aucune contrainte, permet d'isoler différentes sources d'inconsistance et de corri-ger un système incohérent. Dans cet article, nous pro-posons une approche originale pour extraire un MUC d'un réseau de contraintes. Cette approche comporte deux étapes. La première consiste à exécuter plu-sieurs fois un algorithme de recherche complète, en utilisant la pondération de contraintes, de manière à circonscrire une partie inconsistante du réseau. La se-conde consiste à identifier, en utilisant un processus dichotomique, les contraintes de transition appartenant à un MUC. Nous montrons l'intérêt de cette approche en avançant des arguments théoriques et pratiques. Abstract We address the problem of extracting Minimal Un-satisfiable Cores (MUCs) from constraint networks. This computationally hard problem has a practical in-terest in many application domains such as configura-tion, planning, diagnosis, etc. Indeed, identifying one or several disjoint MUCs can help circumscribe dif-ferent sources of inconsistency in order to repair a sys-tem. In this paper, we propose an original approach that involves performing successive runs of a complete backtracking search, using constraint weighting, in or-der to surround an inconsistent part of a network, be-fore identifying all transition constraints belonging to a MUC using a dichotomic process. We show the effec-tiveness of this approach, both theoretically and expe-rimentally.
... QUICKXPLAIN will prune all subtrees in the call graph that do not contain an element of P * and thus discovers irrelevant subproblems dynamically. Similar to (Mauss & Tatar 2002), it thus profits from the properties of decomposable problems, but additionally takes preferences into account. ...
... QUICKX-PLAIN unifies and improves these two methods by successively decomposing the complete explanation problem into subproblems of the same size. (Mauss & Tatar 2002) follow a similar approach, but do not take preferences into account. (de la Banda, Stuckey, & Wazny 2003) determine all conflicts by exploring a conflict-set tree. ...
Conference Paper
Over-constrained problems can have an exponential number of conflicts, which explain the failure, and an exponential number of relaxations, which restore the consistency. A user of an interactive application, however, desires explanations and relaxations containing the most important constraints. To address this need, we define preferred explanations and relaxations based on user preferences between constraints and we compute them by a generic method which works for arbitrary CP, SAT, or DL solvers. We significantly accelerate the basic method by a divide-and-conquer strategy and thus provide the technological basis for the explanation facility of a principal industrial constraint programming tool, which is, for example, used in numerous configuration applications.
... Workshop on Automated and Algorithmic Debugging is computed, which is advantageous when combined with the MBD engine and partitioning strategies (see below). The model derives a contradiction iff there exists no feasible path between the entry state and the exit state of the program. 2 To determine the set of components the conflict is composed of, we follow the approach of [MT02]. The algorithm can be summarized as follows. ...
Article
Full-text available
This paper introduces an automatic debugging framework that relies on model-based reasoning techniques to locate faults in programs. In particular, model-based diagnosis, together with an abstract interpretation based conflict detection mechanism is used to derive diagnoses, which correspond to possible faults in programs. Design information and partial specifications are applied to guide a model revision process, which allows for automatic detection and correction of structural faults.
... This combination is aided by the fact that both subareas tend to use Constraint Satisfaction Problems (CSPs) as their representation of choice. The diagnosis computation work (Fattah and Dechter, 1995; Stumptner and Wotawa, 2001; Mauss and Tatar, 2002 ) focuses on the relational combination of the A hypergraph of a CSP can be easily constructed by mapping all variables of the CSP to vertices and the constraint scopes to hyperedges. A CSP with an acyclic hypergraph can be solved effectively in a backtrack-free manner by first traversing the graph from the leafs to the root and computing possible value tuples and secondly, traversing the graph from the root to the leafs and selecting one tuple of a node as a solution. ...
... The hypergraph corresponding toFig. 1 is cyclic (seeFigure 2). In the rest of the paper, we recapitulate decomposition methods, present a version of TREE* that fits these methods, show the interaction of decomposition and TREE*, and present an extension to the algorithm that can be used with extended domains as presented in (Mauss and Tatar, 2002). ...
... Conceptually, however, nothing is changed since, as we will show, the definition of the TREE* algorithm fits the requirements. We show this by adopting the notation used for the basic computational operations of the aggregation paradigm described in (Mauss and Tatar, 2002). The Rich Constraint Languages approach described in (Mauss and Tatar, 2002) consists of three inference procedures, which are applied to a set of constraints . ...
Conference Paper
Full-text available
Decomposition methods are used to convert general constraint satisfaction problems into an equivalent tree-structured problem that can be solved more ef­ fectively. Recently, diagnosis algorithms for tree- structured systems have been introduced, but the prerequisites of coupling these algorithms to the outcome of decomposition methods have not been analyzed in detail, thus limiting their diagnostic ap­ plicability. In this paper we generalize the TREE* algorithm and show how to use hypertree decom­ position outcomes as input to the algorithm to com­ pute the diagnoses of a general diagnosis problem.
... Celle-ci correspond à l'exploration d'un arbre « CS », mais est limitée par l'explosion combinatoire du nombre de sous-problèmes possibles d'un CSP. D'autres approches sont données dans [22] et [16], où une explication basée sur les préférences de l'utilisateur est extraite. On trouve également le système PaLM [17] , implémenté dans la plateforme Choco [19], qui est un outil d'explication pouvant répondre à des questions du type : pourquoi n'existe-il pas de solution contenant la valeur v i pour une certaine variable A ? De plus, en cas d'insatisfiabilité, PaLM est capable d'extraire un sous-problème insatisfiable, mais sans garantie que celui-ci soit minimal. ...
Article
Full-text available
In this paper, a new form of explanation and recovery tech- nique for the unsatisfiability of discrete CSPs is introduce d. Whereas most approaches amount to providing users with a minimal number of constraints that should be dropped in order to recover satisfiability, a finer-grained alternative technique is introduced. It allows the user to reason both at the constraints and tuples levels by exhibiting both pro- blematic constraints and tuples of values that would allow satisfiability to be recovered if they were not forbidden. To this end, the Minimal Set of Unsatisfiable Tuples (MUST) concept is introduced. Its formal relationships with Mini- mal Unsatisfiable Cores (MUCs) are investigated. Interes- tingly, a concept of shared forbidden tuples is derived. Al- lowing any such tuple makes the corresponding MUC be- come satisfiable. From a practical point of view, a two-step approach to the explanation and recovery of unsatisfiable CSPs is proposed. First, a recent approach proposed by Hemery et al.'s is used to locate a MUC. Second, a speci- fic SAT encoding of a MUC allows MUSTs to be computed by taking advantage of the best current technique to locate Minimally Unsatisfiable Sub-formulas (MUSes) of Boolean formulas. Interestingly enough, shared tuples coincide wi th protected clauses, which are one of the keys to the efficiency of this SAT-related technique. Finally, the feasibility ofthe approach is illustrated through extensive experimental re - sults.