results for au:Silva_R in:cs

- In this work, we argue for the importance of causal reasoning in creating fair algorithms for decision making. We give a review of existing approaches to fairness, describe work in causality necessary for the understanding of causal approaches, argue why causality is necessary for any approach that wishes to be fair, and give a detailed analysis of the many recent approaches to causality-based fairness.
- May 10 2018 cs.CY arXiv:1805.03522v1Workshops are used to explore a specific topic, transfer knowledge, solve identified problems or create something new. In funded research projects and other research endeavours, workshops are the mechanism to gather the wider project, community or interested people together around a particular topic. However, natural questions arise: how do we measure the impact of these workshops? Do we know whether they are meeting the goals and objectives we set for them? What indicators should we use? In response to these questions, this paper will outline rules that will improve the measurement of the impact of workshops.
- This paper introduces a variational approximation framework using direct optimization of what is known as the \it scale invariant Alpha-Beta divergence (sAB divergence). This new objective encompasses most variational objectives that use the Kullback-Leibler, the Rényi or the gamma divergences. It also gives access to objective functions never exploited before in the context of variational inference. This is achieved via two easy to interpret control parameters, which allow for a smooth interpolation over the divergence space while trading-off properties such as mass-covering of a target distribution and robustness to outliers in the data. Furthermore, the sAB variational objective can be optimized directly by repurposing existing methods for Monte Carlo computation of complex variational objectives, leading to estimates of the divergence instead of variational lower bounds. We show the advantages of this objective on Bayesian models for regression problems.
- This paper aims to develop formal methods to achieve a performance guaranteed integrated task and motion planning (ITMP) with respect to high-level specifications given by signal temporal logic (STL). It is a problem of practical importance because many safety-critical applications in robotics (e.g., navigation, manipulation, and surgery) and autonomous systems (e.g., unmanned aircraft and self-driving cars) require a correct-by-construction design for complex missions. Traditional approaches usually assumed a discretization of the continuous state space or a finite partition of the workspace. Instead, we propose an abstraction-free method that synthesis continuous trajectories with respect to given STL specifications directly. For this, our basic idea is to leverage incremental constraint solving by efficiently adding constraints on motion feasibility at the discrete level. Our approach solves the ITMP from STL specification problem using a scalable logic-based inference combined with optimization, which uses efficient solvers, i.e., satisfiability modulo theories (SMT) and linear programming (LP). Consequently, our method has potential to scale up to handle high dimensional continuous dynamics. The proposed design algorithms are proved to be sound and complete, and numerical results are given to illustrate the effectiveness of the design algorithms.
- Dec 04 2017 cond-mat.stat-mech cs.GT arXiv:1712.00070v2In this work, we proposed a new $N$-person game in which the players can bet on two options, for example represented by two boxers. Some of the players have privileged information about the boxers and part of them can provide this information to uninformed players. However, this information may be true if the informed player is altruist or false if he is selfish. So, in this game, the players are divided in three categories: informed and altruist players, informed and selfish players, and uninformed players. By considering the matchings ($N/2$ distinct pairs of randomly chosen players) and that the payoff of the winning group follows aspects captured from two important games, the public goods game and minority game, we showed quantitatively and qualitatively how the altruism can impact on the privileged information. We localized analytically the regions of positive payoffs which were corroborated by numerical simulations performed for all values of information and altruism densities given that we know the information level of the informed players. Finally, in an evolutionary version of the game ,we showed that the gain of the informed players can get worse if we adopted the following procedure: the players increase their investment for situations of positive payoffs, and decrease their investment when negative payoffs occur.
- Nov 28 2017 cs.SE arXiv:1711.09713v1We present Boutiques, a system to automatically publish, integrate and execute applications across computational platforms. Boutiques applications are installed through software containers described in a rich and flexible JSON language. A set of core tools facilitate the construction, validation, import, execution, and publishing of applications. Boutiques is currently supported by several distinct virtual research platforms, and it has been used to describe dozens of applications in the neuroinformatics domain. We expect Boutiques to improve the quality of application integration in computational platforms, to reduce redundancy of effort, to contribute to computational reproducibility, and to foster Open Science.
- Nov 08 2017 cs.RO arXiv:1711.02201v1Correct-by-construction manipulation planning in a dynamic environment, where other agents can manipulate objects in the workspace, is a challenging problem. The tight coupling of actions and motions between agents and complexity of mission specifications makes the problem computationally intractable. This paper presents a reactive integrated mission and motion planning for mobile-robot manipulator systems operating in a partially known environment. We introduce a multi-layered synergistic framework that receives high-level mission specifications expressed in linear temporal logic and generates dynamically-feasible and collision-free motion trajectories to achieve it. In the high-level layer, a mission planner constructs a symbolic two-player game between the robots and their environment to synthesis a strategy that adapts to changes in the workspace imposed by other robots. A bilateral synergistic layer is developed to map the designed mission plan to an integrated task and motion planner, constructing a set of robot tasks to move the objects according to the mission strategy. In the low-level planning stage, verifiable motion controllers are designed that can be incrementally composed to guarantee a safe motion planning for each high-level induced task. The proposed framework is illustrated with a multi-robot warehouse example with the mission of moving objects to various locations.
- Nov 01 2017 cs.CR arXiv:1710.11423v1Intel(R) Software Guard eXtensions (SGX) is a hardware-based technology for ensuring security of sensitive data from disclosure or modification that enables user-level applications to allocate protected areas of memory called enclaves. Such memory areas are cryptographically protected even from code running with higher privilege levels. This memory protection can be used to develop secure and dependable applications, but the technology has some limitations: ($i$) the code of an enclave is visible at load time, ($ii$) libraries used by the code must be statically linked, and ($iii$) the protected memory size is limited, demanding page swapping to be done when this limit is exceeded. We present DynSGX, a privacy preserving tool that enables users and developers to dynamically load and unload code to be executed inside SGX enclaves. Such a technology makes possible that developers use public cloud infrastructures to run applications based on sensitive code and data. Moreover, we present a series of experiments that assess how applications dynamically loaded by DynSGX perform in comparison to statically linked applications that disregard privacy of the enclave code at load time.
- We propose a dynamic edge exchangeable network model that can capture sparse connections observed in real temporal networks, in contrast to existing models which are dense. The model achieved superior link prediction accuracy on multiple data sets when compared to a dynamic variant of the blockmodel, and is able to extract interpretable time-varying community structures from the data. In addition to sparsity, the model accounts for the effect of social influence on vertices' future behaviours. Compared to the dynamic blockmodels, our model has a smaller latent space. The compact latent space requires a smaller number of parameters to be estimated in variational inference and results in a computationally friendly inference algorithm.
- Sep 04 2017 cs.HC arXiv:1709.00111v1Scientific software often presents very particular requirements regarding usability, which is often completely overlooked in this setting. As computational science has emerged as its own discipline, distinct from theoretical and experimental science, it has put new requirements on future scientific software developments. In this paper, we discuss the background of these problems and introduce nine aspects of good usability. We also highlight best practices for each aspect with an emphasis on applications in computational science.
- Machine learning can impact people with legal or ethical consequences when it is used to automate decisions in areas such as insurance, lending, hiring, and predictive policing. In many of these scenarios, previous decisions have been made that are unfairly biased against certain subpopulations, for example those of a particular race, gender, or sexual orientation. Since this past data may be biased, machine learning predictors must account for this to avoid perpetuating or creating discriminatory practices. In this paper, we develop a framework for modeling fairness using tools from causal inference. Our definition of counterfactual fairness captures the intuition that a decision is fair towards an individual if it is the same in (a) the actual world and (b) a counterfactual world where the individual belonged to a different demographic group. We demonstrate our framework on a real-world problem of fair prediction of success in law school.
- Nov 22 2016 cs.NI arXiv:1611.06609v1Wireless networks have become the main form of Internet access. Statistics show that the global mobile Internet penetration should exceed 70\% until 2019. Wi-Fi is an important player in this change. Founded on IEEE 802.11, this technology has a crucial impact in how we share broadband access both in domestic and corporate networks. However, recent works have indicated performance issues in Wi-Fi networks, mainly when they have been deployed without planning and under high user density. Hence, different collision avoidance techniques and Medium Access Control protocols have been designed in order to improve Wi-Fi performance. Analyzing the collision problem, this work strengthens the claims found in the literature about the low Wi-Fi performance under dense scenarios. Then, in particular, this article overviews the MAC protocols used in the IEEE 802.11 standard and discusses solutions to mitigate collisions. Finally, it contributes presenting future trends in MAC protocols. This assists in foreseeing expected improvements for the next generation of Wi-Fi devices.
- Cooperative adaptive cruise control(CACC) system provides a great promise to significantly reduce traffic congestion while maintaining a high level of safety. Recent years have seen an increase of using formal methods in the analysis and design of cooperative adaptive cruise control systems. However, most existing results using formal methods usually assumed an ideal inter-vehicle communication, which is far from the real world situation. Hence, we are motivated to close the gap by explicitly considering non-deterministic time delay and packet dropout due to unreliable inter-vehicle communications. In particular, we consider a passive safety property, which requests a vehicle to avoid any collisions that can be considered as its fault. Under the assumption that the communication delay is bounded and we know the upper bound, we then formally verify the passivity safety of a class of hybrid CACC systems. This result allows us to define a safe control envelope that will guide the synthesis of control signals. Vehicles under the CACC within the safe control envelope are guaranteed to avoid active collisions.
- An autonomous navigation with proven collision avoidance in unknown and dynamic environments is still a challenge, particularly when there are moving obstacles. A popular approach to collision avoidance in the face of moving obstacles is based on model predictive algorithms, which, however, may be computationally expensive. Hence, we adopt a reactive potential field approach here. At every cycle, the proposed approach requires only current robot states relative to the closest obstacle point to find the potential field in the current position; thus, it is more computationally efficient and more suitable to scale up for multiple agent scenarios. Our main contribution here is to write the reactive potential field based motion controller as a hybrid automaton, and then formally verify its safety using differential dynamic logic. In particular, we can guarantee a passive safety property, which means that collisions cannot occur if the robot is to blame, namely a collision can occur only if the robot is at rest. The proposed controller and verification results are demonstrated via simulations and implementation on a Pioneer P3-AT robot.
- We propose a hierarchical design framework to automatically synthesize coordination schemes and control policies for cooperative multi-agent systems to fulfill formal performance requirements, by associating a bottom-up reactive motion controller with a top-down mission plan. On one hand, starting from a global mission that is specified as a regular language over all the agents' mission capabilities, a mission planning layer sits on the top of the proposed framework, decomposing the global mission into local tasks that are in consistency with each agent's individual capabilities, and compositionally justifying whether the achievement of local tasks implies the satisfaction of the global mission via an assume-guarantee paradigm. On the other hand, bottom-up motion plans associated with each agent are synthesized corresponding to the obtained local missions by composing basic motion primitives, which are verified safe by differential dynamic logic (d$\mathcal{L}$), through a Satisfiability Modulo Theories (SMT) solver that searches feasible solutions in face of constraints imposed by local task requirements and the environment description. It is shown that the proposed framework can handle dynamical environments as the motion primitives possess reactive features, making the motion plans adaptive to local environmental changes. Furthermore, on-line mission reconfiguration can be triggered by the motion planning layer once no feasible solutions can be found through the SMT solver. The effectiveness of the overall design framework is validated by an automated warehouse case study.
- Apr 20 2016 cs.RO arXiv:1604.05657v2Integrated Task and Motion Planning (ITMP) for mobile robots in a dynamic environment with moving obstacles is a challenging research question and attracts more and more attentions recently. Most existing methods either restrict to static environments or lack performance guarantees. This motivates us to investigate the ITMP problem using formal methods and propose a bottom-up compositional design approach called CoSMoP (Composition of Safe Motion Primitives). Our basic idea is to synthesize a global motion plan through composing simple local moves and actions, and to achieve its performance guarantee through modular and incremental verifications. The design consists of two steps. First, basic motion primitives are designed and verified locally. Then, a global motion path is built upon these certified motion primitives by concatenating them together. In particular, we model the motion primitives as hybrid automata and verify their safety through formulating as Differential Dynamic Logic (d$\mathcal{L}$). Furthermore, these proven safe motion primitives are composed based on an encoding to Satisfiability Modulo Theories (SMT) that takes into account the geometric constraints. Since d$\mathcal{L}$ allows compositional verification, the sequential composition of the safe motion primitives also preserves safety properties. Therefore, the CoSMoP generates correct plans for given task specifications that are formally proven safe even for moving obstacles. Illustrative examples are presented to show the effectiveness of the methods.
- Digital images are ubiquitous in our modern lives, with uses ranging from social media to news, and even scientific papers. For this reason, it is crucial evaluate how accurate people are when performing the task of identify doctored images. In this paper, we performed an extensive user study evaluating subjects capacity to detect fake images. After observing an image, users have been asked if it had been altered or not. If the user answered the image has been altered, he had to provide evidence in the form of a click on the image. We collected 17,208 individual answers from 383 users, using 177 images selected from public forensic databases. Different from other previously studies, our method propose different ways to avoid lucky guess when evaluating users answers. Our results indicate that people show inaccurate skills at differentiating between altered and non-altered images, with an accuracy of 58%, and only identifying the modified images 46.5% of the time. We also track user features such as age, answering time, confidence, providing deep analysis of how such variables influence on the users' performance.
- Dec 18 2014 cs.DC arXiv:1412.5557v2This is the final report on reproducibility@xsede, a one-day workshop held in conjunction with XSEDE14, the annual conference of the Extreme Science and Engineering Discovery Environment (XSEDE). The workshop's discussion-oriented agenda focused on reproducibility in large-scale computational research. Two important themes capture the spirit of the workshop submissions and discussions: (1) organizational stakeholders, especially supercomputer centers, are in a unique position to promote, enable, and support reproducible research; and (2) individual researchers should conduct each experiment as though someone will replicate that experiment. Participants documented numerous issues, questions, technologies, practices, and potentially promising initiatives emerging from the discussion, but also highlighted four areas of particular interest to XSEDE: (1) documentation and training that promotes reproducible research; (2) system-level tools that provide build- and run-time information at the level of the individual job; (3) the need to model best practices in research collaborations involving XSEDE staff; and (4) continued work on gateways and related technologies. In addition, an intriguing question emerged from the day's interactions: would there be value in establishing an annual award for excellence in reproducible research?
- In a variety of disciplines such as social sciences, psychology, medicine and economics, the recorded data are considered to be noisy measurements of latent variables connected by some causal structure. This corresponds to a family of graphical models known as the structural equation model with latent variables. While linear non-Gaussian variants have been well-studied, inference in nonparametric structural equation models is still underdeveloped. We introduce a sparse Gaussian process parameterization that defines a non-linear structure connecting latent variables, unlike common formulations of Gaussian process latent variable models. The sparse parameterization is given a full Bayesian treatment without compromising Markov chain Monte Carlo efficiency. We compare the stability of the sampling procedure and the predictive ability of the model against the current practice.
- Learning the joint dependence of discrete variables is a fundamental problem in machine learning, with many applications including prediction, clustering and dimensionality reduction. More recently, the framework of copula modeling has gained popularity due to its modular parametrization of joint distributions. Among other properties, copulas provide a recipe for combining flexible models for univariate marginal distributions with parametric families suitable for potentially high dimensional dependence structures. More radically, the extended rank likelihood approach of Hoff (2007) bypasses learning marginal models completely when such information is ancillary to the learning task at hand as in, e.g., standard dimensionality reduction problems or copula parameter estimation. The main idea is to represent data by their observable rank statistics, ignoring any other information from the marginals. Inference is typically done in a Bayesian framework with Gaussian copulas, and it is complicated by the fact this implies sampling within a space where the number of constraints increases quadratically with the number of data points. The result is slow mixing when using off-the-shelf Gibbs sampling. We present an efficient algorithm based on recent advances on constrained Hamiltonian Markov chain Monte Carlo that is simple to implement and does not require paying for a quadratic cost in sample size.
- Oct 29 2012 cs.SE arXiv:1210.7030v1In this document we share the experiences gained throughout the development of a metro system case study. The model is constructed in Event-B using its respective tool set, the Rodin platform. Starting from requirements, adding more details to the model in a stepwise manner through refinement, we identify some keys points and available plugins necessary for modelling large systems (requirement engineering, decomposition, generic instantiation, among others), which ones are lacking plus strengths and weaknesses of the tool.
- Observed associations in a database may be due in whole or part to variations in unrecorded (latent) variables. Identifying such variables and their causal relationships with one another is a principal goal in many scientific and practical domains. Previous work shows that, given a partition of observed variables such that members of a class share only a single latent common cause, standard search algorithms for causal Bayes nets can infer structural relations between latent variables. We introduce an algorithm for discovering such partitions when they exist. Uniquely among available procedures, the algorithm is (asymptotically) correct under standard assumptions in causal Bayes net search algorithms, requires no prior knowledge of the number of latent variables, and does not depend on the mathematical form of the relationships among the latent variables. We evaluate the algorithm on a variety of simulated data sets.
- Latent variable models are used to estimate variables of interest quantities which are observable only up to some measurement error. In many studies, such variables are known but not precisely quantifiable (such as "job satisfaction" in social sciences and marketing, "analytical ability" in educational testing, or "inflation" in economics). This leads to the development of measurement instruments to record noisy indirect evidence for such unobserved variables such as surveys, tests and price indexes. In such problems, there are postulated latent variables and a given measurement model. At the same time, other unantecipated latent variables can add further unmeasured confounding to the observed variables. The problem is how to deal with unantecipated latents variables. In this paper, we provide a method loosely inspired by canonical correlation that makes use of background information concerning the "known" latent variables. Given a partially specified structure, it provides a structure learning approach to detect "unknown unknowns," the confounding effect of potentially infinitely many other latent variables. This is done without explicitly modeling such extra latent factors. Because of the special structure of the problem, we are able to exploit a new variation of composite likelihood fitting to efficiently learn this structure. Validation is provided with experiments in synthetic data and the analysis of a large survey done with a sample of over 100,000 staff members of the National Health Service of the United Kingdom.
- We introduce priors and algorithms to perform Bayesian inference in Gaussian models defined by acyclic directed mixed graphs. Such a class of graphs, composed of directed and bi-directed edges, is a representation of conditional independencies that is closed under marginalization and arises naturally from causal models which allow for unmeasured confounding. Monte Carlo methods and a variational approximation for such models are presented. Our algorithms for Bayesian inference allow the evaluation of posterior distributions for several quantities of interest, including causal effects that are not identifiable from data alone but could otherwise be inferred where informative prior knowledge about confounding is available.
- Knowing which individuals can be more efficient in spreading a pathogen throughout a determinate environment is a fundamental question in disease control. Indeed, over the last years the spread of epidemic diseases and its relationship with the topology of the involved system have been a recurrent topic in complex network theory, taking into account both network models and real-world data. In this paper we explore possible correlations between the heterogeneous spread of an epidemic disease governed by the susceptible-infected-recovered (SIR) model, and several attributes of the originating vertices, considering Erdös-Rényi (ER), Barabási-Albert (BA) and random geometric graphs (RGG), as well as a real case of study, the US Air Transportation Network that comprises the US 500 busiest airports along with inter-connections. Initially, the heterogeneity of the spreading is achieved considering the RGG networks, in which we analytically derive an expression for the distribution of the spreading rates among the established contacts, by assuming that such rates decay exponentially with the distance that separates the individuals. Such distribution is also considered for the ER and BA models, where we observe topological effects on the correlations. In the case of the airport network, the spreading rates are empirically defined, assumed to be directly proportional to the seat availability. Among both the theoretical and the real networks considered, we observe a high correlation between the total epidemic prevalence and the degree, as well as the strength and the accessibility of the epidemic sources. For attributes such as the betweenness centrality and the $k$-shell index, however, the correlation depends on the topology considered.
- Nov 14 2011 physics.soc-ph cs.DL arXiv:1111.2829v1Many discussions have enlarged the literature in Bibliometrics since the Hirsh proposal, the so called $h$-index. Ranking papers according to their citations, this index quantifies a researcher only by its greatest possible number of papers that are cited at least $h$ times. A closed formula for $h$-index distribution that can be applied for distinct databases is not yet known. In fact, to obtain such distribution, the knowledge of citation distribution of the authors and its specificities are required. Instead of dealing with researchers randomly chosen, here we address different groups based on distinct databases. The first group is composed by physicists and biologists, with data extracted from Institute of Scientific Information (ISI). The second group composed by computer scientists, which data were extracted from Google-Scholar system. In this paper, we obtain a general formula for the $h$-index probability density function (pdf) for groups of authors by using generalized exponentials in the context of escort probability. Our analysis includes the use of several statistical methods to estimate the necessary parameters. Also an exhaustive comparison among the possible candidate distributions are used to describe the way the citations are distributed among authors. The $h$-index pdf should be used to classify groups of researchers from a quantitative point of view, which is meaningfully interesting to eliminate obscure qualitative methods.
- Some hard problems from lattices, like LWE (Learning with Errors), are particularly suitable for application in Cryptography due to the possibility of using worst-case to average-case reductions as evidence of strong security properties. In this work, we show two LWE-based constructions of zero-knowledge identification schemes and discuss their performance and security. We also highlight the design choices that make our solution of both theoretical and practical interest.
- Directed acyclic graphs (DAGs) are a popular framework to express multivariate probability distributions. Acyclic directed mixed graphs (ADMGs) are generalizations of DAGs that can succinctly capture much richer sets of conditional independencies, and are especially useful in modeling the effects of latent variables implicitly. Unfortunately there are currently no good parameterizations of general ADMGs. In this paper, we apply recent work on cumulative distribution networks and copulas to propose one one general construction for ADMG models. We consider a simple parameter estimation approach, and report some encouraging experimental results.
- In a variety of disciplines such as social sciences, psychology, medicine and economics, the recorded data are considered to be noisy measurements of latent variables connected by some causal structure. This corresponds to a family of graphical models known as the structural equation model with latent variables. While linear non-Gaussian variants have been well-studied, inference in nonparametric structural equation models is still underdeveloped. We introduce a sparse Gaussian process parameterization that defines a non-linear structure connecting latent variables, unlike common formulations of Gaussian process latent variable models. The sparse parameterization is given a full Bayesian treatment without compromising Markov chain Monte Carlo efficiency. We compare the stability of the sampling procedure and the predictive ability of the model against the current practice.
- Ranking groups of researchers is important in several contexts and can serve many purposes such as the fair distribution of grants based on the scientist's publication output, concession of research projects, classification of journal editorial boards and many other applications in a social context. In this paper, we propose a method for measuring the performance of groups of researchers. The proposed method is called alpha-index and it is based on two parameters: (i) the homogeneity of the h-indexes of the researchers in the group; and (ii) the h-group, which is an extension of the h-index for groups. Our method integrates the concepts of homogeneity and absolute value of the h-index into a single measure which is appropriate for the evaluation of groups. We report on experiments that assess computer science conferences based on the h-indexes of their program committee members. Our results are similar to a manual classification scheme adopted by a research agency.
- Jan 08 2010 cs.LG arXiv:1001.1079v1Discovering latent representations of the observed world has become increasingly more relevant in data analysis. Much of the effort concentrates on building latent variables which can be used in prediction problems, such as classification and regression. A related goal of learning latent structure from data is that of identifying which hidden common causes generate the observations, such as in applications that require predicting the effect of policies. This will be the main problem tackled in our contribution: given a dataset of indicators assumed to be generated by unknown and unmeasured common causes, we wish to discover which hidden common causes are those, and how they generate our data. This is possible under the assumption that observed variables are linear functions of the latent causes with additive noise. Previous results in the literature present solutions for the case where each observed variable is a noisy function of a single latent variable. We show how to extend the existing results for some cases where observed variables measure more than one latent variable.
- Analogical reasoning depends fundamentally on the ability to learn and generalize about relations between objects. We develop an approach to relational learning which, given a set of pairs of objects $\mathbf{S}=\{A^{(1)}:B^{(1)},A^{(2)}:B^{(2)},\ldots,A^{(N)}:B ^{(N)}\}$, measures how well other pairs A:B fit in with the set $\mathbf{S}$. Our work addresses the following question: is the relation between objects A and B analogous to those relations found in $\mathbf{S}$? Such questions are particularly relevant in information retrieval, where an investigator might want to search for analogous pairs of objects that match the query set of interest. There are many ways in which objects can be related, making the task of measuring analogies very challenging. Our approach combines a similarity measure on function spaces with Bayesian analysis to produce a ranking. It requires data containing features of the objects of interest and a link matrix specifying which relationships exist; no further attributes of such relationships are necessary. We illustrate the potential of our method on text analysis and information networks. An application on discovering functional interactions between pairs of proteins is discussed in detail, where we show that our approach can work in practice even if a small set of protein pairs is provided.
- Real-time systems are computing systems in which the meeting of their requirements is vital for their correctness. Consequently, if the real-time requirements of these systems are poorly understood and verified, the results can be disastrous and lead to irremediable project failures at the early phases of development. The present work addresses the problem of detecting deadlock situations early in the requirements specification phase of a concurrent real time system, proposing a simple proof-of-concepts prototype that joins scenario-based requirements specifications and techniques based on topology. The efforts are concentrated in the integration of the formal representation of Message Sequence Chart scenarios into the deadlock detection algorithm of Fajstrup et al., based on geometric and algebraic topology.
- May 02 2005 cs.CE arXiv:cs/0505001v1In this article we study the behavior of a group of economic agents in the context of cooperative game theory, interacting according to rules based on the Potts Model with suitable modifications. Each agent can be thought of as belonging to a chain, where agents can only interact with their nearest neighbors (periodic boundary conditions are imposed). Each agent can invest an amount σ_i=0,...,q-1. Using the transfer matrix method we study analytically, among other things, the behavior of the investment as a function of a control parameter (denoted β) for the cases q=2 and 3. For q>3 numerical evaluation of eigenvalues and high precision numerical derivatives are used in order to assess this information.