858 resultados para Robust Probabilistic Model, Dyslexic Users, Rewriting, Question-Answering
Resumo:
We consider independent edge percolation models on Z, with edge occupation probabilities. We prove that oriented percolation occurs when beta > 1 provided p is chosen sufficiently close to 1, answering a question posed in Newman and Schulman (Commun. Math. Phys. 104: 547, 1986). The proof is based on multi-scale analysis.
Resumo:
We investigate the critical behaviour of a probabilistic mixture of cellular automata (CA) rules 182 and 200 (in Wolfram`s enumeration scheme) by mean-field analysis and Monte Carlo simulations. We found that as we switch off one CA and switch on the other by the variation of the single parameter of the model, the probabilistic CA (PCA) goes through an extinction-survival-type phase transition, and the numerical data indicate that it belongs to the directed percolation universality class of critical behaviour. The PCA displays a characteristic stationary density profile and a slow, diffusive dynamics close to the pure CA 200 point that we discuss briefly. Remarks on an interesting related stochastic lattice gas are addressed in the conclusions.
Resumo:
Axelrod`s model for culture dissemination offers a nontrivial answer to the question of why there is cultural diversity given that people`s beliefs have a tendency to become more similar to each other`s as they interact repeatedly. The answer depends on the two control parameters of the model, namely, the number F of cultural features that characterize each agent, and the number q of traits that each feature can take on, as well as on the size A of the territory or, equivalently, on the number of interacting agents. Here, we investigate the dependence of the number C of distinct coexisting cultures on the area A in Axelrod`s model, the culture-area relationship, through extensive Monte Carlo simulations. We find a non-monotonous culture-area relation, for which the number of cultures decreases when the area grows beyond a certain size, provided that q is smaller than a threshold value q (c) = q (c) (F) and F a parts per thousand yen 3. In the limit of infinite area, this threshold value signals the onset of a discontinuous transition between a globalized regime marked by a uniform culture (C = 1), and a completely polarized regime where all C = q (F) possible cultures coexist. Otherwise, the culture-area relation exhibits the typical behavior of the species-area relation, i.e., a monotonically increasing curve the slope of which is steep at first and steadily levels off at some maximum diversity value.
Resumo:
The issue of smoothing in kriging has been addressed either by estimation or simulation. The solution via estimation calls for postprocessing kriging estimates in order to correct the smoothing effect. Stochastic simulation provides equiprobable images presenting no smoothing and reproducing the covariance model. Consequently, these images reproduce both the sample histogram and the sample semivariogram. However, there is still a problem, which is the lack of local accuracy of simulated images. In this paper, a postprocessing algorithm for correcting the smoothing effect of ordinary kriging estimates is compared with sequential Gaussian simulation realizations. Based on samples drawn from exhaustive data sets, the postprocessing algorithm is shown to be superior to any individual simulation realization yet, at the expense of providing one deterministic estimate of the random function.
Resumo:
Architectures based on Coordinated Atomic action (CA action) concepts have been used to build concurrent fault-tolerant systems. This conceptual model combines concurrent exception handling with action nesting to provide a general mechanism for both enclosing interactions among system components and coordinating forward error recovery measures. This article presents an architectural model to guide the formal specification of concurrent fault-tolerant systems. This architecture provides built-in Communicating Sequential Processes (CSPs) and predefined channels to coordinate exception handling of the user-defined components. Hence some safety properties concerning action scoping and concurrent exception handling can be proved by using the FDR (Failure Divergence Refinement) verification tool. As a result, a formal and general architecture supporting software fault tolerance is ready to be used and proved as users define components with normal and exceptional behaviors. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
When modeling real-world decision-theoretic planning problems in the Markov Decision Process (MDP) framework, it is often impossible to obtain a completely accurate estimate of transition probabilities. For example, natural uncertainty arises in the transition specification due to elicitation of MOP transition models from an expert or estimation from data, or non-stationary transition distributions arising from insufficient state knowledge. In the interest of obtaining the most robust policy under transition uncertainty, the Markov Decision Process with Imprecise Transition Probabilities (MDP-IPs) has been introduced to model such scenarios. Unfortunately, while various solution algorithms exist for MDP-IPs, they often require external calls to optimization routines and thus can be extremely time-consuming in practice. To address this deficiency, we introduce the factored MDP-IP and propose efficient dynamic programming methods to exploit its structure. Noting that the key computational bottleneck in the solution of factored MDP-IPs is the need to repeatedly solve nonlinear constrained optimization problems, we show how to target approximation techniques to drastically reduce the computational overhead of the nonlinear solver while producing bounded, approximately optimal solutions. Our results show up to two orders of magnitude speedup in comparison to traditional ""flat"" dynamic programming approaches and up to an order of magnitude speedup over the extension of factored MDP approximate value iteration techniques to MDP-IPs while producing the lowest error of any approximation algorithm evaluated. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The Grubbs` measurement model is frequently used to compare several measuring devices. It is common to assume that the random terms have a normal distribution. However, such assumption makes the inference vulnerable to outlying observations, whereas scale mixtures of normal distributions have been an interesting alternative to produce robust estimates, keeping the elegancy and simplicity of the maximum likelihood theory. The aim of this paper is to develop an EM-type algorithm for the parameter estimation, and to use the local influence method to assess the robustness aspects of these parameter estimates under some usual perturbation schemes, In order to identify outliers and to criticize the model building we use the local influence procedure in a Study to compare the precision of several thermocouples. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The immersed boundary method is a versatile tool for the investigation of flow-structure interaction. In a large number of applications, the immersed boundaries or structures are very stiff and strong tangential forces on these interfaces induce a well-known, severe time-step restriction for explicit discretizations. This excessive stability constraint can be removed with fully implicit or suitable semi-implicit schemes but at a seemingly prohibitive computational cost. While economical alternatives have been proposed recently for some special cases, there is a practical need for a computationally efficient approach that can be applied more broadly. In this context, we revisit a robust semi-implicit discretization introduced by Peskin in the late 1970s which has received renewed attention recently. This discretization, in which the spreading and interpolation operators are lagged. leads to a linear system of equations for the inter-face configuration at the future time, when the interfacial force is linear. However, this linear system is large and dense and thus it is challenging to streamline its solution. Moreover, while the same linear system or one of similar structure could potentially be used in Newton-type iterations, nonlinear and highly stiff immersed structures pose additional challenges to iterative methods. In this work, we address these problems and propose cost-effective computational strategies for solving Peskin`s lagged-operators type of discretization. We do this by first constructing a sufficiently accurate approximation to the system`s matrix and we obtain a rigorous estimate for this approximation. This matrix is expeditiously computed by using a combination of pre-calculated values and interpolation. The availability of a matrix allows for more efficient matrix-vector products and facilitates the design of effective iterative schemes. We propose efficient iterative approaches to deal with both linear and nonlinear interfacial forces and simple or complex immersed structures with tethered or untethered points. One of these iterative approaches employs a splitting in which we first solve a linear problem for the interfacial force and then we use a nonlinear iteration to find the interface configuration corresponding to this force. We demonstrate that the proposed approach is several orders of magnitude more efficient than the standard explicit method. In addition to considering the standard elliptical drop test case, we show both the robustness and efficacy of the proposed methodology with a 2D model of a heart valve. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
An administrative border might hinder the optimal allocation of a given set of resources by restricting the flow of goods, services, and people. In this paper we address the question: Do administrative borders lead to poor accessibility to public service such as hospitals? In answering the question, we have examined the case of Sweden and its regional borders. We have used detailed data on the Swedish road network, its hospitals, and its geo-coded population. We have assessed the population’s spatial accessibility to Swedish hospitals by computing the inhabitants’ distance to the nearest hospital. We have also elaborated several scenarios ranging from strongly confining regional borders to no confinements of borders and recomputed the accessibility. Our findings imply that administrative borders are only marginally worsening the accessibility.
Resumo:
E-learning has become one of the primary ways of delivering education around the globe. In Somalia, which is a country torn within and from the global community by a prolonged civil war, University of Hargeisa has in collaboration with Dalarna University in Sweden adopted, for the first time, e-learning. This study explores barriers and facilitators to e-learning usage, experienced by students in Somalia’s higher education, using the University of Hargeisa as case study. Interviews were conducted with students to explore how University of Hargeisa’s novice users perceived elearning, and what factors positively and negatively affected their e-learning experiences. The Unified Theory of Acceptance and Use of Technology (UTAUT) model was used as a framework for interpreting the results. The findings show that, in general, the students have a very positive attitude towards e-learning, and they perceived that e-learning enhanced their educational experience. The communication aspect was found to be especially important for Somali students, as it facilitated a feeling of belonging to the global community of students and scholars and alleviated the war-torn country’s isolation. However, some socio-cultural aspects of students’ communities negatively affected their e-learning experience. This study ends with recommendations based on the empirical findings to promote the use and enhance the experience of e-learning in post conflict Somali educational institutions
Resumo:
The open provenance architecture (OPA) approach to the challenge was distinct in several regards. In particular, it is based on an open, well-defined data model and architecture, allowing different components of the challenge workflow to independently record documentation, and for the workflow to be executed in any environment. Another noticeable feature is that we distinguish between the data recorded about what has occurred, emphprocess documentation, and the emphprovenance of a data item, which is all that caused the data item to be as it is and is obtained as the result of a query over process documentation. This distinction allows us to tailor the system to separately best address the requirements of recording and querying documentation. Other notable features include the explicit recording of causal relationships between both events and data items, an interaction-based world model, intensional definition of data items in queries rather than relying on explicit naming mechanisms, and emphstyling of documentation to support non-functional application requirements such as reducing storage costs or ensuring privacy of data. In this paper we describe how each of these features aid us in answering the challenge provenance queries.
Resumo:
A description of a data item's provenance can be provided in dierent forms, and which form is best depends on the intended use of that description. Because of this, dierent communities have made quite distinct underlying assumptions in their models for electronically representing provenance. Approaches deriving from the library and archiving communities emphasise agreed vocabulary by which resources can be described and, in particular, assert their attribution (who created the resource, who modied it, where it was stored etc.) The primary purpose here is to provide intuitive metadata by which users can search for and index resources. In comparison, models for representing the results of scientific workflows have been developed with the assumption that each event or piece of intermediary data in a process' execution can and should be documented, to give a full account of the experiment undertaken. These occurrences are connected together by stating where one derived from, triggered, or otherwise caused another, and so form a causal graph. Mapping between the two approaches would be benecial in integrating systems and exploiting the strengths of each. In this paper, we specify such a mapping between Dublin Core and the Open Provenance Model. We further explain the technical issues to overcome and the rationale behind the approach, to allow the same method to apply in mapping similar schemes.
Resumo:
Esta tese discute três temas: políticas públicas, gestão tecnológica, e setor automotivo. Tendo por objetivo abreviar o ciclo de absorção e desenvolvimento de tecnologia, um volume expressivo de recursos tem sido transferido do setor público para o setor privado através do que é denominado de Política Pública Indutora (PPI). Os governos pretendem, assim, atrair aquelas empresas tecnologicamente mais capacitadas, na expectativa de que transfiram para a localidade onde se instalam o conhecimento que detêm. No Brasil, um dos setores-alvo deste tipo de política tem sido o automotivo, circunstância observada em diferentes momentos da história. Efetivamente, o Regime Automotivo Brasileiro pretende não apenas acelerar o desenvolvimento do país, mas também promover uma significativa transferência de tecnologia. A análise das PPI, por ser de extrema importância, é bastante influenciada e dificultada quer por seus defensores, quer por seus destratores, que as veêm sob os aspectos de sucesso ou não; mas, não bastasse essa dificuldade, há também o elevado conteúdo ideológico que sustenta as argumentações, que faz com que a avaliação se perca num quadro inconclusivo. Afinal, estas iniciativas são benéficas ou não para o país e para as economias regionais? Finalmente, a eficácia, e portanto o acerto desta estratégia só pode ser avaliado expost facto, quando já comprometidos, quiçá irremediavelmente, os recursos públicos. Por essa razão, este estudo desenvolve uma análise ex-ante das políticas públicas do tipo indutoras, fazendo uso de um modelo compreensivo que permite uma análise longitudinal, captando assim, as mudanças no ambiente. Entre outras, procurou-se responder à seguinte questão: é possível, hoje, inferir quanto à contrib uição, se positiva ou negativa, que o Regime Automotivo Brasileiro e os seus desdobramentos estaduais trarão à capacidade tecnológica no entorno da empresa atraída? O problema e a questão de pesquisa foram abordados, predominantemente, sob um enfoque qualitativo, e o método escolhido foi o estudo de caso. Com o auxílio do modelo proposto foi analisada e avaliada a potencialidade de aumento na capacidade tecnológica induzida pela instalação da unidade montadora da General Motors do Brasil, em Gravataí, Rio Grande do Sul. Ao final conclui- se que os benefícios previstos pelo Regime Automotivo Brasileiro, no que diz respeito a capacitação tecnológica local, dificilmente serão atingidos pela instalação de novas empresas automotivas ou a modernização das existentes.
Resumo:
Point pattern matching in Euclidean Spaces is one of the fundamental problems in Pattern Recognition, having applications ranging from Computer Vision to Computational Chemistry. Whenever two complex patterns are encoded by two sets of points identifying their key features, their comparison can be seen as a point pattern matching problem. This work proposes a single approach to both exact and inexact point set matching in Euclidean Spaces of arbitrary dimension. In the case of exact matching, it is assured to find an optimal solution. For inexact matching (when noise is involved), experimental results confirm the validity of the approach. We start by regarding point pattern matching as a weighted graph matching problem. We then formulate the weighted graph matching problem as one of Bayesian inference in a probabilistic graphical model. By exploiting the existence of fundamental constraints in patterns embedded in Euclidean Spaces, we prove that for exact point set matching a simple graphical model is equivalent to the full model. It is possible to show that exact probabilistic inference in this simple model has polynomial time complexity with respect to the number of elements in the patterns to be matched. This gives rise to a technique that for exact matching provably finds a global optimum in polynomial time for any dimensionality of the underlying Euclidean Space. Computational experiments comparing this technique with well-known probabilistic relaxation labeling show significant performance improvement for inexact matching. The proposed approach is significantly more robust under augmentation of the sizes of the involved patterns. In the absence of noise, the results are always perfect.
Resumo:
The rapid growth of urban areas has a significant impact on traffic and transportation systems. New management policies and planning strategies are clearly necessary to cope with the more than ever limited capacity of existing road networks. The concept of Intelligent Transportation System (ITS) arises in this scenario; rather than attempting to increase road capacity by means of physical modifications to the infrastructure, the premise of ITS relies on the use of advanced communication and computer technologies to handle today’s traffic and transportation facilities. Influencing users’ behaviour patterns is a challenge that has stimulated much research in the ITS field, where human factors start gaining great importance to modelling, simulating, and assessing such an innovative approach. This work is aimed at using Multi-agent Systems (MAS) to represent the traffic and transportation systems in the light of the new performance measures brought about by ITS technologies. Agent features have good potentialities to represent those components of a system that are geographically and functionally distributed, such as most components in traffic and transportation. A BDI (beliefs, desires, and intentions) architecture is presented as an alternative to traditional models used to represent the driver behaviour within microscopic simulation allowing for an explicit representation of users’ mental states. Basic concepts of ITS and MAS are presented, as well as some application examples related to the subject. This has motivated the extension of an existing microscopic simulation framework to incorporate MAS features to enhance the representation of drivers. This way demand is generated from a population of agents as the result of their decisions on route and departure time, on a daily basis. The extended simulation model that now supports the interaction of BDI driver agents was effectively implemented, and different experiments were performed to test this approach in commuter scenarios. MAS provides a process-driven approach that fosters the easy construction of modular, robust, and scalable models, characteristics that lack in former result-driven approaches. Its abstraction premises allow for a closer association between the model and its practical implementation. Uncertainty and variability are addressed in a straightforward manner, as an easier representation of humanlike behaviours within the driver structure is provided by cognitive architectures, such as the BDI approach used in this work. This way MAS extends microscopic simulation of traffic to better address the complexity inherent in ITS technologies.