835 resultados para Computer networks -- Simulation methods
Resumo:
In this work we applied a quantum circuit treatment to describe the nuclear spin relaxation. From the Redfield theory, we obtain a description of the quadrupolar relaxation as a computational process in a spin 3/2 system, through a model in which the environment is comprised by five qubits and three different quantum noise channels. The interaction between the environment and the spin 3/2 nuclei is described by a quantum circuit fully compatible with the Redfield theory of relaxation. Theoretical predictions are compared to experimental data, a short review of quantum channels and relaxation in NMR qubits is also present.
Resumo:
There has been great interest in deciding whether a combinatorial structure satisfies some property, or in estimating the value of some numerical function associated with this combinatorial structure, by considering only a randomly chosen substructure of sufficiently large, but constant size. These problems are called property testing and parameter testing, where a property or parameter is said to be testable if it can be estimated accurately in this way. The algorithmic appeal is evident, as, conditional on sampling, this leads to reliable constant-time randomized estimators. Our paper addresses property testing and parameter testing for permutations in a subpermutation perspective; more precisely, we investigate permutation properties and parameters that can be well approximated based on a randomly chosen subpermutation of much smaller size. In this context, we use a theory of convergence of permutation sequences developed by the present authors [C. Hoppen, Y. Kohayakawa, C.G. Moreira, R.M. Sampaio, Limits of permutation sequences through permutation regularity, Manuscript, 2010, 34pp.] to characterize testable permutation parameters along the lines of the work of Borgs et al. [C. Borgs, J. Chayes, L Lovasz, V.T. Sos, B. Szegedy, K. Vesztergombi, Graph limits and parameter testing, in: STOC`06: Proceedings of the 38th Annual ACM Symposium on Theory of Computing, ACM, New York, 2006, pp. 261-270.] in the case of graphs. Moreover, we obtain a permutation result in the direction of a famous result of Alon and Shapira [N. Alon, A. Shapira, A characterization of the (natural) graph properties testable with one-sided error, SIAM J. Comput. 37 (6) (2008) 1703-1727.] stating that every hereditary graph property is testable. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The InteGrade project is a multi-university effort to build a novel grid computing middleware based on the opportunistic use of resources belonging to user workstations. The InteGrade middleware currently enables the execution of sequential, bag-of-tasks, and parallel applications that follow the BSP or the MPI programming models. This article presents the lessons learned over the last five years of the InteGrade development and describes the solutions achieved concerning the support for robust application execution. The contributions cover the related fields of application scheduling, execution management, and fault tolerance. We present our solutions, describing their implementation principles and evaluation through the analysis of several experimental results. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
The InteGrade middleware intends to exploit the idle time of computing resources in computer laboratories. In this work we investigate the performance of running parallel applications with communication among processors on the InteGrade grid. As costly communication on a grid can be prohibitive, we explore the so-called systolic or wavefront paradigm to design the parallel algorithms in which no global communication is used. To evaluate the InteGrade middleware we considered three parallel algorithms that solve the matrix chain product problem, the 0-1 Knapsack Problem, and the local sequence alignment problem, respectively. We show that these three applications running under the InteGrade middleware and MPI take slightly more time than the same applications running on a cluster with only LAM-MPI support. The results can be considered promising and the time difference between the two is not substantial. The overhead of the InteGrade middleware is acceptable, in view of the benefits obtained to facilitate the use of grid computing by the user. These benefits include job submission, checkpointing, security, job migration, etc. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
Belief Revision deals with the problem of adding new information to a knowledge base in a consistent way. Ontology Debugging, on the other hand, aims to find the axioms in a terminological knowledge base which caused the base to become inconsistent. In this article, we propose a belief revision approach in order to find and repair inconsistencies in ontologies represented in some description logic (DL). As the usual belief revision operators cannot be directly applied to DLs, we propose new operators that can be used with more general logics and show that, in particular, they can be applied to the logics underlying OWL-DL and Lite.
Resumo:
The ever-increasing robustness and reliability of flow-simulation methods have consolidated CFD as a major tool in virtually all branches of fluid mechanics. Traditionally, those methods have played a crucial role in the analysis of flow physics. In more recent years, though, the subject has broadened considerably, with the development of optimization and inverse design applications. Since then, the search for efficient ways to evaluate flow-sensitivity gradients has received the attention of numerous researchers. In this scenario, the adjoint method has emerged as, quite possibly, the most powerful tool for the job, which heightens the need for a clear understanding of its conceptual basis. Yet, some of its underlying aspects are still subject to debate in the literature, despite all the research that has been carried out on the method. Such is the case with the adjoint boundary and internal conditions, in particular. The present work aims to shed more light on that topic, with emphasis on the need for an internal shock condition. By following the path of previous authors, the quasi-1D Euler problem is used as a vehicle to explore those concepts. The results clearly indicate that the behavior of the adjoint solution through a shock wave ultimately depends upon the nature of the objective functional.
Resumo:
Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The Örst reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modiÖed information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of Ötted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy ñreaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models.
Resumo:
Com o intuito de utilizar uma rede com protocolo IP para a implementação de malhas fechadas de controle, este trabalho propõe-se a realizar um estudo da operação de um sistema de controle dinâmico distribuído, comparando-o com a operação de um sistema de controle local convencional. Em geral, a decisão de projetar uma arquitetura de controle distribuído é feita baseada na simplicidade, na redução dos custos e confiabilidade; portanto, um diferencial bastante importante é a utilização da rede IP. O objetivo de uma rede de controle não é transmitir dados digitais, mas dados analógicos amostrados. Assim, métricas usuais em redes de computadores, como quantidade de dados e taxa de transferências, tornam-se secundárias em uma rede de controle. São propostas técnicas para tratar os pacotes que sofrem atrasos e recuperar o desempenho do sistema de controle através da rede IP. A chave para este método é realizar a estimação do conteúdo dos pacotes que sofrem atrasos com base no modelo dinâmico do sistema, mantendo o sistema com um nível adequado de desempenho. O sistema considerado é o controle de um manipulador antropomórfico com dois braços e uma cabeça de visão estéreo totalizando 18 juntas. Os resultados obtidos mostram que se pode recuperar boa parte do desempenho do sistema.
Resumo:
Researchers often rely on the t-statistic to make inference on parameters in statistical models. It is common practice to obtain critical values by simulation techniques. This paper proposes a novel numerical method to obtain an approximately similar test. This test rejects the null hypothesis when the test statistic islarger than a critical value function (CVF) of the data. We illustrate this procedure when regressors are highly persistent, a case in which commonly-used simulation methods encounter dificulties controlling size uniformly. Our approach works satisfactorily, controls size, and yields a test which outperforms the two other known similar tests.
Resumo:
The Wireless Sensor Networks (WSN) methods applied to the lifting of oil present as an area with growing demand technical and scientific in view of the optimizations that can be carried forward with existing processes. This dissertation has as main objective to present the development of embedded systems dedicated to a wireless sensor network based on IEEE 802.15.4, which applies the ZigBee protocol, between sensors, actuators and the PLC (Programmable Logic Controller), aiming to solve the present problems in the deployment and maintenance of the physical communication of current elevation oil units based on the method Plunger-Lift. Embedded systems developed for this application will be responsible for acquiring information from sensors and control actuators of the devices present at the well, and also, using the Modbus protocol to make this network becomes transparent to the PLC responsible for controlling the production and delivery information for supervisory SISAL
Resumo:
This paper is the result of the Masters dissertation studying the role and history of scientific communication, especially the changes that have occurred after the appearance of electronic communication and computer networks. This study showed that hypertext systems are increasingly being used in the scientific and academic world in the production of electronic journals; this makes it possible for the user to rapidly access information in their area. However, these systems need to be improved to help the user during search and access to information. Both printed journals migrating to electronic media, and the exclusively electronic journals should present the current quality indicators. The attempt was made to discover whether characteristics related to printed journals are being maintained in their electronic counterparts. For this, a prototype model was developed to analyze the structure of electronic scientific journals; it composes 14 criteria expressing aspects of quality for these journals. It includes elements of Website Information Architecture and those already in place in printed scientific journals in order to ensure that basic functions - archiving and dissemination - are maintained in electronic publishing. Each criterion consists of variables, which measure the maintenance of these functions both in the migrating printed journals and the exclusively electronic ones. This prototype model was used to analyze Ciência da Informação On-line and DataGramaZero - Revista de Ciência da Informação. Results indicate that this model is able to find out if the basic functions of archiving and dissemination are being maintained in electronic journals. Therefore, its implementation is justified in electronic journals. The model can help librarians, authors, and users of electronic journals to identify quality journals, and assist editors in developing their projects. The material from the study may be included in the preservice and inservice education of Information Science professionals and to support editors of scientific journals.