101 resultados para Teoria da computação


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The infographics historically experience the process of evolution of journalism, from the incipient models handmade in the eighteenth century to the inclusion of computers and sophisticated software today. In order to face the advent of TV against of the partiality readers of the printed newspaper, or to represent the Gulf War, where not allowed photography, infographics reaches modern levels of production and publication. The technical devices which enabled the infographics to evolve the environment of the internet, with conditions for the manipulation of the reader, incorporating video, audio and animations, so styling of interactive infographics. These digital models of information visualization recently arrived daily in the northeast and on their respective web sites with features regionalized. This paper therefore proposes to explore and describe the processes of producing the interactive infographics, taking the example of the Diário do Nordeste, Fortaleza, Ceará, whose department was created one year ago. Therefore, based on aspects that guide the theory of journalism, as newsmaking, filters that focus on productive routine (gatekeeping) and the construction stages of the news. This research also draws on the theoretical framework on the subject, in concepts essential characteristics of computer graphics, as well as the methodological procedures and systematic empirical observations in production routines of the newsroom who can testify limitations and / or advances

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The following work is to interpret and analyze the problem of induction under a vision founded on set theory and probability theory as a basis for solution of its negative philosophical implications related to the systems of inductive logic in general. Due to the importance of the problem and the relatively recent developments in these fields of knowledge (early 20th century), as well as the visible relations between them and the process of inductive inference, it has been opened a field of relatively unexplored and promising possibilities. The key point of the study consists in modeling the information acquisition process using concepts of set theory, followed by a treatment using probability theory. Throughout the study it was identified as a major obstacle to the probabilistic justification, both: the problem of defining the concept of probability and that of rationality, as well as the subtle connection between the two. This finding called for a greater care in choosing the criterion of rationality to be considered in order to facilitate the treatment of the problem through such specific situations, but without losing their original characteristics so that the conclusions can be extended to classic cases such as the question about the continuity of the sunrise

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis we study some problems related to petroleum reservoirs using methods and concepts of Statistical Physics. The thesis could be divided percolation problem in random multifractal support motivated by its potential application in modelling oil reservoirs. We develped an heterogeneous and anisotropic grid that followin two parts. The first one introduce a study of the percolations a random multifractal distribution of its sites. After, we determine the percolation threshold for this grid, the fractal dimension of the percolating cluster and the critical exponents ß and v. In the second part, we propose an alternative systematic of modelling and simulating oil reservoirs. We introduce a statistical model based in a stochastic formulation do Darcy Law. In this model, the distribution of permeabilities is localy equivalent to the basic model of bond percolation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we investigate the stochastic behavior of a large class of systems with variable damping which are described by a time-dependent Lagrangian. Our stochastic approach is based on the Langevin treatment describing the motion of a classical Brownian particle of mass m. Two situations of physical interest are considered. In the first one, we discuss in detail an application of the standard Langevin treatment (white noise) for the variable damping system. In the second one, a more general viewpoint is adopted by assuming a given expression to the so-called collored noise. For both cases, the basic diffententiaql equations are analytically solved and al the quantities physically relevant are explicitly determined. The results depend on an arbitrary q parameter measuring how the behavior of the system departs from the standard brownian particle with constant viscosity. Several types of sthocastic behavior (superdiffusive and subdiffusive) are obteinded when the free pamameter varies continuosly. However, all the results of the conventional Langevin approach with constant damping are recovered in the limit q = 1

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The standard kinetic theory for a nonrelativistic diluted gas is generalized in the spirit of the nonextensive statistic distribution introduced by Tsallis. The new formalism depends on an arbitrary q parameter measuring the degree of nonextensivity. In the limit q = 1, the extensive Maxwell-Boltzmann theory is recovered. Starting from a purely kinetic deduction of the velocity q-distribution function, the Boltzmann H-teorem is generalized for including the possibility of nonextensive out of equilibrium effects. Based on this investigation, it is proved that Tsallis' distribution is the necessary and sufficient condition defining a thermodynamic equilibrium state in the nonextensive context. This result follows naturally from the generalized transport equation and also from the extended H-theorem. Two physical applications of the nonextensive effects have been considered. Closed analytic expressions were obtained for the Doppler broadening of spectral lines from an excited gas, as well as, for the dispersion relations describing the eletrostatic oscillations in a diluted electronic plasma. In the later case, a comparison with the experimental results strongly suggests a Tsallis distribution with the q parameter smaller than unity. A complementary study is related to the thermodynamic behavior of a relativistic imperfect simple fluid. Using nonequilibrium thermodynamics, we show how the basic primary variables, namely: the energy momentum tensor, the particle and entropy fluxes depend on the several dissipative processes present in the fluid. The temperature variation law for this moving imperfect fluid is also obtained, and the Eckart and Landau-Lifshitz formulations are recovered as particular cases

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the current challenges of Ubiquitous Computing is the development of complex applications, those are more than simple alarms triggered by sensors or simple systems to configure the environment according to user preferences. Those applications are hard to develop since they are composed by services provided by different middleware and it is needed to know the peculiarities of each of them, mainly the communication and context models. This thesis presents OpenCOPI, a platform which integrates various services providers, including context provision middleware. It provides an unified ontology-based context model, as well as an environment that enable easy development of ubiquitous applications via the definition of semantic workflows that contains the abstract description of the application. Those semantic workflows are converted into concrete workflows, called execution plans. An execution plan consists of a workflow instance containing activities that are automated by a set of Web services. OpenCOPI supports the automatic Web service selection and composition, enabling the use of services provided by distinct middleware in an independent and transparent way. Moreover, this platform also supports execution adaptation in case of service failures, user mobility and degradation of services quality. The validation of OpenCOPI is performed through the development of case studies, specifically applications of the oil industry. In addition, this work evaluates the overhead introduced by OpenCOPI and compares it with the provided benefits, and the efficiency of OpenCOPI s selection and adaptation mechanism

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A matemática intervalar é uma teoria matemática originada na década de 60 com o objetivo de responder questões de exatidão e eficiência que surgem na prática da computação científica e na resolução de problemas numéricos. As abordagens clássicas para teoria da computabilidade tratam com problemas discretos (por exemplo, sobre os números naturais, números inteiros, strings sobre um alfabeto finito, grafos, etc.). No entanto, campos da matemática pura e aplicada tratam com problemas envolvendo números reais e números complexos. Isto acontece, por exemplo, em análise numérica, sistemas dinâmicos, geometria computacional e teoria da otimização. Assim, uma abordagem computacional para problemas contínuos é desejável, ou ainda necessária, para tratar formalmente com computações analógicas e computações científicas em geral. Na literatura existem diferentes abordagens para a computabilidade nos números reais, mas, uma importante diferença entre estas abordagens está na maneira como é representado o número real. Existem basicamente duas linhas de estudo da computabilidade no contínuo. Na primeira delas uma aproximação da saída com precisão arbitrária é computada a partir de uma aproximação razoável da entrada [Bra95]. A outra linha de pesquisa para computabilidade real foi desenvolvida por Blum, Shub e Smale [BSS89]. Nesta aproximação, as chamadas máquinas BSS, um número real é visto como uma entidade acabada e as funções computáveis são geradas a partir de uma classe de funções básicas (numa maneira similar às funções parciais recursivas). Nesta dissertação estudaremos o modelo BSS, usado para se caracterizar uma teoria da computabilidade sobre os números reais e estenderemos este para se modelar a computabilidade no espaço dos intervalos reais. Assim, aqui veremos uma aproximação para computabilidade intervalar epistemologicamente diferente da estudada por Bedregal e Acióly [Bed96, BA97a, BA97b], na qual um intervalo real é visto como o limite de intervalos racionais, e a computabilidade de uma função intervalar real depende da computabilidade de uma função sobre os intervalos racionais

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O método de combinação de Nelson-Oppen permite que vários procedimentos de decisão, cada um projetado para uma teoria específica, possam ser combinados para inferir sobre teorias mais abrangentes, através do princípio de propagação de igualdades. Provadores de teorema baseados neste modelo são beneficiados por sua característica modular e podem evoluir mais facilmente, incrementalmente. Difference logic é uma subteoria da aritmética linear. Ela é formada por constraints do tipo x − y ≤ c, onde x e y são variáveis e c é uma constante. Difference logic é muito comum em vários problemas, como circuitos digitais, agendamento, sistemas temporais, etc. e se apresenta predominante em vários outros casos. Difference logic ainda se caracteriza por ser modelada usando teoria dos grafos. Isto permite que vários algoritmos eficientes e conhecidos da teoria de grafos possam ser utilizados. Um procedimento de decisão para difference logic é capaz de induzir sobre milhares de constraints. Um procedimento de decisão para a teoria de difference logic tem como objetivo principal informar se um conjunto de constraints de difference logic é satisfatível (as variáveis podem assumir valores que tornam o conjunto consistente) ou não. Além disso, para funcionar em um modelo de combinação baseado em Nelson-Oppen, o procedimento de decisão precisa ter outras funcionalidades, como geração de igualdade de variáveis, prova de inconsistência, premissas, etc. Este trabalho apresenta um procedimento de decisão para a teoria de difference logic dentro de uma arquitetura baseada no método de combinação de Nelson-Oppen. O trabalho foi realizado integrando-se ao provador haRVey, de onde foi possível observar o seu funcionamento. Detalhes de implementação e testes experimentais são relatados

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intendding to understand how the human mind operates, some philosophers and psycologists began to study about rationality. Theories were built from those studies and nowadays that interest have been extended to many other areas such as computing engineering and computing science, but with a minimal distinction at its goal: to understand the mind operational proccess and apply it on agents modelling to become possible the implementation (of softwares or hardwares) with the agent-oriented paradigm where agents are able to deliberate their own plans of actions. In computing science, the sub-area of multiagents systems has progressed using several works concerning artificial intelligence, computational logic, distributed systems, games theory and even philosophy and psycology. This present work hopes to show how it can be get a logical formalisation extention of a rational agents architecture model called BDI (based in a philosophic Bratman s Theory) in which agents are capable to deliberate actions from its beliefs, desires and intentions. The formalisation of this model is called BDI logic and it is a modal logic (in general it is a branching time logic) with three access relations: B, D and I. And here, it will show two possible extentions that tranform BDI logic in a modal-fuzzy logic where the formulae and the access relations can be evaluated by values from the interval [0,1]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work will applied the technique of Differential Cryptanalysis, introduced in 1990 by Biham and Shamir, on Papílio s cryptosystem, developed by Karla Ramos, to test and most importantly, to prove its relevance to other block ciphers such as DES, Blowfish and FEAL-N (X). This technique is based on the analysis of differences between plaintext and theirs respective ciphertext, in search of patterns that will assist in the discovery of the subkeys and consequently in the discovery of master key. These differences are obtained by XOR operations. Through this analysis, in addition to obtaining patterns of Pap´ılio, it search to obtain also the main characteristics and behavior of Papilio throughout theirs 16 rounds, identifying and replacing when necessary factors that can be improved in accordance with pre-established definitions of the same, thus providing greater security in the use of his algoritm

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite the emergence of other forms of artificial lift, sucker rod pumping systems remains hegemonic because of its flexibility of operation and lower investment cost compared to other lifting techniques developed. A successful rod pumping sizing necessarily passes through the supply of estimated flow and the controlled wear of pumping equipment used in the mounted configuration. However, the mediation of these elements is particularly challenging, especially for most designers dealing with this work, which still lack the experience needed to get good projects pumping in time. Even with the existence of various computer applications on the market in order to facilitate this task, they must face a grueling process of trial and error until you get the most appropriate combination of equipment for installation in the well. This thesis proposes the creation of an expert system in the design of sucker rod pumping systems. Its mission is to guide a petroleum engineer in the task of selecting a range of equipment appropriate to the context provided by the characteristics of the oil that will be raised to the surface. Features such as the level of gas separation, presence of corrosive elements, possibility of production of sand and waxing are taken into account in selecting the pumping unit, sucker-rod strings and subsurface pump and their operation mode. It is able to approximate the inferente process in the way of human reasoning, which leads to results closer to those obtained by a specialist. For this, their production rules were based on the theory of fuzzy sets, able to model vague concepts typically present in human reasoning. The calculations of operating parameters of the pumping system are made by the API RP 11L method. Based on information input, the system is able to return to the user a set of pumping configurations that meet a given design flow, but without subjecting the selected equipment to an effort beyond that which can bear

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The process for choosing the best components to build systems has become increasingly complex. It becomes more critical if it was need to consider many combinations of components in the context of an architectural configuration. These circumstances occur, mainly, when we have to deal with systems involving critical requirements, such as the timing constraints in distributed multimedia systems, the network bandwidth in mobile applications or even the reliability in real-time systems. This work proposes a process of dynamic selection of architectural configurations based on non-functional requirements criteria of the system, which can be used during a dynamic adaptation. This proposal uses the MAUT theory (Multi-Attribute Utility Theory) for decision making from a finite set of possibilities, which involve multiple criteria to be analyzed. Additionally, it was proposed a metamodel which can be used to describe the application s requirements in terms of the non-functional requirements criteria and their expected values, to express them in order to make the selection of the desired configuration. As a proof of concept, it was implemented a module that performs the dynamic choice of configurations, the MoSAC. This module was implemented using a component-based development approach (CBD), performing a selection of architectural configurations based on the proposed selection process involving multiple criteria. This work also presents a case study where an application was developed in the context of Digital TV to evaluate the time spent on the module to return a valid configuration to be used in a middleware with autoadaptative features, the middleware AdaptTV

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The course of Algorithms and Programming reveals as real obstacle for many students during the computer courses. The students not familiar with new ways of thinking required by the courses as well as not having certain skills required for this, encounter difficulties that sometimes result in the repetition and dropout. Faced with this problem, that survey on the problems experienced by students was conducted as a way to understand the problem and to guide solutions in trying to solve or assuage the difficulties experienced by students. In this paper a methodology to be applied in a classroom based on the concepts of Meaningful Learning of David Ausubel was described. In addition to this theory, a tool developed at UFRN, named Takkou, was used with the intent to better motivate students in algorithms classes and to exercise logical reasoning. Finally a comparative evaluation of the suggested methodology and traditional methodology was carried out, and results were discussed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Formal methods and software testing are tools to obtain and control software quality. When used together, they provide mechanisms for software specification, verification and error detection. Even though formal methods allow software to be mathematically verified, they are not enough to assure that a system is free of faults, thus, software testing techniques are necessary to complement the process of verification and validation of a system. Model Based Testing techniques allow tests to be generated from other software artifacts such as specifications and abstract models. Using formal specifications as basis for test creation, we can generate better quality tests, because these specifications are usually precise and free of ambiguity. Fernanda Souza (2009) proposed a method to define test cases from B Method specifications. This method used information from the machine s invariant and the operation s precondition to define positive and negative test cases for an operation, using equivalent class partitioning and boundary value analysis based techniques. However, the method proposed in 2009 was not automated and had conceptual deficiencies like, for instance, it did not fit in a well defined coverage criteria classification. We started our work with a case study that applied the method in an example of B specification from the industry. Based in this case study we ve obtained subsidies to improve it. In our work we evolved the proposed method, rewriting it and adding characteristics to make it compatible with a test classification used by the community. We also improved the method to support specifications structured in different components, to use information from the operation s behavior on the test case generation process and to use new coverage criterias. Besides, we have implemented a tool to automate the method and we have submitted it to more complex case studies