998 resultados para TEORIA DA ARQUITETURA


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabalho apresenta uma extensão do provador haRVey destinada à verificação de obrigações de prova originadas de acordo com o método B. O método B de desenvolvimento de software abrange as fases de especificação, projeto e implementação do ciclo de vida do software. No contexto da verificação, destacam-se as ferramentas de prova Prioni, Z/EVES e Atelier-B/Click n Prove. Elas descrevem formalismos com suporte à checagem satisfatibilidade de fórmulas da teoria axiomática dos conjuntos, ou seja, podem ser aplicadas ao método B. A checagem de SMT consiste na checagem de satisfatibilidade de fórmulas da lógica de primeira-ordem livre de quantificadores dada uma teoria decidível. A abordagem de checagem de SMT implementada pelo provador automático de teoremas haRVey é apresentada, adotando-se a teoria dos vetores que não permite expressar todas as construções necessárias às especificações baseadas em conjuntos. Assim, para estender a checagem de SMT para teorias dos conjuntos destacam-se as teorias dos conjuntos de Zermelo-Frankel (ZFC) e de von Neumann-Bernays-Gödel (NBG). Tendo em vista que a abordagem de checagem de SMT implementada no haRVey requer uma teoria finita e pode ser estendida para as teorias nãodecidíveis, a teoria NBG apresenta-se como uma opção adequada para a expansão da capacidade dedutiva do haRVey à teoria dos conjuntos. Assim, através do mapeamento dos operadores de conjunto fornecidos pela linguagem B a classes da teoria NBG, obtem-se uma abordagem alternativa para a checagem de SMT aplicada ao método B

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O método de combinação de Nelson-Oppen permite que vários procedimentos de decisão, cada um projetado para uma teoria específica, possam ser combinados para inferir sobre teorias mais abrangentes, através do princípio de propagação de igualdades. Provadores de teorema baseados neste modelo são beneficiados por sua característica modular e podem evoluir mais facilmente, incrementalmente. Difference logic é uma subteoria da aritmética linear. Ela é formada por constraints do tipo x − y ≤ c, onde x e y são variáveis e c é uma constante. Difference logic é muito comum em vários problemas, como circuitos digitais, agendamento, sistemas temporais, etc. e se apresenta predominante em vários outros casos. Difference logic ainda se caracteriza por ser modelada usando teoria dos grafos. Isto permite que vários algoritmos eficientes e conhecidos da teoria de grafos possam ser utilizados. Um procedimento de decisão para difference logic é capaz de induzir sobre milhares de constraints. Um procedimento de decisão para a teoria de difference logic tem como objetivo principal informar se um conjunto de constraints de difference logic é satisfatível (as variáveis podem assumir valores que tornam o conjunto consistente) ou não. Além disso, para funcionar em um modelo de combinação baseado em Nelson-Oppen, o procedimento de decisão precisa ter outras funcionalidades, como geração de igualdade de variáveis, prova de inconsistência, premissas, etc. Este trabalho apresenta um procedimento de decisão para a teoria de difference logic dentro de uma arquitetura baseada no método de combinação de Nelson-Oppen. O trabalho foi realizado integrando-se ao provador haRVey, de onde foi possível observar o seu funcionamento. Detalhes de implementação e testes experimentais são relatados

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is increasingly common use of a single computer system using different devices - personal computers, telephones cellular and others - and software platforms - systems graphical user interfaces, Web and other systems. Depending on the technologies involved, different software architectures may be employed. For example, in Web systems, it utilizes architecture client-server - usually extended in three layers. In systems with graphical interfaces, it is common architecture with the style MVC. The use of architectures with different styles hinders the interoperability of systems with multiple platforms. Another aggravating is that often the user interface in each of the devices have structure, appearance and behaviour different on each device, which leads to a low usability. Finally, the user interfaces specific to each of the devices involved, with distinct features and technologies is a job that needs to be done individually and not allow scalability. This study sought to address some of these problems by presenting a reference architecture platform-independent and that allows the user interface can be built from an abstract specification described in the language in the specification of the user interface, the MML. This solution is designed to offer greater interoperability between different platforms, greater consistency between the user interfaces and greater flexibility and scalability for the incorporation of new devices

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intendding to understand how the human mind operates, some philosophers and psycologists began to study about rationality. Theories were built from those studies and nowadays that interest have been extended to many other areas such as computing engineering and computing science, but with a minimal distinction at its goal: to understand the mind operational proccess and apply it on agents modelling to become possible the implementation (of softwares or hardwares) with the agent-oriented paradigm where agents are able to deliberate their own plans of actions. In computing science, the sub-area of multiagents systems has progressed using several works concerning artificial intelligence, computational logic, distributed systems, games theory and even philosophy and psycology. This present work hopes to show how it can be get a logical formalisation extention of a rational agents architecture model called BDI (based in a philosophic Bratman s Theory) in which agents are capable to deliberate actions from its beliefs, desires and intentions. The formalisation of this model is called BDI logic and it is a modal logic (in general it is a branching time logic) with three access relations: B, D and I. And here, it will show two possible extentions that tranform BDI logic in a modal-fuzzy logic where the formulae and the access relations can be evaluated by values from the interval [0,1]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Middleware platforms have been widely used as an underlying infrastructure to the development of distributed applications. They provide distribution and heterogeneity transparency and a set of services that ease the construction of distributed applications. Nowadays, the middlewares accommodate an increasing variety of requirements to satisfy distinct application domains. This broad range of application requirements increases the complexity of the middleware, due to the introduction of many cross-cutting concerns in the architecture, which are not properly modularized by traditional programming techniques, resulting in a tangling and spread of theses concerns in the middleware code. The presence of these cross-cutting concerns limits the middleware scalability and aspect-oriented paradigm has been used successfully to improve the modularity, extensibility and customization capabilities of middleware. This work presents AO-OiL, an aspect-oriented (AO) middleware architecture, based on the AO middleware reference architecture. This middleware follows the philosophy that the middleware functionalities must be driven by the application requirements. AO-OiL consists in an AO refactoring of the OiL (Orb in Lua) middleware in order to separate basic and crosscutting concerns. The proposed architecture was implemented in Lua and RE-AspectLua. To evaluate the refactoring impact in the middleware architecture, this paper presents a comparative analysis of performance between AO-OiL and OiL

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays several electronics devices support digital videos. Some examples of these devices are cellphones, digital cameras, video cameras and digital televisions. However, raw videos present a huge amount of data, millions of bits, for their representation as the way they were captured. To store them in its primary form it would be necessary a huge amount of disk space and a huge bandwidth to allow the transmission of these data. The video compression becomes essential to make possible information storage and transmission. Motion Estimation is a technique used in the video coder that explores the temporal redundancy present in video sequences to reduce the amount of data necessary to represent the information. This work presents a hardware architecture of a motion estimation module for high resolution videos according to H.264/AVC standard. The H.264/AVC is the most advanced video coder standard, with several new features which allow it to achieve high compression rates. The architecture presented in this work was developed to provide a high data reuse. The data reuse schema adopted reduces the bandwidth required to execute motion estimation. The motion estimation is the task responsible for the largest share of the gains obtained with the H.264/AVC standard so this module is essential for final video coder performance. This work is included in Rede H.264 project which aims to develop Brazilian technology for Brazilian System of Digital Television

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a need for multi-agent system designers in determining the quality of systems in the earliest phases of the development process. The architectures of the agents are also part of the design of these systems, and therefore also need to have their quality evaluated. Motivated by the important role that emotions play in our daily lives, embodied agents researchers have aimed to create agents capable of producing affective and natural interaction with users that produces a beneficial or desirable result. For this, several studies proposing architectures of agents with emotions arose without the accompaniment of appropriate methods for the assessment of these architectures. The objective of this study is to propose a methodology for evaluating architectures emotional agents, which evaluates the quality attributes of the design of architectures, in addition to evaluation of human-computer interaction, the effects on the subjective experience of users of applications that implement it. The methodology is based on a model of well-defined metrics. In assessing the quality of architectural design, the attributes assessed are: extensibility, modularity and complexity. In assessing the effects on users' subjective experience, which involves the implementation of the architecture in an application and we suggest to be the domain of computer games, the metrics are: enjoyment, felt support, warm, caring, trust, cooperation, intelligence, interestingness, naturalness of emotional reactions, believabiliy, reducing of frustration and likeability, and the average time and average attempts. We experimented with this approach and evaluate five architectures emotional agents: BDIE, DETT, Camurra-Coglio, EBDI, Emotional-BDI. Two of the architectures, BDIE and EBDI, were implemented in a version of the game Minesweeper and evaluated for human-computer interaction. In the results, DETT stood out with the best architectural design. Users who have played the version of the game with emotional agents performed better than those who played without agents. In assessing the subjective experience of users, the differences between the architectures were insignificant

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The tracking between models of the requirements and architecture activities is a strategy that aims to prevent loss of information, reducing the gap between these two initial activities of the software life cycle. In the context of Software Product Lines (SPL), it is important to have this support, which allows the correspondence between this two activities, with management of variability. In order to address this issue, this paper presents a process of bidirectional mapping, defining transformation rules between elements of a goaloriented requirements model (described in PL-AOVgraph) and elements of an architectural description (defined in PL-AspectualACME). These mapping rules are evaluated using a case study: the GingaForAll LPS. To automate this transformation, we developed the MaRiPLA tool (Mapping Requirements to Product Line Architecture), through MDD techniques (Modeldriven Development), including Atlas Transformation Language (ATL) with specification of Ecore metamodels jointly with Xtext , a DSL definition framework, and Acceleo, a code generation tool, in Eclipse environment. Finally, the generated models are evaluated based on quality attributes such as variability, derivability, reusability, correctness, traceability, completeness, evolvability and maintainability, extracted from the CAFÉ Quality Model

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the world we are constantly performing everyday actions. Two of these actions are frequent and of great importance: classify (sort by classes) and take decision. When we encounter problems with a relatively high degree of complexity, we tend to seek other opinions, usually from people who have some knowledge or even to the extent possible, are experts in the problem domain in question in order to help us in the decision-making process. Both the classification process as the process of decision making, we are guided by consideration of the characteristics involved in the specific problem. The characterization of a set of objects is part of the decision making process in general. In Machine Learning this classification happens through a learning algorithm and the characterization is applied to databases. The classification algorithms can be employed individually or by machine committees. The choice of the best methods to be used in the construction of a committee is a very arduous task. In this work, it will be investigated meta-learning techniques in selecting the best configuration parameters of homogeneous committees for applications in various classification problems. These parameters are: the base classifier, the architecture and the size of this architecture. We investigated nine types of inductors candidates for based classifier, two methods of generation of architecture and nine medium-sized groups for architecture. Dimensionality reduction techniques have been applied to metabases looking for improvement. Five classifiers methods are investigated as meta-learners in the process of choosing the best parameters of a homogeneous committee.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing complexity of integrated circuits has boosted the development of communications architectures like Networks-on-Chip (NoCs), as an architecture; alternative for interconnection of Systems-on-Chip (SoC). Networks-on-Chip complain for component reuse, parallelism and scalability, enhancing reusability in projects of dedicated applications. In the literature, lots of proposals have been made, suggesting different configurations for networks-on-chip architectures. Among all networks-on-chip considered, the architecture of IPNoSys is a non conventional one, since it allows the execution of operations, while the communication process is performed. This study aims to evaluate the execution of data-flow based applications on IPNoSys, focusing on their adaptation against the design constraints. Data-flow based applications are characterized by the flowing of continuous stream of data, on which operations are executed. We expect that these type of applications can be improved when running on IPNoSys, because they have a programming model similar to the execution model of this network. By observing the behavior of these applications when running on IPNoSys, were performed changes in the execution model of the network IPNoSys, allowing the implementation of an instruction level parallelism. For these purposes, analysis of the implementations of dataflow applications were performed and compared

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O presente estudo tem como objetivo analisar como os gestores da hotelaria percebem a relação da teoria do caos e turismo, descrever como é pensada e concebida a gestão do turismo; mostrar as características caopléxicas do turismo e sua gestão e identificar os elementos do padrão estratégico de gestão e planejamento na visão dos gestores dos Meios de Hospedagem. Os dados foram obtidos através da observação direta intensiva e aplicação de entrevistas aplicadas junto aos hoteleiros da cidade de Natal/RN. Constata que as teorias da mudança e do caos proporcionam visão do conjunto de relações, aspectos e implicações do turismo, com base nas incertezas e instabilidades ambientais e de mercado. Mostra que a maioria dos modelos de análise, planejamento e gestão do turismo ainda é limitada, em razão do seu caráter linear e unidimensional. Destaca que o objetivo da teoria do caos é fornecer explicações dos fenômenos, não é fazer previsões, pois é impossível prever o futuro a longo prazo. Trata das estratégias de gestão como geradoras de novas fontes de incerteza, pois a realidade não pode ser completamente representada por modelos. Ter consciência desses fatos significa, entretanto, considerar o campo de múltiplas possibilidades e oportunidades da mudança e do caos na gestão do desenvolvimento do turismo. Essas idéias foram aplicadas à análise da gestão do turismo na localidade investigada. Os questionamentos levantados no estudo pautaram-se em pesquisa bibliográfica, fundamentada, principalmente, em autores como Beni, Prahalad, Mintzberg, Ritto, Senge e Hamel. Conclui-se que na era do conhecimento, as organizações que continuam presas ao paradigma mecanicista têm dificuldade de acompanhar as mudanças do mercado e sendo a hotelaria uma indústria de serviços que tem características organizacionais complexas e adaptativas, é imprescindível que os empreendimentos hoteleiros funcionem como sistemas interativos, que acompanhem as mudanças de um mercado tão turbulento e instável tendo na teoria do caos e da complexidade uma das possibilidades de compreender a realidade fora do modelo mecanicista.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Social Representation Theory provides subsidies to scientifically analyze what is called common sense, suggesting that thought be given credibility to the individual, from the assignment of logic to it. The Representations allow us to interpret, understand, explain and thus classify information, events and people. In this sense, this study aimed to analyze how social representations of the actors of the intinerant traders who operate in Ponta Negra/RN can be used as an element for tourism planning. To achieve the desired goals, we conducted a qualitative study, from a descriptive study, using methods of data collection the research literature, the technique of free association of words and the questionnaire, applied with 90 intinerant traders who work in Ponta Negra/RN beach. As tools of data analysis were used to analyze literature, and software EVOC and SPHINX. This research has revealed the predominance of people in itinerant male, between 18 and 28 years, with incomplete primary education, no contributors of Previdência Social and working seven days a week. The core elements of representations brings that explain that their knowledge is guided by collectively shared knowledge in the culture of tourism, which is seen as something that brings economics benefits (money) to the society, from the travel and entertainment. The plan represents the forward thinking, based on development plans that seek improvements and organization. The structure and operation of tourism planning in Natal/RN, there were no representation of intinerant traders. It is concluded that understanding the needs of itinerant traders provides grants to developing strategies for the development of tourism. This is achieved from its inclusion in tourism planning, since it enables tourism managers to understand how they are capturing, interpreting and acting on their next reality, since these representations are fundamental in forming opinions and the establishment of individual attitudes and collective. Thus, it is an important theory to be used to subsidize social research with individuals living reality and local needs, but which is the margin of decision-making processes of economic in the Brazil

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, the study of some complex systems is done with use of two distinct procedures. In the first part, we have studied the usage of Wavelet transform on analysis and characterization of (multi)fractal time series. We have test the reliability of Wavelet Transform Modulus Maxima method (WTMM) in respect to the multifractal formalism, trough the calculation of the singularity spectrum of time series whose fractality is well known a priori. Next, we have use the Wavelet Transform Modulus Maxima method to study the fractality of lungs crackles sounds, a biological time series. Since the crackles sounds are due to the opening of a pulmonary airway bronchi, bronchioles and alveoli which was initially closed, we can get information on the phenomenon of the airway opening cascade of the whole lung. Once this phenomenon is associated with the pulmonar tree architecture, which displays fractal geometry, the analysis and fractal characterization of this noise may provide us with important parameters for comparison between healthy lungs and those affected by disorders that affect the geometry of the tree lung, such as the obstructive and parenchymal degenerative diseases, which occurs, for example, in pulmonary emphysema. In the second part, we study a site percolation model for square lattices, where the percolating cluster grows governed by a control rule, corresponding to a method of automatic search. In this model of percolation, which have characteristics of self-organized criticality, the method does not use the automated search on Leaths algorithm. It uses the following control rule: pt+1 = pt + k(Rc − Rt), where p is the probability of percolation, k is a kinetic parameter where 0 < k < 1 and R is the fraction of percolating finite square lattices with side L, LxL. This rule provides a time series corresponding to the dynamical evolution of the system, in particular the likelihood of percolation p. We proceed an analysis of scaling of the signal obtained in this way. The model used here enables the study of the automatic search method used for site percolation in square lattices, evaluating the dynamics of their parameters when the system goes to the critical point. It shows that the scaling of , the time elapsed until the system reaches the critical point, and tcor, the time required for the system loses its correlations, are both inversely proportional to k, the kinetic parameter of the control rule. We verify yet that the system has two different time scales after: one in which the system shows noise of type 1 f , indicating to be strongly correlated. Another in which it shows white noise, indicating that the correlation is lost. For large intervals of time the dynamics of the system shows ergodicity

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we study a new risk model for a firm which is sensitive to its credit quality, proposed by Yang(2003): Are obtained recursive equations for finite time ruin probability and distribution of ruin time and Volterra type integral equation systems for ultimate ruin probability, severity of ruin and distribution of surplus before and after ruin