644 resultados para CNPQ::CIENCIAS EXATAS E DA TERRA::QUIMICA


Relevância:

100.00% 100.00%

Publicador:

Resumo:

With hardware and software technologies advance, it s also happenning modifications in the development models of computational systems. New methodologies for user interface specification are being created with user interface description languages (UIDL). The UIDLs are a way to have a precise description in a language with more abstraction and independent of how will be implemented. A great problem is that even using these nowadays methodologies, we still have a big distance between the UIDLs and its design, what means, the distance between abstract and concrete. The tool BRIDGE (Interface Design Generator Environment) was created with the intention of being a linking bridge between a specification language (the Interactive Message Modeling Language IMML) and its implementation in Java, linking the abstract (specification) to the concrete (implementation). IMML is a language based on models, that allows the designer works in distinct abstraction levels, being each model a distinct abstraction level. IMML is a XML language, that uses the Semiotic Engineering concepts, that deals the computational system, with the user interface and its elements like a metacommunicative artifact, where these elements must to transmit a message to the user about what task must to be realized and the way to reach this goal. With BRIDGE, we intend to supply a lot of support to the design task, being the user interface prototipation the greater of them. BRIDGE allows the design becomes easier and more intuitive coming from an interface specification language

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Formal methods should be used to specify and verify on-card software in Java Card applications. Furthermore, Java Card programming style requires runtime verification of all input conditions for all on-card methods, where the main goal is to preserve the data in the card. Design by contract, and in particular, the JML language, are an option for this kind of development and verification, as runtime verification is part of the Design by contract method implemented by JML. However, JML and its currently available tools for runtime verification were not designed with Java Card limitations in mind and are not Java Card compliant. In this thesis, we analyze how much of this situation is really intrinsic of Java Card limitations and how much is just a matter of a complete re-design of JML and its tools. We propose the requirements for a new language which is Java Card compliant and indicate the lines on which a compiler for this language should be built. JCML strips from JML non-Java Card aspects such as concurrency and unsupported types. This would not be enough, however, without a great effort in optimization of the verification code generated by its compiler, as this verification code must run on the card. The JCML compiler, although being much more restricted than the one for JML, is able to generate Java Card compliant verification code for some lightweight specifications. As conclusion, we present a Java Card compliant variant of JML, JCML (Java Card Modeling Language), with a preliminary version of its compiler

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to great difficulty of accurate solution of Combinatorial Optimization Problems, some heuristic methods have been developed and during many years, the analysis of performance of these approaches was not carried through in a systematic way. The proposal of this work is to make a statistical analysis of heuristic approaches to the Traveling Salesman Problem (TSP). The focus of the analysis is to evaluate the performance of each approach in relation to the necessary computational time until the attainment of the optimal solution for one determined instance of the TSP. Survival Analysis, assisted by methods for the hypothesis test of the equality between survival functions was used. The evaluated approaches were divided in three classes: Lin-Kernighan Algorithms, Evolutionary Algorithms and Particle Swarm Optimization. Beyond those approaches, it was enclosed in the analysis, a memetic algorithm (for symmetric and asymmetric TSP instances) that utilizes the Lin-Kernighan heuristics as its local search procedure

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although some individual techniques of supervised Machine Learning (ML), also known as classifiers, or algorithms of classification, to supply solutions that, most of the time, are considered efficient, have experimental results gotten with the use of large sets of pattern and/or that they have a expressive amount of irrelevant data or incomplete characteristic, that show a decrease in the efficiency of the precision of these techniques. In other words, such techniques can t do an recognition of patterns of an efficient form in complex problems. With the intention to get better performance and efficiency of these ML techniques, were thought about the idea to using some types of LM algorithms work jointly, thus origin to the term Multi-Classifier System (MCS). The MCS s presents, as component, different of LM algorithms, called of base classifiers, and realized a combination of results gotten for these algorithms to reach the final result. So that the MCS has a better performance that the base classifiers, the results gotten for each base classifier must present an certain diversity, in other words, a difference between the results gotten for each classifier that compose the system. It can be said that it does not make signification to have MCS s whose base classifiers have identical answers to the sames patterns. Although the MCS s present better results that the individually systems, has always the search to improve the results gotten for this type of system. Aim at this improvement and a better consistency in the results, as well as a larger diversity of the classifiers of a MCS, comes being recently searched methodologies that present as characteristic the use of weights, or confidence values. These weights can describe the importance that certain classifier supplied when associating with each pattern to a determined class. These weights still are used, in associate with the exits of the classifiers, during the process of recognition (use) of the MCS s. Exist different ways of calculating these weights and can be divided in two categories: the static weights and the dynamic weights. The first category of weights is characterizes for not having the modification of its values during the classification process, different it occurs with the second category, where the values suffers modifications during the classification process. In this work an analysis will be made to verify if the use of the weights, statics as much as dynamics, they can increase the perfomance of the MCS s in comparison with the individually systems. Moreover, will be made an analysis in the diversity gotten for the MCS s, for this mode verify if it has some relation between the use of the weights in the MCS s with different levels of diversity

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Java Card technology allows the development and execution of small applications embedded in smart cards. A Java Card application is composed of an external card client and of an application in the card that implements the services available to the client by means of an Application Programming Interface (API). Usually, these applications manipulate and store important information, such as cash and confidential data of their owners. Thus, it is necessary to adopt rigor on developing a smart card application to improve its quality and trustworthiness. The use of formal methods on the development of these applications is a way to reach these quality requirements. The B method is one of the many formal methods for system specification. The development in B starts with the functional specification of the system, continues with the application of some optional refinements to the specification and, from the last level of refinement, it is possible to generate code for some programming language. The B formalism has a good tool support and its application to Java Card is adequate since the specification and development of APIs is one of the major applications of B. The BSmart method proposed here aims to promote the rigorous development of Java Card applications up to the generation of its code, based on the refinement of its formal specification described in the B notation. This development is supported by the BSmart tool, that is composed of some programs that automate each stage of the method; and by a library of B modules and Java Card classes that model primitive types, essential Java Card API classes and reusable data structures

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, we propose a multi agent system for digital image steganalysis, based on the poliginic bees model. Such approach aims to solve the problem of automatic steganalysis for digital media, with a case study on digital images. The system architecture was designed not only to detect if a file is suspicious of covering a hidden message, as well to extract the hidden message or information regarding it. Several experiments were performed whose results confirm a substantial enhancement (from 67% to 82% success rate) by using the multi-agent approach, fact not observed in traditional systems. An ongoing application using the technique is the detection of anomalies in digital data produced by sensors that capture brain emissions in little animals. The detection of such anomalies can be used to prove theories and evidences of imagery completion during sleep provided by the brain in visual cortex areas

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In systems that combine the outputs of classification methods (combination systems), such as ensembles and multi-agent systems, one of the main constraints is that the base components (classifiers or agents) should be diverse among themselves. In other words, there is clearly no accuracy gain in a system that is composed of a set of identical base components. One way of increasing diversity is through the use of feature selection or data distribution methods in combination systems. In this work, an investigation of the impact of using data distribution methods among the components of combination systems will be performed. In this investigation, different methods of data distribution will be used and an analysis of the combination systems, using several different configurations, will be performed. As a result of this analysis, it is aimed to detect which combination systems are more suitable to use feature distribution among the components

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A maioria da soluções apresentadas como candidatas à implementação de serviços de distribuição de áudio e vídeo, têm sido projetadas levando-se em consideração determinadas condições de infra-estrutura, formato dos fluxos de vídeo a serem transmitidos, ou ainda os tipos de clientes que serão atendidos pelo serviço. Aplicações que utilizam serviços de distribuição de vídeo normalmente precisam lidar com grandes oscilações na demanda pelo serviço devido a entrada e saída de usuários do serviço. Com exemplo, basta observar a enorme variação nos níveis de audiência de programas de televisão. Este comportamento coloca um importante requisito para esta classe de sistemas distribuídos: a capacidade de reconfiguração como conseqüência de variações na demanda. Esta dissertação apresenta um estudo que envolveu o uso de agentes móveis para implementar os servidores de um serviço de distribuição de vídeo denominada DynaVideo. Uma das principais características deste serviço é a capacidade de ajustar sua configuração em conseqüência de variações na demanda. Como os servidores DynaVideo podem replicar-se e são implementados como código móvel, seu posicionamento pode ser otimizado para atender uma dada demanda e, como conseqüência, a configuração do serviço pode ser ajustada para minimizar o consumo de recursos necessários para distribuir vídeo para seus usuários. A principal contribuição desta dissertação foi provar a viabilidade do conceito de servidores implementados como agentes móveis Java baseados no ambiente de desenvolvimento de software Aglet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pervasive applications use context provision middleware support as infrastructures to provide context information. Typically, those applications use communication publish/subscribe to eliminate the direct coupling between components and to allow the selective information dissemination based in the interests of the communicating elements. The use of composite events mechanisms together with such middlewares to aggregate individual low level events, originating from of heterogeneous sources, in high level context information relevant for the application. CES (Composite Event System) is a composite events mechanism that works simultaneously in cooperation with several context provision middlewares. With that integration, applications use CES to subscribe to composite events and CES, in turn, subscribes to the primitive events in the appropriate underlying middlewares and notifies the applications when the composed events happen. Furthermore, CES offers a language with a group of operators for the definition of composite events that also allows context information sharing

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increase of capacity to integrate transistors permitted to develop completed systems, with several components, in single chip, they are called SoC (System-on-Chip). However, the interconnection subsystem cans influence the scalability of SoCs, like buses, or can be an ad hoc solution, like bus hierarchy. Thus, the ideal interconnection subsystem to SoCs is the Network-on-Chip (NoC). The NoCs permit to use simultaneous point-to-point channels between components and they can be reused in other projects. However, the NoCs can raise the complexity of project, the area in chip and the dissipated power. Thus, it is necessary or to modify the way how to use them or to change the development paradigm. Thus, a system based on NoC is proposed, where the applications are described through packages and performed in each router between source and destination, without traditional processors. To perform applications, independent of number of instructions and of the NoC dimensions, it was developed the spiral complement algorithm, which finds other destination until all instructions has been performed. Therefore, the objective is to study the viability of development that system, denominated IPNoSys system. In this study, it was developed a tool in SystemC, using accurate cycle, to simulate the system that performs applications, which was implemented in a package description language, also developed to this study. Through the simulation tool, several result were obtained that could be used to evaluate the system performance. The methodology used to describe the application corresponds to transform the high level application in data-flow graph that become one or more packages. This methodology was used in three applications: a counter, DCT-2D and float add. The counter was used to evaluate a deadlock solution and to perform parallel application. The DCT was used to compare to STORM platform. Finally, the float add aimed to evaluate the efficiency of the software routine to perform a unimplemented hardware instruction. The results from simulation confirm the viability of development of IPNoSys system. They showed that is possible to perform application described in packages, sequentially or parallelly, without interruptions caused by deadlock, and also showed that the execution time of IPNoSys is more efficient than the STORM platform

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Combinatorial optimization problems have the goal of maximize or minimize functions defined over a finite domain. Metaheuristics are methods designed to find good solutions in this finite domain, sometimes the optimum solution, using a subordinated heuristic, which is modeled for each particular problem. This work presents algorithms based on particle swarm optimization (metaheuristic) applied to combinatorial optimization problems: the Traveling Salesman Problem and the Multicriteria Degree Constrained Minimum Spanning Tree Problem. The first problem optimizes only one objective, while the other problem deals with many objectives. In order to evaluate the performance of the algorithms proposed, they are compared, in terms of the quality of the solutions found, to other approaches

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Programs manipulate information. However, information is abstract in nature and needs to be represented, usually by data structures, making it possible to be manipulated. This work presents the AGraphs, a representation and exchange format of the data that uses typed directed graphs with a simulation of hyperedges and hierarchical graphs. Associated to the AGraphs format there is a manipulation library with a simple programming interface, tailored to the language being represented. The AGraphs format in ad-hoc manner was used as representation format in tools developed at UFRN, and, to make it more usable in other tools, an accurate description and the development of support tools was necessary. These accurate description and tools have been developed and are described in this work. This work compares the AGraphs format with other representation and exchange formats (e.g ATerms, GDL, GraphML, GraX, GXL and XML). The main objective this comparison is to capture important characteristics and where the AGraphs concepts can still evolve

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A matemática intervalar é uma teoria matemática originada na década de 60 com o objetivo de responder questões de exatidão e eficiência que surgem na prática da computação científica e na resolução de problemas numéricos. As abordagens clássicas para teoria da computabilidade tratam com problemas discretos (por exemplo, sobre os números naturais, números inteiros, strings sobre um alfabeto finito, grafos, etc.). No entanto, campos da matemática pura e aplicada tratam com problemas envolvendo números reais e números complexos. Isto acontece, por exemplo, em análise numérica, sistemas dinâmicos, geometria computacional e teoria da otimização. Assim, uma abordagem computacional para problemas contínuos é desejável, ou ainda necessária, para tratar formalmente com computações analógicas e computações científicas em geral. Na literatura existem diferentes abordagens para a computabilidade nos números reais, mas, uma importante diferença entre estas abordagens está na maneira como é representado o número real. Existem basicamente duas linhas de estudo da computabilidade no contínuo. Na primeira delas uma aproximação da saída com precisão arbitrária é computada a partir de uma aproximação razoável da entrada [Bra95]. A outra linha de pesquisa para computabilidade real foi desenvolvida por Blum, Shub e Smale [BSS89]. Nesta aproximação, as chamadas máquinas BSS, um número real é visto como uma entidade acabada e as funções computáveis são geradas a partir de uma classe de funções básicas (numa maneira similar às funções parciais recursivas). Nesta dissertação estudaremos o modelo BSS, usado para se caracterizar uma teoria da computabilidade sobre os números reais e estenderemos este para se modelar a computabilidade no espaço dos intervalos reais. Assim, aqui veremos uma aproximação para computabilidade intervalar epistemologicamente diferente da estudada por Bedregal e Acióly [Bed96, BA97a, BA97b], na qual um intervalo real é visto como o limite de intervalos racionais, e a computabilidade de uma função intervalar real depende da computabilidade de uma função sobre os intervalos racionais

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work performs an algorithmic study of optimization of a conformal radiotherapy plan treatment. Initially we show: an overview about cancer, radiotherapy and the physics of interaction of ionizing radiation with matery. A proposal for optimization of a plan of treatment in radiotherapy is developed in a systematic way. We show the paradigm of multicriteria problem, the concept of Pareto optimum and Pareto dominance. A generic optimization model for radioterapic treatment is proposed. We construct the input of the model, estimate the dose given by the radiation using the dose matrix, and show the objective function for the model. The complexity of optimization models in radiotherapy treatment is typically NP which justifyis the use of heuristic methods. We propose three distinct methods: MOGA, MOSA e MOTS. The project of these three metaheuristic procedures is shown. For each procedures follows: a brief motivation, the algorithm itself and the method for tuning its parameters. The three method are applied to a concrete case and we confront their performances. Finally it is analyzed for each method: the quality of the Pareto sets, some solutions and the respective Pareto curves