219 resultados para Criptografia de dados (Computação)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we studied the asymptotic unbiasedness, the strong and the uniform strong consistencies of a class of kernel estimators fn as an estimator of the density function f taking values on a k-dimensional sphere

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dentre os vários aspectos da saúde do idoso, a saúde bucal merece atenção especial pelo fato de que, historicamente, nos serviços odontológicos, não se considera esse grupo populacional como prioridade de atenção. Por isso, se faz necessária a produção de um indicador multidimensional capaz de mensurar todas as alterações bucais encontradas em um idoso, facilitando a categorização da saúde bucal como um todo. Tal indicador representará um importante instrumento capaz de elencar prioridades de atenção voltadas à população idosa. Portanto, o estudo em questão propõe a produção e validação de um indicador de saúde bucal a partir dos dados secundários coletados pelo projeto SB Brasil 2010 referente ao grupo etário de 65 a 74 anos. A amostra foi representada pelos 7619 indivíduos do grupo etário de 65 a 74 anos que participaram da pesquisa nas 5 (cinco) regiões do Brasil. Tais indivíduos foram submetidos à avaliação epidemiológica das condições de saúde bucal, a partir dos índices CPO-d, CPI e PIP. Além disso, verificou-se o uso e necessidade de prótese, bem como características sociais, econômicas e demográficas. Uma análise fatorial identificou um número relativamente pequeno de fatores comuns, através da análise de componentes principais. Após a nomenclatura dos fatores, foi realizada a soma dos escores fatoriais por indivíduo. Por último, a dicotomização dessa soma nos forneceu o indicador de saúde bucal proposto. Para esse estudo foram incluídas na análise fatorial 12 variáveis de saúde bucal oriundas do banco de dados do SB Brasil 2010 e, também 3 variáveis socioeconômicas e demográficas. Com base no critério de Kaiser, observa-se que foram retidos cinco fatores que explicaram 70,28% da variância total das variáveis incluídas no modelo. O fator 1 (um) explica sozinho 32,02% dessa variância, o fator 2 (dois) 14,78%, enquanto que os fatores 3 (três), 4 (quatro) e 5 (cinco) explicam 8,90%, 7,89% e 6,68%, respectivamente. Por meio das cargas fatoriais, o fator um foi denominado dente hígido e pouco uso de prótese , o dois doença periodontal presente , o três necessidade de reabilitação , já o quarto e quinto fator foram denominados de cárie e condição social favorável , respectivamente. Para garantir a representatividade do indicador proposto, realizou-se uma segunda análise fatorial em uma subamostra da população de idosos investigados. Por outro lado, a aplicabilidade do indicador produzido foi testada por meio da associação do mesmo com outras variáveis do estudo. Por fim, Cabe ressaltar que, o indicador aqui produzido foi capaz de agregar diver sas informações a respeito da saúde bucal e das condições sociais desses indivíduos, traduzindo assim, diversos dados em uma informação simples, que facilita o olhar dos gestores de saúde sobre as reais necessidades de intervenções em relação à saúde bucal de determinada população

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the current challenges of Ubiquitous Computing is the development of complex applications, those are more than simple alarms triggered by sensors or simple systems to configure the environment according to user preferences. Those applications are hard to develop since they are composed by services provided by different middleware and it is needed to know the peculiarities of each of them, mainly the communication and context models. This thesis presents OpenCOPI, a platform which integrates various services providers, including context provision middleware. It provides an unified ontology-based context model, as well as an environment that enable easy development of ubiquitous applications via the definition of semantic workflows that contains the abstract description of the application. Those semantic workflows are converted into concrete workflows, called execution plans. An execution plan consists of a workflow instance containing activities that are automated by a set of Web services. OpenCOPI supports the automatic Web service selection and composition, enabling the use of services provided by distinct middleware in an independent and transparent way. Moreover, this platform also supports execution adaptation in case of service failures, user mobility and degradation of services quality. The validation of OpenCOPI is performed through the development of case studies, specifically applications of the oil industry. In addition, this work evaluates the overhead introduced by OpenCOPI and compares it with the provided benefits, and the efficiency of OpenCOPI s selection and adaptation mechanism

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional applications of feature selection in areas such as data mining, machine learning and pattern recognition aim to improve the accuracy and to reduce the computational cost of the model. It is done through the removal of redundant, irrelevant or noisy data, finding a representative subset of data that reduces its dimensionality without loss of performance. With the development of research in ensemble of classifiers and the verification that this type of model has better performance than the individual models, if the base classifiers are diverse, comes a new field of application to the research of feature selection. In this new field, it is desired to find diverse subsets of features for the construction of base classifiers for the ensemble systems. This work proposes an approach that maximizes the diversity of the ensembles by selecting subsets of features using a model independent of the learning algorithm and with low computational cost. This is done using bio-inspired metaheuristics with evaluation filter-based criteria

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Smart card applications represent a growing market. Usually this kind of application manipulate and store critical information that requires some level of security, such as financial or confidential information. The quality and trustworthiness of smart card software can be improved through a rigorous development process that embraces formal techniques of software engineering. In this work we propose the BSmart method, a specialization of the B formal method dedicated to the development of smart card Java Card applications. The method describes how a Java Card application can be generated from a B refinement process of its formal abstract specification. The development is supported by a set of tools, which automates the generation of some required refinements and the translation to Java Card client (host) and server (applet) applications. With respect to verification, the method development process was formalized and verified in the B method, using the Atelier B tool [Cle12a]. We emphasize that the Java Card application is translated from the last stage of refinement, named implementation. This translation process was specified in ASF+SDF [BKV08], describing the grammar of both languages (SDF) and the code transformations through rewrite rules (ASF). This specification was an important support during the translator development and contributes to the tool documentation. We also emphasize the KitSmart library [Dut06, San12], an essential component of BSmart, containing models of all 93 classes/interfaces of Java Card API 2:2:2, of Java/Java Card data types and machines that can be useful for the specifier, but are not part of the standard Java Card library. In other to validate the method, its tool support and the KitSmart, we developed an electronic passport application following the BSmart method. We believe that the results reached in this work contribute to Java Card development, allowing the generation of complete (client and server components), and less subject to errors, Java Card applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It bet on the next generation of computers as architecture with multiple processors and/or multicore processors. In this sense there are challenges related to features interconnection, operating frequency, the area on chip, power dissipation, performance and programmability. The mechanism of interconnection and communication it was considered ideal for this type of architecture are the networks-on-chip, due its scalability, reusability and intrinsic parallelism. The networks-on-chip communication is accomplished by transmitting packets that carry data and instructions that represent requests and responses between the processing elements interconnected by the network. The transmission of packets is accomplished as in a pipeline between the routers in the network, from source to destination of the communication, even allowing simultaneous communications between pairs of different sources and destinations. From this fact, it is proposed to transform the entire infrastructure communication of network-on-chip, using the routing mechanisms, arbitration and storage, in a parallel processing system for high performance. In this proposal, the packages are formed by instructions and data that represent the applications, which are executed on routers as well as they are transmitted, using the pipeline and parallel communication transmissions. In contrast, traditional processors are not used, but only single cores that control the access to memory. An implementation of this idea is called IPNoSys (Integrated Processing NoC System), which has an own programming model and a routing algorithm that guarantees the execution of all instructions in the packets, preventing situations of deadlock, livelock and starvation. This architecture provides mechanisms for input and output, interruption and operating system support. As proof of concept was developed a programming environment and a simulator for this architecture in SystemC, which allows configuration of various parameters and to obtain several results to evaluate it

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Clustering data is a very important task in data mining, image processing and pattern recognition problems. One of the most popular clustering algorithms is the Fuzzy C-Means (FCM). This thesis proposes to implement a new way of calculating the cluster centers in the procedure of FCM algorithm which are called ckMeans, and in some variants of FCM, in particular, here we apply it for those variants that use other distances. The goal of this change is to reduce the number of iterations and processing time of these algorithms without affecting the quality of the partition, or even to improve the number of correct classifications in some cases. Also, we developed an algorithm based on ckMeans to manipulate interval data considering interval membership degrees. This algorithm allows the representation of data without converting interval data into punctual ones, as it happens to other extensions of FCM that deal with interval data. In order to validate the proposed methodologies it was made a comparison between a clustering for ckMeans, K-Means and FCM algorithms (since the algorithm proposed in this paper to calculate the centers is similar to the K-Means) considering three different distances. We used several known databases. In this case, the results of Interval ckMeans were compared with the results of other clustering algorithms when applied to an interval database with minimum and maximum temperature of the month for a given year, referring to 37 cities distributed across continents

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Formal methods should be used to specify and verify on-card software in Java Card applications. Furthermore, Java Card programming style requires runtime verification of all input conditions for all on-card methods, where the main goal is to preserve the data in the card. Design by contract, and in particular, the JML language, are an option for this kind of development and verification, as runtime verification is part of the Design by contract method implemented by JML. However, JML and its currently available tools for runtime verification were not designed with Java Card limitations in mind and are not Java Card compliant. In this thesis, we analyze how much of this situation is really intrinsic of Java Card limitations and how much is just a matter of a complete re-design of JML and its tools. We propose the requirements for a new language which is Java Card compliant and indicate the lines on which a compiler for this language should be built. JCML strips from JML non-Java Card aspects such as concurrency and unsupported types. This would not be enough, however, without a great effort in optimization of the verification code generated by its compiler, as this verification code must run on the card. The JCML compiler, although being much more restricted than the one for JML, is able to generate Java Card compliant verification code for some lightweight specifications. As conclusion, we present a Java Card compliant variant of JML, JCML (Java Card Modeling Language), with a preliminary version of its compiler

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although some individual techniques of supervised Machine Learning (ML), also known as classifiers, or algorithms of classification, to supply solutions that, most of the time, are considered efficient, have experimental results gotten with the use of large sets of pattern and/or that they have a expressive amount of irrelevant data or incomplete characteristic, that show a decrease in the efficiency of the precision of these techniques. In other words, such techniques can t do an recognition of patterns of an efficient form in complex problems. With the intention to get better performance and efficiency of these ML techniques, were thought about the idea to using some types of LM algorithms work jointly, thus origin to the term Multi-Classifier System (MCS). The MCS s presents, as component, different of LM algorithms, called of base classifiers, and realized a combination of results gotten for these algorithms to reach the final result. So that the MCS has a better performance that the base classifiers, the results gotten for each base classifier must present an certain diversity, in other words, a difference between the results gotten for each classifier that compose the system. It can be said that it does not make signification to have MCS s whose base classifiers have identical answers to the sames patterns. Although the MCS s present better results that the individually systems, has always the search to improve the results gotten for this type of system. Aim at this improvement and a better consistency in the results, as well as a larger diversity of the classifiers of a MCS, comes being recently searched methodologies that present as characteristic the use of weights, or confidence values. These weights can describe the importance that certain classifier supplied when associating with each pattern to a determined class. These weights still are used, in associate with the exits of the classifiers, during the process of recognition (use) of the MCS s. Exist different ways of calculating these weights and can be divided in two categories: the static weights and the dynamic weights. The first category of weights is characterizes for not having the modification of its values during the classification process, different it occurs with the second category, where the values suffers modifications during the classification process. In this work an analysis will be made to verify if the use of the weights, statics as much as dynamics, they can increase the perfomance of the MCS s in comparison with the individually systems. Moreover, will be made an analysis in the diversity gotten for the MCS s, for this mode verify if it has some relation between the use of the weights in the MCS s with different levels of diversity

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Java Card technology allows the development and execution of small applications embedded in smart cards. A Java Card application is composed of an external card client and of an application in the card that implements the services available to the client by means of an Application Programming Interface (API). Usually, these applications manipulate and store important information, such as cash and confidential data of their owners. Thus, it is necessary to adopt rigor on developing a smart card application to improve its quality and trustworthiness. The use of formal methods on the development of these applications is a way to reach these quality requirements. The B method is one of the many formal methods for system specification. The development in B starts with the functional specification of the system, continues with the application of some optional refinements to the specification and, from the last level of refinement, it is possible to generate code for some programming language. The B formalism has a good tool support and its application to Java Card is adequate since the specification and development of APIs is one of the major applications of B. The BSmart method proposed here aims to promote the rigorous development of Java Card applications up to the generation of its code, based on the refinement of its formal specification described in the B notation. This development is supported by the BSmart tool, that is composed of some programs that automate each stage of the method; and by a library of B modules and Java Card classes that model primitive types, essential Java Card API classes and reusable data structures

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, we propose a multi agent system for digital image steganalysis, based on the poliginic bees model. Such approach aims to solve the problem of automatic steganalysis for digital media, with a case study on digital images. The system architecture was designed not only to detect if a file is suspicious of covering a hidden message, as well to extract the hidden message or information regarding it. Several experiments were performed whose results confirm a substantial enhancement (from 67% to 82% success rate) by using the multi-agent approach, fact not observed in traditional systems. An ongoing application using the technique is the detection of anomalies in digital data produced by sensors that capture brain emissions in little animals. The detection of such anomalies can be used to prove theories and evidences of imagery completion during sleep provided by the brain in visual cortex areas

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increase of capacity to integrate transistors permitted to develop completed systems, with several components, in single chip, they are called SoC (System-on-Chip). However, the interconnection subsystem cans influence the scalability of SoCs, like buses, or can be an ad hoc solution, like bus hierarchy. Thus, the ideal interconnection subsystem to SoCs is the Network-on-Chip (NoC). The NoCs permit to use simultaneous point-to-point channels between components and they can be reused in other projects. However, the NoCs can raise the complexity of project, the area in chip and the dissipated power. Thus, it is necessary or to modify the way how to use them or to change the development paradigm. Thus, a system based on NoC is proposed, where the applications are described through packages and performed in each router between source and destination, without traditional processors. To perform applications, independent of number of instructions and of the NoC dimensions, it was developed the spiral complement algorithm, which finds other destination until all instructions has been performed. Therefore, the objective is to study the viability of development that system, denominated IPNoSys system. In this study, it was developed a tool in SystemC, using accurate cycle, to simulate the system that performs applications, which was implemented in a package description language, also developed to this study. Through the simulation tool, several result were obtained that could be used to evaluate the system performance. The methodology used to describe the application corresponds to transform the high level application in data-flow graph that become one or more packages. This methodology was used in three applications: a counter, DCT-2D and float add. The counter was used to evaluate a deadlock solution and to perform parallel application. The DCT was used to compare to STORM platform. Finally, the float add aimed to evaluate the efficiency of the software routine to perform a unimplemented hardware instruction. The results from simulation confirm the viability of development of IPNoSys system. They showed that is possible to perform application described in packages, sequentially or parallelly, without interruptions caused by deadlock, and also showed that the execution time of IPNoSys is more efficient than the STORM platform

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Programs manipulate information. However, information is abstract in nature and needs to be represented, usually by data structures, making it possible to be manipulated. This work presents the AGraphs, a representation and exchange format of the data that uses typed directed graphs with a simulation of hyperedges and hierarchical graphs. Associated to the AGraphs format there is a manipulation library with a simple programming interface, tailored to the language being represented. The AGraphs format in ad-hoc manner was used as representation format in tools developed at UFRN, and, to make it more usable in other tools, an accurate description and the development of support tools was necessary. These accurate description and tools have been developed and are described in this work. This work compares the AGraphs format with other representation and exchange formats (e.g ATerms, GDL, GraphML, GraX, GXL and XML). The main objective this comparison is to capture important characteristics and where the AGraphs concepts can still evolve

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motion estimation is the main responsible for data reduction in digital video encoding. It is also the most computational damanding step. H.264 is the newest standard for video compression and was planned to double the compression ratio achievied by previous standards. It was developed by the ITU-T Video Coding Experts Group (VCEG) together with the ISO/IEC Moving Picture Experts Group (MPEG) as the product of a partnership effort known as the Joint Video Team (JVT). H.264 presents novelties that improve the motion estimation efficiency, such as the adoption of variable block-size, quarter pixel precision and multiple reference frames. This work defines an architecture for motion estimation in hardware/software, using a full search algorithm, variable block-size and mode decision. This work consider the use of reconfigurable devices, soft-processors and development tools for embedded systems such as Quartus II, SOPC Builder, Nios II and ModelSim