945 resultados para decentralized attribute-based encryption
Resumo:
OBJECTIVE: To estimate the spatial intensity of urban violence events using wavelet-based methods and emergency room data. METHODS: Information on victims attended at the emergency room of a public hospital in the city of São Paulo, Southeastern Brazil, from January 1, 2002 to January 11, 2003 were obtained from hospital records. The spatial distribution of 3,540 events was recorded and a uniform random procedure was used to allocate records with incomplete addresses. Point processes and wavelet analysis technique were used to estimate the spatial intensity, defined as the expected number of events by unit area. RESULTS: Of all georeferenced points, 59% were accidents and 40% were assaults. There is a non-homogeneous spatial distribution of the events with high concentration in two districts and three large avenues in the southern area of the city of São Paulo. CONCLUSIONS: Hospital records combined with methodological tools to estimate intensity of events are useful to study urban violence. The wavelet analysis is useful in the computation of the expected number of events and their respective confidence bands for any sub-region and, consequently, in the specification of risk estimates that could be used in decision-making processes for public policies.
Resumo:
An Electrocardiogram (ECG) monitoring system deals with several challenges related with noise sources. The main goal of this text was the study of Adaptive Signal Processing Algorithms for ECG noise reduction when applied to real signals. This document presents an adaptive ltering technique based on Least Mean Square (LMS) algorithm to remove the artefacts caused by electromyography (EMG) and power line noise into ECG signal. For this experiments it was used real noise signals, mainly to observe the di erence between real noise and simulated noise sources. It was obtained very good results due to the ability of noise removing that can be reached with this technique. A recolha de sinais electrocardiogr a cos (ECG) sofre de diversos problemas relacionados com ru dos. O objectivo deste trabalho foi o estudo de algoritmos adaptativos para processamento digital de sinal, para redu c~ao de ru do em sinais ECG reais. Este texto apresenta uma t ecnica de redu c~ao de ru do baseada no algoritmo Least Mean Square (LMS) para remo c~ao de ru dos causados quer pela actividade muscular (EMG) quer por ru dos causados pela rede de energia el ectrica. Para as experiencias foram utilizados ru dos reais, principalmente para aferir a diferen ca de performance do algoritmo entre os sinais reais e os simulados. Foram conseguidos bons resultados, essencialmente devido as excelentes caracter sticas que esta t ecnica tem para remover ru dos.
Resumo:
actividade de turismo de habitação tem crescido de forma sustentada ao longo dos últimos anos em Portugal e o recurso à internet como canal de comercialização deste serviço tem substituído gradualmente os restantes canais mais tradicionais. No entanto, esta generalização do recurso a websites, nesta e noutras áreas da actividade económica, não tem sido sempre sinónimo de qualidade, justificando assim a procura de sistemas de avaliação deste tipo de produto de software, que possam ser aplicados sistemática e eficazmente. Nesse sentido, foi desenvolvido este trabalho, no propósito de conseguir um modelo de avaliação de websites, específico para o turismo de habitação, na convicção de que para áreas específicas se exigem abordagens específicas. A pesquisa por modelos orientados para esta actividade não produziu qualquer resultado, ao contrário de outras áreas onde já existem modelos adequados, como sejam a área académica e outras. Daí a necessidade de conjugar ideias e conceitos mais genéricos, de fontes diversas, com elementos específicos de turismo de habitação, no intuito de os combinar e adequar aos objectivos deste trabalho. Assim, a partir de elementos da Norma ISO 9126; de conceitos de usabilidade, funcionalidade, credibilidade, e outros; da opção por uma abordagem de desempenho empresarial; de modelos já existentes, embora direccionados para outras áreas; e de um modelo exploratório de carácter empírico entretanto desenvolvido, foi concebido e implementado um modelo que se caracteriza no essencial, por ser estruturado em três níveis, contemplar um conjunto de trinta e oito atributos em avaliação, com maior incidência naqueles relacionados com aspectos que se considera terem maior influência no desempenho da actividade económica subjacente, e que tem uma ponderação variável do seu impacto no resultado final. No intuito de dar flexibilidade ao modelo e de contrariar a carga subjectiva resultante do processo de ponderação do impacto de cada atributo avaliado, foram implementados no modelo três cenários alternativos com ponderações distintas, cada um valorizando determinado tipo de atributos e desvalorizando outros. Naturalmente que, aquando da implementação do modelo exploratório foram já retiradas algumas conclusões sobre o panorama geral, todavia o modelo definitivo veio dar-lhes maior consistência e pormenor.
Resumo:
Tecnologias da Web Semântica como RDF, OWL e SPARQL sofreram nos últimos anos um forte crescimento e aceitação. Projectos como a DBPedia e Open Street Map começam a evidenciar o verdadeiro potencial da Linked Open Data. No entanto os motores de pesquisa semânticos ainda estão atrasados neste crescendo de tecnologias semânticas. As soluções disponíveis baseiam-se mais em recursos de processamento de linguagem natural. Ferramentas poderosas da Web Semântica como ontologias, motores de inferência e linguagens de pesquisa semântica não são ainda comuns. Adicionalmente a esta realidade, existem certas dificuldades na implementação de um Motor de Pesquisa Semântico. Conforme demonstrado nesta dissertação, é necessária uma arquitectura federada de forma a aproveitar todo o potencial da Linked Open Data. No entanto um sistema federado nesse ambiente apresenta problemas de performance que devem ser resolvidos através de cooperação entre fontes de dados. O standard actual de linguagem de pesquisa na Web Semântica, o SPARQL, não oferece um mecanismo para cooperação entre fontes de dados. Esta dissertação propõe uma arquitectura federada que contém mecanismos que permitem cooperação entre fontes de dados. Aborda o problema da performance propondo um índice gerido de forma centralizada assim como mapeamentos entre os modelos de dados de cada fonte de dados. A arquitectura proposta é modular, permitindo um crescimento de repositórios e funcionalidades simples e de forma descentralizada, à semelhança da Linked Open Data e da própria World Wide Web. Esta arquitectura trabalha com pesquisas por termos em linguagem natural e também com inquéritos formais em linguagem SPARQL. No entanto os repositórios considerados contêm apenas dados em formato RDF. Esta dissertação baseia-se em múltiplas ontologias partilhadas e interligadas.
Oxidative Leaching of metals from electronic waste with solutions based on quaternary ammonium salts
Resumo:
The treatment of electric and electronic waste (WEEE) is a problem which receives ever more attention. An inadequate treatment results in harmful products ending up in the environment. This project intends to investigate the possibilities of an alternative route for recycling of metals from printed circuit boards (PCBs) obtained from rejected computers. The process is based on aqueous solutions composed of an etchant, either 0.2 M CuCl2.2H2O or 0.2 M FeCl3.6H2O, and a quaternary ammonium salt (quat) such as choline chloride or chlormequat. These solutions are reminiscent of deep eutectic solvents (DES) based on quats. DES are quite similar to ionic liquids (ILs) and are used as well as alternative solvents with a great diversity of physical properties, making them attractive for replacement of hazardous, volatile solvents (e.g. VOCs). A remarkable difference between genuine DES and ILs with the solutions used in this project is the addition of rather large quantities of water. It is shown the presence of water has a lot of advantages on the leaching of metals, while the properties typical for DES still remain. The oxidizing capacities of Cu(II) stem from the existence of a stable Cu(I) component in quat based DES and thus the leaching stems from the activity of the Cu(II)/Cu(I) redox couple. The advantage of Fe(III) in combination with DES is the fact that the Fe(III)/Fe(II) redox couple becomes reversible, which is not true in pure water. This opens perspectives for regeneration of the etching solution. In this project the leaching of copper was studied as a function of gradual increasing water content from 0 - 100w% with the same concentration of copper chloride or iron(III) chloride at room temperature and 80ºC. The solutions were also tested on real PCBs. At room temperature a maximum leaching effect for copper was obtained with 30w% choline chloride with 0.2 M CuCl2.2H2O. The leaching effect is still stronger at 80°C, b ut of course these solutions are more energy consuming. For aluminium, tin, zinc and lead, the leaching was faster at 80ºC. Iron and nickel dissolved easily at room temperature. The solutions were not able to dissolve gold, silver, rhodium and platinum.
Resumo:
This thesis presents the Fuzzy Monte Carlo Model for Transmission Power Systems Reliability based studies (FMC-TRel) methodology, which is based on statistical failure and repair data of the transmission power system components and uses fuzzyprobabilistic modeling for system component outage parameters. Using statistical records allows developing the fuzzy membership functions of system component outage parameters. The proposed hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models allows catching both randomness and fuzziness of component outage parameters. A network contingency analysis to identify any overloading or voltage violation in the network is performed once obtained the system states. This is followed by a remedial action algorithm, based on Optimal Power Flow, to reschedule generations and alleviate constraint violations and, at the same time, to avoid any load curtailment, if possible, or, otherwise, to minimize the total load curtailment, for the states identified by the contingency analysis. For the system states that cause load curtailment, an optimization approach is applied to reduce the probability of occurrence of these states while minimizing the costs to achieve that reduction. This methodology is of most importance for supporting the transmission system operator decision making, namely in the identification of critical components and in the planning of future investments in the transmission power system. A case study based on Reliability Test System (RTS) 1996 IEEE 24 Bus is presented to illustrate with detail the application of the proposed methodology.
Resumo:
The ECG signal has been shown to contain relevant information for human identification. Even though results validate the potential of these signals, data acquisition methods and apparatus explored so far compromise user acceptability, requiring the acquisition of ECG at the chest. In this paper, we propose a finger-based ECG biometric system, that uses signals collected at the fingers, through a minimally intrusive 1-lead ECG setup recurring to Ag/AgCl electrodes without gel as interface with the skin. The collected signal is significantly more noisy than the ECG acquired at the chest, motivating the application of feature extraction and signal processing techniques to the problem. Time domain ECG signal processing is performed, which comprises the usual steps of filtering, peak detection, heartbeat waveform segmentation, and amplitude normalization, plus an additional step of time normalization. Through a simple minimum distance criterion between the test patterns and the enrollment database, results have revealed this to be a promising technique for biometric applications.
Resumo:
The main objective of this paper is to evaluate the key elements in the construction of cosistent organisational messages over time. In order to accomplish that, we propose the aligment of several elements: vision, misson, objectives, cultural values, optimal identity attributes, positioning, type of messages, communication style and means, and image...
Resumo:
Devido à grande quantidade de dados disponíveis na Internet, um dos maiores desafios no mundo virtual é recomendar informação aos seus utilizadores. Por outro lado, esta grande quantidade de dados pode ser útil para melhorar recomendações se for anotada e interligada por dados de proveniência. Neste trabalho é abordada a temática de recomendação de (alteração de) permissões acesso sobre recursos ao seu proprietário, ao invés da recomendação do próprio recurso a um potencial consumidor/leitor. Para permitir a recomendação de acessos a um determinado recurso, independentemente do domínio onde o mesmo se encontra alojado, é essencial a utilização de sistemas de controlo de acessos distribuídos, mecanismos de rastreamento de recursos e recomendação independentes do domínio. Assim sendo, o principal objectivo desta tese é utilizar informação de rastreamento de acções realizadas sobre recursos (i.e. informação que relaciona recursos e utilizadores através da Web independentemente do domínio de rede) e utiliza-la para permitir a recomendação de privilégios de acesso a esses recursos por outros utilizadores. Ao longo do desenvolvimento da tese resultaram as seguintes contribuições: A análise do estado da arte de recomendação e de sistemas de recomendação potencialmente utilizáveis na recomendação de privilégios (secção 2.3); A análise do estado da arte de mecanismos de rastreamento e proveniência de informação (secção 2.2); A proposta de um sistema de recomendação de privilégios de acesso independente do domínio e a sua integração no sistema de controlo de acessos proposto anteriormente (secção 3.1); Levantamento, análise e especificação da informação relativa a privilégios de acesso, para ser utilizada no sistema de recomendação (secção 2.1); A especificação da informação resultante do rastreamento de acções para ser utilizada na recomendação de privilégios de acesso (secção 4.1.1); A especificação da informação de feedback resultante do sistema de recomendação de acessos e sua reutilização no sistema de recomendação(secção 4.1.3); A especificação, implementação e integração do sistema de recomendação de privilégios de acesso na plataforma já existente (secção 4.2 e secção 4.3); Realização de experiências de avaliação ao sistema de recomendação de privilégios, bem como a análise dos resultados obtidos (secção 5).
Resumo:
Radiotherapy (RT) is one of the most important approaches in the treatment of cancer and its performance can be improved in three different ways: through the optimization of the dose distribution, by the use of different irradiation techniques or through the study of radiobiological initiatives. The first is purely physical because is related to the physical dose distributiuon. The others are purely radiobiological because they increase the differential effect between the tumour and the health tissues. The Treatment Planning Systems (TPS) are used in RT to create dose distributions with the purpose to maximize the tumoral control and minimize the complications in the healthy tissues. The inverse planning uses dose optimization techniques that satisfy the criteria specified by the user, regarding the target and the organs at risk (OAR’s). The dose optimization is possible through the analysis of dose-volume histograms (DVH) and with the use of computed tomography, magnetic resonance and other digital image techniques.
Resumo:
The electrooxidative behavior of citalopram (CTL) in aqueous media was studied by cyclic voltammetry (CV) and square-wave voltammetry (SWV) at a glassy-carbon electrode. The electrochemical behaviour of CTL involves two electrons and two protons in the irreversible and diffusion controlled oxidation of the tertiary amine group. The maximum analytical signal was obtained in a phosphate buffer (pH ¼ 8.2). For analytical purposes, an SWV method and a flow-injection analysis (FIA) system with amperometric detection were developed. The optimised SWV method showed a linear range between 1.10 10 5–1.20 10 4 molL 1, with a limit of detection (LOD) of 9.5 10 6 molL 1. Using the FIA method, a linear range between 2.00 10 6–9.00 10 5 molL 1 and an LODof 1.9 10 6 molL 1 were obtained. The validation of both methods revealed good performance characteristics confirming applicability for the quantification of CTL in several pharmaceutical products.
Resumo:
Conferência - 16th International Symposium on Wireless Personal Multimedia Communications (WPMC)- Jun 24-27, 2013
Resumo:
Reactive oxygen species (ROS) are produced as a consequence of normal aerobic metabolism and are able to induce DNA oxidative damage. At the cellular level, the evaluation of the protective effect of antioxidants can be achieved by examining the integrity of the DNA nucleobases using electrochemical techniques. Herein, the use of an adenine-rich oligonucleotide (dA21) adsorbed on carbon paste electrodes for the assessment of the antioxidant capacity is proposed. The method was based on the partial damage of a DNA layer adsorbed on the electrode surface by OH• radicals generated by Fenton reaction and the subsequent electrochemical oxidation of the intact adenine bases to generate an oxidation product that was able to catalyze the oxidation of NADH. The presence of antioxidant compounds scavenged hydroxyl radicals leaving more adenines unoxidized, and thus, increasing the electrocatalytic current of NADHmeasured by differential pulse voltammetry (DPV). Using ascorbic acid (AA) as a model antioxidant species, the detection of as low as 50nMof AA in aqueous solution was possible. The protection efficiency was evaluated for several antioxidant compounds. The biosensor was applied to the determination of the total antioxidant capacity (TAC) in beverages.
Resumo:
In this study, a method for the electrochemical quantification of the total antioxidant capacity (TAC) in beverages was developed. The method is based on the oxidative damage to the purine bases, adenine or guanine, that are immobilized on a glassy carbon electrode (GCE) surface. The oxidative lesions on the DNA bases were promoted by the sulfate radical generated by the persulfate/iron(II) system. The presence of antioxidants on the reactive system promoted the protection of the DNA bases immobilized on the GCE by scavenging the sulfate radical. Square-wave voltammetry (SWV) was the electrochemical technique used to perform this study. The efficiencies of five antioxidants (ascorbic acid, gallic acid, caffeic acid, coumaric acid and resveratrol) in scavenging the sulfate radical and, therefore, their ability to protect the purine bases immobilized on the GCE were investigated. These results demonstrated that the purine-based biosensor is suitable for the rapid assessment of the TAC in flavors and flavored water.
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Automação e Electrónica Industrial