991 resultados para digital capture
Resumo:
OBJECTIVES: To develop a method for objective assessment of fine motor timing variability in Parkinson’s disease (PD) patients, using digital spiral data gathered by a touch screen device. BACKGROUND: A retrospective analysis was conducted on data from 105 subjects including65 patients with advanced PD (group A), 15 intermediate patients experiencing motor fluctuations (group I), 15 early stage patients (group S), and 10 healthy elderly subjects (HE) were examined. The subjects were asked to perform repeated upper limb motor tasks by tracing a pre-drawn Archimedes spiral as shown on the screen of the device. The spiral tracing test was performed using an ergonomic pen stylus, using dominant hand. The test was repeated three times per test occasion and the subjects were instructed to complete it within 10 seconds. Digital spiral data including stylus position (x-ycoordinates) and timestamps (milliseconds) were collected and used in subsequent analysis. The total number of observations with the test battery were as follows: Swedish group (n=10079), Italian I group (n=822), Italian S group (n = 811), and HE (n=299). METHODS: The raw spiral data were processed with three data processing methods. To quantify motor timing variability during spiral drawing tasks Approximate Entropy (APEN) method was applied on digitized spiral data. APEN is designed to capture the amount of irregularity or complexity in time series. APEN requires determination of two parameters, namely, the window size and similarity measure. In our work and after experimentation, window size was set to 4 and similarity measure to 0.2 (20% of the standard deviation of the time series). The final score obtained by APEN was normalized by total drawing completion time and used in subsequent analysis. The score generated by this method is hence on denoted APEN. In addition, two more methods were applied on digital spiral data and their scores were used in subsequent analysis. The first method was based on Digital Wavelet Transform and Principal Component Analysis and generated a score representing spiral drawing impairment. The score generated by this method is hence on denoted WAV. The second method was based on standard deviation of frequency filtered drawing velocity. The score generated by this method is hence on denoted SDDV. Linear mixed-effects (LME) models were used to evaluate mean differences of the spiral scores of the three methods across the four subject groups. Test-retest reliability of the three scores was assessed after taking mean of the three possible correlations (Spearman’s rank coefficients) between the three test trials. Internal consistency of the methods was assessed by calculating correlations between their scores. RESULTS: When comparing mean spiral scores between the four subject groups, the APEN scores were different between HE subjects and three patient groups (P=0.626 for S group with 9.9% mean value difference, P=0.089 for I group with 30.2%, and P=0.0019 for A group with 44.1%). However, there were no significant differences in mean scores of the other two methods, except for the WAV between the HE and A groups (P<0.001). WAV and SDDV were highly and significantly correlated to each other with a coefficient of 0.69. However, APEN was not correlated to neither WAV nor SDDV with coefficients of 0.11 and 0.12, respectively. Test-retest reliability coefficients of the three scores were as follows: APEN (0.9), WAV(0.83) and SD-DV (0.55). CONCLUSIONS: The results show that the digital spiral analysis-based objective APEN measure is able to significantly differentiate the healthy subjects from patients at advanced level. In contrast to the other two methods (WAV and SDDV) that are designed to quantify dyskinesias (over-medications), this method can be useful for characterizing Off symptoms in PD. The APEN was not correlated to none of the other two methods indicating that it measures a different construct of upper limb motor function in PD patients than WAV and SDDV. The APEN also had a better test-retest reliability indicating that it is more stable and consistent over time than WAV and SDDV.
Resumo:
Nowadays the real contribution of light on the acceleration of the chemical reaction for the dental bleaching is under incredulity, mostly because the real mechanisms of its contribution still are obscure. Objectives: Determine the influence of pigment of three colored bleaching gels in the light distribution and absorption in the teeth, to accomplish that, we have used in this experiment bovine teeth and three colored bleaching gels. It is well Known that the dark molecules absorb light and increase the local temperature upraising the bleaching rate, these molecules are located in the interface between the enamel and dentin. Methods: This study was realized using an argon laser with 455nm with 150mW of intensity and a LED with the same characteristics, three colored gels (green, blue and red) and to realize the capture of the digital images it was used a CCD camera connected to a PC. The images were processed in a mathematical environment (MATHLAB, R12 (R)). Results: The obtained results show that the color of the bleaching gel influences significantly the absorption of light in the specific sites of the teeth. Conclusions: This poor absorption can be one of the major factors involved with the incredulity of the light contribution on the process that can be observed in the literature nowadays.
Resumo:
INTRODUÇÃO: A utilização da fotogrametria computadorizada em prol da goniometria, ou vice-versa, na prática clínica ainda necessita de fundamentações consistentes. OBJETIVOS: Os objetivos deste estudo foram: verificar a confiabilidade inter e intraexaminadores avaliadores na quantificação das medidas angulares obtidas a partir da fotogrametria computadorizada e a goniometria e determinar a confiabilidade paralela entre esses dois diferentes instrumentos de avaliação. MATERIAIS E MÉTODOS: 26 voluntários e 4 examinadores foram utilizados no estudo. A coleta foi realizada em 4 etapas sequenciais: demarcação dos pontos anatômicos de referência, mensuração e registro dos valores goniométricos, captação da imagem do voluntário com os marcadores fixados no corpo e avaliação do registro fotográfico no programa ImageJ. RESULTADOS: O goniômetro é um instrumento confiável na maioria das evidências, porém, a confiabilidade das medições depende principalmente da uniformização dos procedimentos. Considerações metodológicas relativas ao estabelecimento de confiabilidade e padronização da colocação dos marcadores se fazem necessárias, de modo a oferecer opções de avaliação ainda mais confiáveis para a prática clínica. CONCLUSÃO: Ambos os instrumentos são confiáveis e aceitáveis, porém, mais evidências ainda são necessárias para suportar a utilização desses instrumentos, pois poucos pesquisadores têm utilizado o mesmo desenho de estudo, e a comparação dos resultados entre eles muitas vezes são difíceis.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Pós-graduação em Televisão Digital: Informação e Conhecimento - FAAC
Resumo:
Pós-graduação em Engenharia Mecânica - FEG
Resumo:
In soil surveys, several sampling systems can be used to define the most representative sites for sample collection and description of soil profiles. In recent years, the conditioned Latin hypercube sampling system has gained prominence for soil surveys. In Brazil, most of the soil maps are at small scales and in paper format, which hinders their refinement. The objectives of this work include: (i) to compare two sampling systems by conditioned Latin hypercube to map soil classes and soil properties; (II) to retrieve information from a detailed scale soil map of a pilot watershed for its refinement, comparing two data mining tools, and validation of the new soil map; and (III) to create and validate a soil map of a much larger and similar area from the extrapolation of information extracted from the existing soil map. Two sampling systems were created by conditioned Latin hypercube and by the cost-constrained conditioned Latin hypercube. At each prospection place, soil classification and measurement of the A horizon thickness were performed. Maps were generated and validated for each sampling system, comparing the efficiency of these methods. The conditioned Latin hypercube captured greater variability of soils and properties than the cost-constrained conditioned Latin hypercube, despite the former provided greater difficulty in field work. The conditioned Latin hypercube can capture greater soil variability and the cost-constrained conditioned Latin hypercube presents great potential for use in soil surveys, especially in areas of difficult access. From an existing detailed scale soil map of a pilot watershed, topographical information for each soil class was extracted from a Digital Elevation Model and its derivatives, by two data mining tools. Maps were generated using each tool. The more accurate of these tools was used for extrapolation of soil information for a much larger and similar area and the generated map was validated. It was possible to retrieve the existing soil map information and apply it on a larger area containing similar soil forming factors, at much low financial cost. The KnowledgeMiner tool for data mining, and ArcSIE, used to create the soil map, presented better results and enabled the use of existing soil map to extract soil information and its application in similar larger areas at reduced costs, which is especially important in development countries with limited financial resources for such activities, such as Brazil.
Resumo:
Neste trabalho é apresentado o desenvolvimento e os resultados da implementação e testes em campo de um estabilizador de sistema de potência (ESP) projetado com técnica de controle digital para fins de amortecimento de modos de oscilação eletromecânica observáveis em sinais de potência elétrica medido em uma unidade hidro-geradora, de 350 MVA da Usina Hidrelétrica de Tucuruí. É apresentada e aplicada a metodologia de identificação de modelos paramétricos lineares do tipo auto regressivo com entradas exógenas (ARX), para estimação de modelos com capacidade de capturar a informação relevante (amortecimento e freqüência natural) dos modos eletromecânicos dominantes do sistema. De posse do modelo paramétrico ARX, é efetuada então a síntese da lei de controle digital amortecedor para o ESP, através da técnica de deslocamento radial dos pólos da função de transferência de malha fechada. Para a síntese da lei de controle digital, utilizou-se uma estrutura canônica do tipo RST. Para os testes de campo, a lei de controle amortecedor do ESP digital foi codificada em linguagem C e embarcada em um protótipo cujo hardware é baseado em microcontrolador modelo DSPIC 30F3014, o qual incorpora um grande número de periféricos para aquisição e comunicação de dados. Para avaliar o desempenho do ESP digital desenvolvido, testes experimentais foram realizados em uma unidade geradora de 350 MVA da casa de força número 1, da UHE de Tucuruí. O estabilizador desenvolvido atua através da modulação da referência de tensão do regulador automático de tensão da respectiva unidade geradora, de acordo com as oscilações observadas através da medida de potência elétrica no estator do gerador. Os resultados de testes de campo mostraram um excelente desempenho do ESP digital no amortecimento de um modo eletromecânico, de freqüência natural de aproximadamente 1,7 Hz, observado nos teste de campo realizado.
Resumo:
Os flebotomíneos compreendem um grupo de insetos de grande interesse médicoveterininário, sobretudo com sua associação vetorial com as leishmanioses. A identificação correta destas espécies no campo é prática imprescindível na pesquisa entomológica e controle de vetores de Leishmania, sobretudo no estado do Pará, onde a fauna de flebotomíneos é bastante diversificada em relação ao montante brasileiro. No entanto, esta prática tem sido realizada por técnicos treinados com base na “chave” atualizada em 2003, apoiada por desenhos manuais de vários pesquisadores, que muitas vezes são de difícil entendimento. Assim, este estudo objevitou apresentar a relevância do uso da fotomicrografia digital como instrumento de apoio para a identificação e registro de flebotomíneos do subgênero Psychodopygus. Utilizou-se como abordagem metodológica a captura de imagens de lâminas da coleção do Insetário de Flebotomíneos do Laboratório de Leishmanioses do Instituto Evandro Chagas, das diversas áreas de estudos epidemiológicos de leishmanioses do estado do Pará, no período de 1970 até a atualidade, por meio dos sistemas de: Axiostar, Canon via Phototube e câmera convencional sobreposta à ocular do microscópio. A partir das imagens capturadas, construiu-se uma base de dados organizada de acordo com a hierarquia taxonômica de Phlebotominae que posteriormente foram comparadas com as ilustrações tradicionais de chaves de identificação. Do total de 2105 lâminas, foram obtidas 222 imagens de flebotomíneos. Um banco de dados contendo 344 imagens referentes às 17 espécies de Psychodopygus foi preparado no decorrer do estudo. Conclui-se que as imagens das estruturas desenhadas por diferentes pesquisadores ao longo dos anos apresentavam diferenças, traduzindo a subjetividade da interpretação para a mesma espécie dentro do subgênero Psychodopygus; assim a fotomicrografia digital demonstra ser um recurso de grande utilidade e importância, devido apresentar vantagens, como: melhor qualidade de imagens, durabilidade, fidedignidade com o real observado e praticidade tecnológica, proporcionando maior segurança e ou confiabilidade ao profissional durante a investigação epidemiológica.
Resumo:
Pós-graduação em Ciência da Informação - FFC
Resumo:
This thesis proposes a new document model, according to which any document can be segmented in some independent components and transformed in a pattern-based projection, that only uses a very small set of objects and composition rules. The point is that such a normalized document expresses the same fundamental information of the original one, in a simple, clear and unambiguous way. The central part of my work consists of discussing that model, investigating how a digital document can be segmented, and how a segmented version can be used to implement advanced tools of conversion. I present seven patterns which are versatile enough to capture the most relevant documents’ structures, and whose minimality and rigour make that implementation possible. The abstract model is then instantiated into an actual markup language, called IML. IML is a general and extensible language, which basically adopts an XHTML syntax, able to capture a posteriori the only content of a digital document. It is compared with other languages and proposals, in order to clarify its role and objectives. Finally, I present some systems built upon these ideas. These applications are evaluated in terms of users’ advantages, workflow improvements and impact over the overall quality of the output. In particular, they cover heterogeneous content management processes: from web editing to collaboration (IsaWiki and WikiFactory), from e-learning (IsaLearning) to professional printing (IsaPress).
Resumo:
Carbon dioxide (CO2) capture and storage experiments were conducted at ambient conditions in varying weight % sodium carbonate (Na2CO3) solutions. Experiments were conducted to determine the optimal amount of Na2CO3 in solution for CO2 absorption. It was concluded that a 2% Na2CO3 solution, by weight, was the most efficient solution. The 2% Na2CO3 solution is able to absorb 0.5 g CO2/g Na2CO3. These results led to studies to determine how the gas bubble size affected carbon dioxide absorption in the solution. Studies were conducted using ASTM porosity gas diffusers to vary the bubble size. Gas diffusers with porosities of fine, medium, and extra coarse were used. Results found that the medium porosity gas diffuser was the most efficient at absorbing CO2 at 50%. Variation in the bubble size concluded that absorption of carbon dioxide into the sodium carbonate solution does depend on the bubble size, thus is mass transfer limited. Once the capture stage was optimized (amount of Na2CO3 in solution and bubble size), the next step was to determine if carbon dioxide could be stored as a calcium carbonate mineral using calcium rich industrial waste and if the sodium carbonate solution could be simultaneously regenerated. Studies of CO2 sequestration at ambient conditions have shown that it is possible to permanently sequester CO2 in the form of calcium carbonate using a calcium rich industrial waste. Studies have also shown that it is possible to regenerate a fraction of the sodium carbonate solution.
Resumo:
The novel approach to carbon capture and storage (CCS) described in this dissertation is a significant departure from the conventional approach to CCS. The novel approach uses a sodium carbonate solution to first capture CO2 from post combustion flue gas streams. The captured CO2 is then reacted with an alkaline industrial waste material, at ambient conditions, to regenerate the carbonate solution and permanently store the CO2 in the form of an added value carbonate mineral. Conventional CCS makes use of a hazardous amine solution for CO2 capture, a costly thermal regeneration stage, and the underground storage of supercritical CO2. The objective of the present dissertation was to examine each individual stage (capture and storage) of the proposed approach to CCS. Study of the capture stage found that a 2% w/w sodium carbonate solution was optimal for CO2 absorption in the present system. The 2% solution yielded the best tradeoff between the CO2 absorption rate and the CO2 absorption capacity of the solutions tested. Examination of CO2 absorption in the presence of flue gas impurities (NOx and SOx) found that carbonate solutions possess a significant advantage over amine solutions, that they could be used for multi-pollutant capture. All the NOx and SOx fed to the carbonate solution was able to be captured. Optimization studies found that it was possible to increase the absorption rate of CO2 into the carbonate solution by adding a surfactant to the solution to chemically alter the gas bubble size. The absorption rate of CO2 was increased by as much as 14%. Three coal combustion fly ash materials were chosen as the alkaline industrial waste materials to study the storage CO2 and regeneration the absorbent. X-ray diffraction analysis on reacted fly ash samples confirmed that the captured CO2 reacts with the fly ash materials to form a carbonate mineral, specifically calcite. Studies found that after a five day reaction time, 75% utilization of the waste material for CO2 storage could be achieved, while regenerating the absorbent. The regenerated absorbent exhibited a nearly identical CO2 absorption capacity and CO2 absorption rate as a fresh Na2CO3 solution.