987 resultados para Absolute orientation method
Resumo:
Image orientation is a basic problem in Digital Photogrammetry. While interior and relative orientations were succesfully automated, the same can not be said about absolute orientation. This process can be automated by using an approach based on relational matching and a heuristic that uses the analytical relation between straight features in the object space and its homologous in the image space. A build-in self-diagnosis is also used in this method, that is based on the implementation of data snooping statistic test in the process of spatial resection, using the Iterated Extended Kalman Filtering (IEKF). The aim of this paper is to present the basic principles of the proposed approach and results based on real data.
Resumo:
OBJECTIVE: In ictal scalp electroencephalogram (EEG) the presence of artefacts and the wide ranging patterns of discharges are hurdles to good diagnostic accuracy. Quantitative EEG aids the lateralization and/or localization process of epileptiform activity. METHODS: Twelve patients achieving Engel Class I/IIa outcome following temporal lobe surgery (1 year) were selected with approximately 1-3 ictal EEGs analyzed/patient. The EEG signals were denoised with discrete wavelet transform (DWT), followed by computing the normalized absolute slopes and spatial interpolation of scalp topography associated to detection of local maxima. For localization, the region with the highest normalized absolute slopes at the time when epileptiform activities were registered (>2.5 times standard deviation) was designated as the region of onset. For lateralization, the cerebral hemisphere registering the first appearance of normalized absolute slopes >2.5 times the standard deviation was designated as the side of onset. As comparison, all the EEG episodes were reviewed by two neurologists blinded to clinical information to determine the localization and lateralization of seizure onset by visual analysis. RESULTS: 16/25 seizures (64%) were correctly localized by the visual method and 21/25 seizures (84%) by the quantitative EEG method. 12/25 seizures (48%) were correctly lateralized by the visual method and 23/25 seizures (92%) by the quantitative EEG method. The McNemar test showed p=0.15 for localization and p=0.0026 for lateralization when comparing the two methods. CONCLUSIONS: The quantitative EEG method yielded significantly more seizure episodes that were correctly lateralized and there was a trend towards more correctly localized seizures. SIGNIFICANCE: Coupling DWT with the absolute slope method helps clinicians achieve a better EEG diagnostic accuracy.
Resumo:
13th International Conference on Autonomous Robot Systems (Robotica), 2013, Lisboa
Resumo:
Objectives: The resazurin microtitre plate assay (REMA) was evaluated to determine the susceptibility of Mycobacterium tuberculosis to pyrazinamide, and was compared with the broth microdilution method (BMM), the absolute concentration method (ACM) and pyrazinamidase (PZase) determination. Methods: Thirty-four M. tuberculosis clinical isolates (26 susceptible and 8 resistant to pyrazinamide) and reference strains M. tuberculosis H37Rv ATCC 27294 and Mycobacterium bovis AN5 were tested. Results: REMA and BMM showed 100% specificity and sensitivity when compared with ACM; BMM, however, demanded more reading time. The PZase determination assay showed 87.50% and 100% sensitivity and specificity, respectively. Conclusions: All tested methods in this preliminary study showed excellent sensitivity and specificity for the determination of pyrazinamide susceptibility of M. tuberculosis, but REMA was faster, low-cost and easy to perform and interpret. Additional studies evaluating REMA for differentiating pyrazinamide-resistant and-susceptible M. tuberculosis should be conducted on an extended panel of clinical isolates.
Resumo:
Fenómeno assinalável no nosso século foi a emergência da chamada “arte global” (Belting), dando conta da crise do “mundo da arte” e a disseminação generalizada das práticas artísticas. Neste contexto a obra de Rothko ganha uma força inesperada. Sendo usualmente inscrito no “modernismo” com os seu valores de pureza e especificidade do meio, neste caso a pintura, a nossa investigação revela que o gesto Rothkoniano excede largamente esta representação, que levaria a distinguir radicalmente entre uma fase mítica e surrealista, uma fase abstracionista dos “colour field” e finalmente uma fase sublime das pinturas da Capela Ecuménica de Rothko. Existe uma continuidade evidente que remete para uma geoestética, onde a terra e a sua habitabilidade desempenham um papel crucial. Daí a necessidade de inscrever a obra de Rothko na geofilosofia contemporânea, tal com foi delineada por Gilles Deleuze e Félix Guattari. Procedeu-se, assim, a uma análise crítica da obra e da estética de Rothko, que profeticamente, mas inconscientemente, parece abrir o caminho para o pensamento de uma arte da terra. Trata-se de uma linha de continuidade que atravessa toda a obra de Rothko, refletindo uma picturação do mundo e a vontade de criar de um mundo pictórico e poético, reduzido a elementos mínimos, pós-figurativos mas onde se reconhece a incidência dos motivos como frame e abertura, linha de horizonte e pórticos e passagens. Num segundo momento, explora-se essa dimensão “inconsciente” num projeto artístico pessoal, que se desdobra em abordagens picturais, de pintura, de instalação e de vídeo, que denominamos por “A Terra como Acontecimento”. Este projeto prolonga o esforço Rothkoniano, ao mesmo tempo que o altera profundamente, nomeadamente pelo uso dos materiais, pela mutação no uso da cor, bem como pela maneira como os elementos figurativos são radicalmente alterados pela mera transposição da perspetiva usada. Se a ressonância rothkoniana está bem presente, não menos presente está a intenção de um confronto dialogante com a Obra de Mark Rothko. Aquilo que neste importante artista, era o inconsciente, marcado pelo mito e teologia, pela delimitação da linha de horizonte, bem clássica, e, acima de tudo, pela sua verticalidade marcadamente teológica, “A Terra como Acontecimento” é a matéria que é profundamente radicalizada, bem como a lógica concetual, a qual é preferentemente circular, sem orientação absoluta, e incompleta, o que implica uma outra visão da “abertura”/”fecho”, tão essencial na obra de Rothko. Desta investigação espera-se um contributo significativo para os debates atuais sobre a arte na contemporaneidade.
Resumo:
The aim of this study is to perform a thorough comparison of quantitative susceptibility mapping (QSM) techniques and their dependence on the assumptions made. The compared methodologies were: two iterative single orientation methodologies minimizing the l2, l1TV norm of the prior knowledge of the edges of the object, one over-determined multiple orientation method (COSMOS) and anewly proposed modulated closed-form solution (MCF). The performance of these methods was compared using a numerical phantom and in-vivo high resolution (0.65mm isotropic) brain data acquired at 7T using a new coil combination method. For all QSM methods, the relevant regularization and prior-knowledge parameters were systematically changed in order to evaluate the optimal reconstruction in the presence and absence of a ground truth. Additionally, the QSM contrast was compared to conventional gradient recalled echo (GRE) magnitude and R2* maps obtained from the same dataset. The QSM reconstruction results of the single orientation methods show comparable performance. The MCF method has the highest correlation (corrMCF=0.95, r(2)MCF =0.97) with the state of the art method (COSMOS) with additional advantage of extreme fast computation time. The l-curve method gave the visually most satisfactory balance between reduction of streaking artifacts and over-regularization with the latter being overemphasized when the using the COSMOS susceptibility maps as ground-truth. R2* and susceptibility maps, when calculated from the same datasets, although based on distinct features of the data, have a comparable ability to distinguish deep gray matter structures.
Resumo:
Objective To assess primary health care attributes of access to a first contact, comprehensiveness, coordination, continuity, family guidance and community orientation. Method An evaluative, quantitative and cross-sectional study with 35 professional teams in the Family Health Program of the Alfenas region, Minas Gerais, Brazil. Data collection was done with the Primary Care Assessment Tool - Brazil, professional version. Results Results revealed a low percentage of medical experts among the participants who evaluated the attributes with high scores, with the exception of access to a first contact. Data analysis revealed needs for improvement: hours of service; forms of communication between clients and healthcare services and between clients and professionals; the mechanism of counter-referral. Conclusion It was concluded that there is a mismatch between the provision of services and the needs of the population, which compromises the quality of primary health care.
Resumo:
Le long bio-polymère d'ADN est condensé à l’intérieur du noyau des cellules eukaryotes à l'aide de petites protéines appelées histones. En plus de leurs fonctions condensatrices,ces histones sont également la cible de nombreuses modifications post-traductionnelles(MPT), particulièrement au niveau de leur section N-terminale. Ces modifications réversibles font partie d’un code d’histones épi-génétique transmissible qui orchestre et module dynamiquement certains événements impliquant la chromatine, tels l’activation et la désactivation de gènes ainsi que la duplication et la réparation d’ADN. Ces modifications sont impliquées subséquemment dans la signalisation et la progression de cancers, tels que la leucémie. En conséquence, l'élucidation des modifications d’histones est importante pour comprendre leurs fonctions biologiques. Une méthodologie analytique a été mise au point en laboratoire pour isoler, détecter, et quantifier les MPT d’histones en utilisant une approche rapide à deux volets à l’aide d’outils bioinformatiques spécialisés. La méthodologie développée en laboratoire a été validée en utilisant des histones de souche sauvage ainsi que deux types d’histones mutants déficients en enzymes acétyltransferase. Des trois sources d’histones utilisées, la seule MPT qui a démontré un changement significatif est l’acétylation de l’histone H3 à lysine 56 (H3K56ac). L’expression et la stoechiométrie de cette MPT, issue de cellules de souche sauvage et de cellules mutantes, ont été déterminées avec précision et comparées. Les fonctions de balayage polyvalentes d'un instrument à trappe ionique quadrupôle linéaire hybride ont été utilisées pour améliorer la détection de protéines intactes. Le mode de balayage « enhanced multiply charged » (EMC) a été modifié pour contenir et détecter les ions de protéines intactes situées dans la trappe ionique linéaire. Ce mode de balayage nommé « targeted EMC » (tEMC) a permis de quadrupler le niveau de sensibilité (signal/interférence), et quintupler la résolution du mode de balayage conventionnel. De plus, la capacité de séparation des charges du tEMC a réduit de façon significative les effets de « space charge » dans la trappe ionique linéaire. La résolution supérieure du mode tEMC a permis de différencier plusieurs isoformes modifiées, particulièrement pour l’histone H3. L’analyse des peptides d’histones trypsiques à l’aide du mode de balayage « MRM » a permis le séquençage et la quantification de MPT avec un haut degré de précision. La seule MPT qui était sous-exprimée entre l’histone de souche sauvage et le mutant DOT1L fut la méthylation de l’histone H3 lysine 79(H3K79me1). Les effets de deux inhibiteurs d’enzymes HDAC (HDACi) sur l’expression de MPT d’histone ont été évalués en utilisant la méthodologie analytique mentionnée. Les histones extraites de cellules normales et cancéreuses ont été exposées à du Vorinostat(SAHA) ou du Entinostat (MS-275) pour une période de 24 à 72 heures. Deux histones furent principalement affectées, soit H3 et H4. Étonnamment, les mêmes effets n'ont pas été détectés lorsque les cellules normales ont été traitées avec le HDACi pour une période de 48 à 72 heures. Une méthode absolue de quantification avec une courbe d’étalonnage a été développée pour le peptide H3K56ac. Contrairement à certaines publications, nos résultats démontrent que cette MPT est présente dans les cellules mammifères avec une stoechiométrie très basse (< 0,1%) et n'est pas surexprimée de façon significative après le traitement au HDACi.
Resumo:
Protecting the quality of children growth and development becomes a supreme qualification for the betterment of a nation. Double burden child malnutrition is emerging worldwide which might have a strong influence to the quality of child brain development and could not be paid-off on later life. Milk places a notable portion during the infancy and childhood. Thus, the deep insight on milk consumption pattern might explain the phenomenon of double burden child malnutrition correlated to the cognitive impairments. Objective: Current study is intended (1) to examine the current face of Indonesian double burden child malnutrition: a case study in Bogor, West Java, Indonesia, (2) to investigate the association of this phenomenon with child brain development, and (3) to examine the contribution of socioeconomic status and milk consumption on this phenomenon so that able to formulate some possible solutions to encounter this problem. Design: A cross-sectional study using a structured coded questionnaire was conducted among 387 children age 5-6 years old and their parents from 8 areas in Bogor, West-Java, Indonesia on November 2012 to December 2013, to record some socioeconomic status, anthropometric measurements, and history of breast feeding. Diet and probability of milk intake was assessed by two 24 h dietary recalls and food frequency questionnaire (FFQ). Usual daily milk intake was calculated using Multiple Source Method (MSM). Some brain development indicators (IQ, EQ, learning, and memory ability) using Projective Multi-phase Orientation method was also executed to learn the correlation between double burden child malnutrition and some brain development indicator. Results and conclusions: A small picture of child double burden malnutrition is shown in Bogor, West Java, Indonesia, where prevalence of Severe Acute Malnutrition (SAM) is 27.1%, Moderate Acute Malnutrition (MAM) is 24.9%, and overnutrition is 7.7%. This phenomenon proves to impair the child brain development. The malnourished children, both under- and over- nourished children have significantly (P-value<0.05) lower memory ability compared to the normal children (memory score, N; SAM = 45.2, 60; MAM = 48.5, 61; overweight = 48.4, 43; obesity = 47.9, 60; normal = 52.4, 163). The plausible reasons behind these evidences are the lack of nutrient intake during the sprout growth period on undernourished children or increasing adiposity on overnourished children might influence the growth of hippocampus area which responsible to the memory ability. Either undernutrition or overnutrition, the preventive action on this problem is preferable to avoid ongoing cognitive performance loss of the next generation. Some possible solutions for this phenomenon are promoting breast feeding initiation and exclusive breast feeding practices for infants, supporting the consumption of a normal portion of milk (250 to 500 ml per day) for children, and breaking the chain of poverty by socioeconomic improvement. And, the national food security becomes the fundamental point for the betterment of the next. In the global context, the causes of under- and over- nutrition have to be opposed through integrated and systemic approaches for a better quality of the next generation of human beings.
Resumo:
1. Suction sampling is a popular method for the collection of quantitative data on grassland invertebrate populations, although there have been no detailed studies into the effectiveness of the method. 2. We investigate the effect of effort (duration and number of suction samples) and sward height on the efficiency of suction sampling of grassland beetle, true bug, planthopper and spider Populations. We also compare Suction sampling with an absolute sampling method based on the destructive removal of turfs. 3. Sampling for durations of 16 seconds was sufficient to collect 90% of all individuals and species of grassland beetles, with less time required for the true bugs, spiders and planthoppers. The number of samples required to collect 90% of the species was more variable, although in general 55 sub-samples was sufficient for all groups, except the true bugs. Increasing sward height had a negative effect on the capture efficiency of suction sampling. 4. The assemblage structure of beetles, planthoppers and spiders was independent of the sampling method (suction or absolute) used. 5. Synthesis and applications. In contrast to other sampling methods used in grassland habitats (e.g. sweep netting or pitfall trapping), suction sampling is an effective quantitative tool for the measurement of invertebrate diversity and assemblage structure providing sward height is included as a covariate. The effective sampling of beetles, true bugs, planthoppers and spiders altogether requires a minimum sampling effort of 110 sub-samples of duration of 16 seconds. Such sampling intensities can be adjusted depending on the taxa sampled, and we provide information to minimize sampling problems associated with this versatile technique. Suction sampling should remain an important component in the toolbox of experimental techniques used during both experimental and management sampling regimes within agroecosystems, grasslands or other low-lying vegetation types.
Resumo:
Outliers são observações que parecem ser inconsistentes com as demais. Também chamadas de valores atípicos, extremos ou aberrantes, estas inconsistências podem ser causadas por mudanças de política ou crises econômicas, ondas inesperadas de frio ou calor, erros de medida ou digitação, entre outras. Outliers não são necessariamente valores incorretos, mas, quando provenientes de erros de medida ou digitação, podem distorcer os resultados de uma análise e levar o pesquisador à conclusões equivocadas. O objetivo deste trabalho é estudar e comparar diferentes métodos para detecção de anormalidades em séries de preços do Índice de Preços ao Consumidor (IPC), calculado pelo Instituto Brasileiro de Economia (IBRE) da Fundação Getulio Vargas (FGV). O IPC mede a variação dos preços de um conjunto fixo de bens e serviços componentes de despesas habituais das famílias com nível de renda situado entre 1 e 33 salários mínimos mensais e é usado principalmente como um índice de referência para avaliação do poder de compra do consumidor. Além do método utilizado atualmente no IBRE pelos analistas de preços, os métodos considerados neste estudo são: variações do Método do IBRE, Método do Boxplot, Método do Boxplot SIQR, Método do Boxplot Ajustado, Método de Cercas Resistentes, Método do Quartil, do Quartil Modificado, Método do Desvio Mediano Absoluto e Algoritmo de Tukey. Tais métodos foram aplicados em dados pertencentes aos municípios Rio de Janeiro e São Paulo. Para que se possa analisar o desempenho de cada método, é necessário conhecer os verdadeiros valores extremos antecipadamente. Portanto, neste trabalho, tal análise foi feita assumindo que os preços descartados ou alterados pelos analistas no processo de crítica são os verdadeiros outliers. O Método do IBRE é bastante correlacionado com os preços alterados ou descartados pelos analistas. Sendo assim, a suposição de que os preços alterados ou descartados pelos analistas são os verdadeiros valores extremos pode influenciar os resultados, fazendo com que o mesmo seja favorecido em comparação com os demais métodos. No entanto, desta forma, é possível computar duas medidas através das quais os métodos são avaliados. A primeira é a porcentagem de acerto do método, que informa a proporção de verdadeiros outliers detectados. A segunda é o número de falsos positivos produzidos pelo método, que informa quantos valores precisaram ser sinalizados para um verdadeiro outlier ser detectado. Quanto maior for a proporção de acerto gerada pelo método e menor for a quantidade de falsos positivos produzidos pelo mesmo, melhor é o desempenho do método. Sendo assim, foi possível construir um ranking referente ao desempenho dos métodos, identificando o melhor dentre os analisados. Para o município do Rio de Janeiro, algumas das variações do Método do IBRE apresentaram desempenhos iguais ou superiores ao do método original. Já para o município de São Paulo, o Método do IBRE apresentou o melhor desempenho. Em trabalhos futuros, espera-se testar os métodos em dados obtidos por simulação ou que constituam bases largamente utilizadas na literatura, de forma que a suposição de que os preços descartados ou alterados pelos analistas no processo de crítica são os verdadeiros outliers não interfira nos resultados.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
In this work we propose a technique that uses uncontrolled small format aerial images, or SFAI, and stereohotogrammetry techniques to construct georeferenced mosaics. Images are obtained using a simple digital camera coupled with a radio controlled (RC) helicopter. Techniques for removing common distortions are applied and the relative orientation of the models are recovered using projective geometry. Ground truth points are used to get absolute orientation, plus a definition of scale and a coordinate system which relates image measures to the ground. The mosaic is read into a GIS system, providing useful information to different types of users, such as researchers, governmental agencies, employees, fishermen and tourism enterprises. Results are reported, illustrating the applicability of the system. The main contribution is the generation of georeferenced mosaics using SFAIs, which have not yet broadly explored in cartography projects. The proposed architecture presents a viable and much less expensive solution, when compared to systems using controlled pictures
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Diese Arbeit stellt eine ausführliche Studie fundamentaler Eigenschaften der Kalzit CaCO3(10.4) und verwandter Mineraloberflächen dar, welche nicht nur durch die Verwendung von Nichtkontakt Rasterkraftmikroskopie, sondern hauptsächlich durch die Messung von Kraftfeldern ermöglicht wurde. Die absolute Oberflächenorientierung sowie der hierfür zugrundeliegende Prozess auf atomarer Skala konnten erfolgreich für die Kalzit (10.4) Oberfläche identifiziert werden.rnDie Adsorption chiraler Moleküle auf Kalzit ist relevant im Bereich der Biomineralisation, was ein Verständnis der Oberflächensymmetrie unumgänglich macht. Die Messung des Oberflächenkraftfeldes auf atomarer Ebene ist hierfür ein zentraler Aspekt. Eine solche Kraftkarte beleuchtet nicht nur die für die Biomineralisation wichtige Wechselwirkung der Oberfläche mit Molekülen, sondern enthält auch die Möglichkeit, Prozesse auf atomarer Skala und damit Oberflächeneigenschaften zu identifizieren.rnDie Einführung eines höchst flexiblen Messprotokolls gewährleistet die zuverlässige und kommerziell nicht erhältliche Messung des Oberflächenkraftfeldes. Die Konversion der rohen ∆f Daten in die vertikale Kraft Fz ist jedoch kein trivialer Vorgang, insbesondere wenn Glätten der Daten in Frage kommt. Diese Arbeit beschreibt detailreich, wie Fz korrekt für die experimentellen Bedingungen dieser Arbeit berechnet werden können. Weiterhin ist beschrieben, wie Lateralkräfte Fy und Dissipation Γ erhalten wurden, um das volle Potential dieser Messmethode auszureizen.rnUm Prozesse auf atomarer Skala auf Oberflächen zu verstehen sind die kurzreichweitigen, chemischen Kräfte Fz,SR von größter Wichtigkeit. Langreichweitige Beiträge müssen hierzu an Fz angefittet und davon abgezogen werden. Dies ist jedoch eine fehleranfällige Aufgabe, die in dieser Arbeit dadurch gemeistert werden konnte, dass drei unabhängige Kriterien gefunden wurden, die den Beginn zcut von Fz,SR bestimmen, was für diese Aufgabe von zentraler Bedeutung ist. Eine ausführliche Fehleranalyse zeigt, dass als Kriterium die Abweichung der lateralen Kräfte voneinander vertrauenswürdige Fz,SR liefert. Dies ist das erste Mal, dass in einer Studie ein Kriterium für die Bestimmung von zcut gegeben werden konnte, vervollständigt mit einer detailreichen Fehleranalyse.rnMit der Kenntniss von Fz,SR und Fy war es möglich, eine der fundamentalen Eigenschaften der CaCO3(10.4) Oberfläche zu identifizieren: die absolute Oberflächenorientierung. Eine starke Verkippung der abgebildeten Objekte