916 resultados para Topology-based methods


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The realization that statistical physics methods can be applied to analyze written texts represented as complex networks has led to several developments in natural language processing, including automatic summarization and evaluation of machine translation. Most importantly, so far only a few metrics of complex networks have been used and therefore there is ample opportunity to enhance the statistics-based methods as new measures of network topology and dynamics are created. In this paper, we employ for the first time the metrics betweenness, vulnerability and diversity to analyze written texts in Brazilian Portuguese. Using strategies based on diversity metrics, a better performance in automatic summarization is achieved in comparison to previous work employing complex networks. With an optimized method the Rouge score (an automatic evaluation method used in summarization) was 0.5089, which is the best value ever achieved for an extractive summarizer with statistical methods based on complex networks for Brazilian Portuguese. Furthermore, the diversity metric can detect keywords with high precision, which is why we believe it is suitable to produce good summaries. It is also shown that incorporating linguistic knowledge through a syntactic parser does enhance the performance of the automatic summarizers, as expected, but the increase in the Rouge score is only minor. These results reinforce the suitability of complex network methods for improving automatic summarizers in particular, and treating text in general. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Evapotranspiration (ET) plays an important role in global climate dynamics and in primary production of terrestrial ecosystems; it represents the mass and energy transfer from the land to atmosphere. Limitations to measuring ET at large scales using ground-based methods have motivated the development of satellite remote sensing techniques. The purpose of this work is to evaluate the accuracy of the SEBAL algorithm for estimating surface turbulent heat fluxes at regional scale, using 28 images from MODIS. SEBAL estimates are compared with eddy-covariance (EC) measurements and results from the hydrological model MGB-IPH. SEBAL instantaneous estimates of latent heat flux (LE) yielded r(2) = 0.64 and r(2) = 0.62 over sugarcane croplands and savannas when compared against in situ EC estimates. At the same sites, daily aggregated estimates of LE were r(2) = 0.76 and r(2) = 0.66, respectively. Energy balance closure showed that turbulent fluxes over sugarcane croplands were underestimated by 7% and 9% over savannas. Average daily ET from SEBAL is in close agreement with estimates from the hydrological model for an overlay of 38,100 km(2) (r(2) = 0.88). Inputs to which the algorithm is most sensitive are vegetation index (NDVI), gradient of temperature (dT) to compute sensible heat flux (H) and net radiation (Re). It was verified that SEBAL has a tendency to overestimate results both at local and regional scales probably because of low sensitivity to soil moisture and water stress. Nevertheless the results confirm the potential of the SEBAL algorithm, when used with MODIS images for estimating instantaneous LE and daily ET from large areas.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abstract Background Leishmania (Leishmania) amazonensis infection in man results in a clinical spectrum of disease manifestations ranging from cutaneous to mucosal or visceral involvement. In the present study, we have investigated the genetic variability of 18 L. amazonensis strains isolated in northeastern Brazil from patients with different clinical manifestations of leishmaniasis. Parasite DNA was analyzed by sequencing of the ITS flanking the 5.8 S subunit of the ribosomal RNA genes, by RAPD and SSR-PCR and by PFGE followed by hybridization with gene-specific probes. Results ITS sequencing and PCR-based methods revealed genetic heterogeneity among the L. amazonensis isolates examined and molecular karyotyping also showed variation in the chromosome size of different isolates. Unrooted genetic trees separated strains into different groups. Conclusion These results indicate that L. amazonensis strains isolated from leishmaniasis patients from northeastern Brazil are genetically diverse, however, no correlation between genetic polymorphism and phenotype were found.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introduction Toxoplasmosis may be life-threatening in fetuses and in immune-deficient patients. Conventional laboratory diagnosis of toxoplasmosis is based on the presence of IgM and IgG anti-Toxoplasma gondii antibodies; however, molecular techniques have emerged as alternative tools due to their increased sensitivity. The aim of this study was to compare the performance of 4 PCR-based methods for the laboratory diagnosis of toxoplasmosis. One hundred pregnant women who seroconverted during pregnancy were included in the study. The definition of cases was based on a 12-month follow-up of the infants. Methods Amniotic fluid samples were submitted to DNA extraction and amplification by the following 4 Toxoplasma techniques performed with parasite B1 gene primers: conventional PCR, nested-PCR, multiplex-nested-PCR, and real-time PCR. Seven parameters were analyzed, sensitivity (Se), specificity (Sp), positive predictive value (PPV), negative predictive value (NPV), positive likelihood ratio (PLR), negative likelihood ratio (NLR) and efficiency (Ef). Results Fifty-nine of the 100 infants had toxoplasmosis; 42 (71.2%) had IgM antibodies at birth but were asymptomatic, and the remaining 17 cases had non-detectable IgM antibodies but high IgG antibody titers that were associated with retinochoroiditis in 8 (13.5%) cases, abnormal cranial ultrasound in 5 (8.5%) cases, and signs/symptoms suggestive of infection in 4 (6.8%) cases. The conventional PCR assay detected 50 cases (9 false-negatives), nested-PCR detected 58 cases (1 false-negative and 4 false-positives), multiplex-nested-PCR detected 57 cases (2 false-negatives), and real-time-PCR detected 58 cases (1 false-negative). Conclusions The real-time PCR assay was the best-performing technique based on the parameters of Se (98.3%), Sp (100%), PPV (100%), NPV (97.6%), PLR (â^ž), NLR (0.017), and Ef (99%).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Im Rahmen der vorliegenden Dissertation wurde, basierend auf der Parallel-/Orthogonalraum-Methode, eine neue Methode zur Berechnung von allgemeinen massiven Zweischleifen-Dreipunkt-Tensorintegralen mit planarer und gedrehter reduzierter planarer Topologie entwickelt. Die Ausarbeitung und Implementation einer Tensorreduktion fuer Integrale, welche eine allgemeine Tensorstruktur im Minkowski-Raum besitzen koennen, wurde durchgefuehrt. Die Entwicklung und Implementation eines Algorithmus zur semi-analytischen Berechnung der schwierigsten Integrale, die nach der Tensorreduktion verbleiben, konnte vollendet werden. (Fuer die anderen Basisintegrale koennen wohlbekannte Methoden verwendet werden.) Die Implementation ist bezueglich der UV-endlichen Anteile der Masterintegrale, die auch nach Tensorreduktion noch die zuvor erwaehnten Topologien besitzen, abgeschlossen. Die numerischen Integrationen haben sich als stabil erwiesen. Fuer die verbleibenden Teile des Projektes koennen wohlbekannte Methoden verwendet werden. In weiten Teilen muessen lediglich noch Links zu existierenden Programmen geschrieben werden. Fuer diejenigen wenigen verbleibenden speziellen Topologien, welche noch zu implementieren sind, sind (wohlbekannte) Methoden zu implementieren. Die Computerprogramme, die im Rahmen dieses Projektes entstanden, werden auch fuer allgemeinere Prozesse in das xloops-Projekt einfliessen. Deswegen wurde sie soweit moeglich fuer allgemeine Prozesse entwickelt und implementiert. Der oben erwaehnte Algorithmus wurde insbesondere fuer die Evaluation der fermionischen NNLO-Korrekturen zum leptonischen schwachen Mischungswinkel sowie zu aehnlichen Prozessen entwickelt. Im Rahmen der vorliegenden Dissertation wurde ein Grossteil der fuer die fermionischen NNLO-Korrekturen zu den effektiven Kopplungskonstanten des Z-Zerfalls (und damit fuer den schachen Mischungswinkel) notwendigen Arbeit durchgefuehrt.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis studies molecular dynamics simulations on two levels of resolution: the detailed level of atomistic simulations, where the motion of explicit atoms in a many-particle system is considered, and the coarse-grained level, where the motion of superatoms composed of up to 10 atoms is modeled. While atomistic models are capable of describing material specific effects on small scales, the time and length scales they can cover are limited due to their computational costs. Polymer systems are typically characterized by effects on a broad range of length and time scales. Therefore it is often impossible to atomistically simulate processes, which determine macroscopic properties in polymer systems. Coarse-grained (CG) simulations extend the range of accessible time and length scales by three to four orders of magnitude. However, no standardized coarse-graining procedure has been established yet. Following the ideas of structure-based coarse-graining, a coarse-grained model for polystyrene is presented. Structure-based methods parameterize CG models to reproduce static properties of atomistic melts such as radial distribution functions between superatoms or other probability distributions for coarse-grained degrees of freedom. Two enhancements of the coarse-graining methodology are suggested. Correlations between local degrees of freedom are implicitly taken into account by additional potentials acting between neighboring superatoms in the polymer chain. This improves the reproduction of local chain conformations and allows the study of different tacticities of polystyrene. It also gives better control of the chain stiffness, which agrees perfectly with the atomistic model, and leads to a reproduction of experimental results for overall chain dimensions, such as the characteristic ratio, for all different tacticities. The second new aspect is the computationally cheap development of nonbonded CG potentials based on the sampling of pairs of oligomers in vacuum. Static properties of polymer melts are obtained as predictions of the CG model in contrast to other structure-based CG models, which are iteratively refined to reproduce reference melt structures. The dynamics of simulations at the two levels of resolution are compared. The time scales of dynamical processes in atomistic and coarse-grained simulations can be connected by a time scaling factor, which depends on several specific system properties as molecular weight, density, temperature, and other components in mixtures. In this thesis the influence of molecular weight in systems of oligomers and the situation in two-component mixtures is studied. For a system of small additives in a melt of long polymer chains the temperature dependence of the additive diffusion is predicted and compared to experiments.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The topic of this work concerns nonparametric permutation-based methods aiming to find a ranking (stochastic ordering) of a given set of groups (populations), gathering together information from multiple variables under more than one experimental designs. The problem of ranking populations arises in several fields of science from the need of comparing G>2 given groups or treatments when the main goal is to find an order while taking into account several aspects. As it can be imagined, this problem is not only of theoretical interest but it also has a recognised relevance in several fields, such as industrial experiments or behavioural sciences, and this is reflected by the vast literature on the topic, although sometimes the problem is associated with different keywords such as: "stochastic ordering", "ranking", "construction of composite indices" etc., or even "ranking probabilities" outside of the strictly-speaking statistical literature. The properties of the proposed method are empirically evaluated by means of an extensive simulation study, where several aspects of interest are let to vary within a reasonable practical range. These aspects comprise: sample size, number of variables, number of groups, and distribution of noise/error. The flexibility of the approach lies mainly in the several available choices for the test-statistic and in the different types of experimental design that can be analysed. This render the method able to be tailored to the specific problem and the to nature of the data at hand. To perform the analyses an R package called SOUP (Stochastic Ordering Using Permutations) has been written and it is available on CRAN.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In der vorliegenden Arbeit wurde eine Top Down (TD) und zwei Bottom Up (BU) MALDI/ESI Massenspektrometrie/HPLC-Methoden entwickelt mit dem Ziel Augenoberfächenkomponenten, d.h. Tränenfilm und Konjunktivalzellen zu analysieren. Dabei wurde ein detaillierter Einblick in die Entwicklungsschritte gegeben und die Ansätze auf Eignung und methodische Grenzen untersucht. Während der TD Ansatz vorwiegend Eignung zur Analyse von rohen, weitgehend unbearbeiteten Zellproben fand, konnten mittels des BU Ansatzes bearbeitete konjunktivale Zellen, aber auch Tränenfilm mit hoher Sensitivität und Genauigkeit proteomisch analysiert werden. Dabei konnten mittels LC MALDI BU-Methode mehr als 200 Tränenproteine und mittels der LC ESI Methode mehr als 1000 Tränen- sowie konjunktivale Zellproteine gelistet werden. Dabei unterschieden sich ESI- and MALDI- Methoden deutlich bezüglich der Quantität und Qualität der Ergebnisse, weshalb differente proteomische Anwendungsgebiete der beiden Methoden vorgeschlagen wurden. Weiterhin konnten mittels der entwickelten LC MALDI/ESI BU Plattform, basierend auf den Vorteilen gegenüber dem TD Ansatz, therapeutische Einflüsse auf die Augenoberfläche mit Fokus auf die topische Anwendung von Taurin sowie Taflotan® sine, untersucht werden. Für Taurin konnten entzündungshemmende Effekte, belegt durch dynamische Veränderungen des Tränenfilms, dokumentiert werden. Außerdem konnten vorteilhafte, konzentrationsabhängige Wirkweisen auch in Studien an konjunktival Zellen gezeigt werden. Für die Anwendung von konservierungsmittelfreien Taflotan® sine, konnte mittels LC ESI BU Analyse eine Regenerierung der Augenoberfläche in Patienten mit Primärem Offenwinkel Glaukom (POWG), welche unter einem “Trockenem Auge“ litten nach einem therapeutischen Wechsel von Xalatan® basierend auf dynamischen Tränenproteomveränderungen gezeigt werden. Die Ergebnisse konnten mittels Microarray (MA) Analysen bestätigt werden. Sowohl in den Taurin Studien, als auch in der Taflotan® sine Studie, konnten charakteristische Proteine der Augenoberfläche dokumentiert werden, welche eine objektive Bewertung des Gesundheitszustandes der Augenoberfläche ermöglichen. Eine Kombination von Taflotan® sine und Taurin wurde als mögliche Strategie zur Therapie des Trockenen Auges bei POWG Patienten vorgeschlagen und diskutiert.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Conventional inorganic materials for x-ray radiation sensors suffer from several drawbacks, including their inability to cover large curved areas, me- chanical sti ffness, lack of tissue-equivalence and toxicity. Semiconducting organic polymers represent an alternative and have been employed as di- rect photoconversion material in organic diodes. In contrast to inorganic detector materials, polymers allow low-cost and large area fabrication by sol- vent based methods. In addition their processing is compliant with fexible low-temperature substrates. Flexible and large-area detectors are needed for dosimetry in medical radiotherapy and security applications. The objective of my thesis is to achieve optimized organic polymer diodes for fexible, di- rect x-ray detectors. To this end polymer diodes based on two different semi- conducting polymers, polyvinylcarbazole (PVK) and poly(9,9-dioctyluorene) (PFO) have been fabricated. The diodes show state-of-the-art rectifying be- haviour and hole transport mobilities comparable to reference materials. In order to improve the X-ray stopping power, high-Z nanoparticle Bi2O3 or WO3 where added to realize a polymer-nanoparticle composite with opti- mized properities. X-ray detector characterization resulted in sensitivties of up to 14 uC/Gy/cm2 for PVK when diodes were operated in reverse. Addition of nanoparticles could further improve the performance and a maximum sensitivy of 19 uC/Gy/cm2 was obtained for the PFO diodes. Compared to the pure PFO diode this corresponds to a five-fold increase and thus highlights the potentiality of nanoparticles for polymer detector design. In- terestingly the pure polymer diodes showed an order of magnitude increase in sensitivity when operated in forward regime. The increase was attributed to a different detection mechanism based on the modulation of the diodes conductivity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

n this paper we present a novel hybrid approach for multimodal medical image registration based on diffeomorphic demons. Diffeomorphic demons have proven to be a robust and efficient way for intensity-based image registration. A very recent extension even allows to use mutual information (MI) as a similarity measure to registration multimodal images. However, due to the intensity correspondence uncertainty existing in some anatomical parts, it is difficult for a purely intensity-based algorithm to solve the registration problem. Therefore, we propose to combine the resulting transformations from both intensity-based and landmark-based methods for multimodal non-rigid registration based on diffeomorphic demons. Several experiments on different types of MR images were conducted, for which we show that a better anatomical correspondence between the images can be obtained using the hybrid approach than using either intensity information or landmarks alone.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study aimed to evaluate the influence of professional prophylactic methods on the DIAGNOdent 2095, DIAGNOdent 2190 and VistaProof performance in detecting occlusal caries. Assessments were performed in 110 permanent teeth at baseline and after bicarbonate jet or prophylactic paste and rinsing. Performance in terms of sensitivity improved after rinsing of the occlusal surfaces when the prophylactic paste was used. However, the sodium bicarbonate jet did not significantly influence the performance of the fluorescence-based methods. It can be concluded that different professional prophylactic methods can significantly influence the performance of fluorescence-based methods for occlusal caries detection.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Misconceptions exist in all fields of learning and develop through a person’s preconception of how the world works. Students with misconceptions in chemical engineering are not capable of correctly transferring knowledge to a new situation and will likely arrive at an incorrect solution. The purpose of this thesis was to repair misconceptions in thermodynamics by using inquiry-based activities. Inquiry-based learning is a method of teaching that involves hands-on learning and self-discovery. Previous work has shown inquiry-based methods result in better conceptual understanding by students relative to traditional lectures. The thermodynamics activities were designed to guide students towards the correct conceptual understanding through observing a preconception fail to hold up through an experiment or simulation. The developed activities focus on the following topics in thermodynamics: “internal energy versus enthalpy”, “equilibrium versus steady state”, and “entropy”. For each topic, two activities were designed to clarify the concept and assure it was properly grasped. Each activity was coupled with an instructions packet containing experimental procedure as well as pre- and post-analysis questions, which were used to analyze the effect of the activities on the students’ responses. Concept inventories were used to monitor students’ conceptual understanding at the beginning and end of the semester. The results did not show a statistically significant increase in the overall concept inventory scores for students who performed the activities compared to traditional learning. There was a statistically significant increase in concept area scores for “internal energy versus enthalpy” and “equilibrium versus steady state”. Although there was not a significant increase in concept inventory scores for “entropy”, written analyses showed most students’ misconceptions were repaired. Students transferred knowledge effectively and retained most of the information in the concept areas of “internal energy versus enthalpy” and “equilibrium versus steady state”.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This book will serve as a foundation for a variety of useful applications of graph theory to computer vision, pattern recognition, and related areas. It covers a representative set of novel graph-theoretic methods for complex computer vision and pattern recognition tasks. The first part of the book presents the application of graph theory to low-level processing of digital images such as a new method for partitioning a given image into a hierarchy of homogeneous areas using graph pyramids, or a study of the relationship between graph theory and digital topology. Part II presents graph-theoretic learning algorithms for high-level computer vision and pattern recognition applications, including a survey of graph based methodologies for pattern recognition and computer vision, a presentation of a series of computationally efficient algorithms for testing graph isomorphism and related graph matching tasks in pattern recognition and a new graph distance measure to be used for solving graph matching problems. Finally, Part III provides detailed descriptions of several applications of graph-based methods to real-world pattern recognition tasks. It includes a critical review of the main graph-based and structural methods for fingerprint classification, a new method to visualize time series of graphs, and potential applications in computer network monitoring and abnormal event detection.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study compared the performance of fluorescence-based methods, radiographic examination, and International Caries Detection and Assessment System (ICDAS) II on occlusal surfaces. One hundred and nineteen permanent human molars were assessed twice by 2 experienced dentists using the laser fluorescence (LF and LFpen) and fluorescence camera (FC) devices, ICDAS II and bitewing radiographs (BW). After measuring, the teeth were histologically prepared and assessed for caries extension. The sensitivities for dentine caries detection were 0.86 (FC), 0.78 (LFpen), 0.73 (ICDAS II), 0.51 (LF) and 0.34 (BW). The specificities were 0.97 (BW), 0.89 (LF), 0.65 (ICDAS II), 0.63 (FC) and 0.56 (LFpen). BW presented the highest values of likelihood ratio (LR)+ (12.47) and LR- (0.68). Rank correlations with histology were 0.53 (LF), 0.52 (LFpen), 0.41 (FC), 0.59 (ICDAS II) and 0.57 (BW). The area under the ROC curve varied from 0.72 to 0.83. Inter- and intraexaminer intraclass correlation values were respectively 0.90 and 0.85 (LF), 0.93 and 0.87 (LFpen) and 0.85 and 0.76 (FC). The ICDAS II kappa values were 0.51 (interexaminer) and 0.61 (intraexaminer). The BW kappa values were 0.50 (interexaminer) and 0.62 (intraexaminer). The Bland and Altman limits of agreement were 46.0 and 38.2 (LF), 55.6 and 40.0 (LFpen) and 1.12 and 0.80 (FC), for intra- and interexaminer reproducibilities. The posttest probability for dentine caries detection was high for BW and LF. In conclusion, LFpen, FC and ICDAS II presented better sensitivity and LF and BW better specificity. ICDAS II combined with BW showed the best performance and is the best combination for detecting caries on occlusal surfaces.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Loss to follow-up (LTFU) is a common problem in many epidemiological studies. In antiretroviral treatment (ART) programs for patients with human immunodeficiency virus (HIV), mortality estimates can be biased if the LTFU mechanism is non-ignorable, that is, mortality differs between lost and retained patients. In this setting, routine procedures for handling missing data may lead to biased estimates. To appropriately deal with non-ignorable LTFU, explicit modeling of the missing data mechanism is needed. This can be based on additional outcome ascertainment for a sample of patients LTFU, for example, through linkage to national registries or through survey-based methods. In this paper, we demonstrate how this additional information can be used to construct estimators based on inverse probability weights (IPW) or multiple imputation. We use simulations to contrast the performance of the proposed estimators with methods widely used in HIV cohort research for dealing with missing data. The practical implications of our approach are illustrated using South African ART data, which are partially linkable to South African national vital registration data. Our results demonstrate that while IPWs and proper imputation procedures can be easily constructed from additional outcome ascertainment to obtain valid overall estimates, neglecting non-ignorable LTFU can result in substantial bias. We believe the proposed estimators are readily applicable to a growing number of studies where LTFU is appreciable, but additional outcome data are available through linkage or surveys of patients LTFU. Copyright © 2013 John Wiley & Sons, Ltd.