973 resultados para Iterative Closest Point (ICP) Algorithm
Resumo:
RÉSUMÉ Cette thèse porte sur le développement de méthodes algorithmiques pour découvrir automatiquement la structure morphologique des mots d'un corpus. On considère en particulier le cas des langues s'approchant du type introflexionnel, comme l'arabe ou l'hébreu. La tradition linguistique décrit la morphologie de ces langues en termes d'unités discontinues : les racines consonantiques et les schèmes vocaliques. Ce genre de structure constitue un défi pour les systèmes actuels d'apprentissage automatique, qui opèrent généralement avec des unités continues. La stratégie adoptée ici consiste à traiter le problème comme une séquence de deux sous-problèmes. Le premier est d'ordre phonologique : il s'agit de diviser les symboles (phonèmes, lettres) du corpus en deux groupes correspondant autant que possible aux consonnes et voyelles phonétiques. Le second est de nature morphologique et repose sur les résultats du premier : il s'agit d'établir l'inventaire des racines et schèmes du corpus et de déterminer leurs règles de combinaison. On examine la portée et les limites d'une approche basée sur deux hypothèses : (i) la distinction entre consonnes et voyelles peut être inférée sur la base de leur tendance à alterner dans la chaîne parlée; (ii) les racines et les schèmes peuvent être identifiés respectivement aux séquences de consonnes et voyelles découvertes précédemment. L'algorithme proposé utilise une méthode purement distributionnelle pour partitionner les symboles du corpus. Puis il applique des principes analogiques pour identifier un ensemble de candidats sérieux au titre de racine ou de schème, et pour élargir progressivement cet ensemble. Cette extension est soumise à une procédure d'évaluation basée sur le principe de la longueur de description minimale, dans- l'esprit de LINGUISTICA (Goldsmith, 2001). L'algorithme est implémenté sous la forme d'un programme informatique nommé ARABICA, et évalué sur un corpus de noms arabes, du point de vue de sa capacité à décrire le système du pluriel. Cette étude montre que des structures linguistiques complexes peuvent être découvertes en ne faisant qu'un minimum d'hypothèses a priori sur les phénomènes considérés. Elle illustre la synergie possible entre des mécanismes d'apprentissage portant sur des niveaux de description linguistique distincts, et cherche à déterminer quand et pourquoi cette coopération échoue. Elle conclut que la tension entre l'universalité de la distinction consonnes-voyelles et la spécificité de la structuration racine-schème est cruciale pour expliquer les forces et les faiblesses d'une telle approche. ABSTRACT This dissertation is concerned with the development of algorithmic methods for the unsupervised learning of natural language morphology, using a symbolically transcribed wordlist. It focuses on the case of languages approaching the introflectional type, such as Arabic or Hebrew. The morphology of such languages is traditionally described in terms of discontinuous units: consonantal roots and vocalic patterns. Inferring this kind of structure is a challenging task for current unsupervised learning systems, which generally operate with continuous units. In this study, the problem of learning root-and-pattern morphology is divided into a phonological and a morphological subproblem. The phonological component of the analysis seeks to partition the symbols of a corpus (phonemes, letters) into two subsets that correspond well with the phonetic definition of consonants and vowels; building around this result, the morphological component attempts to establish the list of roots and patterns in the corpus, and to infer the rules that govern their combinations. We assess the extent to which this can be done on the basis of two hypotheses: (i) the distinction between consonants and vowels can be learned by observing their tendency to alternate in speech; (ii) roots and patterns can be identified as sequences of the previously discovered consonants and vowels respectively. The proposed algorithm uses a purely distributional method for partitioning symbols. Then it applies analogical principles to identify a preliminary set of reliable roots and patterns, and gradually enlarge it. This extension process is guided by an evaluation procedure based on the minimum description length principle, in line with the approach to morphological learning embodied in LINGUISTICA (Goldsmith, 2001). The algorithm is implemented as a computer program named ARABICA; it is evaluated with regard to its ability to account for the system of plural formation in a corpus of Arabic nouns. This thesis shows that complex linguistic structures can be discovered without recourse to a rich set of a priori hypotheses about the phenomena under consideration. It illustrates the possible synergy between learning mechanisms operating at distinct levels of linguistic description, and attempts to determine where and why such a cooperation fails. It concludes that the tension between the universality of the consonant-vowel distinction and the specificity of root-and-pattern structure is crucial for understanding the advantages and weaknesses of this approach.
Resumo:
A major issue in the application of waveform inversion methods to crosshole georadar data is the accurate estimation of the source wavelet. Here, we explore the viability and robustness of incorporating this step into a time-domain waveform inversion procedure through an iterative deconvolution approach. Our results indicate that, at least in non-dispersive electrical environments, such an approach provides remarkably accurate and robust estimates of the source wavelet even in the presence of strong heterogeneity in both the dielectric permittivity and electrical conductivity. Our results also indicate that the proposed source wavelet estimation approach is relatively insensitive to ambient noise and to the phase characteristics of the starting wavelet. Finally, there appears to be little-to-no trade-off between the wavelet estimation and the tomographic imaging procedures.
Resumo:
Chronic hepatitis C is a major healthcare problem. The response to antiviral therapy for patients with chronic hepatitis C has previously been defined biochemically and by PCR. However, changes in the hepatic venous pressure gradient (HVPG) may be considered as an adjunctive end point for the therapeutic evaluation of antiviral therapy in chronic hepatitis C. It is a validated technique which is safe, well tolerated, well established, and reproducible. Serial HVPG measurements may be the best way to evaluate response to therapy in chronic hepatitis C.
Resumo:
Les infections liées aux accès vasculaires sont l'une des causes principales des infections nosocomiales. Elles englobent leur colonisation par des micro-organismes, les infections du site d'insertion et les bactériémies et fongémies qui leur sont attribuées. Une bactériémie complique en moyenne 3 à 5 voies veineuses sur 100, ou représente de 2 à 14 épisodes pour 1000 jour-cathéters. Cette proportion n'est que la partie visible de l'iceberg puisque la plupart des épisodes de sepsis clinique sans foyer infectieux apparent associé sont actuellement considérés comme secondaires aux accès vasculaires. Les principes thérapeutiques sont présentés après une brève revue de leur physiopathologie. Plusieurs approches préventives sont ensuite discutées, y compris des éléments récents sur l'utilisation de cathéters imprégnés de désinfectants ou d'antibiotiques.
Resumo:
The epithelial Na+ channel (ENaC) belongs to a new class of channel proteins called the ENaC/DEG superfamily involved in epithelial Na+ transport, mechanotransduction, and neurotransmission. The role of ENaC in Na+ homeostasis and in the control of blood pressure has been demonstrated recently by the identification of mutations in ENaC beta and gamma subunits causing hypertension. The function of ENaC in Na+ reabsorption depends critically on its ability to discriminate between Na+ and other ions like K+ or Ca2+. ENaC is virtually impermeant to K+ ions, and the molecular basis for its high ionic selectivity is largely unknown. We have identified a conserved Ser residue in the second transmembrane domain of the ENaC alpha subunit (alphaS589), which when mutated allows larger ions such as K+, Rb+, Cs+, and divalent cations to pass through the channel. The relative ion permeability of each of the alphaS589 mutants is related inversely to the ionic radius of the permeant ion, indicating that alphaS589 mutations increase the molecular cutoff of the channel by modifying the pore geometry at the selectivity filter. Proper geometry of the pore is required to tightly accommodate Na+ and Li+ ions and to exclude larger cations. We provide evidence that ENaC discriminates between cations mainly on the basis of their size and the energy of dehydration.
Resumo:
Developmental constraints have been postulated to limit the space of feasible phenotypes and thus shape animal evolution. These constraints have been suggested to be the strongest during either early or mid-embryogenesis, which corresponds to the early conservation model or the hourglass model, respectively. Conflicting results have been reported, but in recent studies of animal transcriptomes the hourglass model has been favored. Studies usually report descriptive statistics calculated for all genes over all developmental time points. This introduces dependencies between the sets of compared genes and may lead to biased results. Here we overcome this problem using an alternative modular analysis. We used the Iterative Signature Algorithm to identify distinct modules of genes co-expressed specifically in consecutive stages of zebrafish development. We then performed a detailed comparison of several gene properties between modules, allowing for a less biased and more powerful analysis. Notably, our analysis corroborated the hourglass pattern at the regulatory level, with sequences of regulatory regions being most conserved for genes expressed in mid-development but not at the level of gene sequence, age, or expression, in contrast to some previous studies. The early conservation model was supported with gene duplication and birth that were the most rare for genes expressed in early development. Finally, for all gene properties, we observed the least conservation for genes expressed in late development or adult, consistent with both models. Overall, with the modular approach, we showed that different levels of molecular evolution follow different patterns of developmental constraints. Thus both models are valid, but with respect to different genomic features.
Resumo:
Point defects of opposite signs can alternately nucleate on the -1/2 disclination line that forms near the free surface of a confined nematic liquid crystal. We show the existence of metastable configurations consisting of periodic repetitions of such defects. These configurations are characterized by a minimal interdefect spacing that is seen to depend on sample thickness and on an applied electric field. The time evolution of the defect distribution suggests that the defects attract at small distances and repel at large distances.
Resumo:
Pursuant to House File 451 the Single Point of Entry Long-Term Living Resources System Team, involving several state agencies as well as interested associations, submitted a report to the legislature on recommendations to establish a single point of entry system.
Resumo:
Audit report on the City of Center Point, Iowa for the year ended June 30, 2012
Resumo:
The primary goal of this project is to demonstrate the accuracy and utility of a freezing drizzle algorithm that can be implemented on roadway environmental sensing systems (ESSs). The types of problems related to the occurrence of freezing precipitation range from simple traffic delays to major accidents that involve fatalities. Freezing drizzle can also lead to economic impacts in communities with lost work hours, vehicular damage, and downed power lines. There are means for transportation agencies to perform preventive and reactive treatments to roadways, but freezing drizzle can be difficult to forecast accurately or even detect as weather radar and surface observation networks poorly observe these conditions. The detection of freezing precipitation is problematic and requires special instrumentation and analysis. The Federal Aviation Administration (FAA) development of aircraft anti-icing and deicing technologies has led to the development of a freezing drizzle algorithm that utilizes air temperature data and a specialized sensor capable of detecting ice accretion. However, at present, roadway ESSs are not capable of reporting freezing drizzle. This study investigates the use of the methods developed for the FAA and the National Weather Service (NWS) within a roadway environment to detect the occurrence of freezing drizzle using a combination of icing detection equipment and available ESS sensors. The work performed in this study incorporated the algorithm developed initially and further modified for work with the FAA for aircraft icing. The freezing drizzle algorithm developed for the FAA was applied using data from standard roadway ESSs. The work performed in this study lays the foundation for addressing the central question of interest to winter maintenance professionals as to whether it is possible to use roadside freezing precipitation detection (e.g., icing detection) sensors to determine the occurrence of pavement icing during freezing precipitation events and the rates at which this occurs.
Resumo:
3D dose reconstruction is a verification of the delivered absorbed dose. Our aim was to describe and evaluate a 3D dose reconstruction method applied to phantoms in the context of narrow beams. A solid water phantom and a phantom containing a bone-equivalent material were irradiated on a 6 MV linac. The transmitted dose was measured by using one array of a 2D ion chamber detector. The dose reconstruction was obtained by an iterative algorithm. A phantom set-up error and organ interfraction motion were simulated to test the algorithm sensitivity. In all configurations convergence was obtained within three iterations. A local reconstructed dose agreement of at least 3% / 3mm with respect to the planned dose was obtained, except in a few points of the penumbra. The reconstructed primary fluences were consistent with the planned ones, which validates the whole reconstruction process. The results validate our method in a simple geometry and for narrow beams. The method is sensitive to a set-up error of a heterogeneous phantom and interfraction heterogeneous organ motion.