206 resultados para tipping
Resumo:
A distalização dos molares superiores é uma opção de tratamento da má oclusão de Classe II, quando o envolvimento é principalmente dentoalveolar. Dispositivos intrabucais como o aparelho Pêndulo, dispensam a colaboração do paciente quanto ao uso, porém promovem efeitos muitas vezes indesejáveis como a vestibularização dos dentes anteriores que participam na ancoragem e a inclinação dos molares distalizados. Após o surgimento dos Dispositivos de Ancoragem Temporária (DATs), como o mini-implante pode-se alcançar a ancoragem de forma previsível e eficiente. Com isto, por meio de um estudo prospectivo, foram avaliadas as alterações dentárias, promovidas pela distalização de molares superiores com um aparelho Pêndulo modificado, apoiado em dois mini-implantes instalados no palato de 10 indivíduos, sendo 2 do sexo feminino e 8 do masculino, com média de idade de 14,3 anos. A amostra foi composta por 20 modelos digitalizados em 3D, obtidos de em duas fases: no início do tratamento (T1) e após distalização com sobrecorreção de 1 mm (T2), permitindo quantificar as alterações dentárias sagitais, transversais e possíveis movimentos de rotação, angulação e movimentos verticais. Os resultados obtidos mostraram que no sentido sagital, houve uma efetiva distalização com significância estatística, para os segundos molares superiores; primeiros molares superiores em média de 4,34 mm e 3,91mm para o lado direito e esquerdo, respectivamente, e para os segundos pré-molares do lado direito e esquerdo de 2,06 mm e 1,95 mm, respectivamente. Porém, para os dentes anteriores, foi constatada a perda de ancoragem. No sentido transversal, o maior aumento ocorreu na região dos dentes posteriores. Os movimentos de rotação, angulação e vertical dos primeiros molares superiores, indicam que houve rotação mesiovestibular e inclinação distal das coroas destes dentes de ambos os lados; as medidas verticais, demonstram que houve movimento significativo apenas para o primeiro molar direito, com inclinação distal pela intrusão da cúspide distal. Este dispositivo mostrou-se eficaz na correção da Classe II em um tempo médio de 6,2 meses.(AU)
Resumo:
A proporção anterior do tamanho dental de Bolton deve ser considerada no planejamento do caso quando se almeja uma oclusão ótima na finalização do tratamento ortodôntico. Este estudo teve por objetivo verificar se há relação entre as seguintes variáveis e a proporção anterior de Bolton: espessura vestíbulo-lingual dos incisivos superiores, relação entre a angulação dental dos incisivos superiores e o espaço mesiodistal por eles ocupado, sobressaliência e sobremordida, e se há dimorfismo sexual. Foram avaliados 35 pares de modelos em gesso com oclusão normal natural, provenientes de 27 indivíduos do sexo feminino e 8 do masculino, leucodermas, com idade entre 13 anos e 17 anos e 4 meses (idade média: 15 anos e 8 meses). Utilizou-se um paquímetro digital e um fragmento de régua para a obtenção das medidas. A proporção anterior encontrada foi de 77,48% (DP±2,22), estatisticamente semelhante ao valor proposto por Bolton, 77,20% (DP±1,65). De acordo com o teste de Pearson, somente a sobremordida mostrou relação estatisticamente significante com a proporção dental anterior. Não foi observado dimorfismo sexual.(AU)
Resumo:
A presente pesquisa avaliou a angulação mésio-distal e a distalização dos primeiros e dos segundos molares superiores permanentes em pacientes tratados com o aparelho Forsus® em conjunto com o aparelho ortodôntico fixo. A amostra foi composta por 44 teleradiografias, (22 do lado direito e 22 do lado esquerdo), obtidas por meio de 11 tomografias computadorizadas, de 11 pacientes, realizadas em dois tempos: antes (T1) e após (T2) a instalação do aparelho Forsus®, tratados na clínica de Pós-Graduação em Odontologia, área de Ortodontia, da Universidade Metodista de São Paulo. Após a obtenção dos cortes tipo telerradiografia, foi realizada a marcação dos pontos, linhas e planos, e realizada a mensuração das variáveis de interesse. Para avaliação do espaço para os terceiros molares superiores, utilizou-se uma linha referencial (linha PTVR), demarcada a partir do ponto PTV, perpendicular ao plano de Frankfurt. O espaço avaliado compreendeu entre a Linha PTVR, até a face distal do primeiro molar e do segundo superior permanente. Para avaliar o longo eixo dos primeiros e segundos molares superiores, mensurou-se o ângulo formado entre esses dentes e o plano palatino. Para auxílio das mensurações, foi utilizado o software Radiocef Studio 2. Na análise estatística usou-se o teste t pareado. Concluiu-se que houve distalização e aumento da angulação distal dos primeiros e segundos molares superiores, sendo que nos segundos molares a distalização e angulação distal da coroa ocorreram em menor quantidade; os efeitos do lado direito e esquerdo foram semelhantes; pode-se também constatar que ocorreu uma redução na probabilidade de erupção dos terceiros molares superiores.
Resumo:
This paper surveys the context of feature extraction by neural network approaches, and compares and contrasts their behaviour as prospective data visualisation tools in a real world problem. We also introduce and discuss a hybrid approach which allows us to control the degree of discriminatory and topographic information in the extracted feature space.
Resumo:
This thesis is a study of the generation of topographic mappings - dimension reducing transformations of data that preserve some element of geometric structure - with feed-forward neural networks. As an alternative to established methods, a transformational variant of Sammon's method is proposed, where the projection is effected by a radial basis function neural network. This approach is related to the statistical field of multidimensional scaling, and from that the concept of a 'subjective metric' is defined, which permits the exploitation of additional prior knowledge concerning the data in the mapping process. This then enables the generation of more appropriate feature spaces for the purposes of enhanced visualisation or subsequent classification. A comparison with established methods for feature extraction is given for data taken from the 1992 Research Assessment Exercise for higher educational institutions in the United Kingdom. This is a difficult high-dimensional dataset, and illustrates well the benefit of the new topographic technique. A generalisation of the proposed model is considered for implementation of the classical multidimensional scaling (¸mds}) routine. This is related to Oja's principal subspace neural network, whose learning rule is shown to descend the error surface of the proposed ¸mds model. Some of the technical issues concerning the design and training of topographic neural networks are investigated. It is shown that neural network models can be less sensitive to entrapment in the sub-optimal global minima that badly affect the standard Sammon algorithm, and tend to exhibit good generalisation as a result of implicit weight decay in the training process. It is further argued that for ideal structure retention, the network transformation should be perfectly smooth for all inter-data directions in input space. Finally, there is a critique of optimisation techniques for topographic mappings, and a new training algorithm is proposed. A convergence proof is given, and the method is shown to produce lower-error mappings more rapidly than previous algorithms.
Resumo:
Visualization has proven to be a powerful and widely-applicable tool the analysis and interpretation of data. Most visualization algorithms aim to find a projection from the data space down to a two-dimensional visualization space. However, for complex data sets living in a high-dimensional space it is unlikely that a single two-dimensional projection can reveal all of the interesting structure. We therefore introduce a hierarchical visualization algorithm which allows the complete data set to be visualized at the top level, with clusters and sub-clusters of data points visualized at deeper levels. The algorithm is based on a hierarchical mixture of latent variable models, whose parameters are estimated using the expectation-maximization algorithm. We demonstrate the principle of the approach first on a toy data set, and then apply the algorithm to the visualization of a synthetic data set in 12 dimensions obtained from a simulation of multi-phase flows in oil pipelines and to data in 36 dimensions derived from satellite images.
Resumo:
Principal component analysis (PCA) is one of the most popular techniques for processing, compressing and visualising data, although its effectiveness is limited by its global linearity. While nonlinear variants of PCA have been proposed, an alternative paradigm is to capture data complexity by a combination of local linear PCA projections. However, conventional PCA does not correspond to a probability density, and so there is no unique way to combine PCA models. Previous attempts to formulate mixture models for PCA have therefore to some extent been ad hoc. In this paper, PCA is formulated within a maximum-likelihood framework, based on a specific form of Gaussian latent variable model. This leads to a well-defined mixture model for probabilistic principal component analysers, whose parameters can be determined using an EM algorithm. We discuss the advantages of this model in the context of clustering, density modelling and local dimensionality reduction, and we demonstrate its application to image compression and handwritten digit recognition.
Resumo:
Principal component analysis (PCA) is a ubiquitous technique for data analysis and processing, but one which is not based upon a probability model. In this paper we demonstrate how the principal axes of a set of observed data vectors may be determined through maximum-likelihood estimation of parameters in a latent variable model closely related to factor analysis. We consider the properties of the associated likelihood function, giving an EM algorithm for estimating the principal subspace iteratively, and discuss the advantages conveyed by the definition of a probability density function for PCA.
Resumo:
Principal component analysis (PCA) is a ubiquitous technique for data analysis and processing, but one which is not based upon a probability model. In this paper we demonstrate how the principal axes of a set of observed data vectors may be determined through maximum-likelihood estimation of parameters in a latent variable model closely related to factor analysis. We consider the properties of the associated likelihood function, giving an EM algorithm for estimating the principal subspace iteratively, and discuss the advantages conveyed by the definition of a probability density function for PCA.
Resumo:
Throughout the 1970s and 1980s, West Germany was considered to be one of the world’s most successful economic and political systems. In his seminal 1987 analysis of West Germany’s ‘semisovereign’ system of governance, Peter Katzenstein attributed this success to a combination of a fragmented polity, consensus politics and incremental policy changes. However, unification in 1990 has both changed Germany’s institutional configuration and created economic and social challenges on a huge scale. This volume therefore asks whether semisovereignty still exists in contemporary Germany and, crucially, whether it remains an asset in terms of addressing these challenges. By shadowing and building on the original study, an eminent team of British, German and American scholars analyses institutional changes and the resulting policy developments in key sectors, with Peter Katzenstein himself providing the conclusion. Together, the chapters provide a landmark assessment of the outcomes produced by one of the world’s most important countries. Contents: 1. Introduction: semisovereignty challenged Simon Green and William E. Paterson; 2. Institutional transfer: can semisovereignty be transferred? The political economy of Eastern Germany Wade Jacoby; 3. Political parties Thomas Saalfeld; 4. Federalism: the new territorialism Charlie Jeffery; 5. Shock-absorbers under stress. Parapublic institutions and the double challenges of German unification and European integration Andreas Busch; 6. Economic policy management: catastrophic equilibrium, tipping points and crisis interventions Kenneth Dyson; 7. Industrial relations: from state weakness as strength to state weakness as weakness. Welfare corporatism and the private use of the public interest Wolfgang Streeck; 8. Social policy: crisis and transformation Roland Czada; 9. Immigration and integration policy: between incrementalism and non-decisions Simon Green; 10. Environmental policy: the law of diminishing returns? Charles Lees; 11. Administrative reform Kluas H. Goetz; 12. European policy-making: between associated sovereignty and semisovereignty William E. Paterson; 13. Conclusion: semisovereignty in United Germany Peter J. Katzenstein.
Resumo:
Ten grades of ABS and four grades of polypropylene have been plated with various copper + nickel + chromium coatings and subjected to a variety of tests. In corrosion studies the pre-electroplating sequence and plastics type have been shown to influence performance. One ABS pre-electroplating sequence was consistently associated with better corrosion performance; two factors were responsible for this, namely the more severe nature of the etch and the relatively more noble electroless nickel. Statistical analysis has indicated that order of severity of the corrosion tests was static-mobile-CASS, the latter being the least severe. In mechanical tests two properties of ABS and polypropJylene, ductility and impact strength, have been shown to be adversely affected when electrodeposited layers were applied. The cause of this is due to a complex of factors, the most important of which is the notch sensitivity of the plastics. Peel adhesion has been studied on flat panels and also on ones which had a ridge and a valley moulded into one face. High adhesion peaks occurred on the flat face at regions associated with the ridge and valley. The local moulding conditions induced by the features were responsible for this phenonemon. In the main programme the thermal cycling test was shown to be more likely than the peel adhesion test to give an indication of the service performance of electroplated plastics.
Resumo:
A recent novel approach to the visualisation and analysis of datasets, and one which is particularly applicable to those of a high dimension, is discussed in the context of real applications. A feed-forward neural network is utilised to effect a topographic, structure-preserving, dimension-reducing transformation of the data, with an additional facility to incorporate different degrees of associated subjective information. The properties of this transformation are illustrated on synthetic and real datasets, including the 1992 UK Research Assessment Exercise for funding in higher education. The method is compared and contrasted to established techniques for feature extraction, and related to topographic mappings, the Sammon projection and the statistical field of multidimensional scaling.
Resumo:
The industry has not clearly focused on many important problems, such as rewarding service workers based on productivity. Instead, many industry leaders have focused on "straw men issues," issues that are more rhetoric than substance. The authors examine some of these so-called is- sues in detail: governmental wage policies, immigration laws, the quality of the work force, service worker training, and gratuity management, to provide a fresh look at worker productivity beyond the rhetoric and myths that prevail
Resumo:
Ocean acidification (OA) is known to affect bivalve early life-stages. We tested responses of blue mussel larvae to a wide range of pH in order to identify their tolerance threshold. Our results confirmed that decreasing seawater pH and decreasing saturation state increases larval mortality rate and the percentage of abnormally developing larvae. Virtually no larvae reared at average pHT 7.16 were able to feed or reach the D-shell stage and their development appeared to be arrested at the trochophore stage. However larvae were capable of reaching the D-shell stage under milder acidification (pHT=7.35, 7.6, 7.85) including in under-saturated seawater with omega Aragonite as low as 0.54±0.01 (mean±s. e. m.), with a tipping point for normal development identified at pHT 7.765. Additionally growth rate of normally developing larvae was not affected by lower pHT despite potential increased energy costs associated with compensatory calcification in response to increased shell dissolution. Overall, our results on OA impacts on mussel larvae suggest an average pHT of 7.16 is beyond their physiological tolerance threshold and indicate a shift in energy allocation towards growth in some individuals revealing potential OA resilience.
Resumo:
The exponential growth of studies on the biological response to ocean acidification over the last few decades has generated a large amount of data. To facilitate data comparison, a data compilation hosted at the data publisher PANGAEA was initiated in 2008 and is updated on a regular basis (doi:10.1594/PANGAEA.149999). By January 2015, a total of 581 data sets (over 4 000 000 data points) from 539 papers had been archived. Here we present the developments of this data compilation five years since its first description by Nisumaa et al. (2010). Most of study sites from which data archived are still in the Northern Hemisphere and the number of archived data from studies from the Southern Hemisphere and polar oceans are still relatively low. Data from 60 studies that investigated the response of a mix of organisms or natural communities were all added after 2010, indicating a welcomed shift from the study of individual organisms to communities and ecosystems. The initial imbalance of considerably more data archived on calcification and primary production than on other processes has improved. There is also a clear tendency towards more data archived from multifactorial studies after 2010. For easier and more effective access to ocean acidification data, the ocean acidification community is strongly encouraged to contribute to the data archiving effort, and help develop standard vocabularies describing the variables and define best practices for archiving ocean acidification data.