957 resultados para Statistical analysis techniques
Resumo:
Mestrado em Tecnologia de Diagnóstico e Intervenção Cardiovascular - Área de especialização: Intervenção Cardiovascular.
Resumo:
Mestrado em Fisioterapia
Resumo:
Introdução: A extrusão apical detritos (EAD) consequência indesejável da instrumentação canalar pode ser associada a dor/edema, podendo atrasar a cicatrização periapical. O nosso trabalho teve como objectivo avaliar e quantificar a EAD em canais instrumentados por sistemas de instrumentação rotatória contínua e reciprocante. Materiais e Métodos: 80 dentes monocanalares sem tratamento endodôntico prévio foram aleatoriamente divididos em 4 grupos (n=20): One Shape® Protaper® NEXT, Hyflex® EDM e WaveOne® Gold. Um tubo de Eppendorf (TdE) foi pesado antecipadamente numa balança analítica de precisão e com um dente inserido foi montado num dispositivo modificado, similar ao método descrito por Myers & Montgomery. Os canais foram instrumentados e irrigados com água destilada. Os dentes instrumentados foram removidos dos TdE e estes preenchidos com água destilada até perfazer 1,5ml, incubados a 70ºC durante cinco dias sendo pesados novamente, calculando a diferença entre o peso inicial e final determinando o peso dos detritos. Os dados foram analisados estatisticamente utilizando o IBM SPSS Statistics 22, considerando α=0,05. Efetuaram-se testes Kruskal-Wallis e post-hoc com ajustamento do ρ-value pelo método Dunn-Bonferroni. Resultados: Houve EAD em todas as técnicas de instrumentação. A análise estatística mostrou haver diferenças significativas na EAD entre as técnicas utilizadas (α=0,002). Entre as técnicas WaveOne® Gold e One Shape® (α=0,003), WaveOne® Gold e Protaper® NEXT (α=0,023) e WaveOne® Gold e Hyflex® EDM (α=0,028). Conclusões: A técnica One Shape® apresentou menor EAD e a técnica WaveOne® Gold com movimento reciprocante constitui maior fator de risco tendo apresentado maior EAD. Os resultados deste estudo indicam que os profissionais devem estar cientes para a EAD que pode ocorrer com cada instrumento, o que poderá servir de base para a selecção de um instrumento particular. Implicações clínicas: A escolha do sistema de instrumentação canalar influencia a extrusão de detritos. Fontes de financiamento: Agradecimentos as empresas; Micro-Mega, França, COLTÉNE e Dentsply Maillefer, Suíça.
Resumo:
This study aims to compare the thermal performance of tiles made from recycled material (waste packaging cardboard with aluminized film) with the tiles of fiber and bitumen, fiber cement and red ceramic with the aim of verifying the suitability of tile to be used in hot and humid climate of low latitude. The samples were selected according to the availability from Natal - RN market, as they are sold to the consumers. The methodology was based on studies that used experimental apparatus composed of thermal chambers heated by banks of incandescent bulbs, to analyze the thermal performance of materials. The tiles in the study were submitted to analysis of thermal performance, thermophysical properties and absorptance, using chambers of thermal performance, measuring the thermophysical properties and portable spectrometer, respectively. Comparative analysis of thermal performance between two samples of the recycled material with dimple sizes and different amounts of aluminum were made, in order to verify, if these characteristics had some interference on the thermal performance of them; the results showed no significant performance differences between the samples. The data obtained in chambers of thermal performance and confirmed by statistical analysis, showed, that the tile of recycled material have similar thermal performance to the tile of fiber cement. In addition to these tests was carried out the automatic monitoring of a building covered with tiles of recycled material, to verify its thermal performance in a real situation. The results showed that recycled shingles must be used with technical criteria similar to those used for fiber cement tiles, with regard to the heat gain into the building. Within these criteria should be taken into account local characteristics, especially in regions with hot and humid climate, and its use must be associated, according to the literature, to elements of thermal insulation and use of passive techniques such as vented attics, ceilings and right foot higher
Resumo:
Increases in pediatric thyroid cancer incidence could be partly due to previous clinical intervention. This retrospective cohort study used 1973-2012 data from the Surveillance Epidemiology and End Results program to assess the association between previous radiation therapy exposure in development of second primary thyroid cancer (SPTC) among 0-19-year-old children. Statistical analysis included the calculation of summary statistics and univariable and multivariable logistic regression analysis. Relative to no previous radiation therapy exposure, cases exposed to radiation had 2.46 times the odds of developing SPTC (95% CI: 1.39-4.34). After adjustment for sex and age at diagnosis, Hispanic children who received radiation therapy for a first primary malignancy had 3.51 times the odds of developing SPTC compared to Hispanic children who had not received radiation therapy, [AOR=3.51, 99% CI: 0.69-17.70, p=0.04]. These findings support the development of age-specific guidelines for the use of radiation based interventions among children with and without cancer.
Resumo:
Institutions are widely regarded as important, even ultimate drivers of economic growth and performance. A recent mainstream of institutional economics has concentrated on the effect of persisting, often imprecisely measured institutions and on cataclysmic events as agents of noteworthy institutional change. As a consequence, institutional change without large-scale shocks has received little attention. In this dissertation I apply a complementary, quantitative-descriptive approach that relies on measures of actually enforced institutions to study institutional persistence and change over a long time period that is undisturbed by the typically studied cataclysmic events. By placing institutional change into the center of attention one can recognize different speeds of institutional innovation and the continuous coexistence of institutional persistence and change. Specifically, I combine text mining procedures, network analysis techniques and statistical approaches to study persistence and change in England’s common law over the Industrial Revolution (1700-1865). Based on the doctrine of precedent - a peculiarity of common law systems - I construct and analyze the apparently first citation network that reflects lawmaking in England. Most strikingly, I find large-scale change in the making of English common law around the turn of the 19th century - a period free from the typically studied cataclysmic events. Within a few decades a legal innovation process with low depreciation rates (1 to 2 percent) and strong past-persistence transitioned to a present-focused innovation process with significantly higher depreciation rates (4 to 6 percent) and weak past-persistence. Comparison with U.S. Supreme Court data reveals a similar U.S. transition towards the end of the 19th century. The English and U.S. transitions appear to have unfolded in a very specific manner: a new body of law arose during the transitions and developed in a self-referential manner while the existing body of law lost influence, but remained prominent. Additional findings suggest that Parliament doubled its influence on the making of case law within the first decades after the Glorious Revolution and that England’s legal rules manifested a high degree of long-term persistence. The latter allows for the possibility that the often-noted persistence of institutional outcomes derives from the actual persistence of institutions.
Resumo:
The formation of the cartilage tissue depends on the coordination of cell to cell or cell to ECM interaction that cause to the cell polarity, migration and differentiation of precursor mesenchymal cells during chondrogenesis Many of these events are mediated by ECM components such as glycocojugates which with their suger residues such as galactose or aminosuger have a ligand role for regulatory molecules. The aim of this study was to identify the presence and distribution of some different glycoconjugates and their suger residues in the chondrogenesis by histochemistry and lectin-histochemistry techniques. For this purpose, embryos from pregnant wistar rats from E12-E20 were collected and fixed. Some of them were stained with alizarin red Salcin blue staining to demonstrate cartilage and bone formation in whole mount embryos. Other embryos with serial sections (5-7micm thikness) were stained by: 1-alcian blue (pH: l) for S-GAG,2-alcin blue (pH:2.5)for C-GAG, S-PAS alcian blue fora neutral and acidic sugers,4- tuloidin blue for metachromatic substances. Stained sections were graded according to the staining intensity (0-5 grading s method). Statistical analysis showed significant difference for those substances among experimental groups. Lectin histochemistry with MPA, VVA, SBA, OFA demonstrated differences between organs for suger residues during chondrogensis. It seems that synthesis and secretion of glycocojugates and change of their suger residues follows a spatiotemporal pattern and developmentaly regulated.
Resumo:
Background: Learning styles are cognitive, emotional, and physiological traits, as well as indicators of how learners perceive, interact, and respond to their learning environments. According to Honey-Mumford, learning styles are classified as active, reflexive, theoretical, and pragmatic. Objective: The purpose of this study was to identify the predominant learning styles among pharmacy students at the Federal University of Paraná, Brazil. Methods: An observational, cross-sectional, and descriptive study was conducted using the Honey-Alonso Learning Style Questionnaire. Students in the Bachelor of Pharmacy program were invited to participate in this study. The questionnaire comprised 80 randomized questions, 20 for each of the four learning styles. The maximum possible score was 20 points for each learning style, and cumulative scores indicated the predominant learning styles among the participants. Honey-Mumford (1986) proposed five preference levels for each style (very low, low, moderate, high, and very high), called a general interpretation scale, to avoid student identification with one learning style and ignoring the characteristics of the other styles. Statistical analysis was performed using the Statistical Package for the Social Sciences (SPSS) version 20.0. Results: This study included 297 students (70% of all pharmacy students at the time) with a median age of 21 years old. Women comprised 77.1% of participants. The predominant style among pharmacy students at the Federal University of Paraná was the pragmatist, with a median of 14 (high preference). The pragmatist style prevails in people who are able to discover techniques related to their daily learning because such people are curious to discover new strategies and attempt to verify whether the strategies are efficient and valid. Because these people are direct and objective in their actions, pragmatists prefer to focus on practical issues that are validated and on problem situations. There was no statistically significant difference between genders with regard to learning styles. Conclusion: The pragmatist style is the prevailing style among pharmacy students at the Federal University of Paraná. Although students may have a learning preference that preference is not the only manner in which students can learn, neither their preference is the only manner in which students can be taught. Awareness of students learning styles can be used to adapt the methodology used by teachers to render the teaching-learning process effective and long lasting. The content taught to students should be presented in different manners because varying teaching methods can develop learning skills in students.
Resumo:
This dissertation proposes statistical methods to formulate, estimate and apply complex transportation models. Two main problems are part of the analyses conducted and presented in this dissertation. The first method solves an econometric problem and is concerned with the joint estimation of models that contain both discrete and continuous decision variables. The use of ordered models along with a regression is proposed and their effectiveness is evaluated with respect to unordered models. Procedure to calculate and optimize the log-likelihood functions of both discrete-continuous approaches are derived, and difficulties associated with the estimation of unordered models explained. Numerical approximation methods based on the Genz algortithm are implemented in order to solve the multidimensional integral associated with the unordered modeling structure. The problems deriving from the lack of smoothness of the probit model around the maximum of the log-likelihood function, which makes the optimization and the calculation of standard deviations very difficult, are carefully analyzed. A methodology to perform out-of-sample validation in the context of a joint model is proposed. Comprehensive numerical experiments have been conducted on both simulated and real data. In particular, the discrete-continuous models are estimated and applied to vehicle ownership and use models on data extracted from the 2009 National Household Travel Survey. The second part of this work offers a comprehensive statistical analysis of free-flow speed distribution; the method is applied to data collected on a sample of roads in Italy. A linear mixed model that includes speed quantiles in its predictors is estimated. Results show that there is no road effect in the analysis of free-flow speeds, which is particularly important for model transferability. A very general framework to predict random effects with few observations and incomplete access to model covariates is formulated and applied to predict the distribution of free-flow speed quantiles. The speed distribution of most road sections is successfully predicted; jack-knife estimates are calculated and used to explain why some sections are poorly predicted. Eventually, this work contributes to the literature in transportation modeling by proposing econometric model formulations for discrete-continuous variables, more efficient methods for the calculation of multivariate normal probabilities, and random effects models for free-flow speed estimation that takes into account the survey design. All methods are rigorously validated on both real and simulated data.
Resumo:
Background: The aim of this study was the evaluation of a fast Gradient Spin Echo Technique (GraSE) for cardiac T2-mapping, combining a robust estimation of T2 relaxation times with short acquisition times. The sequence was compared against two previously introduced T2-mapping techniques in a phantom and in vivo. Methods: Phantom experiments were performed at 1.5 T using a commercially available cylindrical gel phantom. Three different T2-mapping techniques were compared: a Multi Echo Spin Echo (MESE; serving as a reference), a T2-prepared balanced Steady State Free Precession (T2prep) and a Gradient Spin Echo sequence. For the subsequent in vivo study, 12 healthy volunteers were examined on a clinical 1.5 T scanner. The three T2-mapping sequences were performed at three short-axis slices. Global myocardial T2 relaxation times were calculated and statistical analysis was performed. For assessment of pixel-by-pixel homogeneity, the number of segments showing an inhomogeneous T2 value distribution, as defined by a pixel SD exceeding 20 % of the corresponding observed T2 time, was counted. Results: Phantom experiments showed a greater difference of measured T2 values between T2prep and MESE than between GraSE and MESE, especially for species with low T1 values. Both, GraSE and T2prep resulted in an overestimation of T2 times compared to MESE. In vivo, significant differences between mean T2 times were observed. In general, T2prep resulted in lowest (52.4 +/- 2.8 ms) and GraSE in highest T2 estimates (59.3 +/- 4.0 ms). Analysis of pixel-by-pixel homogeneity revealed the least number of segments with inhomogeneous T2 distribution for GraSE-derived T2 maps. Conclusions: The GraSE sequence is a fast and robust sequence, combining advantages of both MESE and T2prep techniques, which promises to enable improved clinical applicability of T2-mapping in the future. Our study revealed significant differences of derived mean T2 values when applying different sequence designs. Therefore, a systematic comparison of different cardiac T2-mapping sequences and the establishment of dedicated reference values should be the goal of future studies.
Resumo:
International audience
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Exatas, Departamento de Ciência da Computação, Programa de Pós-Graducação em Informática, 2016.
Resumo:
A capacidade de adaptação e rapidez de decisão, distinguem as empresas que melhor conseguem competir e crescer no mercado global. Para atuar rapidamente, as organizações precisam de sistemas de informação cada vez mais eficazes, surgindo recentemente uma nova função considerada fundamental para as empresas, que é a de Cientista de Dados. É neste contexto e para responder aos desafios atuais e futuros, que surgem sistemas de informação cada vez mais avançados, suportados por modelos de análise e visualização estatística. Este trabalho consiste em criar uma metodologia de desenvolvimento de modelos de previsão de incumprimento e perfil do consumidor, aplicado a cartões de crédito, com base numa exposição de análise comportamental, utilizando técnicas de análise de sobrevivência. São definidas técnicas de tratamento dos dados recolhidos, estimado modelo não-paramétrico de Kaplan-Meier e vários modelos de Cox de riscos proporcionais. Com recurso à curva ROC, dependente do tempo, à AUC e ao índice de Gini, conclui-se que o modelo final apresenta um desempenho positivo para identificar os clientes em situação de incumprimento ou com propensão a incumprir.
Resumo:
Part 19: Knowledge Management in Networks
Resumo: