918 resultados para STATISTICAL-METHOD


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Abstract Background: The analysis of the Auditory Brainstem Response (ABR) is of fundamental importance to the investigation of the auditory system behaviour, though its interpretation has a subjective nature because of the manual process employed in its study and the clinical experience required for its analysis. When analysing the ABR, clinicians are often interested in the identification of ABR signal components referred to as Jewett waves. In particular, the detection and study of the time when these waves occur (i.e., the wave latency) is a practical tool for the diagnosis of disorders affecting the auditory system. Significant differences in inter-examiner results may lead to completely distinct clinical interpretations of the state of the auditory system. In this context, the aim of this research was to evaluate the inter-examiner agreement and variability in the manual classification of ABR. Methods: A total of 160 ABR data samples were collected, for four different stimulus intensity (80dBHL, 60dBHL, 40dBHL and 20dBHL), from 10 normal-hearing subjects (5 men and 5 women, from 20 to 52 years). Four examiners with expertise in the manual classification of ABR components participated in the study. The Bland-Altman statistical method was employed for the assessment of inter-examiner agreement and variability. The mean, standard deviation and error for the bias, which is the difference between examiners’ annotations, were estimated for each pair of examiners. Scatter plots and histograms were employed for data visualization and analysis. Results: In most comparisons the differences between examiner’s annotations were below 0.1 ms, which is clinically acceptable. In four cases, it was found a large error and standard deviation (>0.1 ms) that indicate the presence of outliers and thus, discrepancies between examiners. Conclusions: Our results quantify the inter-examiner agreement and variability of the manual analysis of ABR data, and they also allows for the determination of different patterns of manual ABR analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Lack of access to insurance exacerbates the impact of climate variability on smallholder famers in Africa. Unlike traditional insurance, which compensates proven agricultural losses, weather index insurance (WII) pays out in the event that a weather index is breached. In principle, WII could be provided to farmers throughout Africa. There are two data-related hurdles to this. First, most farmers do not live close enough to a rain gauge with sufficiently long record of observations. Second, mismatches between weather indices and yield may expose farmers to uncompensated losses, and insurers to unfair payouts – a phenomenon known as basis risk. In essence, basis risk results from complexities in the progression from meteorological drought (rainfall deficit) to agricultural drought (low soil moisture). In this study, we use a land-surface model to describe the transition from meteorological to agricultural drought. We demonstrate that spatial and temporal aggregation of rainfall results in a clearer link with soil moisture, and hence a reduction in basis risk. We then use an advanced statistical method to show how optimal aggregation of satellite-based rainfall estimates can reduce basis risk, enabling remotely sensed data to be utilized robustly for WII.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The 5` cis-regulatory region of the CCR5 gene exhibits a strong signature of balancing selection in several human populations. Here we analyze the polymorphism of this region in Amerindians from Amazonia, who have a complex demographic history, including recent bottlenecks that are known to reduce genetic variability. Amerindians show high nucleotide diversity (pi = 0.27%) and significantly positive Tajima`s D, and carry haplotypes associated with weak and strong gene expression. To evaluate whether these signatures of balancing selection could be explained by demography, we perform neutrality tests based on empiric and simulated data. The observed Tajima`s D was higher than that of other world populations: higher than that found for 18 noncoding regions of South Amerindians, and higher than 99.6% of simulated genealogies, which assume nonequilibrium conditions. Moreover, comparing Amerindians and Asians, the Fst for CCR5 cis-regulatory region was unusually low, in relation to neutral markers. These findings indicate that, despite their complex demographic history, South Amerindians carry a detectable signature of selection on the CCR5 cis-regulatory region. (C) 2010 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The increase in biodiversity from high to low latitudes is a widely recognized biogeographical pattern. According to the latitudinal gradient hypothesis (LGH), this pattern was shaped by differential effects of Late Quaternary climatic changes across a latitudinal gradient. Here, we evaluate the effects of climatic changes across a tropical latitudinal gradient and its implications to diversification of an Atlantic Forest (AF) endemic passerine. We studied the intraspecific diversification and historical demography of Sclerurus scansor, based on mitochondrial (ND2, ND3 and cytb) and nuclear (FIB7) gene sequences. Phylogenetic analyses recovered three well-supported clades associated with distinct latitudinal zones. Coalescent-based methods were applied to estimate divergence times and changes in effective population sizes. Estimates of divergence times indicate that intraspecific diversification took place during Middle-Late Pleistocene. Distinct demographic scenarios were identified, with the southern lineage exhibiting a clear signature of demographic expansion, while the central one remained more stable. The northern lineage, contrasting with LGH predictions, exhibited a clear sign of a recent bottleneck. Our results suggest that different AF regions reacted distinctly, even in opposite ways, under the same climatic period, producing simultaneously favourable scenarios for isolation and contact among populations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Genetic variation for environmental sensitivity indicates that animals are genetically different in their response to environmental factors. Environmental factors are either identifiable (e.g. temperature) and called macro-environmental or unknown and called micro-environmental. The objectives of this study were to develop a statistical method to estimate genetic parameters for macro- and micro-environmental sensitivities simultaneously, to investigate bias and precision of resulting estimates of genetic parameters and to develop and evaluate use of Akaike’s information criterion using h-likelihood to select the best fitting model. Methods: We assumed that genetic variation in macro- and micro-environmental sensitivities is expressed as genetic variance in the slope of a linear reaction norm and environmental variance, respectively. A reaction norm model to estimate genetic variance for macro-environmental sensitivity was combined with a structural model for residual variance to estimate genetic variance for micro-environmental sensitivity using a double hierarchical generalized linear model in ASReml. Akaike’s information criterion was constructed as model selection criterion using approximated h-likelihood. Populations of sires with large half-sib offspring groups were simulated to investigate bias and precision of estimated genetic parameters. Results: Designs with 100 sires, each with at least 100 offspring, are required to have standard deviations of estimated variances lower than 50% of the true value. When the number of offspring increased, standard deviations of estimates across replicates decreased substantially, especially for genetic variances of macro- and micro-environmental sensitivities. Standard deviations of estimated genetic correlations across replicates were quite large (between 0.1 and 0.4), especially when sires had few offspring. Practically, no bias was observed for estimates of any of the parameters. Using Akaike’s information criterion the true genetic model was selected as the best statistical model in at least 90% of 100 replicates when the number of offspring per sire was 100. Application of the model to lactation milk yield in dairy cattle showed that genetic variance for micro- and macro-environmental sensitivities existed. Conclusion: The algorithm and model selection criterion presented here can contribute to better understand genetic control of macro- and micro-environmental sensitivities. Designs or datasets should have at least 100 sires each with 100 offspring.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This research is concerned with the mechanical and physical properties of hemp fibre reinforced concrete (HFRC). An experimental program was developed based on the statistical method of fractional factors design. The variables for the experimental study were: (1) mixing method; (2) fibre content by weight; (3) aggregate size; and (4) fibre length. Their effects on the compressive and flexural performance of HFRC composites were investigated. The specific gravity and water absorption ratio of HFRC were also studied. The results indicate that the compressive and flexural properties can be modelled using a simple empirical linear expression based on statistical analysis and regression, and that hemp fibre content (by weight) is the critical factor affecting the compressive and flexural properties of HFRC.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Colour patterns and their visual backgrounds consist of a mosaic of patches that vary in colour, brightness, size, shape and position. Most studies of crypsis, aposematism, sexual selection, or other forms of signalling concentrate on one or two patch classes (colours), either ignoring the rest of the colour pattern, or analysing the patches separately. We summarize methods of comparing colour patterns making use of known properties of bird eyes. The methods are easily modifiable for other animal visual systems. We present a new statistical method to compare entire colour patterns rather than comparing multiple pairs of patches. Unlike previous methods, the new method detects differences in the relationships among the colours, not just differences in colours. We present tests of the method's ability to detect a variety of kinds of differences between natural colour patterns and provide suggestions for analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Stormwater pipe systems in Australia are designed to convey water from rainfall and surface runoff only and do not transport sewage. Any blockage can cause flooding events with the probability of subsequent property damage. Proactive maintenance plans that can enhance their serviceability need to be developed based on a sound deterioration model. This paper uses a neural network (NN) approach to model deterioration in serviceability of concrete stormwater pipes, which make up the bulk of the stormwater network in Australia. System condition data was collected using CCTV images. The outcomes of model are the identification of the significant factors influencing the serviceability deterioration and the forecasting of the change of serviceability condition over time for individual pipes based on the pipe attributes. The proposed method is validated and compared with multiple discriminant analysis, a traditionally statistical method. The results show that the NN model can be applied to forecasting serviceability deterioration. However, further improvements in data collection and condition grading schemes should be carried out to increase the prediction accuracy of the NN model.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present a method for foreground/background separation of audio using a background modelling technique. The technique models the background in an online, unsupervised, and adaptive fashion, and is designed for application to long term surveillance and monitoring problems. The background is determined using a statistical method to model the states of the audio over time. In addition, three methods are used to increase the accuracy of background modelling in complex audio environments. Such environments can cause the failure of the statistical model to accurately capture the background states. An entropy-based approach is used to unify background representations fragmented over multiple states of the statistical model. The approach successfully unifies such background states, resulting in a more robust background model. We adaptively adjust the number of states considered background according to background complexity, resulting in the more accurate classification of background models. Finally, we use an auxiliary model cache to retain potential background states in the system. This prevents the deletion of such states due to a rapid influx of observed states that can occur for highly dynamic sections of the audio signal. The separation algorithm was successfully applied to a number of audio environments representing monitoring applications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It is well known that the outcome of an intervention is affected both by the inherent effects of the intervention and the patient's expectations. For this reason in comparative clinical trials an effort is made to conceal the nature of the administered intervention from the participants in the trial i.e. to blind the trial. Yet, in practice perfect blinding is impossible to ensure or even verify post hoc. The current clinical standard is to follow up the trial with an auxiliary questionnaire, which allows trial participants to express in closed form their belief concerning the intervention, i.e. trial group assignment (treatment or control). Auxiliary questionnaire responses are then used to compute the extent of blinding in the trial in the form of a blinding index. If the estimated extent of blinding exceeds a particular threshold the trial is deemed sufficiently blinded; otherwise, the strength of evidence of the trial is brought into question. This may necessitate that the trial is repeated. In this paper we make several contributions. Firstly, we identify a series of problems of the aforesaid clinical practice and discuss them in context of the most commonly used blinding indexes. Secondly, we formulate a novel approach for handling imperfectly blinded trials. We adopt a feedback questionnaire of the same form as that which is currently in use, but interpret the collected data using a novel statistical method, significantly different from that proposed in the previous work. Unlike the previously proposed approaches, our method is void of any ad hoc free parameters and robust to small changes in the participants' feedback responses. Our method also does not discard any data and is not predicated on any strong assumptions used to interpret participants' feedback. The key idea behind the present method is that it is meaningful to compare only the corresponding treatment and control participant sub-groups, that is, sub-groups matched by their auxiliary responses. A series of experiments on simulated trials is used to demonstrate the effectiveness of the proposed approach and its superiority over those currently in use.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Exploratory factor analysis (hereafter, factor analysis) is a complex statistical method that is integral to many fields of research. Using factor analysis requires researchers to make several decisions, each of which affects the solutions generated. In this paper, we focus on five major decisions that are made in conducting factor analysis: (i) establishing how large the sample needs to be, (ii) choosing between factor analysis and principal components analysis, (iii) determining the number of factors to retain, (iv) selecting a method of data extraction, and (v) deciding upon the methods of factor rotation. The purpose of this paper is threefold: (i) to review the literature with respect to these five decisions, (ii) to assess current practices in nursing research, and (iii) to offer recommendations for future use. The literature reviews illustrate that factor analysis remains a dynamic field of study, with recent research having practical implications for those who use this statistical method. The assessment was conducted on 54 factor analysis (and principal components analysis) solutions presented in the results sections of 28 papers published in the 2012 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. The main findings from the assessment were that researchers commonly used (a) participants-to-items ratios for determining sample sizes (used for 43% of solutions), (b) principal components analysis (61%) rather than factor analysis (39%), (c) the eigenvalues greater than one rule and screen tests to decide upon the numbers of factors/components to retain (61% and 46%, respectively), (d) principal components analysis and unweighted least squares as methods of data extraction (61% and 19%, respectively), and (e) the Varimax method of rotation (44%). In general, well-established, but out-dated, heuristics and practices informed decision making with respect to the performance of factor analysis in nursing studies. Based on the findings from factor analysis research, it seems likely that the use of such methods may have had a material, adverse effect on the solutions generated. We offer recommendations for future practice with respect to each of the five decisions discussed in this paper.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

With the service life of water supply network (WSN) growth, the growing phenomenon of aging pipe network has become exceedingly serious. As urban water supply network is hidden underground asset, it is difficult for monitoring staff to make a direct classification towards the faults of pipe network by means of the modern detecting technology. In this paper, based on the basic property data (e.g. diameter, material, pressure, distance to pump, distance to tank, load, etc.) of water supply network, decision tree algorithm (C4.5) has been carried out to classify the specific situation of water supply pipeline. Part of the historical data was used to establish a decision tree classification model, and the remaining historical data was used to validate this established model. Adopting statistical methods were used to access the decision tree model including basic statistical method, Receiver Operating Characteristic (ROC) and Recall-Precision Curves (RPC). These methods has been successfully used to assess the accuracy of this established classification model of water pipe network. The purpose of classification model was to classify the specific condition of water pipe network. It is important to maintain the pipeline according to the classification results including asset unserviceable (AU), near perfect condition (NPC) and serious deterioration (SD). Finally, this research focused on pipe classification which plays a significant role in maintaining water supply networks in the future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Estudo sobre o processo de revisão dos parâmetros conceituais e metodológicos norteadores da Pesquisa Nacional por Amostra de Domicilios - PNAD, da Fundação Instituto Brasileiro de Geografia e Estatistica IBGE, abrangendo o periodo de 1989 a 1992. Objetiva-se mostrar o processo de produção estatistica, bem como compreender as relações entre objeto, teoria e método em investigações voltadas para a mensuração do real. A luz de reflexões sobre a construção do conhecimento cientifico e o papel do método no processo de pesquisa, discute-se a especificidade do método estatistico. Os resultados indicam que o processo de construção da PNAD se afirma como um fazer cientifico.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Esta dissertação foi desenvolvida no curso de Administração de instituição de ensino superior, na cidade de Belém, com o intuito de compreender as principais variáveis que envolvem a avaliação institucional docente. Motivada principalmente pela preocupação com a qualidade do ensino atualmente ofertado em todo o país e buscando entender de que forma a avaliação institucional docente pode servir como fonte de informações para a tomada de decisão por parte da gestão dos cursos. Para tanto, foi feita uma pesquisa descritiva do tipo estudo de caso com abordagem quantitativa, por meio de entrevistas estruturadas, aplicadas aos alunos e a seus respectivos professores Administradores, com a utilização do método estatístico multivariado Análise de Correspondência para confrontação dos dados coletados. Com o suporte teórico do referencial selecionado de diversos autores, os resultados possibilitaram identificar, a partir do cruzamento da avaliação institucional docente realizada pelos alunos e da auto-avaliação respondida pelos professores, seis perguntas passíveis de conclusões, gerando informações importantes para a coordenação do curso em questão. Portanto, avaliar os caminhos percorridos visa uma pequena contribuição na transformação do processo de avaliação docente em processo cada vez mais respeitado e atualizado de acordo com o novo contexto e exigências organizacionais, governamentais e mercadológicas, na medida em que corresponde à importante fonte de medição e informações para tomadas de decisão por parte dos gestores universitários.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O objetivo deste trabalho é avaliar se houve evolução nas práticas de governança corporativa entre os anos de 2005 a 2012 nas empresas brasileiras que abriram capital entre 2004 e 2006, e se esta evolução ocorreu no sentido de melhoria das práticas de governança. Busca-se também entender se esta evolução ocorreu continuamente em todo o período estudado ou se somente nos primeiros anos após seu IPO. Também busca-se compreender se a evolução ocorreu somente nos requisitos mandatórios por lei ou se houve uma real mudança de cultura nas empresas após abrirem seu capital. Para mensurar a qualidade das práticas de governança foi utilizado o Corporate Governance Index – CGI desenvolvido por Leal e Carvalhal-da-Silva (2007) para avaliar as práticas de governança em empresas brasileiras. O índice é composto de 24 questões binárias, e considera-se uma evolução nas práticas quanto mais o índice da empresa se aproxima dos 24 pontos. O índice foi calculado ano a ano para o período e empresas estudadas. Em seguida, o índice foi dividido em duas partes, questões diretamente relacionadas a leis e regulamentações e questões não diretamente relacionadas a leis e regulamentações. Utilizando os testes estatísticos T-student e W de Kendall, comprovou-se estatisticamente que houve evolução nas práticas de governança corporativa entre os anos de 2005 a 2012, entretanto, prova-se estatisticamente que esta melhora está concentrada no período de 2005 a 2008, sendo que não foi constatada uma alteração significativa nas práticas no período de 2009 a 2012. Identificou-se também que houve uma evolução positiva nas práticas de governança não diretamente relacionadas a leis e regulamentações entre os anos de 2005 a 2012, entretanto, quando retira-se deste teste as questões que foram indiretamente impactadas por mudanças nas instruções da CVM que ocorreram no período, encontra-se uma piora nas práticas de governança entre 2005 e 2012. Estes resultados sugerem que as empresas estão trabalhando em um movimento de “check the box” das boas práticas de governança, cumprindo o que é mandatório por lei e regulamentações, e questiona se as evoluções de leis, regulamentações e códigos de boas práticas de governança corporativa no Brasil estão cumprindo o seu objetivo final de desenvolver uma cultura de boa governança nas empresas brasileiras.