972 resultados para Trivariate Normal Distribution
Resumo:
In previous Statnotes, many of the statistical tests described rely on the assumption that the data are a random sample from a normal or Gaussian distribution. These include most of the tests in common usage such as the ‘t’ test ), the various types of analysis of variance (ANOVA), and Pearson’s correlation coefficient (‘r’) . In microbiology research, however, not all variables can be assumed to follow a normal distribution. Yeast populations, for example, are a notable feature of freshwater habitats, representatives of over 100 genera having been recorded . Most common are the ‘red yeasts’ such as Rhodotorula, Rhodosporidium, and Sporobolomyces and ‘black yeasts’ such as Aurobasidium pelculans, together with species of Candida. Despite the abundance of genera and species, the overall density of an individual species in freshwater is likely to be low and hence, samples taken from such a population will contain very low numbers of cells. A rare organism living in an aquatic environment may be distributed more or less at random in a volume of water and therefore, samples taken from such an environment may result in counts which are more likely to be distributed according to the Poisson than the normal distribution. The Poisson distribution was named after the French mathematician Siméon Poisson (1781-1840) and has many applications in biology, especially in describing rare or randomly distributed events, e.g., the number of mutations in a given sequence of DNA after exposure to a fixed amount of radiation or the number of cells infected by a virus given a fixed level of exposure. This Statnote describes how to fit the Poisson distribution to counts of yeast cells in samples taken from a freshwater lake.
Resumo:
Using a modified deprivation (or poverty) function, in this paper, we theoretically study the changes in poverty with respect to the 'global' mean and variance of the income distribution using Indian survey data. We show that when the income obeys a log-normal distribution, a rising mean income generally indicates a reduction in poverty while an increase in the variance of the income distribution increases poverty. This altruistic view for a developing economy, however, is not tenable anymore once the poverty index is found to follow a pareto distribution. Here although a rising mean income indicates a reduction in poverty, due to the presence of an inflexion point in the poverty function, there is a critical value of the variance below which poverty decreases with increasing variance while beyond this value, poverty undergoes a steep increase followed by a decrease with respect to higher variance. Identifying this inflexion point as the poverty line, we show that the pareto poverty function satisfies all three standard axioms of a poverty index [N.C. Kakwani, Econometrica 43 (1980) 437; A.K. Sen, Econometrica 44 (1976) 219] whereas the log-normal distribution falls short of this requisite. Following these results, we make quantitative predictions to correlate a developing with a developed economy. © 2006 Elsevier B.V. All rights reserved.
Resumo:
Евелина Илиева Велева - Разпределението на Уишарт се среща в практиката като разпределението на извадъчната ковариационна матрица за наблюдения над многомерно нормално разпределение. Изведени са някои маргинални плътности, получени чрез интегриране на плътността на Уишарт разпределението. Доказани са необходими и достатъчни условия за положителна определеност на една матрица, които дават нужните граници за интегрирането.
Resumo:
Lognormal distribution has abundant applications in various fields. In literature, most inferences on the two parameters of the lognormal distribution are based on Type-I censored sample data. However, exact measurements are not always attainable especially when the observation is below or above the detection limits, and only the numbers of measurements falling into predetermined intervals can be recorded instead. This is the so-called grouped data. In this paper, we will show the existence and uniqueness of the maximum likelihood estimators of the two parameters of the underlying lognormal distribution with Type-I censored data and grouped data. The proof was first established under the case of normal distribution and extended to the lognormal distribution through invariance property. The results are applied to estimate the median and mean of the lognormal population.
Resumo:
The aim of this study was to evaluate the degree of conversion (DC) and the cytotoxicity of photo-cured experimental resin composites containing 4-(N,N-dimethylamino)phenethyl alcohol (DMPOH) combined to the camphorquinone (CQ) compared with ethylamine benzoate (EDAB). The resin composites were mechanically blended using 35 wt% of an organic matrix and 65 wt% of filler loading. To this matrix was added 0.2 wt% of CQ and 0.2 wt% of one of the reducing agents tested. 5x1 mm samples (n=5) were previously submitted to DC measurement and then pre-immersed in complete culture medium without 10% (v/v) bovine serum for 1 h or 24 h at 37 °C in a humidifier incubator with 5% CO2 and 95% humidity to evaluate the cytotoxic effects of experimental resin composites using the MTT assay on immortalized human keratinocytes cells. As a result of absence of normal distribution, the statistical analysis was performed using the nonparametric Kruskal-Wallis to evaluate the cytotoxicity and one-way analysis of variance to evaluate the DC. For multiple comparisons, cytotoxicity statistical analyses were submitted to Student-Newman-Keuls and DC analysis to Tukey's HSD post-hoc test (=0.05). No significant differences were found between the DC of DMPOH (49.9%) and EDAB (50.7%). 1 h outcomes showed no significant difference of the cell viability between EDAB (99.26%), DMPOH (94.85%) and the control group (100%). After 24 h no significant difference were found between EDAB (48.44%) and DMPOH (38.06%), but significant difference was found compared with the control group (p>0.05). DMPOH presented similar DC and cytotoxicity compared with EDAB when associated with CQ.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Prostaglandins control osteoblastic and osteoclastic function under physiological or pathological conditions and are important modulators of the bone healing process. The non-steroidal anti-inflammatory drugs (NSAIDs) inhibit cyclooxygenase (COX) activity and consequently prostaglandins synthesis. Experimental and clinical evidence has indicated a risk for reparative bone formation related to the use of non-selective (COX-1 and COX-2) and COX-2 selective NSAIDs. Ketorolac is a non-selective NSAID which, at low doses, has a preferential COX-1 inhibitory effect and etoricoxib is a new selective COX-2 inhibitor. Although literature data have suggested that ketorolac can interfere negatively with long bone fracture healing, there seems to be no study associating etoricoxib with reparative bone formation. Paracetamol/acetaminophen, one of the first choices for pain control in clinical dentistry, has been considered a weak anti-inflammatory drug, although supposedly capable of inhibiting COX-2 activity in inflammatory sites. OBJECTIVE: The purpose of the present study was to investigate whether paracetamol, ketorolac and etoricoxib can hinder alveolar bone formation, taking the filling of rat extraction socket with newly formed bone as experimental model. MATERIAL AND METHODS: The degree of new bone formation inside the alveolar socket was estimated two weeks after tooth extraction by a differential point-counting method, using an optical microscopy with a digital camera for image capture and histometry software. Differences between groups were analyzed by ANOVA after confirming a normal distribution of sample data. RESULTS AND CONCLUSIONS: Histometric results confirmed that none of the tested drugs had a detrimental effect in the volume fraction of bone trabeculae formed inside the alveolar socket.
Resumo:
A avaliação do coeficiente de variação (CV) como medida da precisão dos experimentos tem sido feita com diversas culturas, espécies animais e forrageiras por meio de trabalhos sugerindo faixas de classificação dos valores, considerando-se a média, o desvio padrão e a distribuição dos valores de CV das diversas variáveis respostas envolvidas nos experimentos. Neste trabalho, objetivouse estudar a distribuição dos valores de CV de experimentos com a cultura do feijão, propondo faixas que orientem os pesquisadores na avaliação de seus estudos com cada variável. Os dados utilizados foram obtidos de revisão em revistas que publicam artigos científicos com a cultura do feijão. Foram consideradas as variáveis: rendimento, número de vagens por planta, número de grãos por vagem, peso de 100 grãos, estande final, altura de plantas e índice de colheita. Foram obtidas faixas de valores de CV para cada variável tomando como base a distribuição normal, utilizando-se também a distribuição dos quantis amostrais e a mediana e o pseudo-sigma, classificando-os como baixo, médio, alto e muito alto. Os cálculos estatísticos para verificação da normalidade dos dados foram implementados por meio de uma função no software estatístico livre R. Os resultados obtidos indicaram que faixas de valores de CV diferiram entre as diversas variáveis apresentando ampla variação justificando a necessidade de utilizar faixa de avaliação específica para cada variável.
Resumo:
In this paper, we compare three residuals to assess departures from the error assumptions as well as to detect outlying observations in log-Burr XII regression models with censored observations. These residuals can also be used for the log-logistic regression model, which is a special case of the log-Burr XII regression model. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and the empirical distribution of each residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended to the modified martingale-type residual in log-Burr XII regression models with censored data.
Resumo:
This paper proposes a regression model considering the modified Weibull distribution. This distribution can be used to model bathtub-shaped failure rate functions. Assuming censored data, we consider maximum likelihood and Jackknife estimators for the parameters of the model. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and we also present some ways to perform global influence. Besides, for different parameter settings, sample sizes and censoring percentages, various simulations are performed and the empirical distribution of the modified deviance residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for a martingale-type residual in log-modified Weibull regression models with censored data. Finally, we analyze a real data set under log-modified Weibull regression models. A diagnostic analysis and a model checking based on the modified deviance residual are performed to select appropriate models. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
O Teorema Central do Limite e a Lei dos Grandes Números estão entre os mais importantes resultados da teoria da probabilidade. O primeiro deles busca condições sob as quais [fórmula] converge em distribuição para a distribuição normal com parâmetros 0 e 1, quando n tende ao infinito, onde Sn é a soma de n variáveis aleatórias independentes. Ao mesmo tempo, o segundo estabelece condições para que [fórmula] convirja a zero, ou equivalentemente, para que [fórmula] convirja para a esperança das variáveis aleatórias, caso elas sejam identicamente distribuídas. Em ambos os casos as sequências abordadas são do tipo [fórmula], onde [fórmula] e [fórmula] são constantes reais. Caracterizar os possíveis limites de tais sequências é um dos objetivos dessa dissertação, já que elas não convergem exclusivamente para uma variável aleatória degenerada ou com distribuição normal como na Lei dos Grandes Números e no Teorema Central do Limite, respectivamente. Assim, somos levados naturalmente ao estudo das distribuições infinitamente divisíveis e estáveis, e os respectivos teoremas limites, e este vem a ser o objetivo principal desta dissertação. Para as demonstrações dos teoremas utiliza-se como estratégia principal a aplicação do método de Lyapunov, o qual consiste na análise da convergência da sequência de funções características correspondentes às variáveis aleatórias. Nesse sentido, faremos também uma abordagem detalhada de tais funções neste trabalho.
Resumo:
Mestrado em Controlo de Gestão e dos Negócios
Resumo:
O chocolate é considerado uma emulsão complexa e um alimento de luxo, que durante o seu consumo provoca estímulos que activam os centros de prazer do cérebro Humano. Tendo em conta a importância deste alimento torna-se necessário estudar e avaliar a melhor forma de melhorar a qualidade do chocolate. Este trabalho teve como objectivo verificar e analisar a qualidade do processo de fabrico da massa de chocolate, no que respeita (i) a rastreabilidade das matérias-primas e do produto acabado e, por outro lado, (ii) determinar e estudar o efeito de alguns parâmetros do processo nas características da massa, através das variáveis viscosidade, tensão de corte, tensão de corte crítica (“yield value”) e granulometria. Estas variáveis foram medidas em massas de chocolate de leite com o nome de formulação CAI e provenientes das duas unidades fabris da empresa (UF1 e UF2). Os parâmetros estudados na UF1 foram a influência das conchas e dos ingredientes. Na UF2 estudou-se a influência dos inutilizados de fabrico e a influência dos inutilizados de fabrico juntamente com o efeito de um ingrediente que foi o açúcar. Os resultados da viscosidade, tensão de corte e tensão de corte crítica (“yield value”) foram analisados estatisticamente por análise de variância (ANOVA), recorrendo aos testes de Komolgorov-Smirnov, Shapiro-Wilk e de Levene para verificar as condições de aplicabilidade desta análise. Os resultados da granulometria como não aderiram a uma distribuição normal foram analisados pelo método não paramétrico de Kruskal-Wallis. Estas análises foram executadas no programa “Statistical Package for the Social Sciences” (SPSS). Pelos resultados obtidos, conclui-se que, para a UF1, a concha afecta a tensão de corte, viscosidade e a tensão de corte crítica do chocolate produzido, na medida em que existem diferenças entre as conchas estudadas. Para esta unidade conclui-se que os ingredientes também influenciam a granulometria da massa. No caso da UF2, conclui-se que a tensão de corte é afectada apenas pelo lote de açúcar, a viscosidade é afectada tanto pelo lote de açúcar como pela presença de inutilizados de fabrico e a tensão de corte crítica não é afectada por nenhum destes efeitos. A granulometria, nesta unidade é afectada pelos lotes de açúcar estudados.
Resumo:
Mestrado em Contabilidade e Análise Financeira
Resumo:
Beyond the classical statistical approaches (determination of basic statistics, regression analysis, ANOVA, etc.) a new set of applications of different statistical techniques has increasingly gained relevance in the analysis, processing and interpretation of data concerning the characteristics of forest soils. This is possible to be seen in some of the recent publications in the context of Multivariate Statistics. These new methods require additional care that is not always included or refered in some approaches. In the particular case of geostatistical data applications it is necessary, besides to geo-reference all the data acquisition, to collect the samples in regular grids and in sufficient quantity so that the variograms can reflect the spatial distribution of soil properties in a representative manner. In the case of the great majority of Multivariate Statistics techniques (Principal Component Analysis, Correspondence Analysis, Cluster Analysis, etc.) despite the fact they do not require in most cases the assumption of normal distribution, they however need a proper and rigorous strategy for its utilization. In this work, some reflections about these methodologies and, in particular, about the main constraints that often occur during the information collecting process and about the various linking possibilities of these different techniques will be presented. At the end, illustrations of some particular cases of the applications of these statistical methods will also be presented.