487 resultados para Laplace eigenfunctions


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia e Gestão Industrial

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nesta Dissertação apresentam-se e estudam-se, de uma forma crítica, dois novos métodos de amostragem adaptativa e uma nova medida de desempenho de métodos de amostragem, no contexto do controlo estatístico da qualidade. Considerando como base uma carta de controlo para a média do tipo Shewhart, estudamos as suas propriedades estatísticas e realizamos estudos comparativos, em termos do seu desempenho estatístico, com alguns dos métodos mais referenciados na literatura.Inicialmente, desenvolvemos um novo método adaptativo de amostragem no qual os intervalos entre amostras são obtidos com base na função densidade da distribuição de Laplace reduzida. Este método revela-se, particularmente, eficiente na deteção de moderadas e grandes alterações da média, pouco sensível à limitação do menor intervalo de amostragem e robusto face a diferentes situações consideradas para a não normalidade da característica da qualidade. Em determinadas situações, este método é sempre mais eficiente do que o método com intervalos de amostragem adaptativos,dimensões amostrais fixas e coeficientes dos limites de controlo fixos. Tendo como base o método de amostragem definido no ponto anterior e um método no qual os intervalos de amostragem são definidos antes do início do controlo do processo com base na taxa cumulativa de risco do sistema, apresentamos um novo método de amostragem que combina o método de intervalos predefinidos com o método de intervalos adaptativos. Neste método, os instantes de amostragem são definidos pela média ponderada dos instantes dos dois métodos, atribuindo-se maior peso ao método adaptativo para alterações moderadas (onde o método predefinido é menos eficaz) e maior peso ao método predefinido nos restantes casos (onde o método adaptativo é menos eficaz). Desta forma, os instantes de amostragem, inicialmente calendarizados de acordo com as expectativas de ocorrência de uma alteração tomando como base a distribuição do tempo de vida do sistema, são adaptados em função do valor da estatística amostral calculada no instante anterior. Este método é sempre mais eficiente do que o método periódico clássico, o que não acontece com nenhum outro esquema adaptativo, e do que o método de amostragem VSI para alguns pares de amostragem, posicionando-se como uma forte alternativa aos procedimentos de amostragem encontrados na literatura. Por fim, apresentamos uma nova medida de desempenho de métodos de amostragem. Considerando que dois métodos em comparação têm o mesmo tempo médio de mau funcionamento, o desempenho dos métodos é comparado através do número médio de amostras recolhidas sob controlo. Tendo em conta o tempo de vida do sistema, com diferentes taxas de risco, esta medida mostra-se robusta e permite, num contexto económico, um melhor controlo de custos por unidade de tempo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numa unidade fabril um dos fatores decisivos para o cumprimento dos objetivos exigidos é a capacidade de produção sem paragens involuntárias, isto é, a fiabilidade dos equipamentos tem de ser elevada. Para que tal aconteça é necessário que exista um planeamento da manutenção desses mesmos equipamentos e que esta seja aplicada de forma coerente e estruturada. No âmbito do estudo de caso, para a determinar a fiabilidade, manutibilidade e disponibilidade dos componentes do sistema utilizaram-se modelos estatísticos, baseados nos dados disponíveis do ano 2014. Iniciou-se a análise do sistema global fazendo o levantamento do número de falhas por equipamento e das suas respetivas durações que, por sua vez, foram recolhidas através dos registos das folhas de obra abertas/fechadas para cada equipamento. Utilizou-se o teste de Laplace para determinar qual a tendência das avarias, e, com o auxílio dos indicadores de manutibilidade e fiabilidade, determinou-se a disponibilidade intrínseca do sistema. Em seguida, iniciou-se um estudo mais detalhado sobre os tempos registados no decorrer de cada avaria, visando reduzir o gap presente na base de dados usada e, com isso, aumentar a sua fiabilidade. Identificaram-se os equipamentos críticos, assumindo que estes se encontram abaixo dos valores aceitáveis para um fluxo produtivo contínuo e eficiente. Por fim, realizou-se um levantamento de custos, associados ao ciclo de vida de cada equipamento, de forma a ajudar a empresa a decidir se a melhor opção será a substituição dos equipamentos críticos ou a sua reparação. Com toda esta informação, elaborou-se um documento que demonstra, quais os equipamentos que têm maior influência na disponibilidade da linha e ainda todos os gastos existentes em manutenção e componentes. Espera-se que, com este estudo, a empresa possua as ferramentas necessárias para poder tomar uma decisão que se traduza numa melhoria do fluxo produtivo do sistema.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The classical central limit theorem states the uniform convergence of the distribution functions of the standardized sums of independent and identically distributed square integrable real-valued random variables to the standard normal distribution function. While first versions of the central limit theorem are already due to Moivre (1730) and Laplace (1812), a systematic study of this topic started at the beginning of the last century with the fundamental work of Lyapunov (1900, 1901). Meanwhile, extensions of the central limit theorem are available for a multitude of settings. This includes, e.g., Banach space valued random variables as well as substantial relaxations of the assumptions of independence and identical distributions. Furthermore, explicit error bounds are established and asymptotic expansions are employed to obtain better approximations. Classical error estimates like the famous bound of Berry and Esseen are stated in terms of absolute moments of the random summands and therefore do not reflect a potential closeness of the distributions of the single random summands to a normal distribution. Non-classical approaches take this issue into account by providing error estimates based on, e.g., pseudomoments. The latter field of investigation was initiated by work of Zolotarev in the 1960's and is still in its infancy compared to the development of the classical theory. For example, non-classical error bounds for asymptotic expansions seem not to be available up to now ...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We establish existence and non-existence results to the Brezis-Nirenberg type problem involving the square root of the Laplacian in a bounded domain with zero Dirichlet boundary condition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A sequence of “inner equations” attached to certain perturbations of the McMillan map was considered in [MSS09], their solutions were used in that article to measure an exponentially small separatrix splitting. We prove here all the results relative to these equations which are necessary to complete the proof of the main result of [MSS09]. The present work relies on ideas from resurgence theory: we describe the formal solutions, study the analyticity of their Borel transforms and use ´Ecalle’s alien derivations to measure the discrepancy between different Borel-Laplace sums.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Genetic evaluation using animal models or pedigree-based models generally assume only autosomal inheritance. Bayesian animal models provide a flexible framework for genetic evaluation, and we show how the model readily can accommodate situations where the trait of interest is influenced by both autosomal and sex-linked inheritance. This allows for simultaneous calculation of autosomal and sex-chromosomal additive genetic effects. Inferences were performed using integrated nested Laplace approximations (INLA), a nonsampling-based Bayesian inference methodology. We provide a detailed description of how to calculate the inverse of the X- or Z-chromosomal additive genetic relationship matrix, needed for inference. The case study of eumelanic spot diameter in a Swiss barn owl (Tyto alba) population shows that this trait is substantially influenced by variation in genes on the Z-chromosome (sigma(2)(z) = 0.2719 and sigma(2)(a) = 0.4405). Further, a simulation study for this study system shows that the animal model accounting for both autosomal and sex-chromosome-linked inheritance is identifiable, that is, the two effects can be distinguished, and provides accurate inference on the variance components.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes a new methodology to compute Value at Risk (VaR) for quantifying losses in credit portfolios. We approximate the cumulative distribution of the loss function by a finite combination of Haar wavelet basis functions and calculate the coefficients of the approximation by inverting its Laplace transform. The Wavelet Approximation (WA) method is specially suitable for non-smooth distributions, often arising in small or concentrated portfolios, when the hypothesis of the Basel II formulas are violated. To test the methodology we consider the Vasicek one-factor portfolio credit loss model as our model framework. WA is an accurate, robust and fast method, allowing to estimate VaR much more quickly than with a Monte Carlo (MC) method at the same level of accuracy and reliability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce an algebraic operator framework to study discounted penalty functions in renewal risk models. For inter-arrival and claim size distributions with rational Laplace transform, the usual integral equation is transformed into a boundary value problem, which is solved by symbolic techniques. The factorization of the differential operator can be lifted to the level of boundary value problems, amounting to iteratively solving first-order problems. This leads to an explicit expression for the Gerber-Shiu function in terms of the penalty function.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Natural selection is typically exerted at some specific life stages. If natural selection takes place before a trait can be measured, using conventional models can cause wrong inference about population parameters. When the missing data process relates to the trait of interest, a valid inference requires explicit modeling of the missing process. We propose a joint modeling approach, a shared parameter model, to account for nonrandom missing data. It consists of an animal model for the phenotypic data and a logistic model for the missing process, linked by the additive genetic effects. A Bayesian approach is taken and inference is made using integrated nested Laplace approximations. From a simulation study we find that wrongly assuming that missing data are missing at random can result in severely biased estimates of additive genetic variance. Using real data from a wild population of Swiss barn owls Tyto alba, our model indicates that the missing individuals would display large black spots; and we conclude that genes affecting this trait are already under selection before it is expressed. Our model is a tool to correctly estimate the magnitude of both natural selection and additive genetic variance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analyzing functional data often leads to finding common factors, for which functional principal component analysis proves to be a useful tool to summarize and characterize the random variation in a function space. The representation in terms of eigenfunctions is optimal in the sense of L-2 approximation. However, the eigenfunctions are not always directed towards an interesting and interpretable direction in the context of functional data and thus could obscure the underlying structure. To overcome such difficulty, an alternative to functional principal component analysis is proposed that produces directed components which may be more informative and easier to interpret. These structural components are similar to principal components, but are adapted to situations in which the domain of the function may be decomposed into disjoint intervals such that there is effectively independence between intervals and positive correlation within intervals. The approach is demonstrated with synthetic examples as well as real data. Properties for special cases are also studied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Study on the likelihood and prevalence of patients with copd, over a year in a family medicine consultation, during 2012 and first two months of 2013. In a query of a health center about 15oo patients every 6 months probabilistic evolution was studied according to the theory of Laplace. Analyze both the COPD, its symptoms, etiology, clinical consultation and treatment in Family Medicine.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Comprend : Dancourt

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The contributions of the correlated and uncorrelated components of the electron-pair density to atomic and molecular intracule I(r) and extracule E(R) densities and its Laplacian functions ∇2I(r) and ∇2E(R) are analyzed at the Hartree-Fock (HF) and configuration interaction (CI) levels of theory. The topologies of the uncorrelated components of these functions can be rationalized in terms of the corresponding one-electron densities. In contrast, by analyzing the correlated components of I(r) and E(R), namely, IC(r) and EC(R), the effect of electron Fermi and Coulomb correlation can be assessed at the HF and CI levels of theory. Moreover, the contribution of Coulomb correlation can be isolated by means of difference maps between IC(r) and EC(R) distributions calculated at the two levels of theory. As application examples, the He, Ne, and Ar atomic series, the C2-2, N2, O2+2 molecular series, and the C2H4 molecule have been investigated. For these atoms and molecules, it is found that Fermi correlation accounts for the main characteristics of IC(r) and EC(R), with Coulomb correlation increasing slightly the locality of these functions at the CI level of theory. Furthermore, IC(r), EC(R), and the associated Laplacian functions, reveal the short-ranged nature and high isotropy of Fermi and Coulomb correlation in atoms and molecules

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A topological analysis of intracule and extracule densities and their Laplacians computed within the Hartree-Fock approximation is presented. The analysis of the density distributions reveals that among all possible electron-electron interactions in atoms and between atoms in molecules only very few are located rigorously as local maxima. In contrast, they are clearly identified as local minima in the topology of Laplacian maps. The conceptually different interpretation of intracule and extracule maps is also discussed in detail. An application example to the C2H2, C2H4, and C2H6 series of molecules is presented