951 resultados para stochastic analysis


Relevância:

60.00% 60.00%

Publicador:

Resumo:

长期以来,材料的孔洞损伤一直是力学家和材料学家所关注的焦点之一,相应的研究方法很多,所得到的成果也很丰富。但是这些研究大部分是基于单个孔洞或有限个孔洞来考虑的,很少将大量的孔洞损伤作为整体来探讨。本文就是考虑到在韧性金属合金材料的破坏和失效过程中,往往是有大量的孔洞损伤参与其中的。我们试图将这些作为整体来考虑,并着重对初始裂纹钝化扩展过程的裂尖前沿来进行研究和讨论。本文从微孔洞数密度守恒方程出发,讨论了裂尖前沿孔洞损伤数密度群体化的方程以及它的解,探讨了损伤各阶矩的分布形式和演化规律。并且对一个系列低碳合金钢样品的I型初始裂纹的钝化扩展和断口孔洞的观察和统计的结果与计算模拟的结果进行了比较,得到了相同的趋势。计算模拟和试验的结果表明,在裂尖前沿孔洞损伤的群体演化过程中,损伤矩的分布是随着离开裂尖距离增加而减少的,并且这种分布随时间增加而且增加,并且趋于稳定分布。最后根据实验中反映出来的由于材料内部的不均匀等造成的孔洞损伤演化的不均匀性,引入随机涨落的概念导出局域孔洞数密度演化守恒方程来探讨这种不均匀性,通过模拟计算得到平均场理论和局域孔洞数密度守恒理论的差异,并由全场孔洞数密度演化守恒方程的分析来证实这个差异。

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conventional Hidden Markov models generally consist of a Markov chain observed through a linear map corrupted by additive noise. This general class of model has enjoyed a huge and diverse range of applications, for example, speech processing, biomedical signal processing and more recently quantitative finance. However, a lesser known extension of this general class of model is the so-called Factorial Hidden Markov Model (FHMM). FHMMs also have diverse applications, notably in machine learning, artificial intelligence and speech recognition [13, 17]. FHMMs extend the usual class of HMMs, by supposing the partially observed state process is a finite collection of distinct Markov chains, either statistically independent or dependent. There is also considerable current activity in applying collections of partially observed Markov chains to complex action recognition problems, see, for example, [6]. In this article we consider the Maximum Likelihood (ML) parameter estimation problem for FHMMs. Much of the extant literature concerning this problem presents parameter estimation schemes based on full data log-likelihood EM algorithms. This approach can be slow to converge and often imposes heavy demands on computer memory. The latter point is particularly relevant for the class of FHMMs where state space dimensions are relatively large. The contribution in this article is to develop new recursive formulae for a filter-based EM algorithm that can be implemented online. Our new formulae are equivalent ML estimators, however, these formulae are purely recursive and so, significantly reduce numerical complexity and memory requirements. A computer simulation is included to demonstrate the performance of our results. © Taylor & Francis Group, LLC.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We consider a method for approximate inference in hidden Markov models (HMMs). The method circumvents the need to evaluate conditional densities of observations given the hidden states. It may be considered an instance of Approximate Bayesian Computation (ABC) and it involves the introduction of auxiliary variables valued in the same space as the observations. The quality of the approximation may be controlled to arbitrary precision through a parameter ε > 0. We provide theoretical results which quantify, in terms of ε, the ABC error in approximation of expectations of additive functionals with respect to the smoothing distributions. Under regularity assumptions, this error is, where n is the number of time steps over which smoothing is performed. For numerical implementation, we adopt the forward-only sequential Monte Carlo (SMC) scheme of [14] and quantify the combined error from the ABC and SMC approximations. This forms some of the first quantitative results for ABC methods which jointly treat the ABC and simulation errors, with a finite number of data and simulated samples. © Taylor & Francis Group, LLC.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work analyzes the relationship between large food webs describing potential feeding relations between species and smaller sub-webs thereof describing relations actually realized in local communities of various sizes. Special attention is given to the relationships between patterns of phylogenetic correlations encountered in large webs and sub-webs. Based on the current theory of food-web topology as implemented in the matching model, it is shown that food webs are scale invariant in the following sense: given a large web described by the model, a smaller, randomly sampled sub-web thereof is described by the model as well. A stochastic analysis of model steady states reveals that such a change in scale goes along with a re-normalization of model parameters. Explicit formulae for the renormalized parameters are derived. Thus, the topology of food webs at all scales follows the same patterns, and these can be revealed by data and models referring to the local scale alone. As a by-product of the theory, a fast algorithm is derived which yields sample food webs from the exact steady state of the matching model for a high-dimensional trophic niche space in finite time. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Aim: To determine whether internal limiting membrane (ILM) peeling is cost-effective compared with no peeling for patients with an idiopathic stage 2 or 3 full-thickness macular hole. Methods: A cost-effectiveness analysis was performed alongside a randomised controlled trial. 141 participants were randomly allocated to receive macular-hole surgery, with either ILM peeling or no peeling. Health-service resource use, costs and quality of life were calculated for each participant. The incremental cost per quality-adjusted life year (QALY) gained was calculated at 6 months. Results: At 6 months, the total costs were on average higher (£424, 95% CI -182 to 1045) in the No Peel arm, primarily owing to the higher reoperation rate in the No Peel arm. The mean additional QALYs from ILM peel at 6 months were 0.002 (95% CI 0.01 to 0.013), adjusting for baseline EQ-5D and other minimisation factors. A mean incremental cost per QALY was not computed, as Peeling was on average less costly and slightly more effective. A stochastic analysis suggested that there was more than a 90% probability that Peeling would be cost-effective at a willingness-to-pay threshold of £20 000 per QALY. Conclusion: Although there is no evidence of a statistically significant difference in either costs or QALYs between macular hole surgery with or without ILM peeling, the balance of probabilities is that ILM Peeling is likely to be a cost-effective option for the treatment of macular holes. Further long-term follow-up data are needed to confirm these findings.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The process of accounting for heterogeneity has made significant advances in statistical research, primarily in the framework of stochastic analysis and the development of multiple-point statistics (MPS). Among MPS techniques, the direct sampling (DS) method is tested to determine its ability to delineate heterogeneity from aerial magnetics data in a regional sandstone aquifer intruded by low-permeability volcanic dykes in Northern Ireland, UK. The use of two two-dimensional bivariate training images aids in creating spatial probability distributions of heterogeneities of hydrogeological interest, despite relatively ‘noisy’ magnetics data (i.e. including hydrogeologically irrelevant urban noise and regional geologic effects). These distributions are incorporated into a hierarchy system where previously published density function and upscaling methods are applied to derive regional distributions of equivalent hydraulic conductivity tensor K. Several K models, as determined by several stochastic realisations of MPS dyke locations, are computed within groundwater flow models and evaluated by comparing modelled heads with field observations. Results show a significant improvement in model calibration when compared to a simplistic homogeneous and isotropic aquifer model that does not account for the dyke occurrence evidenced by airborne magnetic data. The best model is obtained when normal and reverse polarity dykes are computed separately within MPS simulations and when a probability threshold of 0.7 is applied. The presented stochastic approach also provides improvement when compared to a previously published deterministic anisotropic model based on the unprocessed (i.e. noisy) airborne magnetics. This demonstrates the potential of coupling MPS to airborne geophysical data for regional groundwater modelling.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Esta tesis está dividida en dos partes: en la primera parte se presentan y estudian los procesos telegráficos, los procesos de Poisson con compensador telegráfico y los procesos telegráficos con saltos. El estudio presentado en esta primera parte incluye el cálculo de las distribuciones de cada proceso, las medias y varianzas, así como las funciones generadoras de momentos entre otras propiedades. Utilizando estas propiedades en la segunda parte se estudian los modelos de valoración de opciones basados en procesos telegráficos con saltos. En esta parte se da una descripción de cómo calcular las medidas neutrales al riesgo, se encuentra la condición de no arbitraje en este tipo de modelos y por último se calcula el precio de las opciones Europeas de compra y venta.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El objetivo de este documento es recopilar algunos resultados clasicos sobre existencia y unicidad ´ de soluciones de ecuaciones diferenciales estocasticas (EDEs) con condici ´ on final (en ingl ´ es´ Backward stochastic differential equations) con particular enfasis en el caso de coeficientes mon ´ otonos, y su cone- ´ xion con soluciones de viscosidad de sistemas de ecuaciones diferenciales parciales (EDPs) parab ´ olicas ´ y el´ıpticas semilineales de segundo orden.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

For Wiener spaces conditional expectations and $L^{2}$-martingales w.r.t. the natural filtration have a natural representation in terms of chaos expansion. In this note an extension to larger classes of processes is discussed. In particular, it is pointed out that orthogonality of the chaos expansion is not required.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Seguindo a tendência mundial de um melhor gerenciamento de riscos, o regulador do mercado de seguros brasileiro, após a implementação dos demais riscos, está em fase avançada de desenvolvimento de seu modelo para aferir o risco de mercado das seguradoras. Uma vez que as discussões cessem, as empresas serão forçadas a usar um modelo que, hoje, apresenta muitas falhas, gerando uma demanda de capital adicional de seus acionistas que pode levar algumas delas ao estado de insolvência. O principal objetivo deste estudo é analisar a adequação do modelo e subsidiar a discussão a fim de aperfeiçoar o modelo final, com análises comparativas com outros modelos no país e no mundo, estudo de cenários e visões do mercado. De modo geral, as análises feitas revelam problemas sérios no modelo, como necessidade de aporte de capital em empresas extremamente lucrativas e insuficiência de garantia de segurança pelo uso puro dos fatores de choque em detrimento a uma análise estocástica. Finalmente, são sugeridas algumas soluções para minimizar o efeito da inadequação do modelo e ainda algumas sugestões para melhoria do mesmo, de forma que os acionistas não sejam prejudicados, o regulador consiga administrar adequadamente os riscos e a sociedade seja beneficiada pela solidez das companhias em quem confiou seus riscos.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Associated with an ordered sequence of an even number 2N of positive real numbers is a birth and death process (BDP) on {0, 1, 2,..., N} having these real numbers as its birth and death rates. We generate another birth and death process from this BDP on {0, 1, 2,..., 2N}. This can be further iterated. We illustrate with an example from tan(kz). In BDP, the decay parameter, viz., the largest non-zero eigenvalue is important in the study of convergence to stationarity. In this article, the smallest eigenvalue is found to be useful.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this study we analyzed the influence of demographic parameters on the population dynamics of Tribolium castaneum, combining empiricism and population theory to analyze the different effects of environmental heterogeneity, by employing Ricker models, designed to study a two-patch system taking into account deterministic and stochastic analysis. Results were expressed by bifurcation diagrams and stochastic simulations. Dynamic equilibrium was widely investigated with results suggesting specific parametric spaces in response to environmental heterogeneity and migration. Population equilibrium patterns, synchrony and persistence in T. castaneum were discussed

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The inherent stochastic character of most of the physical quantities involved in engineering models has led to an always increasing interest for probabilistic analysis. Many approaches to stochastic analysis have been proposed. However, it is widely acknowledged that the only universal method available to solve accurately any kind of stochastic mechanics problem is Monte Carlo Simulation. One of the key parts in the implementation of this technique is the accurate and efficient generation of samples of the random processes and fields involved in the problem at hand. In the present thesis an original method for the simulation of homogeneous, multi-dimensional, multi-variate, non-Gaussian random fields is proposed. The algorithm has proved to be very accurate in matching both the target spectrum and the marginal probability. The computational efficiency and robustness are very good too, even when dealing with strongly non-Gaussian distributions. What is more, the resulting samples posses all the relevant, welldefined and desired properties of “translation fields”, including crossing rates and distributions of extremes. The topic of the second part of the thesis lies in the field of non-destructive parametric structural identification. Its objective is to evaluate the mechanical characteristics of constituent bars in existing truss structures, using static loads and strain measurements. In the cases of missing data and of damages that interest only a small portion of the bar, Genetic Algorithm have proved to be an effective tool to solve the problem.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Esta tesis realiza una contribución metodológica al problema de la gestión óptima de embalses hidroeléctricos durante eventos de avenidas, considerando un enfoque estocástico y multiobjetivo. Para ello se propone una metodología de evaluación de estrategias de laminación en un contexto probabilístico y multiobjetivo. Además se desarrolla un entorno dinámico de laminación en tiempo real con pronósticos que combina un modelo de optimización y algoritmos de simulación. Estas herramientas asisten a los gestores de las presas en la toma de decisión respecto de cuál es la operación más adecuada del embalse. Luego de una detallada revisión de la bibliografía, se observó que los trabajos en el ámbito de la gestión óptima de embalses en avenidas utilizan, en general, un número reducido de series de caudales o hidrogramas para caracterizar los posibles escenarios. Limitando el funcionamiento satisfactorio de un modelo determinado a situaciones hidrológicas similares. Por otra parte, la mayoría de estudios disponibles en este ámbito abordan el problema de la laminación en embalses multipropósito durante la temporada de avenidas, con varios meses de duración. Estas características difieren de la realidad de la gestión de embalses en España. Con los avances computacionales en materia de gestión de información en tiempo real, se observó una tendencia a la implementación de herramientas de operación en tiempo real con pronósticos para determinar la operación a corto plazo (involucrando el control de avenidas). La metodología de evaluación de estrategias propuesta en esta tesis se basa en determinar el comportamiento de éstas frente a un espectro de avenidas características de la solicitación hidrológica. Con ese fin, se combina un sistema de evaluación mediante indicadores y un entorno de generación estocástica de avenidas, obteniéndose un sistema implícitamente estocástico. El sistema de evaluación consta de tres etapas: caracterización, síntesis y comparación, a fin de poder manejar la compleja estructura de datos resultante y realizar la evaluación. En la primera etapa se definen variables de caracterización, vinculadas a los aspectos que se quieren evaluar (seguridad de la presa, control de inundaciones, generación de energía, etc.). Estas variables caracterizan el comportamiento del modelo para un aspecto y evento determinado. En la segunda etapa, la información de estas variables se sintetiza en un conjunto de indicadores, lo más reducido posible. Finalmente, la comparación se lleva a cabo a partir de la comparación de esos indicadores, bien sea mediante la agregación de dichos objetivos en un indicador único, o bien mediante la aplicación del criterio de dominancia de Pareto obteniéndose un conjunto de soluciones aptas. Esta metodología se aplicó para calibrar los parámetros de un modelo de optimización de embalse en laminación y su comparación con otra regla de operación, mediante el enfoque por agregación. Luego se amplió la metodología para evaluar y comparar reglas de operación existentes para el control de avenidas en embalses hidroeléctricos, utilizando el criterio de dominancia. La versatilidad de la metodología permite otras aplicaciones, tales como la determinación de niveles o volúmenes de seguridad, o la selección de las dimensiones del aliviadero entre varias alternativas. Por su parte, el entorno dinámico de laminación al presentar un enfoque combinado de optimización-simulación, permite aprovechar las ventajas de ambos tipos de modelos, facilitando la interacción con los operadores de las presas. Se mejoran los resultados respecto de los obtenidos con una regla de operación reactiva, aun cuando los pronósticos se desvían considerablemente del hidrograma real. Esto contribuye a reducir la tan mencionada brecha entre el desarrollo teórico y la aplicación práctica asociada a los modelos de gestión óptima de embalses. This thesis presents a methodological contribution to address the problem about how to operate a hydropower reservoir during floods in order to achieve an optimal management considering a multiobjective and stochastic approach. A methodology is proposed to assess the flood control strategies in a multiobjective and probabilistic framework. Additionally, a dynamic flood control environ was developed for real-time operation, including forecasts. This dynamic platform combines simulation and optimization models. These tools may assist to dam managers in the decision making process, regarding the most appropriate reservoir operation to be implemented. After a detailed review of the bibliography, it was observed that most of the existing studies in the sphere of flood control reservoir operation consider a reduce number of hydrographs to characterize the reservoir inflows. Consequently, the adequate functioning of a certain strategy may be limited to similar hydrologic scenarios. In the other hand, most of the works in this context tackle the problem of multipurpose flood control operation considering the entire flood season, lasting some months. These considerations differ from the real necessity in the Spanish context. The implementation of real-time reservoir operation is gaining popularity due to computational advances and improvements in real-time data management. The methodology proposed in this thesis for assessing the strategies is based on determining their behavior for a wide range of floods, which are representative of the hydrological forcing of the dam. An evaluation algorithm is combined with a stochastic flood generation system to obtain an implicit stochastic analysis framework. The evaluation system consists in three stages: characterizing, synthesizing and comparing, in order to handle the complex structure of results and, finally, conduct the evaluation process. In the first stage some characterization variables are defined. These variables should be related to the different aspects to be evaluated (such as dam safety, flood protection, hydropower, etc.). Each of these variables characterizes the behavior of a certain operating strategy for a given aspect and event. In the second stage this information is synthesized obtaining a reduced group of indicators or objective functions. Finally, the indicators are compared by means of an aggregated approach or by a dominance criterion approach. In the first case, a single optimum solution may be achieved. However in the second case, a set of good solutions is obtained. This methodology was applied for calibrating the parameters of a flood control model and to compare it with other operating policy, using an aggregated method. After that, the methodology was extent to assess and compared some existing hydropower reservoir flood control operation, considering the Pareto approach. The versatility of the method allows many other applications, such as determining the safety levels, defining the spillways characteristics, among others. The dynamic framework for flood control combines optimization and simulation models, exploiting the advantages of both techniques. This facilitates the interaction between dam operators and the model. Improvements are obtained applying this system when compared with a reactive operating policy, even if the forecasts deviate significantly from the observed hydrograph. This approach contributes to reduce the gap between the theoretical development in the field of reservoir management and its practical applications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A new structure with the special property that instantaneous resurrection and mass disaster are imposed on an ordinary birth-death process is considered. Under the condition that the underlying birth-death process is exit or bilateral, we are able to give easily checked existence criteria for such Markov processes. A very simple uniqueness criterion is also established. All honest processes are explicitly constructed. Ergodicity properties for these processes are investigated. Surprisingly, it can be proved that all the honest processes are not only recurrent but also ergodic without imposing any extra conditions. Equilibrium distributions are then established. Symmetry and reversibility of such processes are also investigated. Several examples are provided to illustrate our results.