916 resultados para Random coefficients


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In many situations probability models are more realistic than deterministic models. Several phenomena occurring in physics are studied as random phenomena changing with time and space. Stochastic processes originated from the needs of physicists.Let X(t) be a random variable where t is a parameter assuming values from the set T. Then the collection of random variables {X(t), t ∈ T} is called a stochastic process. We denote the state of the process at time t by X(t) and the collection of all possible values X(t) can assume, is called state space

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we solve the duplication problem P_n(ax) = sum_{m=0}^{n}C_m(n,a)P_m(x) where {P_n}_{n>=0} belongs to a wide class of polynomials, including the classical orthogonal polynomials (Hermite, Laguerre, Jacobi) as well as the classical discrete orthogonal polynomials (Charlier, Meixner, Krawtchouk) for the specific case a = −1. We give closed-form expressions as well as recurrence relations satisfied by the duplication coefficients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we derive an identity for the Fourier coefficients of a differentiable function f(t) in terms of the Fourier coefficients of its derivative f'(t). This yields an algorithm to compute the Fourier coefficients of f(t) whenever the Fourier coefficients of f'(t) are known, and vice versa. Furthermore this generates an iterative scheme for N times differentiable functions complementing the direct computation of Fourier coefficients via the defining integrals which can be also treated automatically in certain cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, we have mainly achieved the following: 1. we provide a review of the main methods used for the computation of the connection and linearization coefficients between orthogonal polynomials of a continuous variable, moreover using a new approach, the duplication problem of these polynomial families is solved; 2. we review the main methods used for the computation of the connection and linearization coefficients of orthogonal polynomials of a discrete variable, we solve the duplication and linearization problem of all orthogonal polynomials of a discrete variable; 3. we propose a method to generate the connection, linearization and duplication coefficients for q-orthogonal polynomials; 4. we propose a unified method to obtain these coefficients in a generic way for orthogonal polynomials on quadratic and q-quadratic lattices. Our algorithmic approach to compute linearization, connection and duplication coefficients is based on the one used by Koepf and Schmersau and on the NaViMa algorithm. Our main technique is to use explicit formulas for structural identities of classical orthogonal polynomial systems. We find our results by an application of computer algebra. The major algorithmic tools for our development are Zeilberger’s algorithm, q-Zeilberger’s algorithm, the Petkovšek-van-Hoeij algorithm, the q-Petkovšek-van-Hoeij algorithm, and Algorithm 2.2, p. 20 of Koepf's book "Hypergeometric Summation" and it q-analogue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have developed a technique called RISE (Random Image Structure Evolution), by which one may systematically sample continuous paths in a high-dimensional image space. A basic RISE sequence depicts the evolution of an object's image from a random field, along with the reverse sequence which depicts the transformation of this image back into randomness. The processing steps are designed to ensure that important low-level image attributes such as the frequency spectrum and luminance are held constant throughout a RISE sequence. Experiments based on the RISE paradigm can be used to address some key open issues in object perception. These include determining the neural substrates underlying object perception, the role of prior knowledge and expectation in object perception, and the developmental changes in object perception skills from infancy to adulthood.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article examines the effect on price of different characteristics of holiday hotels in the sun-and-beach segment, under the hedonic function perspective. Monthly prices of the majority of hotels in the Spanish continental Mediterranean coast are gathered from May to October 1999 from the tour operator catalogues. Hedonic functions are specified as random-effect models and parametrized as structural equation models with two latent variables, a random peak season price and a random width of seasonal fluctuations. Characteristics of the hotel and the region where they are located are used as predictors of both latent variables. Besides hotel category, region, distance to the beach, availability of parking place and room equipment have an effect on peak price and also on seasonality. 3- star hotels have the highest seasonality and hotels located in the southern regions the lowest, which could be explained by a warmer climate in autumn

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyze a finite horizon, single product, periodic review model in which pricing and production/inventory decisions are made simultaneously. Demands in different periods are random variables that are independent of each other and their distributions depend on the product price. Pricing and ordering decisions are made at the beginning of each period and all shortages are backlogged. Ordering cost includes both a fixed cost and a variable cost proportional to the amount ordered. The objective is to find an inventory policy and a pricing strategy maximizing expected profit over the finite horizon. We show that when the demand model is additive, the profit-to-go functions are k-concave and hence an (s,S,p) policy is optimal. In such a policy, the period inventory is managed based on the classical (s,S) policy and price is determined based on the inventory position at the beginning of each period. For more general demand functions, i.e., multiplicative plus additive functions, we demonstrate that the profit-to-go function is not necessarily k-concave and an (s,S,p) policy is not necessarily optimal. We introduce a new concept, the symmetric k-concave functions and apply it to provide a characterization of the optimal policy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyze an infinite horizon, single product, periodic review model in which pricing and production/inventory decisions are made simultaneously. Demands in different periods are identically distributed random variables that are independent of each other and their distributions depend on the product price. Pricing and ordering decisions are made at the beginning of each period and all shortages are backlogged. Ordering cost includes both a fixed cost and a variable cost proportional to the amount ordered. The objective is to maximize expected discounted, or expected average profit over the infinite planning horizon. We show that a stationary (s,S,p) policy is optimal for both the discounted and average profit models with general demand functions. In such a policy, the period inventory is managed based on the classical (s,S) policy and price is determined based on the inventory position at the beginning of each period.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We generalize a previous model of time-delayed reaction–diffusion fronts (Fort and Méndez 1999 Phys. Rev. Lett. 82 867) to allow for a bias in the microscopic random walk of particles or individuals. We also present a second model which takes the time order of events (diffusion and reproduction) into account. As an example, we apply them to the human invasion front across the USA in the 19th century. The corrections relative to the previous model are substantial. Our results are relevant to physical and biological systems with anisotropic fronts, including particle diffusion in disordered lattices, population invasions, the spread of epidemics, etc

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introducción: la enfermedad hepática grasa no alcohólica (NAFLD) es una enfermedad muy frecuente y de curso insidioso. El diagnostico, seguimiento y tratamiento de esta condición permanecen aun sin consenso debido principalmente a la falta de conocimiento de su historia natural y la dificultad de un diagnostico preciso de forma no invasiva. Materiales y Métodos: estudio prospectivo, observacional de corte transversal y correlación usando un muestreo no aleatorio de los pacientes que asistieron al servicio de chequeo médico de la Fundación CardioInfantil – Instituto de Cardiología. Se evaluaron variables clínicas y para-clínicas como Índice de Masa Corporal, transaminasas, triglicéridos y apariencia ultrasonográfica del hígado. Se realizo análisis no paramétrico de varianza con la prueba de Kruskal-Wallis y análisis de correlación por medio del índice de correlación de Spearman. Resultados: se incluyeron 619 pacientes. Se encontró una variación estadísticamente significativa (p<0,001) entre todas las variables analizadas agrupadas de acuerdo a la apariencia ultrasonográfica del hígado. Finalmente, se encontraron coeficientes de correlación positivos y estadísticamente significativos (p<0,001) para las mismas variables. Discusión: la evaluación por ultrasonografía del hígado es una opción atractiva para el diagnostico y seguimiento de los pacientes con NAFLD debido a sus características no invasivas, bajo costo y amplia disponibilidad. Los resultados obtenidos sugieren que dada la variación de los parámetros clínicos de acuerdo con la apariencia hepática, esta herramienta puede ser útil tanto en fase de diagnostico como en fase de seguimiento para los pacientes de esta población. Los coeficientes de correlación sugieren que la posibilidad de predecir variables sanguíneas usando este método que debería estudiarse más a fondo. Conclusiones: en conjunto, los resultados de este estudio soportan la utilidad de la evaluación ultrasonográfica del hígado como herramienta de evaluación y posible seguimiento en pacientes con sospecha de NAFLD en esta población.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper uses Colombian household survey data collected over the period 1984-2005 to estimate Gini coe¢ cients along with their corresponding standard errors. We Önd a statistically signiÖcant increase in wage income inequality following the adoption of the liberalisation measures of the early 1990s, and mixed evidence during the recovery years that followed the economic recession of the late 1990s. We also Önd that in several cases the observed di§erences in the Gini coe¢ cients across cities have not been statistically signiÖcant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El objetivo de este documento es recopilar algunos resultados clasicos sobre existencia y unicidad ´ de soluciones de ecuaciones diferenciales estocasticas (EDEs) con condici ´ on final (en ingl ´ es´ Backward stochastic differential equations) con particular enfasis en el caso de coeficientes mon ´ otonos, y su cone- ´ xion con soluciones de viscosidad de sistemas de ecuaciones diferenciales parciales (EDPs) parab ´ olicas ´ y el´ıpticas semilineales de segundo orden.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A branching random motion on a line, with abrupt changes of direction, is studied. The branching mechanism, being independient of random motion, and intensities of reverses are defined by a particle's current direction. A soluton of a certain hyperbolic system of coupled non-linear equations (Kolmogorov type backward equation) have a so-called McKean representation via such processes. Commonly this system possesses traveling-wave solutions. The convergence of solutions with Heaviside terminal data to the travelling waves is discussed.This Paper realizes the McKean programme for the Kolmogorov-Petrovskii-Piskunov equation in this case. The Feynman-Kac formula plays a key role.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the key aspects in 3D-image registration is the computation of the joint intensity histogram. We propose a new approach to compute this histogram using uniformly distributed random lines to sample stochastically the overlapping volume between two 3D-images. The intensity values are captured from the lines at evenly spaced positions, taking an initial random offset different for each line. This method provides us with an accurate, robust and fast mutual information-based registration. The interpolation effects are drastically reduced, due to the stochastic nature of the line generation, and the alignment process is also accelerated. The results obtained show a better performance of the introduced method than the classic computation of the joint histogram