36 resultados para Consommation de poisson


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we study one-dimensional reflected backward stochastic differential equation when the noise is driven by a Brownian motion and an independent Poisson point process when the solution is forced to stay above a right continuous left-hand limited obstacle. We prove existence and uniqueness of the solution by using a penalization method combined with a monotonic limit theorem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En aquest treball s’implementa un model analític de les característiques DC del MOSFET de doble porta (DG-MOSFET), basat en la solució de l’equació de Poisson i en la teoria de deriva-difussió[1]. El MOSFET de doble porta asimètric presenta una gran flexibilitat en el disseny de la tensió llindar i del corrent OFF. El model analític reprodueix les característiques DC del DG-MOSFET de canal llarg i és la base per construir models circuitals tipus SPICE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This empirical work studies the influence of immigrant students on individuals’ school choice in one of the most populated regions in Spain: Catalonia. It has estimated, following the Poisson model, the probability that a certain school, which immigrant students are already attending, may be chosen by natives as well as by immigrants, respectively. The information provided by the Catalonia School Department presents school characteristics of all the primary and secondary schools in Catalonia during the 2001/02 and 2002/03 school years. The results obtained support the evidence that Catalonia native families avoid schools attended by immigrants. Natives certainly prefer not to interact with immigrants. Private schools are more successful in avoiding immigrants. Finally, the main reason for non-natives’ choice is the presence of other non-natives in the same school.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has been recently found that a number of systems displaying crackling noise also show a remarkable behavior regarding the temporal occurrence of successive events versus their size: a scaling law for the probability distributions of waiting times as a function of a minimum size is fulfilled, signaling the existence on those systems of self-similarity in time-size. This property is also present in some non-crackling systems. Here, the uncommon character of the scaling law is illustrated with simple marked renewal processes, built by definition with no correlations. Whereas processes with a finite mean waiting time do not fulfill a scaling law in general and tend towards a Poisson process in the limit of very high sizes, processes without a finite mean tend to another class of distributions, characterized by double power-law waiting-time densities. This is somehow reminiscent of the generalized central limit theorem. A model with short-range correlations is not able to escape from the attraction of those limit distributions. A discussion on open problems in the modeling of these properties is provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper tries to resolve some of the main shortcomings in the empirical literature of location decisions for new plants, i.e. spatial effects and overdispersion. Spatial effects are omnipresent, being a source of overdispersion in the data as well as a factor shaping the functional relationship between the variables that explain a firm’s location decisions. Using Count Data models, empirical researchers have dealt with overdispersion and excess zeros by developments of the Poisson regression model. This study aims to take this a step further, by adopting Bayesian methods and models in order to tackle the excess of zeros, spatial and non-spatial overdispersion and spatial dependence simultaneously. Data for Catalonia is used and location determinants are analysed to that end. The results show that spatial effects are determinant. Additionally, overdispersion is descomposed into an unstructured iid effect and a spatially structured effect. Keywords: Bayesian Analysis, Spatial Models, Firm Location. JEL Classification: C11, C21, R30.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A collection of spherical obstacles in the unit ball in Euclidean space is said to be avoidable for Brownian motion if there is a positive probability that Brownian motion diffusing from some point in the ball will avoid all the obstacles and reach the boundary of the ball. The centres of the spherical obstacles are generated according to a Poisson point process while the radius of an obstacle is a deterministic function. If avoidable configurations are generated with positive probability Lundh calls this percolation diffusion. An integral condition for percolation diffusion is derived in terms of the intensity of the point process and the function that determines the radii of the obstacles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A collection of spherical obstacles in the unit ball in Euclidean space is said to be avoidable for Brownian motion if there is a positive probability that Brownian motion diffusing from some point in the ball will avoid all the obstacles and reach the boundary of the ball. The centres of the spherical obstacles are generated according to a Poisson point process while the radius of an obstacle is a deterministic function. If avoidable con gurations are generated with positive probability Lundh calls this percolation di usion. An integral condition for percolation di ffusion is derived in terms of the intensity of the point process and the function that determines the radii of the obstacles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Early detection of breast cancer (BC) with mammography may cause overdiagnosis andovertreatment, detecting tumors which would remain undiagnosed during a lifetime. The aims of this study were: first, to model invasive BC incidence trends in Catalonia (Spain) taking into account reproductive and screening data; and second, to quantify the extent of BC overdiagnosis. We modeled the incidence of invasive BC using a Poisson regression model. Explanatory variables were:age at diagnosis and cohort characteristics (completed fertility rate, percentage of women that use mammography at age 50, and year of birth). This model also was used to estimate the background incidence in the absence of screening. We used a probabilistic model to estimate the expected BC incidence if women in the population usedmammography as reported in health surveys. The difference between the observed and expected cumulative incidences provided an estimate of overdiagnosis.Incidence of invasive BC increased, especially in cohorts born from 1940 to 1955. The biggest increase was observed in these cohorts between the ages of 50 to 65 years, where the final BC incidence rates more than doubled the initial ones. Dissemination of mammography was significantly associated with BC incidence and overdiagnosis. Our estimates of overdiagnosis ranged from 0.4% to 46.6%, for women born around 1935 and 1950, respectively.Our results support the existence of overdiagnosis in Catalonia attributed to mammography usage, and the limited malignant potential of some tumors may play an important role. Women should be better informed about this risk. Research should be oriented towards personalized screening and risk assessment tools

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper examines a dataset which is modeled well by thePoisson-Log Normal process and by this process mixed with LogNormal data, which are both turned into compositions. Thisgenerates compositional data that has zeros without any need forconditional models or assuming that there is missing or censoreddata that needs adjustment. It also enables us to model dependenceon covariates and within the composition

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dans certains pays européens, les produits carnés élaborés peuvent représenter près de 20% de la consommation journalière de sodium. De ce fait, les industries de la viande tentent de réduire la teneur en sel dans les produits carnés pour répondre, d’une part aux attentes des consommateurs et d’autre part aux demandes des autorités sanitaires. Le système Quick‐Dry‐Slice process (QDS®), couplé avec l’utilisation de sels substituant le chlorure de sodium (NaCl), a permis de fabriquer, avec succès, des saucisses fermentées à basse teneur en sel en réduisant le cycle de fabrication et sans ajout de NaCl supplémentaire. Les technologies de mesure en ligne non destructives, comme les rayons X et l’induction électromagnétique, permettent de classifier les jambons frais suivant leur teneur en gras, un paramètre crucial pour adapter la durée de l’étape de salaison. La technologie des rayons X peut aussi être utilisée pour estimer la quantité de sel incorporée pendant la salaison. L’information relative aux teneurs en sel et en gras est importante pour optimiser le processus d’élaboration du jambon sec en réduisant la variabilité de la teneur en sel entre les lots et dans un même lot, mais aussi pour réduire la teneur en sel du produit final. D’autres technologies comme la spectroscopie en proche infrarouge (NIRS) ou spectroscopie microondes sont aussi utiles pour contrôler le processus d’élaboration et pour caractériser et classifier les produits carnés élaborés, selon leur teneur en sel. La plupart de ces technologies peuvent être facilement appliquées en ligne dans l’industrie afin de contrôler le processus de fabrication et d’obtenir ainsi des produits carnés présentant les caractéristiques recherchées.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We include solvation effects in tight-binding Hamiltonians for hole states in DNA. The corresponding linear-response parameters are derived from accurate estimates of solvation energy calculated for several hole charge distributions in DNA stacks. Two models are considered: (A) the correction to a diagonal Hamiltonian matrix element depends only on the charge localized on the corresponding site and (B) in addition to this term, the reaction field due to adjacent base pairs is accounted for. We show that both schemes give very similar results. The effects of the polar medium on the hole distribution in DNA are studied. We conclude that the effects of polar surroundings essentially suppress charge delocalization in DNA, and hole states in (GC)n sequences are localized on individual guanines

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: With increasing computer power, simulating the dynamics of complex systems in chemistry and biology is becoming increasingly routine. The modelling of individual reactions in (bio)chemical systems involves a large number of random events that can be simulated by the stochastic simulation algorithm (SSA). The key quantity is the step size, or waiting time, τ, whose value inversely depends on the size of the propensities of the different channel reactions and which needs to be re-evaluated after every firing event. Such a discrete event simulation may be extremely expensive, in particular for stiff systems where τ can be very short due to the fast kinetics of some of the channel reactions. Several alternative methods have been put forward to increase the integration step size. The so-called τ-leap approach takes a larger step size by allowing all the reactions to fire, from a Poisson or Binomial distribution, within that step. Although the expected value for the different species in the reactive system is maintained with respect to more precise methods, the variance at steady state can suffer from large errors as τ grows. Results: In this paper we extend Poisson τ-leap methods to a general class of Runge-Kutta (RK) τ-leap methods. We show that with the proper selection of the coefficients, the variance of the extended τ-leap can be well-behaved, leading to significantly larger step sizes.Conclusions: The benefit of adapting the extended method to the use of RK frameworks is clear in terms of speed of calculation, as the number of evaluations of the Poisson distribution is still one set per time step, as in the original τ-leap method. The approach paves the way to explore new multiscale methods to simulate (bio)chemical systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Models are presented for the optimal location of hubs in airline networks, that take into consideration the congestion effects. Hubs, which are the most congested airports, are modeled as M/D/c queuing systems, that is, Poisson arrivals, deterministic service time, and {\em c} servers. A formula is derived for the probability of a number of customers in the system, which is later used to propose a probabilistic constraint. This constraint limits the probability of {\em b} airplanes in queue, to be lesser than a value $\alpha$. Due to the computational complexity of the formulation. The model is solved using a meta-heuristic based on tabu search. Computational experience is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many dynamic revenue management models divide the sale period into a finite number of periods T and assume, invoking a fine-enough grid of time, that each period sees at most one booking request. These Poisson-type assumptions restrict the variability of the demand in the model, but researchers and practitioners were willing to overlook this for the benefit of tractability of the models. In this paper, we criticize this model from another angle. Estimating the discrete finite-period model poses problems of indeterminacy and non-robustness: Arbitrarily fixing T leads to arbitrary control values and on the other hand estimating T from data adds an additional layer of indeterminacy. To counter this, we first propose an alternate finite-population model that avoids this problem of fixing T and allows a wider range of demand distributions, while retaining the useful marginal-value properties of the finite-period model. The finite-population model still requires jointly estimating market size and the parameters of the customer purchase model without observing no-purchases. Estimation of market-size when no-purchases are unobservable has rarely been attempted in the marketing or revenue management literature. Indeed, we point out that it is akin to the classical statistical problem of estimating the parameters of a binomial distribution with unknown population size and success probability, and hence likely to be challenging. However, when the purchase probabilities are given by a functional form such as a multinomial-logit model, we propose an estimation heuristic that exploits the specification of the functional form, the variety of the offer sets in a typical RM setting, and qualitative knowledge of arrival rates. Finally we perform simulations to show that the estimator is very promising in obtaining unbiased estimates of population size and the model parameters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.