937 resultados para Discrete time inventory models


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper investigates which properties money-demand functions have to satisfy to be consistent with multidimensional extensions of Lucasí(2000) versions of the Sidrauski (1967) and the shopping-time models. We also investigate how such classes of models relate to each other regarding the rationalization of money demands. We conclude that money demand functions rationalizable by the shoppingtime model are always rationalizable by the Sidrauski model, but that the converse is not true. The log-log money demand with an interest-rate elasticity greater than or equal to one and the semi-log money demand are counterexamples.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this article, proportional hazards and logistic models for grouped survival data were extended to incorporate time-dependent covariates. The extension was motivated by a forestry experiment designed to compare five different water stresses in Eucalyptus grandis seedlings. The response was the seedling lifetime. The data set was grouped since there were just three occasions in which the seedlings was visited by the researcher. In each of these occasions also the shoot height was measured and therefore it is a time-dependent covariate. Both extended models were used in this example, and the results were very similar.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work develops a new methodology in order to discriminate models for interval-censored data based on bootstrap residual simulation by observing the deviance difference from one model in relation to another, according to Hinde (1992). Generally, this sort of data can generate a large number of tied observations and, in this case, survival time can be regarded as discrete. Therefore, the Cox proportional hazards model for grouped data (Prentice & Gloeckler, 1978) and the logistic model (Lawless, 1982) can befitted by means of generalized linear models. Whitehead (1989) considered censoring to be an indicative variable with a binomial distribution and fitted the Cox proportional hazards model using complementary log-log as a link function. In addition, a logistic model can be fitted using logit as a link function. The proposed methodology arises as an alternative to the score tests developed by Colosimo et al. (2000), where such models can be obtained for discrete binary data as particular cases from the Aranda-Ordaz distribution asymmetric family. These tests are thus developed with a basis on link functions to generate such a fit. The example that motivates this study was the dataset from an experiment carried out on a flax cultivar planted on four substrata susceptible to the pathogen Fusarium oxysoprum. The response variable, which is the time until blighting, was observed in intervals during 52 days. The results were compared with the model fit and the AIC values.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Ties among event times are often recorded in survival studies. For example, in a two week laboratory study where event times are measured in days, ties are very likely to occur. The proportional hazards model might be used in this setting using an approximated partial likelihood function. This approximation works well when the number of ties is small. on the other hand, discrete regression models are suggested when the data are heavily tied. However, in many situations it is not clear which approach should be used in practice. In this work, empirical guidelines based on Monte Carlo simulations are provided. These recommendations are based on a measure of the amount of tied data present and the mean square error. An example illustrates the proposed criterion.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The discrete phase space approach to quantum mechanics of degrees of freedom without classical counterparts is applied to the many-fermions/quasi-spin Lipkin model. The Wi:ner function is written for some chosen states associated to discrete angle and angular momentum variables, and the rime evolution is numerically calculated using the discrete von Neumnnn-Liouville equation. Direct evidences in the lime evolution of the Wigner function are extracted that identify a tunnelling effect. A connection with a SU(2)-based semiclassical continuous approach to the Lipkin model is also presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this work we compared the estimates of the parameters of ARCH models using a complete Bayesian method and an empirical Bayesian method in which we adopted a non-informative prior distribution and informative prior distribution, respectively. We also considered a reparameterization of those models in order to map the space of the parameters into real space. This procedure permits choosing prior normal distributions for the transformed parameters. The posterior summaries were obtained using Monte Carlo Markov chain methods (MCMC). The methodology was evaluated by considering the Telebras series from the Brazilian financial market. The results show that the two methods are able to adjust ARCH models with different numbers of parameters. The empirical Bayesian method provided a more parsimonious model to the data and better adjustment than the complete Bayesian method.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The von Neumann-Liouville time evolution equation is represented in a discrete quantum phase space. The mapped Liouville operator and the corresponding Wigner function are explicitly written for the problem of a magnetic moment interacting with a magnetic field and the precessing solution is found. The propagator is also discussed and a time interval operator, associated to a unitary operator which shifts the energy levels in the Zeeman spectrum, is introduced. This operator is associated to the particular dynamical process and is not the continuous parameter describing the time evolution. The pair of unitary operators which shifts the time and energy is shown to obey the Weyl-Schwinger algebra. (C) 1999 Elsevier B.V. B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Using the flexibility and constructive definition of the Schwinger bases, we developed different mapping procedures to enhance different aspects of the dynamics and of the symmetries of an extended version of the two-level Lipkin model. The classical limits of the dynamics are discussed in connection with the different mappings. Discrete Wigner functions are also calculated. © 1995.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The investigation of the dynamics of a discrete soliton in an array of Bose-Einstein condensates under the action of a periodically time-modulated atomic scattering length [Feshbach-resonance management (FRM)] was discussed. The slow and rapid modulations, in comparison with the tunneling frequency were considered. An averaged equation, which was a generalized discrete nonlinear Schrödinger equation, including higher-order effective nonlinearities and intersite nonlinear interactions was derived in the case of the rapid modulation. It was demonstrated that the modulations of sufficient strength results in splitting of the soliton by direct simulations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

As the number of simulation experiments increases, the necessity for validation and verification of these models demands special attention on the part of the simulation practitioners. By analyzing the current scientific literature, it is observed that the operational validation description presented in many papers does not agree on the importance designated to this process and about its applied techniques, subjective or objective. With the expectation of orienting professionals, researchers and students in simulation, this article aims to elaborate a practical guide through the compilation of statistical techniques in the operational validation of discrete simulation models. Finally, the guide's applicability was evaluated by using two study objects, which represent two manufacturing cells, one from the automobile industry and the other from a Brazilian tech company. For each application, the guide identified distinct steps, due to the different aspects that characterize the analyzed distributions. © 2011 Brazilian Operations Research Society.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper describes a program for the automatic generation of code for Intel's 8051 microcontroller. The code is generated from a place-transition Petri net specification. Our goal is to minimize programming time. The code generated by our program has been observed to exactly match the net model. It has also been observed that no change is needed to be made to the generated code for its compilation to the target architecture. © 2011 IFAC.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this paper is to present the application of a three-phase harmonic propagation analysis time-domain tool, using the Norton model to approach the modeling of non-linear loads, making the harmonics currents flow more appropriate to the operation analysis and to the influence of mitigation elements analysis. This software makes it possible to obtain results closer to the real distribution network, considering voltages unbalances, currents imbalances and the application of mitigation elements for harmonic distortions. In this scenario, a real case study with network data and equipments connected to the network will be presented, as well as the modeling of non-linear loads based on real data obtained from some PCCs (Points of Common Coupling) of interests for a distribution company.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of this paper is to compare 18 reference evapotranspiration models to the standard Penman-Monteith model in the Jaboticabal, Sao Paulo, region for the following time scales: daily, 5-day, 15-day and seasonal. A total of 5 years of daily meteorological data was used for the following analyses: accuracy (mean absolute percentage error, Mape), precision (R-2) and tendency (bias) (systematic error, SE). The results were also compared at the 95% probability level with Tukey's test. The Priestley-Taylor (1972) method was the most accurate for all time scales, the Tanner-Pelton (1960) method was the most accurate in the winter, and the Thornthwaite (1948) method was the most accurate of the methods that only used temperature data in the equations.