989 resultados para Stochastic modelling
Resumo:
The paper develops a method to solve higher-dimensional stochasticcontrol problems in continuous time. A finite difference typeapproximation scheme is used on a coarse grid of low discrepancypoints, while the value function at intermediate points is obtainedby regression. The stability properties of the method are discussed,and applications are given to test problems of up to 10 dimensions.Accurate solutions to these problems can be obtained on a personalcomputer.
Resumo:
The Drivers Scheduling Problem (DSP) consists of selecting a set of duties for vehicle drivers, for example buses, trains, plane or boat drivers or pilots, for the transportation of passengers or goods. This is a complex problem because it involves several constraints related to labour and company rules and can also present different evaluation criteria and objectives. Being able to develop an adequate model for this problem that can represent the real problem as close as possible is an important research area.The main objective of this research work is to present new mathematical models to the DSP problem that represent all the complexity of the drivers scheduling problem, and also demonstrate that the solutions of these models can be easily implemented in real situations. This issue has been recognized by several authors and as important problem in Public Transportation. The most well-known and general formulation for the DSP is a Set Partition/Set Covering Model (SPP/SCP). However, to a large extend these models simplify some of the specific business aspects and issues of real problems. This makes it difficult to use these models as automatic planning systems because the schedules obtained must be modified manually to be implemented in real situations. Based on extensive passenger transportation experience in bus companies in Portugal, we propose new alternative models to formulate the DSP problem. These models are also based on Set Partitioning/Covering Models; however, they take into account the bus operator issues and the perspective opinions and environment of the user.We follow the steps of the Operations Research Methodology which consist of: Identify the Problem; Understand the System; Formulate a Mathematical Model; Verify the Model; Select the Best Alternative; Present the Results of theAnalysis and Implement and Evaluate. All the processes are done with close participation and involvement of the final users from different transportation companies. The planner s opinion and main criticisms are used to improve the proposed model in a continuous enrichment process. The final objective is to have a model that can be incorporated into an information system to be used as an automatic tool to produce driver schedules. Therefore, the criteria for evaluating the models is the capacity to generate real and useful schedules that can be implemented without many manual adjustments or modifications. We have considered the following as measures of the quality of the model: simplicity, solution quality and applicability. We tested the alternative models with a set of real data obtained from several different transportation companies and analyzed the optimal schedules obtained with respect to the applicability of the solution to the real situation. To do this, the schedules were analyzed by the planners to determine their quality and applicability. The main result of this work is the proposition of new mathematical models for the DSP that better represent the realities of the passenger transportation operators and lead to better schedules that can be implemented directly in real situations.
Resumo:
This paper evaluates new evidence on price setting practices and inflation persistence in the euro area with respect to its implications for macro modelling. It argues that several of the most commonly used assumptions in micro-founded macro models are seriously challenged by the new findings.
Resumo:
The principal aim of this paper is to estimate a stochastic frontier costfunction and an inefficiency effects model in the analysis of the primaryhealth care services purchased by the public authority and supplied by 180providers in 1996 in Catalonia. The evidence from our sample does not supportthe premise that contracting out has helped improve purchasing costefficiency in primary care. Inefficient purchasing cost was observed in thecomponent of this purchasing cost explicitly included in the contract betweenpurchaser and provider. There are no observable incentives for thecontracted-out primary health care teams to minimise prescription costs, whichare not explicitly included in the present contracting system.
Resumo:
The need for integration in the supply chain management leads us to considerthe coordination of two logistic planning functions: transportation andinventory. The coordination of these activities can be an extremely importantsource of competitive advantage in the supply chain management. The battle forcost reduction can pass through the equilibrium of transportation versusinventory managing costs. In this work, we study the specific case of aninventory-routing problem for a week planning period with different types ofdemand. A heuristic methodology, based on the Iterated Local Search, isproposed to solve the Multi-Period Inventory Routing Problem with stochasticand deterministic demand.
Resumo:
In this paper, generalizing results in Alòs, León and Vives (2007b), we see that the dependence of jumps in the volatility under a jump-diffusion stochastic volatility model, has no effect on the short-time behaviour of the at-the-money implied volatility skew, although the corresponding Hull and White formula depends on the jumps. Towards this end, we use Malliavin calculus techniques for Lévy processes based on Løkka (2004), Petrou (2006), and Solé, Utzet and Vives (2007).
Resumo:
In this paper we use Malliavin calculus techniques to obtain an expression for the short-time behavior of the at-the-money implied volatility skew for a generalization of the Bates model, where the volatility does not need to be neither a difussion, nor a Markov process as the examples in section 7 show. This expression depends on the derivative of the volatility in the sense of Malliavin calculus.
Resumo:
The evolution of boundedly rational rules for playing normal form games is studied within stationary environments ofstochastically changing games. Rules are viewed as algorithms prescribing strategies for the different normal formgames that arise. It is shown that many of the folk results of evolutionary game theory typically obtained witha fixed game and fixed strategies carry over to the present case. The results are also related to recent experimentson rules and games.
Resumo:
Youth is one of the phases in the life-cycle when some of the most decisivelife transitions take place. Entering the labour market or leaving parentalhome are events with important consequences for the economic well-beingof young adults. In this paper, the interrelationship between employment,residential emancipation and poverty dynamics is studied for eight Europeancountries by means of an econometric model with feedback effects. Resultsshow that youth poverty genuine state dependence is positive and highly significant.Evidence proves there is a strong causal effect between poverty andleaving home in Scandinavian countries, however, time in economic hardshipdoes not last long. In Southern Europe, instead, youth tend to leave theirparental home much later in order to avoid falling into a poverty state that ismore persistent. Past poverty has negative consequences on the likelihood ofemployment.
Resumo:
Despite the importance of supplier inducement and brand loyalty inthe drug purchasing process, little empirical evidence is to be foundwith regard to the influence that these factors exert on patients decisions. Under the new scenario of easier access to information,patients are becoming more demanding and even go as far asquestioning their physicians prescription. Furthermore, newregulation also encourages patients to adopt an active role in thedecision between brand-name and generic drugs. Using a statedpreference model based on a choice survey, I have found evidenceof how significant physicians prescription and pharmacists recommendation become throughout the drug purchase process and,to what extent, brand loyalty influences the final decision. Asfar as we are aware, this paper is the first to explicitlytake consumers preferences into account rather than focusingon the behavior of health professionals.
Resumo:
Context There are no evidence syntheses available to guide clinicians on when to titrate antihypertensive medication after initiation. Objective To model the blood pressure (BP) response after initiating antihypertensive medication. Data sources electronic databases including Medline, Embase, Cochrane Register and reference lists up to December 2009. Study selection Trials that initiated antihypertensive medication as single therapy in hypertensive patients who were either drug naive or had a placebo washout from previous drugs. Data extraction Office BP measurements at a minimum of two weekly intervals for a minimum of 4 weeks. An asymptotic approach model of BP response was assumed and non-linear mixed effects modelling used to calculate model parameters. Results and conclusions Eighteen trials that recruited 4168 patients met inclusion criteria. The time to reach 50% of the maximum estimated BP lowering effect was 1 week (systolic 0.91 weeks, 95% CI 0.74 to 1.10; diastolic 0.95, 0.75 to 1.15). Models incorporating drug class as a source of variability did not improve fit of the data. Incorporating the presence of a titration schedule improved model fit for both systolic and diastolic pressure. Titration increased both the predicted maximum effect and the time taken to reach 50% of the maximum (systolic 1.2 vs 0.7 weeks; diastolic 1.4 vs 0.7 weeks). Conclusions Estimates of the maximum efficacy of antihypertensive agents can be made early after starting therapy. This knowledge will guide clinicians in deciding when a newly started antihypertensive agent is likely to be effective or not at controlling BP.
Resumo:
Designing an efficient sampling strategy is of crucial importance for habitat suitability modelling. This paper compares four such strategies, namely, 'random', 'regular', 'proportional-stratified' and 'equal -stratified'- to investigate (1) how they affect prediction accuracy and (2) how sensitive they are to sample size. In order to compare them, a virtual species approach (Ecol. Model. 145 (2001) 111) in a real landscape, based on reliable data, was chosen. The distribution of the virtual species was sampled 300 times using each of the four strategies in four sample sizes. The sampled data were then fed into a GLM to make two types of prediction: (1) habitat suitability and (2) presence/ absence. Comparing the predictions to the known distribution of the virtual species allows model accuracy to be assessed. Habitat suitability predictions were assessed by Pearson's correlation coefficient and presence/absence predictions by Cohen's K agreement coefficient. The results show the 'regular' and 'equal-stratified' sampling strategies to be the most accurate and most robust. We propose the following characteristics to improve sample design: (1) increase sample size, (2) prefer systematic to random sampling and (3) include environmental information in the design'
Resumo:
Uncertainty quantification of petroleum reservoir models is one of the present challenges, which is usually approached with a wide range of geostatistical tools linked with statistical optimisation or/and inference algorithms. The paper considers a data driven approach in modelling uncertainty in spatial predictions. Proposed semi-supervised Support Vector Regression (SVR) model has demonstrated its capability to represent realistic features and describe stochastic variability and non-uniqueness of spatial properties. It is able to capture and preserve key spatial dependencies such as connectivity, which is often difficult to achieve with two-point geostatistical models. Semi-supervised SVR is designed to integrate various kinds of conditioning data and learn dependences from them. A stochastic semi-supervised SVR model is integrated into a Bayesian framework to quantify uncertainty with multiple models fitted to dynamic observations. The developed approach is illustrated with a reservoir case study. The resulting probabilistic production forecasts are described by uncertainty envelopes.
Resumo:
Ground clutter caused by anomalous propagation (anaprop) can affect seriously radar rain rate estimates, particularly in fully automatic radar processing systems, and, if not filtered, can produce frequent false alarms. A statistical study of anomalous propagation detected from two operational C-band radars in the northern Italian region of Emilia Romagna is discussed, paying particular attention to its diurnal and seasonal variability. The analysis shows a high incidence of anaprop in summer, mainly in the morning and evening, due to the humid and hot summer climate of the Po Valley, particularly in the coastal zone. Thereafter, a comparison between different techniques and datasets to retrieve the vertical profile of the refractive index gradient in the boundary layer is also presented. In particular, their capability to detect anomalous propagation conditions is compared. Furthermore, beam path trajectories are simulated using a multilayer ray-tracing model and the influence of the propagation conditions on the beam trajectory and shape is examined. High resolution radiosounding data are identified as the best available dataset to reproduce accurately the local propagation conditions, while lower resolution standard TEMP data suffers from interpolation degradation and Numerical Weather Prediction model data (Lokal Model) are able to retrieve a tendency to superrefraction but not to detect ducting conditions. Observing the ray tracing of the centre, lower and upper limits of the radar antenna 3-dB half-power main beam lobe it is concluded that ducting layers produce a change in the measured volume and in the power distribution that can lead to an additional error in the reflectivity estimate and, subsequently, in the estimated rainfall rate.