937 resultados para Discrete time inventory models
Resumo:
Die gegenwärtige Entwicklung der internationalen Klimapolitik verlangt von Deutschland eine Reduktion seiner Treibhausgasemissionen. Wichtigstes Treibhausgas ist Kohlendioxid, das durch die Verbrennung fossiler Energieträger in die Atmosphäre freigesetzt wird. Die Reduktionsziele können prinzipiell durch eine Verminderung der Emissionen sowie durch die Schaffung von Kohlenstoffsenken erreicht werden. Senken beschreiben dabei die biologische Speicherung von Kohlenstoff in Böden und Wäldern. Eine wichtige Einflussgröße auf diese Prozesse stellt die räumliche Dynamik der Landnutzung einer Region dar. In dieser Arbeit wird das Modellsystem HILLS entwickelt und zur Simulation dieser komplexen Wirkbeziehungen im Bundesland Hessen genutzt. Ziel ist es, mit HILLS über eine Analyse des aktuellen Zustands hinaus auch Szenarien über Wege der zukünftigen regionalen Entwicklung von Landnutzung und ihrer Wirkung auf den Kohlenstoffhaushalt bis 2020 zu untersuchen. Für die Abbildung der räumlichen und zeitlichen Dynamik von Landnutzung in Hessen wird das Modell LUCHesse entwickelt. Seine Aufgabe ist die Simulation der relevanten Prozesse auf einem 1 km2 Raster, wobei die Raten der Änderung exogen als Flächentrends auf Ebene der hessischen Landkreise vorgegeben werden. LUCHesse besteht aus Teilmodellen für die Prozesse: (A) Ausbreitung von Siedlungs- und Gewerbefläche, (B) Strukturwandel im Agrarsektor sowie (C) Neuanlage von Waldflächen (Aufforstung). Jedes Teilmodell umfasst Methoden zur Bewertung der Standorteignung der Rasterzellen für unterschiedliche Landnutzungsklassen und zur Zuordnung der Trendvorgaben zu solchen Rasterzellen, die jeweils am besten für eine Landnutzungsklasse geeignet sind. Eine Validierung der Teilmodelle erfolgt anhand von statistischen Daten für den Zeitraum von 1990 bis 2000. Als Ergebnis eines Simulationslaufs werden für diskrete Zeitschritte digitale Karten der Landnutzugsverteilung in Hessen erzeugt. Zur Simulation der Kohlenstoffspeicherung wird eine modifizierte Version des Ökosystemmodells Century entwickelt (GIS-Century). Sie erlaubt einen gesteuerten Simulationslauf in Jahresschritten und unterstützt die Integration des Modells als Komponente in das HILLS Modellsystem. Es werden verschiedene Anwendungsschemata für GIS-Century entwickelt, mit denen die Wirkung der Stilllegung von Ackerflächen, der Aufforstung sowie der Bewirtschaftung bereits bestehender Wälder auf die Kohlenstoffspeicherung untersucht werden kann. Eine Validierung des Modells und der Anwendungsschemata erfolgt anhand von Feld- und Literaturdaten. HILLS implementiert eine sequentielle Kopplung von LUCHesse mit GIS-Century. Die räumliche Kopplung geschieht dabei auf dem 1 km2 Raster, die zeitliche Kopplung über die Einführung eines Landnutzungsvektors, der die Beschreibung der Landnutzungsänderung einer Rasterzelle während des Simulationszeitraums enthält. Außerdem integriert HILLS beide Modelle über ein dienste- und datenbankorientiertes Konzept in ein Geografisches Informationssystem (GIS). Auf diesem Wege können die GIS-Funktionen zur räumlichen Datenhaltung und Datenverarbeitung genutzt werden. Als Anwendung des Modellsystems wird ein Referenzszenario für Hessen mit dem Zeithorizont 2020 berechnet. Das Szenario setzt im Agrarsektor eine Umsetzung der AGENDA 2000 Politik voraus, die in großem Maße zu Stilllegung von Ackerflächen führt, während für den Bereich Siedlung und Gewerbe sowie Aufforstung die aktuellen Trends der Flächenausdehnung fortgeschrieben werden. Mit HILLS ist es nun möglich, die Wirkung dieser Landnutzungsänderungen auf die biologische Kohlenstoffspeicherung zu quantifizieren. Während die Ausdehnung von Siedlungsflächen als Kohlenstoffquelle identifiziert werden kann (37 kt C/a), findet sich die wichtigste Senke in der Bewirtschaftung bestehender Waldflächen (794 kt C/a). Weiterhin führen die Stilllegung von Ackerfläche (26 kt C/a) sowie Aufforstung (29 kt C/a) zu einer zusätzlichen Speicherung von Kohlenstoff. Für die Kohlenstoffspeicherung in Böden zeigen die Simulationsexperimente sehr klar, dass diese Senke nur von beschränkter Dauer ist.
Resumo:
We develop an extension to the tactical planning model (TPM) for a job shop by the third author. The TPM is a discrete-time model in which all transitions occur at the start of each time period. The time period must be defined appropriately in order for the model to be meaningful. Each period must be short enough so that a job is unlikely to travel through more than one station in one period. At the same time, the time period needs to be long enough to justify the assumptions of continuous workflow and Markovian job movements. We build an extension to the TPM that overcomes this restriction of period sizing by permitting production control over shorter time intervals. We achieve this by deriving a continuous-time linear control rule for a single station. We then determine the first two moments of the production level and queue length for the workstation.
Resumo:
Traditional inventory models focus on risk-neutral decision makers, i.e., characterizing replenishment strategies that maximize expected total profit, or equivalently, minimize expected total cost over a planning horizon. In this paper, we propose a framework for incorporating risk aversion in multi-period inventory models as well as multi-period models that coordinate inventory and pricing strategies. In each case, we characterize the optimal policy for various measures of risk that have been commonly used in the finance literature. In particular, we show that the structure of the optimal policy for a decision maker with exponential utility functions is almost identical to the structure of the optimal risk-neutral inventory (and pricing) policies. Computational results demonstrate the importance of this approach not only to risk-averse decision makers, but also to risk-neutral decision makers with limited information on the demand distribution.
Resumo:
Piecewise linear models systems arise as mathematical models of systems in many practical applications, often from linearization for nonlinear systems. There are two main approaches of dealing with these systems according to their continuous or discrete-time aspects. We propose an approach which is based on the state transformation, more particularly the partition of the phase portrait in different regions where each subregion is modeled as a two-dimensional linear time invariant system. Then the Takagi-Sugeno model, which is a combination of local model is calculated. The simulation results show that the Alpha partition is well-suited for dealing with such a system
Determinantes de la deserción universitaria en la Facultad de Economía de la Universidad del Rosario
Resumo:
Este trabajo analiza el problema de la deserción estudiantil en la Facultad de Economía de la Universidad del Rosario, a través del estudio de los factores individuales, académicos y socioeconómicos que implican el riesgo de desertar. Con este objetivo, se utiliza el análisis de modelos de duración. Específi camente, se estima un modelo de riesgo proporcional de tiempo discreto con y sin heterogeneidad observada (Prentice- Gloeckler, 1978 y Meyer, 1980). Los resultados muestran que los estudiantes de sexo masculino, la vinculación de los estudiantes al mercado laboral y los estudiantes provenientes de otras regiones, tienen el mayor riesgo de deserción. Además, la edad del estudiante incrementa el riesgo, sin embargo, su efecto decrece marginalmente al aumentar la edad. Palabras clave: deserción estudiantil, modelos de duración, riesgo proporcional. Clasifi cación JEL: C41, C13, I21.
Resumo:
El presente trabajo se enfoca en el análisis de las acciones de Ecopetrol, empresa representativa del mercado de Extracción de Petróleo y Gas natural en Colombia (SP&G), durante el periodo, del 22 de mayo de 2012 al 30 de agosto de 2013. Durante este espacio de tiempo la acción sufrió una serie de variaciones en su precio las cuales se relacionaban a la nueva emisión de acciones que realizo la Compañía. Debido a este cambio en el comportamiento del activo se generaron una serie de interrogantes sobre, (i) la reacción del mercado ante diferentes sucesos ocurridos dentro de las firmas y en su entorno (ii) la capacidad de los modelos financieros de predecir y entender las posibles reacciones observadas de los activos (entendidos como deuda). Durante el desarrollo del presente trabajo se estudiará la pertinencia del mismo, en línea con los objetivos y desarrollos de la Escuela de Administración de la Universidad del Rosario. Puntualmente en temas de Perdurabilidad direccionados a la línea de Gerencia. Donde el entendimiento de la deuda como parte del funcionamiento actual y como variable determinante para el comportamiento futuro de las organizaciones tiene especial importancia. Una vez se clarifica la relación entre el presente trabajo y la Universidad, se desarrollan diferentes conceptos y teorías financieras que han permitido conocer y estudiar de manera más específica el mercado, con el objetivo de reducir los riesgos de las inversiones realizadas. Éste análisis se desarrolla en dos partes: (i) modelos de tiempo discreto y (ii) modelos de tiempo continúo. Una vez se tiene mayor claridad sobre los modelos estudiados hasta el momento se realiza el respectivo análisis de los datos mediante modelos de caos y análisis recurrente los cuales nos permiten entender que las acciones se comportan de manera caótica pero que establecen ciertas relaciones entre los precios actuales y los históricos, desarrollando comportamientos definidos entre los precios, las cantidades, el entorno macroeconómico y la organización. De otra parte, se realiza una descripción del mercado de petróleo en Colombia y se estudia a Ecopetrol como empresa y eje principal del mercado descrito en el país. La compañía Ecopetrol es representativa debido a que es uno de los mayores aportantes fiscales del país, pues sus ingresos se desprenden de bienes que se encuentran en el subsuelo por lo que la renta petrolera incluye impuestos a la producción transformación y consumo (Ecopetrol, 2003). Por último, se presentan los resultados del trabajo, así como el análisis que da lugar para presentar ciertas recomendaciones a partir de lo observado.
Resumo:
The integration of processes at different scales is a key problem in the modelling of cell populations. Owing to increased computational resources and the accumulation of data at the cellular and subcellular scales, the use of discrete, cell-level models, which are typically solved using numerical simulations, has become prominent. One of the merits of this approach is that important biological factors, such as cell heterogeneity and noise, can be easily incorporated. However, it can be difficult to efficiently draw generalizations from the simulation results, as, often, many simulation runs are required to investigate model behaviour in typically large parameter spaces. In some cases, discrete cell-level models can be coarse-grained, yielding continuum models whose analysis can lead to the development of insight into the underlying simulations. In this paper we apply such an approach to the case of a discrete model of cell dynamics in the intestinal crypt. An analysis of the resulting continuum model demonstrates that there is a limited region of parameter space within which steady-state (and hence biologically realistic) solutions exist. Continuum model predictions show good agreement with corresponding results from the underlying simulations and experimental data taken from murine intestinal crypts.
Resumo:
The performance of various statistical models and commonly used financial indicators for forecasting securitised real estate returns are examined for five European countries: the UK, Belgium, the Netherlands, France and Italy. Within a VAR framework, it is demonstrated that the gilt-equity yield ratio is in most cases a better predictor of securitized returns than the term structure or the dividend yield. In particular, investors should consider in their real estate return models the predictability of the gilt-equity yield ratio in Belgium, the Netherlands and France, and the term structure of interest rates in France. Predictions obtained from the VAR and univariate time-series models are compared with the predictions of an artificial neural network model. It is found that, whilst no single model is universally superior across all series, accuracy measures and horizons considered, the neural network model is generally able to offer the most accurate predictions for 1-month horizons. For quarterly and half-yearly forecasts, the random walk with a drift is the most successful for the UK, Belgian and Dutch returns and the neural network for French and Italian returns. Although this study underscores market context and forecast horizon as parameters relevant to the choice of the forecast model, it strongly indicates that analysts should exploit the potential of neural networks and assess more fully their forecast performance against more traditional models.
Resumo:
We propose and analyse a class of evolving network models suitable for describing a dynamic topological structure. Applications include telecommunication, on-line social behaviour and information processing in neuroscience. We model the evolving network as a discrete time Markov chain, and study a very general framework where, conditioned on the current state, edges appear or disappear independently at the next timestep. We show how to exploit symmetries in the microscopic, localized rules in order to obtain conjugate classes of random graphs that simplify analysis and calibration of a model. Further, we develop a mean field theory for describing network evolution. For a simple but realistic scenario incorporating the triadic closure effect that has been empirically observed by social scientists (friends of friends tend to become friends), the mean field theory predicts bistable dynamics, and computational results confirm this prediction. We also discuss the calibration issue for a set of real cell phone data, and find support for a stratified model, where individuals are assigned to one of two distinct groups having different within-group and across-group dynamics.
Resumo:
We consider the problem of discrete time filtering (intermittent data assimilation) for differential equation models and discuss methods for its numerical approximation. The focus is on methods based on ensemble/particle techniques and on the ensemble Kalman filter technique in particular. We summarize as well as extend recent work on continuous ensemble Kalman filter formulations, which provide a concise dynamical systems formulation of the combined dynamics-assimilation problem. Possible extensions to fully nonlinear ensemble/particle based filters are also outlined using the framework of optimal transportation theory.
Resumo:
The authors model retail rents in the United Kingdom with use of vector-autoregressive and time-series models. Two retail rent series are used, compiled by LaSalle Investment Management and CB Hillier Parker, and the emphasis is on forecasting. The results suggest that the use of the vector-autoregression and time-series models in this paper can pick up important features of the data that are useful for forecasting purposes. The relative forecasting performance of the models appears to be subject to the length of the forecast time-horizon. The results also show that the variables which were appropriate for inclusion in the vector-autoregression systems differ between the two rent series, suggesting that the structure of optimal models for predicting retail rents could be specific to the rent index used. Ex ante forecasts from our time-series suggest that both LaSalle Investment Management and CB Hillier Parker real retail rents will exhibit an annual growth rate above their long-term mean.
Resumo:
The paper discusses how variations in the pattern of convective plasma flows should beincluded in self-consistent time-dependent models of the coupled ionosphere-thermosphere system. The author shows how these variations depend upon the mechanism by which the solar wind flow excites the convection. The modelling of these effects is not just of relevance to the polar ionosphere. This is because the influence of convection is not confined to high latitudes: the resultant heating and composition changes in the thermosphere are communicated to lower latitudes by the winds which are also greatly modified by the plasma convection. These thermospheric changes alter the global distribution of plasma by modulatingthe rates of the chemical reactions which areresponsible for the loss of plasma. Hence the modelling of these high-latitude processes is of relevanceto the design and operation of HF communication, radar and navigation systems worldwide.
Resumo:
A particle filter method is presented for the discrete-time filtering problem with nonlinear ItA ` stochastic ordinary differential equations (SODE) with additive noise supposed to be analytically integrable as a function of the underlying vector-Wiener process and time. The Diffusion Kernel Filter is arrived at by a parametrization of small noise-driven state fluctuations within branches of prediction and a local use of this parametrization in the Bootstrap Filter. The method applies for small noise and short prediction steps. With explicit numerical integrators, the operations count in the Diffusion Kernel Filter is shown to be smaller than in the Bootstrap Filter whenever the initial state for the prediction step has sufficiently few moments. The established parametrization is a dual-formula for the analysis of sensitivity to gaussian-initial perturbations and the analysis of sensitivity to noise-perturbations, in deterministic models, showing in particular how the stability of a deterministic dynamics is modeled by noise on short times and how the diffusion matrix of an SODE should be modeled (i.e. defined) for a gaussian-initial deterministic problem to be cast into an SODE problem. From it, a novel definition of prediction may be proposed that coincides with the deterministic path within the branch of prediction whose information entropy at the end of the prediction step is closest to the average information entropy over all branches. Tests are made with the Lorenz-63 equations, showing good results both for the filter and the definition of prediction.
Resumo:
We introduce in this paper a new class of discrete generalized nonlinear models to extend the binomial, Poisson and negative binomial models to cope with count data. This class of models includes some important models such as log-nonlinear models, logit, probit and negative binomial nonlinear models, generalized Poisson and generalized negative binomial regression models, among other models, which enables the fitting of a wide range of models to count data. We derive an iterative process for fitting these models by maximum likelihood and discuss inference on the parameters. The usefulness of the new class of models is illustrated with an application to a real data set. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Internet protocol TV (IPTV) is predicted to be the key technology winner in the future. Efforts to accelerate the deployment of IPTV centralized model which is combined of VHO, encoders, controller, access network and Home network. Regardless of whether the network is delivering live TV, VOD, or Time-shift TV, all content and network traffic resulting from subscriber requests must traverse the entire network from the super-headend all the way to each subscriber's Set-Top Box (STB).IPTV services require very stringent QoS guarantees When IPTV traffic shares the network resources with other traffic like data and voice, how to ensure their QoS and efficiently utilize the network resources is a key and challenging issue. For QoS measured in the network-centric terms of delay jitter, packet losses and bounds on delay. The main focus of this thesis is on the optimized bandwidth allocation and smooth datatransmission. The proposed traffic model for smooth delivering video service IPTV network with its QoS performance evaluation. According to Maglaris et al [5] First, analyze the coding bit rate of a single video source. Various statistical quantities are derived from bit rate data collected with a conditional replenishment inter frame coding scheme. Two correlated Markov process models (one in discrete time and one incontinuous time) are shown to fit the experimental data and are used to model the input rates of several independent sources into a statistical multiplexer. Preventive control mechanism which is to be include CAC, traffic policing used for traffic control.QoS has been evaluated of common bandwidth scheduler( FIFO) by use fluid models with Markovian queuing method and analysis the result by using simulator andanalytically, Which is measured the performance of the packet loss, overflow and mean waiting time among the network users.