993 resultados para DEPENDENT INTER-OCCURRENCES TIMES


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the problem of estimating the mean and variance of the time between occurrences of an event of interest (inter-occurrences times) where some forms of dependence between two consecutive time intervals are allowed. Two basic density functions are taken into account. They are the Weibull and the generalised exponential density functions. In order to capture the dependence between two consecutive inter-occurrences times, we assume that either the shape and/or the scale parameters of the two density functions are given by auto-regressive models. The expressions for the mean and variance of the inter-occurrences times are presented. The models are applied to the ozone data from two regions of Mexico City. The estimation of the parameters is performed using a Bayesian point of view via Markov chain Monte Carlo (MCMC) methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Real-time scheduling usually considers worst-case values for the parameters of task (or message stream) sets, in order to provide safe schedulability tests for hard real-time systems. However, worst-case conditions introduce a level of pessimism that is often inadequate for a certain class of (soft) real-time systems. In this paper we provide an approach for computing the stochastic response time of tasks where tasks have inter-arrival times described by discrete probabilistic distribution functions, instead of minimum inter-arrival (MIT) values.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We explore in depth the validity of a recently proposed scaling law for earthquake inter-event time distributions in the case of the Southern California, using the waveform cross-correlation catalog of Shearer et al. Two statistical tests are used: on the one hand, the standard two-sample Kolmogorov-Smirnov test is in agreement with the scaling of the distributions. On the other hand, the one-sample Kolmogorov-Smirnov statistic complemented with Monte Carlo simulation of the inter-event times, as done by Clauset et al., supports the validity of the gamma distribution as a simple model of the scaling function appearing on the scaling law, for rescaled inter-event times above 0.01, except for the largest data set (magnitude greater than 2). A discussion of these results is provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In today’s competitive markets, the importance of goodscheduling strategies in manufacturing companies lead to theneed of developing efficient methods to solve complexscheduling problems.In this paper, we studied two production scheduling problemswith sequence-dependent setups times. The setup times areone of the most common complications in scheduling problems,and are usually associated with cleaning operations andchanging tools and shapes in machines.The first problem considered is a single-machine schedulingwith release dates, sequence-dependent setup times anddelivery times. The performance measure is the maximumlateness.The second problem is a job-shop scheduling problem withsequence-dependent setup times where the objective is tominimize the makespan.We present several priority dispatching rules for bothproblems, followed by a study of their performance. Finally,conclusions and directions of future research are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Long Term Evolution (LTE) cellular technology is expected to extend the capacity and improve the performance of current 3G cellular networks. Among the key mechanisms in LTE responsible for traffic management is the packet scheduler, which handles the allocation of resources to active flows in both the frequency and time dimension. This paper investigates for various scheduling scheme how they affect the inter-cell interference characteristics and how the interference in turn affects the user’s performance. A special focus in the analysis is on the impact of flow-level dynamics resulting from the random user behaviour. For this we use a hybrid analytical/simulation approach which enables fast evaluation of flow-level performance measures. Most interestingly, our findings show that the scheduling policy significantly affects the inter-cell interference pattern but that the scheduler specific pattern has little impact on the flow-level performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We show that a simple mixing idea allows one to establish a number of explicit formulas for ruin probabilities and related quantities in collective risk models with dependence among claim sizes and among claim inter-occurrence times. Examples include compound Poisson risk models with completely monotone marginal claim size distributions that are dependent according to Archimedean survival copulas as well as renewal risk models with dependent inter-occurrence times.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presented at SEMINAR "ACTION TEMPS RÉEL:INFRASTRUCTURES ET SERVICES SYSTÉMES". 10, Apr, 2015. Brussels, Belgium.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bacterial cellulose produced from Gluconacetobacter xilinus was used to produce cellulose nanocrystals by sulfuric acid hydrolysis. Hydrolysis was performed with 64% sulfuric acid at 50 ºC with the hydrolysis time ranging between 5 and 90 min. The production of nanocrystals was observed to have size distributions that were dependent on hydrolysis times up to 10 min, after which time the suspensions showed distributions closer in size. Results from thermal analysis and X-ray diffraction showed that the amorphous cellulose was removed, leaving only the crystalline portion. Self-supported films were formed from the suspension of nanocrystals and had iridescence characteristics. The films were characterized by microscopy measures and specular reflectance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis deals with analysis of some Stochastic Inventory Models with Pooling/Retrial of Customers.. In the first model we analyze an (s,S) production Inventory system with retrial of customers. Arrival of customers from outside the system form a Poisson process. The inter production times are exponentially distributed with parameter µ. When inventory level reaches zero further arriving demands are sent to the orbit which has capacity M(<∞). Customers, who find the orbit full and inventory level at zero are lost to the system. Demands arising from the orbital customers are exponentially distributed with parameter γ. In the model-II we extend these results to perishable inventory system assuming that the life-time of each item follows exponential with parameter θ. The study deals with an (s,S) production inventory with service times and retrial of unsatisfied customers. Primary demands occur according to a Markovian Arrival Process(MAP). Consider an (s,S)-retrial inventory with service time in which primary demands occur according to a Batch Markovian Arrival Process (BMAP). The inventory is controlled by the (s,S) policy and (s,S) inventory system with service time. Primary demands occur according to Poissson process with parameter λ. The study concentrates two models. In the first model we analyze an (s,S) Inventory system with postponed demands where arrivals of demands form a Poisson process. In the second model, we extend our results to perishable inventory system assuming that the life-time of each item follows exponential distribution with parameter θ. Also it is assumed that when inventory level is zero the arriving demands choose to enter the pool with probability β and with complementary probability (1- β) it is lost for ever. Finally it analyze an (s,S) production inventory system with switching time. A lot of work is reported under the assumption that the switching time is negligible but this is not the case for several real life situation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is devoted to the study of some stochastic models in inventories. An inventory system is a facility at which items of materials are stocked. In order to promote smooth and efficient running of business, and to provide adequate service to the customers, an inventory materials is essential for any enterprise. When uncertainty is present, inventories are used as a protection against risk of stock out. It is advantageous to procure the item before it is needed at a lower marginal cost. Again, by bulk purchasing, the advantage of price discounts can be availed. All these contribute to the formation of inventory. Maintaining inventories is a major expenditure for any organization. For each inventory, the fundamental question is how much new stock should be ordered and when should the orders are replaced. In the present study, considered several models for single and two commodity stochastic inventory problems. The thesis discusses two models. In the first model, examined the case in which the time elapsed between two consecutive demand points are independent and identically distributed with common distribution function F(.) with mean  (assumed finite) and in which demand magnitude depends only on the time elapsed since the previous demand epoch. The time between disasters has an exponential distribution with parameter . In Model II, the inter arrival time of disasters have general distribution (F.) with mean  ( ) and the quantity destructed depends on the time elapsed between disasters. Demands form compound poison processes with inter arrival times of demands having mean 1/. It deals with linearly correlated bulk demand two Commodity inventory problem, where each arrival demands a random number of items of each commodity C1 and C2, the maximum quantity demanded being a (< S1) and b(

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Mecânica - FEG

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die vorliegende Arbeit ist motiviert durch biologische Fragestellungen bezüglich des Verhaltens von Membranpotentialen in Neuronen. Ein vielfach betrachtetes Modell für spikende Neuronen ist das Folgende. Zwischen den Spikes verhält sich das Membranpotential wie ein Diffusionsprozess X der durch die SDGL dX_t= beta(X_t) dt+ sigma(X_t) dB_t gegeben ist, wobei (B_t) eine Standard-Brown'sche Bewegung bezeichnet. Spikes erklärt man wie folgt. Sobald das Potential X eine gewisse Exzitationsschwelle S überschreitet entsteht ein Spike. Danach wird das Potential wieder auf einen bestimmten Wert x_0 zurückgesetzt. In Anwendungen ist es manchmal möglich, einen Diffusionsprozess X zwischen den Spikes zu beobachten und die Koeffizienten der SDGL beta() und sigma() zu schätzen. Dennoch ist es nötig, die Schwellen x_0 und S zu bestimmen um das Modell festzulegen. Eine Möglichkeit, dieses Problem anzugehen, ist x_0 und S als Parameter eines statistischen Modells aufzufassen und diese zu schätzen. In der vorliegenden Arbeit werden vier verschiedene Fälle diskutiert, in denen wir jeweils annehmen, dass das Membranpotential X zwischen den Spikes eine Brown'sche Bewegung mit Drift, eine geometrische Brown'sche Bewegung, ein Ornstein-Uhlenbeck Prozess oder ein Cox-Ingersoll-Ross Prozess ist. Darüber hinaus beobachten wir die Zeiten zwischen aufeinander folgenden Spikes, die wir als iid Treffzeiten der Schwelle S von X gestartet in x_0 auffassen. Die ersten beiden Fälle ähneln sich sehr und man kann jeweils den Maximum-Likelihood-Schätzer explizit angeben. Darüber hinaus wird, unter Verwendung der LAN-Theorie, die Optimalität dieser Schätzer gezeigt. In den Fällen OU- und CIR-Prozess wählen wir eine Minimum-Distanz-Methode, die auf dem Vergleich von empirischer und wahrer Laplace-Transformation bezüglich einer Hilbertraumnorm beruht. Wir werden beweisen, dass alle Schätzer stark konsistent und asymptotisch normalverteilt sind. Im letzten Kapitel werden wir die Effizienz der Minimum-Distanz-Schätzer anhand simulierter Daten überprüfen. Ferner, werden Anwendungen auf reale Datensätze und deren Resultate ausführlich diskutiert.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methane is a strong greenhouse gas and large uncertainties exist concerning the future evolution of its atmospheric abundance. Analyzing methane atmospheric mixing and stable isotope ratios in air trapped in polar ice sheets helps in reconstructing the evolution of its sources and sinks in the past. This is important to improve predictions of atmospheric CH4 mixing ratios in the future under the influence of a changing climate. The aim of this study is to assess whether past atmospheric δ13C(CH4) variations can be reliably reconstructed from firn air measurements. Isotope reconstructions obtained with a state of the art firn model from different individual sites show unexpectedly large discrepancies and are mutually inconsistent. We show that small changes in the diffusivity profiles at individual sites lead to strong differences in the firn fractionation, which can explain a large part of these discrepancies. Using slightly modified diffusivities for some sites, and neglecting samples for which the firn fractionation signals are strongest, a combined multi-site inversion can be performed, which returns an isotope reconstruction that is consistent with firn data. However, the isotope trends are lower than what has been concluded from Southern Hemisphere (SH) archived air samples and high-accumulation ice core data. We conclude that with the current datasets and understanding of firn air transport, a high precision reconstruction of δ13C of CH4 from firn air samples is not possible, because reconstructed atmospheric trends over the last 50 yr of 0.3–1.5 ‰ are of the same magnitude as inherent uncertainties in the method, which are the firn fractionation correction (up to ~2 ‰ at individual sites), the Kr isobaric interference (up to ~0.8 ‰, system dependent), inter-laboratory calibration offsets (~0.2 ‰) and uncertainties in past CH4 levels (~0.5 ‰).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In-situ Fe isotope measurements have been carried out to estimate the impact of the hydrothermal metamorphic overprint on the Fe isotopic composition of Fe-Ti-oxides and Fe-sulfides of the different lithologies of the drilled rocks from IODP Hole 1256D (eastern equatorial Pacific; 15 Ma crust formed at the East Pacific Rise). Most igneous rocks normally have a very restricted range in their 56Fe/54Fe ratio. In contrast, Fe isotope compositions of hot fluids (> 300 °C) from mid-ocean-ridge spreading centers define a narrow range that is shifted to lower delta 56Fe values by 0.2 per mil - 0.5 per mil as compared to igneous rocks. Therefore, it is expected that mineral phases that contain large amounts of Fe are especially affected by the interaction with a fluid that fractionates Fe isotopes during exsolution/precipitation of those minerals. We have used a femtosecond UV-Laser ablation system to determine mineral 56Fe/54Fe ratios of selected samples with a precision of < 0.1 per mil (2 sigma level) at micrometer-scale. We have found significant variations of the delta 56Fe (IRMM-014) values in the minerals between different samples as well as within samples and mineral grains. The overall observed scale of delta 56Fe (magnetite) in 1256D rocks ranges from - 0.12 to + 0.64 per mil, and of delta 56Fe (ilmenite) from - 0.77 to + 0.01 per mil. Pyrite in the lowermost sheeted dike section is clearly distinguishable from the other investigated lithological units, having positive delta 56Fe values between + 0.29 and + 0.56 per mil, whereas pyrite in the other samples has generally negative delta 56Fe values from - 1.10 to - 0.59 permil. One key observation is that the temperature dependent inter-mineral fractionations of Fe isotopes between magnetite and ilmenite are systematically shifted towards higher values when compared to theoretically expected values, while synthesized, well equilibrated magnetite-ilmenite pairs are compatible with the theoretical predictions. Theoretical considerations including beta-factors of different aqueous Fe-chlorides and Rayleigh-type fractionations in the presence of a hydrous, chlorine-bearing fluid can explain this observation. The disagreement between observed and theoretical equilibrium fractionation, the fact that magnetite, in contrast to ilmenite shows a slight downhole trend in the delta 56Fe values, and the observation of small scale heterogeneities within single mineral grains imply that a general re-equilibration of the magnetite-ilmenite pairs is overprinted by kinetic fractionation effects, caused by the interaction of magnetite/ilmenite with hydrothermal fluids penetrating the upper oceanic crust during cooling, or incomplete re-equilibration at low temperatures. Furthermore, the observation of significant small-scale variations in the 56Fe/54Fe ratios of single minerals in this study highlights the importance of high spatial-resolution-analyses of stable isotope ratios for further investigations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper focuses on the design of railway timetables considering a variable elastic demand profile along a whole design day. Timetabling is the third stage in the classical hierarchical railway planning process. Most of previous works on this topic consider a uniform demand behavior for short planning intervals. In this paper, we propose a MINLP model for designing non-periodic timetables on a railway corridor where demand is dependent on waiting times. In the elastic demand case, long waiting times lead to a loss of passengers, who may select an alternative transportation mode. The mode choice is modeled using two alternative methods. The first one is based on a sigmoid function and can be used in case of absence of information for competitor modes. In the second one, the mode choice probability is obtained using a Logit model that explicitly considers the existence of a main alternative mode. With the purpose of obtaining optimal departure times, in both cases, a minimization of the loss of passengers is used as objective function. Finally, as illustration, the timetabling MINLP model with both mode choice methods is applied to a real case and computational results are shown.