864 resultados para Production Inventory Model with Switching Time


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the twentieth century, the allocation of womens' time changed dramatically. This paper explores the implications for the allocation of married womens' time stemming from: (1) the household revolution associated with the introduction of a variety of labor-saving devices in the home; (2) the remarkable increase in the relative wage of women; and (3) changes in childcare requirements associated with changes in fertility patterns. To do so, we construct a life-cycle model with home production and childcare constraints. The parameters of the childcare production function are estimated using micro evidence from U.S. time use data. We find that the increase in the relative wage of women is the most important explanation of the increase in married womens' market work time over the twentieth century. Changes in fertility had large effects up to 1980, but little effect thereafter. The declining price of durables has an appreciable effect only since 1980, an effect that is consistent with a broader interpretation of durable goods reflecting the marketization of home production.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Le suivi thérapeutique est recommandé pour l’ajustement de la dose des agents immunosuppresseurs. La pertinence de l’utilisation de la surface sous la courbe (SSC) comme biomarqueur dans l’exercice du suivi thérapeutique de la cyclosporine (CsA) dans la transplantation des cellules souches hématopoïétiques est soutenue par un nombre croissant d’études. Cependant, pour des raisons intrinsèques à la méthode de calcul de la SSC, son utilisation en milieu clinique n’est pas pratique. Les stratégies d’échantillonnage limitées, basées sur des approches de régression (R-LSS) ou des approches Bayésiennes (B-LSS), représentent des alternatives pratiques pour une estimation satisfaisante de la SSC. Cependant, pour une application efficace de ces méthodologies, leur conception doit accommoder la réalité clinique, notamment en requérant un nombre minimal de concentrations échelonnées sur une courte durée d’échantillonnage. De plus, une attention particulière devrait être accordée à assurer leur développement et validation adéquates. Il est aussi important de mentionner que l’irrégularité dans le temps de la collecte des échantillons sanguins peut avoir un impact non-négligeable sur la performance prédictive des R-LSS. Or, à ce jour, cet impact n’a fait l’objet d’aucune étude. Cette thèse de doctorat se penche sur ces problématiques afin de permettre une estimation précise et pratique de la SSC. Ces études ont été effectuées dans le cadre de l’utilisation de la CsA chez des patients pédiatriques ayant subi une greffe de cellules souches hématopoïétiques. D’abord, des approches de régression multiple ainsi que d’analyse pharmacocinétique de population (Pop-PK) ont été utilisées de façon constructive afin de développer et de valider adéquatement des LSS. Ensuite, plusieurs modèles Pop-PK ont été évalués, tout en gardant à l’esprit leur utilisation prévue dans le contexte de l’estimation de la SSC. Aussi, la performance des B-LSS ciblant différentes versions de SSC a également été étudiée. Enfin, l’impact des écarts entre les temps d’échantillonnage sanguins réels et les temps nominaux planifiés, sur la performance de prédiction des R-LSS a été quantifié en utilisant une approche de simulation qui considère des scénarios diversifiés et réalistes représentant des erreurs potentielles dans la cédule des échantillons sanguins. Ainsi, cette étude a d’abord conduit au développement de R-LSS et B-LSS ayant une performance clinique satisfaisante, et qui sont pratiques puisqu’elles impliquent 4 points d’échantillonnage ou moins obtenus dans les 4 heures post-dose. Une fois l’analyse Pop-PK effectuée, un modèle structural à deux compartiments avec un temps de délai a été retenu. Cependant, le modèle final - notamment avec covariables - n’a pas amélioré la performance des B-LSS comparativement aux modèles structuraux (sans covariables). En outre, nous avons démontré que les B-LSS exhibent une meilleure performance pour la SSC dérivée des concentrations simulées qui excluent les erreurs résiduelles, que nous avons nommée « underlying AUC », comparée à la SSC observée qui est directement calculée à partir des concentrations mesurées. Enfin, nos résultats ont prouvé que l’irrégularité des temps de la collecte des échantillons sanguins a un impact important sur la performance prédictive des R-LSS; cet impact est en fonction du nombre des échantillons requis, mais encore davantage en fonction de la durée du processus d’échantillonnage impliqué. Nous avons aussi mis en évidence que les erreurs d’échantillonnage commises aux moments où la concentration change rapidement sont celles qui affectent le plus le pouvoir prédictif des R-LSS. Plus intéressant, nous avons mis en exergue que même si différentes R-LSS peuvent avoir des performances similaires lorsque basées sur des temps nominaux, leurs tolérances aux erreurs des temps d’échantillonnage peuvent largement différer. En fait, une considération adéquate de l'impact de ces erreurs peut conduire à une sélection et une utilisation plus fiables des R-LSS. Par une investigation approfondie de différents aspects sous-jacents aux stratégies d’échantillonnages limités, cette thèse a pu fournir des améliorations méthodologiques notables, et proposer de nouvelles voies pour assurer leur utilisation de façon fiable et informée, tout en favorisant leur adéquation à la pratique clinique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis deals with some of the non-linear Gaussian and non-Gaussian time models and mainly concentrated in studying the properties and application of a first order autoregressive process with Cauchy marginal distribution. In this thesis some of the non-linear Gaussian and non-Gaussian time series models and mainly concentrated in studying the properties and application of a order autoregressive process with Cauchy marginal distribution. Time series relating to prices, consumptions, money in circulation, bank deposits and bank clearing, sales and profit in a departmental store, national income and foreign exchange reserves, prices and dividend of shares in a stock exchange etc. are examples of economic and business time series. The thesis discuses the application of a threshold autoregressive(TAR) model, try to fit this model to a time series data. Another important non-linear model is the ARCH model, and the third model is the TARCH model. The main objective here is to identify an appropriate model to a given set of data. The data considered are the daily coconut oil prices for a period of three years. Since it is a price data the consecutive prices may not be independent and hence a time series based model is more appropriate. In this study the properties like ergodicity, mixing property and time reversibility and also various estimation procedures used to estimate the unknown parameters of the process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis T-policy is implemented to the inventory system with random lead time and also repair in the reliability of k-out-of-n system. Inventory system may be considered as the system of keeping records of the amounts of commodities in stock. Reliability is defined as the ability of an entity to perform a required function under given conditions for a given time interval. It is measured by the probability that an entity E can perform a required function under given conditions for the time interval. In this thesis considered k-out-of-n system with repair and two modes of service under T-policy. In this case first server is available always and second server is activated on elapse of T time units. The lead time is exponentially distributed with parameter  and T is exponentially distributed with parameter  from the epoch at which it was inactivated after completion of repair of all failed units in the previous cycle, or the moment n-k failed units accumulate. The repaired units are assumed to be as good as new. In this study , three different situations, ie; cold system, warm system and hot system. A k-out-of-n system is called cold, warm or hot according as the functional units do not fail, fail at a lower rate or fail at the same rate when system is shown as that when it is up.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis Entitled “modelling and analysis of recurrent event data with multiple causes.Survival data is a term used for describing data that measures the time to occurrence of an event.In survival studies, the time to occurrence of an event is generally referred to as lifetime.Recurrent event data are commonly encountered in longitudinal studies when individuals are followed to observe the repeated occurrences of certain events. In many practical situations, individuals under study are exposed to the failure due to more than one causes and the eventual failure can be attributed to exactly one of these causes.The proposed model was useful in real life situations to study the effect of covariates on recurrences of certain events due to different causes.In Chapter 3, an additive hazards model for gap time distributions of recurrent event data with multiple causes was introduced. The parameter estimation and asymptotic properties were discussed .In Chapter 4, a shared frailty model for the analysis of bivariate competing risks data was presented and the estimation procedures for shared gamma frailty model, without covariates and with covariates, using EM algorithm were discussed. In Chapter 6, two nonparametric estimators for bivariate survivor function of paired recurrent event data were developed. The asymptotic properties of the estimators were studied. The proposed estimators were applied to a real life data set. Simulation studies were carried out to find the efficiency of the proposed estimators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The influence of vacancy concentration on the behavior of the three-dimensional random field Ising model with metastable dynamics is studied. We have focused our analysis on the number of spanning avalanches which allows us a clear determination of the critical line where the hysteresis loops change from continuous to discontinuous. By a detailed finite-size scaling analysis we determine the phase diagram and numerically estimate the critical exponents along the whole critical line. Finally, we discuss the origin of the curvature of the critical line at high vacancy concentration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we attempt to make a probabilistic analysis of some physically realizable, though complex, storage and queueing models. It is essentially a mathematical study of the stochastic processes underlying these models. Our aim is to have an improved understanding of the behaviour of such models, that may widen their applicability. Different inventory systems with randon1 lead times, vacation to the server, bulk demands, varying ordering levels, etc. are considered. Also we study some finite and infinite capacity queueing systems with bulk service and vacation to the server and obtain the transient solution in certain cases. Each chapter in the thesis is provided with self introduction and some important references

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thunderstorm, resulting from vigorous convective activity, is one of the most spectacular weather phenomena in the atmosphere. A common feature of the weather during the pre-monsoon season over the Indo-Gangetic Plain and northeast India is the outburst of severe local convective storms, commonly known as ‘Nor’westers’(as they move from northwest to southeast). The severe thunderstorms associated with thunder, squall lines, lightning and hail cause extensive losses in agricultural, damage to structure and also loss of life. In this paper, sensitivity experiments have been conducted with the Non-hydrostatic Mesoscale Model (NMM) to test the impact of three microphysical schemes in capturing the severe thunderstorm event occurred over Kolkata on 15 May 2009. The results show that the WRF-NMM model with Ferrier microphysical scheme appears to reproduce the cloud and precipitation processes more realistically than other schemes. Also, we have made an attempt to diagnose four severe thunderstorms that occurred during pre-monsoon seasons of 2006, 2007 and 2008 through the simulated radar reflectivity fields from NMM model with Ferrier microphysics scheme and validated the model results with Kolkata Doppler Weather Radar (DWR) observations. Composite radar reflectivity simulated by WRF-NMM model clearly shows the severe thunderstorm movement as observed by DWR imageries, but failed to capture the intensity as in observations. The results of these analyses demonstrated the capability of high resolution WRF-NMM model in the simulation of severe thunderstorm events and determined that the 3 km model improve upon current abilities when it comes to simulating severe thunderstorms over east Indian region

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Marine Aspergillus awamori BTMFW032, recently reported by us, produce acidophilic tannase as extracellular enzyme. Here, we report the application of this enzyme for synthesis of propyl gallate by direct transesterification of tannic acid and in tea cream solubilisation besides the simultaneous production of gallic acid along with tannase under submerged fermentation by this fungus. This acidophilic tannase enabled synthesis of propyl gallate by direct transesterification of tannic acid using propanol as organic reaction media under low water conditions. The identity of the product was confirmed with thin layer chromatography and Fourier transform infrared spectroscopy. It was noted that 699 U/ml of enzyme could give 60% solubilisation of tea cream within 1 h. Enzyme production medium was optimized adopting Box–Behnken design for simultaneous synthesis of tannase and gallic acid. Process variables including tannic acid, sodium chloride, ferrous sulphate, dipotassium hydrogen phosphate, incubation period and agitation were recognized as the critical factors that influenced tannase and gallic acid production. The model obtained predicted 4,824.61 U/ml of tannase and 136.206 μg/ml gallic acid after 48 h of incubation, whereas optimized medium supported 5,085 U/ml tannase and 372.6 μg/ml of gallic acid production after 36 and 84 h of incubation, respectively, with a 15-fold increase in both enzyme and gallic acid production. Results indicated scope for utilization of this acidophilic tannase for transesterification of tannic acid into propyl gallate, tea cream solubilisation and simultaneous production of gallic acid along with tannase

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L-Glutamine amidohydrolase (L-glutaminase, EC 3.5.1.2) is a therapeutically and industrially important enzyme. Because it is a potent antileukemic agent and a flavor-enhancing agent used in the food industry, many researchers have focused their attention on L-glutaminase. In this article, we report the continuous production of extracellular L-glutaminase by the marine fungus Beauveria bassiana BTMF S-10 in a packed-bed reactor. Parameters influencing bead production and performance under batch mode were optimized in the order-support (Na-alginate) concentration, concentration of CaCl2 for bead preparation, curing time of beads, spore inoculum concentration, activation time, initial pH of enzyme production medium, temperature of incubation, and retention time. Parameters optimized under batch mode for L-glutaminase production were incorporated into the continuous production studies. Beads with 12 × 108 spores/g of beads were activated in a solution of 1% glutamine in seawater for 15 h, and the activated beads were packed into a packed-bed reactor. Enzyme production medium (pH 9.0) was pumped through the bed, and the effluent was collected from the top of the column. The effect of flow rate of the medium, substrate concentration, aeration, and bed height on continuous production of L-glutaminase was studied. Production was monitored for 5 h in each case, and the volumetric productivity was calculated. Under the optimized conditions for continuous production, the reactor gave a volumetric productivity of 4.048 U/(mL·h), which indicates that continuous production of the enzyme by Ca-alginate-immobilizedspores is well suited for B. bassiana and results in a higher yield of enzyme within a shorter time. The results indicate the scope of utilizing immobilized B. bassiana for continuous commercial production of L-glutaminase

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To study the behaviour of beam-to-column composite connection more sophisticated finite element models is required, since component model has some severe limitations. In this research a generic finite element model for composite beam-to-column joint with welded connections is developed using current state of the art local modelling. Applying mechanically consistent scaling method, it can provide the constitutive relationship for a plane rectangular macro element with beam-type boundaries. Then, this defined macro element, which preserves local behaviour and allows for the transfer of five independent states between local and global models, can be implemented in high-accuracy frame analysis with the possibility of limit state checks. In order that macro element for scaling method can be used in practical manner, a generic geometry program as a new idea proposed in this study is also developed for this finite element model. With generic programming a set of global geometric variables can be input to generate a specific instance of the connection without much effort. The proposed finite element model generated by this generic programming is validated against testing results from University of Kaiserslautern. Finally, two illustrative examples for applying this macro element approach are presented. In the first example how to obtain the constitutive relationships of macro element is demonstrated. With certain assumptions for typical composite frame the constitutive relationships can be represented by bilinear laws for the macro bending and shear states that are then coupled by a two-dimensional surface law with yield and failure surfaces. In second example a scaling concept that combines sophisticated local models with a frame analysis using a macro element approach is presented as a practical application of this numerical model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The traditional task of a central bank is to preserve price stability and, in doing so, not to impair the real economy more than necessary. To meet this challenge, it is of great relevance whether inflation is only driven by inflation expectations and the current output gap or whether it is, in addition, influenced by past inflation. In the former case, as described by the New Keynesian Phillips curve, the central bank can immediately and simultaneously achieve price stability and equilibrium output, the so-called ‘divine coincidence’ (Blanchard and Galí 2007). In the latter case, the achievement of price stability is costly in terms of output and will be pursued over several periods. Similarly, it is important to distinguish this latter case, which describes ‘intrinsic’ inflation persistence, from that of ‘extrinsic’ inflation persistence, where the sluggishness of inflation is not a ‘structural’ feature of the economy but merely ‘inherited’ from the sluggishness of the other driving forces, inflation expectations and output. ‘Extrinsic’ inflation persistence is usually considered to be the less challenging case, as policy-makers are supposed to fight against the persistence in the driving forces, especially to reduce the stickiness of inflation expectations by a credible monetary policy, in order to reestablish the ‘divine coincidence’. The scope of this dissertation is to contribute to the vast literature and ongoing discussion on inflation persistence: Chapter 1 describes the policy consequences of inflation persistence and summarizes the empirical and theoretical literature. Chapter 2 compares two models of staggered price setting, one with a fixed two-period duration and the other with a stochastic duration of prices. I show that in an economy with a timeless optimizing central bank the model with the two-period alternating price-setting (for most parameter values) leads to more persistent inflation than the model with stochastic price duration. This result amends earlier work by Kiley (2002) who found that the model with stochastic price duration generates more persistent inflation in response to an exogenous monetary shock. Chapter 3 extends the two-period alternating price-setting model to the case of 3- and 4-period price durations. This results in a more complex Phillips curve with a negative impact of past inflation on current inflation. As simulations show, this multi-period Phillips curve generates a too low degree of autocorrelation and too early turnings points of inflation and is outperformed by a simple Hybrid Phillips curve. Chapter 4 starts from the critique of Driscoll and Holden (2003) on the relative real-wage model of Fuhrer and Moore (1995). While taking the critique seriously that Fuhrer and Moore’s model will collapse to a much simpler one without intrinsic inflation persistence if one takes their arguments literally, I extend the model by a term for inequality aversion. This model extension is not only in line with experimental evidence but results in a Hybrid Phillips curve with inflation persistence that is observably equivalent to that presented by Fuhrer and Moore (1995). In chapter 5, I present a model that especially allows to study the relationship between fairness attitudes and time preference (impatience). In the model, two individuals take decisions in two subsequent periods. In period 1, both individuals are endowed with resources and are able to donate a share of their resources to the other individual. In period 2, the two individuals might join in a common production after having bargained on the split of its output. The size of the production output depends on the relative share of resources at the end of period 1 as the human capital of the individuals, which is built by means of their resources, cannot fully be substituted one against each other. Therefore, it might be rational for a well-endowed individual in period 1 to act in a seemingly ‘fair’ manner and to donate own resources to its poorer counterpart. This decision also depends on the individuals’ impatience which is induced by the small but positive probability that production is not possible in period 2. As a general result, the individuals in the model economy are more likely to behave in a ‘fair’ manner, i.e., to donate resources to the other individual, the lower their own impatience and the higher the productivity of the other individual. As the (seemingly) ‘fair’ behavior is modelled as an endogenous outcome and as it is related to the aspect of time preference, the presented framework might help to further integrate behavioral economics and macroeconomics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este proyecto aspira comprender el proceder que le ha permitido a Crepes & Waffles generar perdurabilidad empresarial, motivo por el cual fue galardonada con el Premio Empresario Colombiano Mariposa de Lorenz (2008). La práctica empresarial de Crepes & Waffles permite resaltar su gestión de calidad, identidad organizacional y estrategia, factores que contrastaremos con las herramientas de medición de perdurabilidad empresarial planteados por la Universidad del Rosario y generar un diagnostico sobre la efectividad de estos en la empresa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En un mundo hiperconectado, dinámico y cargado de incertidumbre como el actual, los métodos y modelos analíticos convencionales están mostrando sus limitaciones. Las organizaciones requieren, por tanto, herramientas útiles que empleen tecnología de información y modelos de simulación computacional como mecanismos para la toma de decisiones y la resolución de problemas. Una de las más recientes, potentes y prometedoras es el modelamiento y la simulación basados en agentes (MSBA). Muchas organizaciones, incluidas empresas consultoras, emplean esta técnica para comprender fenómenos, hacer evaluación de estrategias y resolver problemas de diversa índole. Pese a ello, no existe (hasta donde conocemos) un estado situacional acerca del MSBA y su aplicación a la investigación organizacional. Cabe anotar, además, que por su novedad no es un tema suficientemente difundido y trabajado en Latinoamérica. En consecuencia, este proyecto pretende elaborar un estado situacional sobre el MSBA y su impacto sobre la investigación organizacional.