960 resultados para Life-time distribution
Resumo:
One hundred and ten specimens of Pseudoplatystoma corruscans (Pimelodidae) and 582 specimens of Schizodon borelli (Anostomidae) collected in the high Paraná River were analyzed. On necropsy 74% of P. corruscans were found to be parasitized; proteocephalidean cestodes presented the greatest number. With regard to S. borelli, the percentage of parasitism reached 19.42% and the nematode Cucullanus pinnai was the most abundant. The absence of correlation between the endoparasitic diversity and the standard length of the two host species indicates that each one presents homogeneity in alimentary behaviour during all its life time, permiting the uniform recruitment of the same species of endoparasites during all its ontogenetic development. Independence of diversity values in relation to sex of P. corruscans and S. borelli evidences that the ecological relationships are similar between males and females in these species. Both host's infrapopulations presented a typical overdispersed pattern of distribution with isolationist characteristics.
Resumo:
Changes in life tables of Rhodnius neivai due to variations of environmental temperature were studied, based on nine cohorts. Three cohorts were kept at 22°C, three at 27°C and three at 32°C. Cohorts were censused daily during nymphal instars and weekly in adults. Nine complete horizontal life tables were built. A high negative correlation between temperature and age at first laying was registered (r=-0,84). Age at maximum reproduction was significantly lower at 32°C. Average number of eggs/female/week and total eggs/female on its life time were significantly lower at 22°C. Total number of egg by cohort and total number of reproductive weeks were significantly higher at 27°C. At 32°C, generational time was significantly lower. At 27°C net reproductive rate and total reproductive value were significantly higher. At 22°C, intrinsic growth, finite growth and finite birth rates were significantly lower. At 22°C, death instantaneous rate was significantly higher.
Resumo:
Background: We aimed to analyze the rate and time distribution of pre- and post-morbid cerebrovascular events in a single ischemic stroke population, and whether these depend on the etiology of the index stroke. Methods: In 2,203 consecutive patients admitted to a single stroke center registry (ASTRAL), the ischemic stroke that led to admission was considered the index event. Frequency distribution and cumulative relative distribution graphs of the most recent and first recurrent event (ischemic stroke, transient ischemic attack, intracranial or subarachnoid hemorrhage) were drawn in weekly and daily intervals for all strokes and for all stroke types. Results: The frequency of events at identical time points before and after the index stroke was mostly reduced in the first week after (vs. before) stroke (1.0 vs. 4.2%, p < 0.001) and the first month (2.7 vs. 7.4%, p < 0.001), and then ebbed over the first year (8.4 vs. 13.1%, p < 0.001). On daily basis, the peak frequency was noticed at day -1 (1.6%) with a reduction to 0.7% on the index day and 0.17% 24 h after. The event rate in patients with atherosclerotic stroke was particularly high around the index event, but 1-year cumulative recurrence rate was similar in all stroke types. Conclusions: We confirm a short window of increased vulnerability in ischemic stroke and show a 4-, 3- and 2-fold reduction in post-stroke events at 1 week, 1 month and 1 year, respectively, compared to identical pre-stroke periods. This break in the 'stroke wave' is particularly striking after atherosclerotic and lacunar strokes.
Resumo:
PURPOSE: To compare clinical benefit response (CBR) and quality of life (QOL) in patients receiving gemcitabine (Gem) plus capecitabine (Cap) versus single-agent Gem for advanced/metastatic pancreatic cancer. PATIENTS AND METHODS: Patients were randomly assigned to receive GemCap (oral Cap 650 mg/m(2) twice daily on days 1 through 14 plus Gem 1,000 mg/m(2) in a 30-minute infusion on days 1 and 8 every 3 weeks) or Gem (1,000 mg/m(2) in a 30-minute infusion weekly for 7 weeks, followed by a 1-week break, and then weekly for 3 weeks every 4 weeks) for 24 weeks or until progression. CBR criteria and QOL indicators were assessed over this period. CBR was defined as improvement from baseline for >or= 4 consecutive weeks in pain (pain intensity or analgesic consumption) and Karnofsky performance status, stability in one but improvement in the other, or stability in pain and performance status but improvement in weight. RESULTS: Of 319 patients, 19% treated with GemCap and 20% treated with Gem experienced a CBR, with a median duration of 9.5 and 6.5 weeks, respectively (P < .02); 54% of patients treated with GemCap and 60% treated with Gem had no CBR (remaining patients were not assessable). There was no treatment difference in QOL (n = 311). QOL indicators were improving under chemotherapy (P < .05). These changes differed by the time to failure, with a worsening 1 to 2 months before treatment failure (all P < .05). CONCLUSION: There is no indication of a difference in CBR or QOL between GemCap and Gem. Regardless of their initial condition, some patients experience an improvement in QOL on chemotherapy, followed by a worsening before treatment failure.
Resumo:
Woven monofilament, multifilament, and spun yarn filter media have long been the standard media in liquid filtration equipment. While the energy for a solid-liquid separation process is determined by the engineering work, it is the interface between the slurry and the equipment - the filter media - that greatly affects the performance characteristics of the unit operation. Those skilled in the art are well aware that a poorly designed filter medium may endanger the whole operation, whereas well-performing filter media can make the operation smooth and economical. As the mineral and pulp producers seek to produce ever finer and more refined fractions of their products, it is becoming increasingly important to be able to dewater slurries with average particle sizes around 1 ¿m using conventional, high-capacity filtration equipment. Furthermore, the surface properties of the media must not allow sticky and adhesive particles to adhere to the media. The aim of this thesis was to test how the dirt-repellency, electrical resistance and highpressure filtration performance of selected woven filter media can be improved by modifying the fabric or yarn with coating, chemical treatment and calendering. The results achieved by chemical surface treatments clearly show that the woven media surface properties can be modified to achieve lower electrical resistance and improved dirt-repellency. The main challenge with the chemical treatments is the abrasion resistance and, while the experimental results indicate that the treatment is sufficiently permanent to resist standard weathering conditions, they may still prove to be inadequately strong in terms of actual use.From the pressure filtration studies in this work, it seems obvious that the conventional woven multifilament fabrics still perform surprisingly well against the coated media in terms of filtrate clarity and cake build-up. Especially in cases where the feed slurry concentration was low and the pressures moderate, the conventional media seemed to outperform the coated media. In the cases where thefeed slurry concentration was high, the tightly woven media performed well against the monofilament reference fabrics, but seemed to do worse than some of the coated media. This result is somewhat surprising in that the high initial specific resistance of the coated media would suggest that the media will blind more easily than the plain woven media. The results indicate, however, that it is actually the woven media that gradually clogs during the coarse of filtration. In conclusion, it seems obvious that there is a pressure limit above which the woven media looses its capacity to keep the solid particles from penetrating the structure. This finding suggests that for extreme pressures the only foreseeable solution is the coated fabrics supported by a strong enough woven fabric to hold thestructure together. Having said that, the high pressure filtration process seems to follow somewhat different laws than the more conventional processes. Based on the results, it may well be that the role of the cloth is most of all to support the cake, and the main performance-determining factor is a long life time. Measuring the pore size distribution with a commercially available porometer gives a fairly accurate picture of the pore size distribution of a fabric, but failsto give insight into which of the pore sizes is the most important in determining the flow through the fabric. Historically air, and sometimes water, permeability measures have been the standard in evaluating media filtration performance including particle retention. Permeability, however, is a function of a multitudeof variables and does not directly allow the estimation of the effective pore size. In this study a new method for estimating the effective pore size and open pore area in a densely woven multifilament fabric was developed. The method combines a simplified equation of the electrical resistance of fabric with the Hagen-Poiseuille flow equation to estimate the effective pore size of a fabric and the total open area of pores. The results are validated by comparison to the measured values of the largest pore size (Bubble point) and the average pore size. The results show good correlation with measured values. However, the measured and estimated values tend to diverge in high weft density fabrics. This phenomenon is thought to be a result of a more tortuous flow path of denser fabrics, and could most probably be cured by using another value for the tortuosity factor.
Resumo:
This thesis develops a comprehensive and a flexible statistical framework for the analysis and detection of space, time and space-time clusters of environmental point data. The developed clustering methods were applied in both simulated datasets and real-world environmental phenomena; however, only the cases of forest fires in Canton of Ticino (Switzerland) and in Portugal are expounded in this document. Normally, environmental phenomena can be modelled as stochastic point processes where each event, e.g. the forest fire ignition point, is characterised by its spatial location and occurrence in time. Additionally, information such as burned area, ignition causes, landuse, topographic, climatic and meteorological features, etc., can also be used to characterise the studied phenomenon. Thereby, the space-time pattern characterisa- tion represents a powerful tool to understand the distribution and behaviour of the events and their correlation with underlying processes, for instance, socio-economic, environmental and meteorological factors. Consequently, we propose a methodology based on the adaptation and application of statistical and fractal point process measures for both global (e.g. the Morisita Index, the Box-counting fractal method, the multifractal formalism and the Ripley's K-function) and local (e.g. Scan Statistics) analysis. Many measures describing the space-time distribution of environmental phenomena have been proposed in a wide variety of disciplines; nevertheless, most of these measures are of global character and do not consider complex spatial constraints, high variability and multivariate nature of the events. Therefore, we proposed an statistical framework that takes into account the complexities of the geographical space, where phenomena take place, by introducing the Validity Domain concept and carrying out clustering analyses in data with different constrained geographical spaces, hence, assessing the relative degree of clustering of the real distribution. Moreover, exclusively to the forest fire case, this research proposes two new methodologies to defining and mapping both the Wildland-Urban Interface (WUI) described as the interaction zone between burnable vegetation and anthropogenic infrastructures, and the prediction of fire ignition susceptibility. In this regard, the main objective of this Thesis was to carry out a basic statistical/- geospatial research with a strong application part to analyse and to describe complex phenomena as well as to overcome unsolved methodological problems in the characterisation of space-time patterns, in particular, the forest fire occurrences. Thus, this Thesis provides a response to the increasing demand for both environmental monitoring and management tools for the assessment of natural and anthropogenic hazards and risks, sustainable development, retrospective success analysis, etc. The major contributions of this work were presented at national and international conferences and published in 5 scientific journals. National and international collaborations were also established and successfully accomplished. -- Cette thèse développe une méthodologie statistique complète et flexible pour l'analyse et la détection des structures spatiales, temporelles et spatio-temporelles de données environnementales représentées comme de semis de points. Les méthodes ici développées ont été appliquées aux jeux de données simulées autant qu'A des phénomènes environnementaux réels; nonobstant, seulement le cas des feux forestiers dans le Canton du Tessin (la Suisse) et celui de Portugal sont expliqués dans ce document. Normalement, les phénomènes environnementaux peuvent être modélisés comme des processus ponctuels stochastiques ou chaque événement, par ex. les point d'ignition des feux forestiers, est déterminé par son emplacement spatial et son occurrence dans le temps. De plus, des informations tels que la surface bru^lée, les causes d'ignition, l'utilisation du sol, les caractéristiques topographiques, climatiques et météorologiques, etc., peuvent aussi être utilisées pour caractériser le phénomène étudié. Par conséquent, la définition de la structure spatio-temporelle représente un outil puissant pour compren- dre la distribution du phénomène et sa corrélation avec des processus sous-jacents tels que les facteurs socio-économiques, environnementaux et météorologiques. De ce fait, nous proposons une méthodologie basée sur l'adaptation et l'application de mesures statistiques et fractales des processus ponctuels d'analyse global (par ex. l'indice de Morisita, la dimension fractale par comptage de boîtes, le formalisme multifractal et la fonction K de Ripley) et local (par ex. la statistique de scan). Des nombreuses mesures décrivant les structures spatio-temporelles de phénomènes environnementaux peuvent être trouvées dans la littérature. Néanmoins, la plupart de ces mesures sont de caractère global et ne considèrent pas de contraintes spatiales com- plexes, ainsi que la haute variabilité et la nature multivariée des événements. A cet effet, la méthodologie ici proposée prend en compte les complexités de l'espace géographique ou le phénomène a lieu, à travers de l'introduction du concept de Domaine de Validité et l'application des mesures d'analyse spatiale dans des données en présentant différentes contraintes géographiques. Cela permet l'évaluation du degré relatif d'agrégation spatiale/temporelle des structures du phénomène observé. En plus, exclusif au cas de feux forestiers, cette recherche propose aussi deux nouvelles méthodologies pour la définition et la cartographie des zones périurbaines, décrites comme des espaces anthropogéniques à proximité de la végétation sauvage ou de la forêt, et de la prédiction de la susceptibilité à l'ignition de feu. A cet égard, l'objectif principal de cette Thèse a été d'effectuer une recherche statistique/géospatiale avec une forte application dans des cas réels, pour analyser et décrire des phénomènes environnementaux complexes aussi bien que surmonter des problèmes méthodologiques non résolus relatifs à la caractérisation des structures spatio-temporelles, particulièrement, celles des occurrences de feux forestières. Ainsi, cette Thèse fournit une réponse à la demande croissante de la gestion et du monitoring environnemental pour le déploiement d'outils d'évaluation des risques et des dangers naturels et anthro- pogéniques. Les majeures contributions de ce travail ont été présentées aux conférences nationales et internationales, et ont été aussi publiées dans 5 revues internationales avec comité de lecture. Des collaborations nationales et internationales ont été aussi établies et accomplies avec succès.
Resumo:
This thesis introduces a real-time simulation environment based on the multibody simulation approach. The environment consists of components that are used in conventional product development, including computer aided drawing, visualization, dynamic simulation and finite element software architecture, data transfer and haptics. These components are combined to perform as a coupled system on one platform. The environment is used to simulate mobile and industrial machines at different stages of a product life time. Consequently, the demands of the simulated scenarios vary. In this thesis, a real-time simulation environment based on the multibody approach is used to study a reel mechanism of a paper machine and a gantry crane. These case systems are used to demonstrate the usability of the real-time simulation environment for fault detection purposes and in the context of a training simulator. In order to describe the dynamical performance of a mobile or industrial machine, the nonlinear equations of motion must be defined. In this thesis, the dynamical behaviour of machines is modelled using the multibody simulation approach. A multibody system may consist of rigid and flexible bodies which are joined using kinematic joint constraints while force components are used to describe the actuators. The strength of multibody dynamics relies upon its ability to describe nonlinearities arising from wearing of the components, friction, large rotations or contact forces in a systematic manner. For this reason, the interfaces between subsystems such as mechanics, hydraulics and control systems of the mechatronic machine can be defined and analyzed in a straightforward manner.
Resumo:
We generalize to arbitrary waiting-time distributions some results which were previously derived for discrete distributions. We show that for any two waiting-time distributions with the same mean delay time, that with higher dispersion will lead to a faster front. Experimental data on the speed of virus infections in a plaque are correctly explained by the theoretical predictions using a Gaussian delay-time distribution, which is more realistic for this system than the Dirac delta distribution considered previously [J. Fort and V. Méndez, Phys. Rev. Lett.89, 178101 (2002)]
Resumo:
The marine environment is certainly one of the most complex systems to study, not only because of the challenges posed by the nature of the waters, but especially due to the interactions of physical, chemical and biological processes that control the cycles of the elements. Together with analytical chemists, oceanographers have been making a great effort in the advancement of knowledge of the distribution patterns of trace elements and processes that determine their biogeochemical cycles and influences on the climate of the planet. The international academic community is now in prime position to perform the first study on a global scale for observation of trace elements and their isotopes in the marine environment (GEOTRACES) and to evaluate the effects of major global changes associated with the influences of megacities distributed around the globe. This action can only be performed due to the development of highly sensitive detection methods and the use of clean sampling and handling techniques, together with a joint international program working toward the clear objective of expanding the frontiers of the biogeochemistry of the oceans and related topics, including climate change issues and ocean acidification associated with alterations in the carbon cycle. It is expected that the oceanographic data produced this coming decade will allow a better understanding of biogeochemical cycles, and especially the assessment of changes in trace elements and contaminants in the oceans due to anthropogenic influences, as well as its effects on ecosystems and climate. Computational models are to be constructed to simulate the conditions and processes of the modern oceans and to allow predictions. The environmental changes arising from human activity since the 18th century (also called the Anthropocene) have made the Earth System even more complex. Anthropogenic activities have altered both terrestrial and marine ecosystems, and the legacy of these impacts in the oceans include: a) pollution of the marine environment by solid waste, including plastics; b) pollution by chemical and medical (including those for veterinary use) substances such as hormones, antibiotics, legal and illegal drugs, leading to possible endocrine disruption of marine organisms; and c) ocean acidification, the collateral effect of anthropogenic emissions of CO2 into the atmosphere, irreversible in the human life time scale. Unfortunately, the anthropogenic alteration of the hydrosphere due to inputs of plastics, metal, hydrocarbons, contaminants of emerging concern and even with formerly "exotic" trace elements, such us rare earth elements is likely to accelerate in the near future. These emerging contaminants would likely soon present difficulties for studies in pristine environments. All this knowledge brings with it a great responsibility: helping to envisage viable adaptation and mitigation solutions to the problems identified. The greatest challenge faced by Brazil is currently to create a framework project to develop education, science and technology applied to oceanography and related areas. This framework would strengthen the present working groups and enhance capacity building, allowing a broader Brazilian participation in joint international actions and scientific programs. Recently, the establishment of the National Institutes of Science and Technology (INCTs) for marine science, and the creation of the National Institute of Oceanographic and Hydrological Research represent an exemplary start. However, the participation of the Brazilian academic community in the latest assaults on the frontier of chemical oceanography is extremely limited, largely due to: i. absence of physical infrastructure for the preparation and processing of field samples at ultra-trace level; ii. limited access to oceanographic cruises, due to the small number of Brazilian vessels and/or absence of "clean" laboratories on board; iii. restricted international cooperation; iv. limited analytical capacity of Brazilian institutions for the analysis of trace elements in seawater; v. high cost of ultrapure reagents associated with processing a large number of samples, and vi. lack of qualified technical staff. Advances in knowledge, analytic capabilities and the increasing availability of analytical resources available today offer favorable conditions for chemical oceanography to grow. The Brazilian academic community is maturing and willing to play a role in strengthening the marine science research programs by connecting them with educational and technological initiatives in order to preserve the oceans and to promote the development of society.
Resumo:
Percarboxylic acids are commonly used as disinfection and bleaching agents in textile, paper, and fine chemical industries. All of these applications are based on the oxidative potential of these compounds. In spite of high interest in these chemicals, they are unstable and explosive chemicals, which increase the risk of synthesis processes and transportation. Therefore, the safety criteria in the production process should be considered. Microreactors represent a technology that efficiently utilizes safety advantages resulting from small scale. Therefore, microreactor technology was used in the synthesis of peracetic acid and performic acid. These percarboxylic acids were produced at different temperatures, residence times and catalyst i.e. sulfuric acid concentrations. Both synthesis reactions seemed to be rather fast because with performic acid equilibrium was reached in 4 min at 313 K and with peracetic acid in 10 min at 343 K. In addition, the experimental results were used to study the kinetics of the formation of performic acid and peracetic acid. The advantages of the microreactors in this study were the efficient temperature control even in very exothermic reaction and good mixing due to the short diffusion distances. Therefore, reaction rates were determined with high accuracy. Three different models were considered in order to estimate the kinetic parameters such as reaction rate constants and activation energies. From these three models, the laminar flow model with radial velocity distribution gave most precise parameters. However, sulfuric acid creates many drawbacks in this synthesis process. Therefore, a ´´greener´´ way to use heterogeneous catalyst in the synthesis of performic acid in microreactor was studied. The cation exchange resin, Dowex 50 Wx8, presented very high activity and a long life time in this reaction. In the presence of this catalyst, the equilibrium was reached in 120 second at 313 K which indicates a rather fast reaction. In addition, the safety advantages of microreactors were investigated in this study. Four different conventional methods were used. Production of peracetic acid was used as a test case, and the safety of one conventional batch process was compared with an on-site continuous microprocess. It was found that the conventional methods for the analysis of process safety might not be reliable and adequate for radically novel technology, such as microreactors. This is understandable because the conventional methods are partly based on experience, which is very limited in connection with totally novel technology. Therefore, one checklist-based method was developed to study the safety of intensified and novel processes at the early stage of process development. The checklist was formulated using the concept of layers of protection for a chemical process. The traditional and three intensified processes of hydrogen peroxide synthesis were selected as test cases. With these real cases, it was shown that several positive and negative effects on safety can be detected in process intensification. The general claim that safety is always improved by process intensification was questioned.
Resumo:
Life cycle costing (LCC) practices are spreading from military and construction sectors to wider area of industries. Suppliers as well as customers are demanding comprehensive cost knowledge that includes all relevant cost elements through the life cycle of products. The problem of total cost visibility is being acknowledged and the performance of suppliers is evaluated not just by low acquisition costs of their products, but by total value provided through the life time of their offerings. The main purpose of this thesis is to provide better understanding of product cost structure to the case company. Moreover, comprehensive theoretical body serves as a guideline or methodology for further LCC process. Research includes the constructive analysis of LCC related concepts and features as well as overview of life cycle support services in manufacturing industry. The case study aims to review the existing LCC practices within the case company and provide suggestions for improvements. It includes identification of most relevant life cycle cost elements, development of cost breakdown structure and generic cost model for data collection. Moreover, certain cost-effective suggestions are provided as well. This research should support decision making processes, assessment of economic viability of products, financial planning, sales and other processes within the case company.
Resumo:
Preparation for embryo implantation requires extensive adaptation of the uterine microenvironment. This process consists of cell proliferation and cell differentiation resulting in the transformation of endometrial fibroblasts into a new type of cell called decidual cell. In the present study, we followed the space-time distribution of versican and hyaluronan (HA) in different tissues of the uterus before and after embryo implantation. Fragments of mouse uteri obtained on the fourth, fifth, sixth and seventh days of pregnancy were fixed in Methacarn, embedded in Paraplast and cut into 5-µm thick sections. HA was detected using a biotinylated fragment of the proteoglycan aggrecan, which binds to this glycosaminoglycan with high affinity and specificity. Versican was detected by a polyclonal antibody. Both reactions were developed by peroxidase methods. Before embryo implantation, both HA and versican were present in the endometrial stroma. However, after embryo implantation, HA disappeared from the decidual region immediately surrounding the implantation chamber, whereas versican accumulated in the same region. The differences observed in the expression of HA and versican suggest that both molecules may participate in the process of endometrial decidualization and/or embryo implantation.
Resumo:
This work is aimed at evaluating the physicochemical, physical, chromatic, microbiological, and sensorial stability of a non-dairy dessert elaborated with soy, guava juice, and oligofructose for 60 days at refrigerated storage as well as to estimate its shelf life time. The titrable acidity, pH, instrumental color, water activity, ascorbic acid, and physical stability were measured. Panelists (n = 50) from the campus community used a hedonic scale to assess the acceptance, purchase intent, creaminess, flavor, taste, acidity, color, and overall appearance of the dessert during 60 days. The data showed that the parameters differed significantly (p < 0.05) from the initial time, and they could be fitted in mathematical equations with coefficient of determination above 71%, aiming to consider them suitable for prediction purposes. Creaminess and acceptance did not differ statistically in the 60-day period; taste, flavor, and acidity kept a suitable hedonic score during storage. Notwithstanding, the sample showed good physical stability against gravity and presented more than 15% of the Brazilian Daily Recommended Value of copper, iron, and ascorbic acid. The product shelf life estimation found was 79 days considering the overall acceptance, acceptance index and purchase intent.
Resumo:
Mesures effectuées dans le laboratoire de caractérisation optique des semi-conducteurs du Prof. Richard Leonelli du département de physique de l'université de Montréal. Les nanofils d'InGaN/GaN ont été fournis par le groupe du Prof. Zetian Mi du département de génie électrique et informatique de l'université McGill.
Resumo:
The thesis deals with analysis of some Stochastic Inventory Models with Pooling/Retrial of Customers.. In the first model we analyze an (s,S) production Inventory system with retrial of customers. Arrival of customers from outside the system form a Poisson process. The inter production times are exponentially distributed with parameter µ. When inventory level reaches zero further arriving demands are sent to the orbit which has capacity M(<∞). Customers, who find the orbit full and inventory level at zero are lost to the system. Demands arising from the orbital customers are exponentially distributed with parameter γ. In the model-II we extend these results to perishable inventory system assuming that the life-time of each item follows exponential with parameter θ. The study deals with an (s,S) production inventory with service times and retrial of unsatisfied customers. Primary demands occur according to a Markovian Arrival Process(MAP). Consider an (s,S)-retrial inventory with service time in which primary demands occur according to a Batch Markovian Arrival Process (BMAP). The inventory is controlled by the (s,S) policy and (s,S) inventory system with service time. Primary demands occur according to Poissson process with parameter λ. The study concentrates two models. In the first model we analyze an (s,S) Inventory system with postponed demands where arrivals of demands form a Poisson process. In the second model, we extend our results to perishable inventory system assuming that the life-time of each item follows exponential distribution with parameter θ. Also it is assumed that when inventory level is zero the arriving demands choose to enter the pool with probability β and with complementary probability (1- β) it is lost for ever. Finally it analyze an (s,S) production inventory system with switching time. A lot of work is reported under the assumption that the switching time is negligible but this is not the case for several real life situation.