18 resultados para reliability of supply

em Cochin University of Science


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordination among supply chain members is essential for better supply chain performance. An effective method to improve supply chain coordination is to implement proper coordination mechanisms. The primary objective of this research is to study the performance of a multi-level supply chain while using selected coordination mechanisms separately, and in combination, under lost sale and back order cases. The coordination mechanisms used in this study are price discount, delay in payment and different types of information sharing. Mathematical modelling and simulation modelling are used in this study to analyse the performance of the supply chain using these mechanisms. Initially, a three level supply chain consisting of a supplier, a manufacturer and a retailer has been used to study the combined effect of price discount and delay in payment on the performance (profit) of supply chain using mathematical modelling. This study showed that implementation of individual mechanisms improves the performance of the supply chain compared to ‘no coordination’. When more than one mechanism is used in combination, performance in most cases further improved. The three level supply chain considered in mathematical modelling was then extended to a three level network supply chain consisting of a four retailers, two wholesalers, and a manufacturer with an infinite part supplier. The performance of this network supply chain was analysed under both lost sale and backorder cases using simulation modelling with the same mechanisms: ‘price discount and delay in payment’ used in mathematical modelling. This study also showed that the performance of the supply chain is significantly improved while using combination of mechanisms as obtained earlier. In this study, it is found that the effect (increase in profit) of ‘delay in payment’ and combination of ‘price discount’ & ‘delay in payment’ on SC profit is relatively high in the case of lost sale. Sensitivity analysis showed that order cost of the retailer plays a major role in the performance of the supply chain as it decides the order quantity of the other players in the supply chain in this study. Sensitivity analysis also showed that there is a proportional change in supply chain profit with change in rate of return of any player. In the case of price discount, elasticity of demand is an important factor to improve the performance of the supply chain. It is also found that the change in permissible delay in payment given by the seller to the buyer affects the SC profit more than the delay in payment availed by the buyer from the seller. In continuation of the above, a study on the performance of a four level supply chain consisting of a manufacturer, a wholesaler, a distributor and a retailer with ‘information sharing’ as coordination mechanism, under lost sale and backorder cases, using a simulation game with live players has been conducted. In this study, best performance is obtained in the case of sharing ‘demand and supply chain performance’ compared to other seven types of information sharing including traditional method. This study also revealed that effect of information sharing on supply chain performance is relatively high in the case of lost sale than backorder. The in depth analysis in this part of the study showed that lack of information sharing need not always be resulting in bullwhip effect. Instead of bullwhip effect, lack of information sharing produced a huge hike in lost sales cost or backorder cost in this study which is also not favorable for the supply chain. Overall analysis provided the extent of improvement in supply chain performance under different cases. Sensitivity analysis revealed useful insights about the decision variables of supply chain and it will be useful for the supply chain management practitioners to take appropriate decisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Application of Queueing theory in areas like Computer networking, ATM facilities, Telecommunications and to many other numerous situation made people study Queueing models extensively and it has become an ever expanding branch of applied probability. The thesis discusses Reliability of a ‘k-out-of-n system’ where the server also attends external customers when there are no failed components (main customers), under a retrial policy, which can be explained in detail. It explains the reliability of a ‘K-out-of-n-system’ where the server also attends external customers and studies a multi-server infinite capacity Queueing system where each customer arrives as ordinary but can generate into priority customer which waiting in the queue. The study gives details on a finite capacity multi-server queueing system with self-generation of priority customers and also on a single server infinite capacity retrial Queue where the customer in the orbit can generate into a priority customer and leaves the system if the server is already busy with a priority generated customer; else he is taken for service immediately. Arrival process is according to a MAP and service times follow MSP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software systems are progressively being deployed in many facets of human life. The implication of the failure of such systems, has an assorted impact on its customers. The fundamental aspect that supports a software system, is focus on quality. Reliability describes the ability of the system to function under specified environment for a specified period of time and is used to objectively measure the quality. Evaluation of reliability of a computing system involves computation of hardware and software reliability. Most of the earlier works were given focus on software reliability with no consideration for hardware parts or vice versa. However, a complete estimation of reliability of a computing system requires these two elements to be considered together, and thus demands a combined approach. The present work focuses on this and presents a model for evaluating the reliability of a computing system. The method involves identifying the failure data for hardware components, software components and building a model based on it, to predict the reliability. To develop such a model, focus is given to the systems based on Open Source Software, since there is an increasing trend towards its use and only a few studies were reported on the modeling and measurement of the reliability of such products. The present work includes a thorough study on the role of Free and Open Source Software, evaluation of reliability growth models, and is trying to present an integrated model for the prediction of reliability of a computational system. The developed model has been compared with existing models and its usefulness of is being discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this thesis T-policy is implemented to the inventory system with random lead time and also repair in the reliability of k-out-of-n system. Inventory system may be considered as the system of keeping records of the amounts of commodities in stock. Reliability is defined as the ability of an entity to perform a required function under given conditions for a given time interval. It is measured by the probability that an entity E can perform a required function under given conditions for the time interval. In this thesis considered k-out-of-n system with repair and two modes of service under T-policy. In this case first server is available always and second server is activated on elapse of T time units. The lead time is exponentially distributed with parameter  and T is exponentially distributed with parameter  from the epoch at which it was inactivated after completion of repair of all failed units in the previous cycle, or the moment n-k failed units accumulate. The repaired units are assumed to be as good as new. In this study , three different situations, ie; cold system, warm system and hot system. A k-out-of-n system is called cold, warm or hot according as the functional units do not fail, fail at a lower rate or fail at the same rate when system is shown as that when it is up.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Model development for selection of location for refinery in India and identification of characteristics to be looked into when configuring it and to develop models for integrated supply chain planning for a refinery. Locating and removing inbound, internal and outbound logistic problems in an existing refinery and overall design of a logistic information system for a refinery are the main objectives of the study. A brief description of supply chain management (SCM), elements of SCM and their significance, logistics cost in petroleum industry and its impacts, and dynamics of petroleum its logistic practices are also to be presented. Scope of application of SCM in petroleum refinery will also be discussed. A review of the investigations carried out by earlier researches in the area of supply chain management in general and with specific reference to petroleum refining.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ongoing changes in global economic structure along information revolution have produced an environment where knowledge and skills or education and training are considered increasingly valued commodities. This is based on the simple notion that nation’s economic progress is linked to education and training. This idea is embodied in the theory of human capital, according to which the knowledge and skill found in labour represents valuable resources for the market. Thus the important assumptions of the Human capital theory are 910 Human capital is an investment for future (2) More training and education leads to better work skills (3) Educational institutions play a central role in the development of human capital(4) the technological revolution is often cited as the most pressing reason why education and knowledge are becoming valuable economic commodities . The objectives of the present study are, the investment and institutional or structural framework of higher education in Kerala, the higher education market and the strengths and weakness of supply demand conditions , cost and the benefits of higher education in Kerala , impact of recent policy changes in higher education,need for expanding higher education market to solve the grave problem of Un employment on the basis of as systematic manpower planning and the higher education and its association with income and employment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study is about the stability of random sums and extremes.The difficulty in finding exact sampling distributions resulted in considerable problems of computing probabilities concerning the sums that involve a large number of terms.Functions of sample observations that are natural interest other than the sum,are the extremes,that is , the minimum and the maximum of the observations.Extreme value distributions also arise in problems like the study of size effect on material strengths,the reliability of parallel and series systems made up of large number of components,record values and assessing the levels of air pollution.It may be noticed that the theories of sums and extremes are mutually connected.For instance,in the search for asymptotic normality of sums ,it is assumed that at least the variance of the population is finite.In such cases the contributions of the extremes to the sum of independent and identically distributed(i.i.d) r.vs is negligible.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the present environment, industry should provide the products of high quality. Quality of products is judged by the period of time they can successfully perform their intended functions without failure. The cause of the failures can be ascertained through life testing experiments and the times to failure due to different cause are likely to follow different distributions. Knowledge of this distribution is essential to eliminate causes of failures and thereby to improve the quality and the reliability of products. The main accomplishment expected to the study is to develop statistical tools that could facilitate solution to lifetime data arising in such and similar contexts

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we describe the use of an open cell photoacoustic configuration for the evaluation of the thermal effusivity of liquid crystals. The feasibility, precision and reliability of the method are initially established by measuring the thermal effusivities of water and glycerol, for which the effusivity values are known accurately. In order to demonstrate the use of the present method in the thermal characterization of liquid crystals, we have measured the thermal effusivity values in various mesophases of 4-cyano-4 - octyloxybiphenyl (8OCB) and 4-cyano-4 -heptyloxybiphenyl (7OCB) liquid crystals using a variable temperature open photoacoustic cell. A comparison of the measured values for the two liquid crystals shows that the thermal effusivities of 7OCB in the nematic and isotropic phases are slightly less than those of 8OCB in the corresponding phases

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we describe the use of an open cell photoacoustic configuration for the evaluation of the thermal effusivity of liquid crystals. The feasibility, precision and reliability of the method are initially established by measuring the thermal effusivities of water and glycerol, for which the effusivity values are known accurately. In order to demonstrate the use of the present method in the thermal characterization of liquid crystals, we have measured the thermal effusivity values in various mesophases of 4-cyano-4 - octyloxybiphenyl (8OCB) and 4-cyano-4 -heptyloxybiphenyl (7OCB) liquid crystals using a variable temperature open photoacoustic cell. A comparison of the measured values for the two liquid crystals shows that the thermal effusivities of 7OCB in the nematic and isotropic phases are slightly less than those of 8OCB in the corresponding phases

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we describe the use of an open cell photoacoustic configuration for the evaluation of the thermal effusivity of liquid crystals. The feasibility, precision and reliability of the method are initially established by measuring the thermal effusivities of water and glycerol, for which the effusivity values are known accurately. In order to demonstrate the use of the present method in the thermal characterization of liquid crystals, we have measured the thermal effusivity values in various mesophases of 4-cyano-4 - octyloxybiphenyl (8OCB) and 4-cyano-4 -heptyloxybiphenyl (7OCB) liquid crystals using a variable temperature open photoacoustic cell. A comparison of the measured values for the two liquid crystals shows that the thermal effusivities of 7OCB in the nematic and isotropic phases are slightly less than those of 8OCB in the corresponding phases

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Non-destructive testing (NDT) is the use of non-invasive techniques to determine the integrity of a material, component, or structure. Engineers and scientists use NDT in a variety of applications, including medical imaging, materials analysis, and process control.Photothermal beam deflection technique is one of the most promising NDT technologies. Tremendous R&D effort has been made for improving the efficiency and simplicity of this technique. It is a popular technique because it can probe surfaces irrespective of the size of the sample and its surroundings. This technique has been used to characterize several semiconductor materials, because of its non-destructive and non-contact evaluation strategy. Its application further extends to analysis of wide variety of materials. Instrumentation of a NDT technique is very crucial for any material analysis. Chapter two explores the various excitation sources, source modulation techniques, detection and signal processing schemes currently practised. The features of the experimental arrangement including the steps for alignment, automation, data acquisition and data analysis are explained giving due importance to details.Theoretical studies form the backbone of photothermal techniques. The outcome of a theoretical work is the foundation of an application.The reliability of the theoretical model developed and used is proven from the studies done on crystalline.The technique is applied for analysis of transport properties such as thermal diffusivity, mobility, surface recombination velocity and minority carrier life time of the material and thermal imaging of solar cell absorber layer materials like CuInS2, CuInSe2 and SnS thin films.analysis of In2S3 thin films, which are used as buffer layer material in solar cells. The various influences of film composition, chlorine and silver incorporation in this material is brought out from the measurement of transport properties and analysis of sub band gap levels.The application of photothermal deflection technique for characterization of solar cells is a relatively new area that requires considerable attention.The application of photothermal deflection technique for characterization of solar cells is a relatively new area that requires considerable attention. Chapter six thus elucidates the theoretical aspects of application of photothermal techniques for solar cell analysis. The experimental design and method for determination of solar cell efficiency, optimum load resistance and series resistance with results from the analysis of CuInS2/In2S3 based solar cell forms the skeleton of this chapter.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Internet today has become a vital part of day to day life, owing to the revolutionary changes it has brought about in various fields. Dependence on the Internet as an information highway and knowledge bank is exponentially increasing so that a going back is beyond imagination. Transfer of critical information is also being carried out through the Internet. This widespread use of the Internet coupled with the tremendous growth in e-commerce and m-commerce has created a vital need for infonnation security.Internet has also become an active field of crackers and intruders. The whole development in this area can become null and void if fool-proof security of the data is not ensured without a chance of being adulterated. It is, hence a challenge before the professional community to develop systems to ensure security of the data sent through the Internet.Stream ciphers, hash functions and message authentication codes play vital roles in providing security services like confidentiality, integrity and authentication of the data sent through the Internet. There are several ·such popular and dependable techniques, which have been in use widely, for quite a long time. This long term exposure makes them vulnerable to successful or near successful attempts for attacks. Hence it is the need of the hour to develop new algorithms with better security.Hence studies were conducted on various types of algorithms being used in this area. Focus was given to identify the properties imparting security at this stage. By making use of a perception derived from these studies, new algorithms were designed. Performances of these algorithms were then studied followed by necessary modifications to yield an improved system consisting of a new stream cipher algorithm MAJE4, a new hash code JERIM- 320 and a new message authentication code MACJER-320. Detailed analysis and comparison with the existing popular schemes were also carried out to establish the security levels.The Secure Socket Layer (SSL) I Transport Layer Security (TLS) protocol is one of the most widely used security protocols in Internet. The cryptographic algorithms RC4 and HMAC have been in use for achieving security services like confidentiality and authentication in the SSL I TLS. But recent attacks on RC4 and HMAC have raised questions about the reliability of these algorithms. Hence MAJE4 and MACJER-320 have been proposed as substitutes for them. Detailed studies on the performance of these new algorithms were carried out; it has been observed that they are dependable alternatives.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Occupational stress is becoming a major issue in both corporate and social agenda .In industrialized countries, there have been quite dramatic changes in the conditions at work, during the last decade ,caused by economic, social and technical development. As a consequence, the people today at work are exposed to high quantitative and qualitative demands as well as hard competition caused by global economy. A recent report says that ailments due to work related stress is likely to cost India’s exchequer around 72000 crores between 2009 and 2015. Though India is a fast developing country, it is yet to create facilities to mitigate the adverse effects of work stress, more over only little efforts have been made to assess the work related stress.In the absence of well defined standards to assess the work related stress in India, an attempt is made in this direction to develop the factors for the evaluation of work stress. Accordingly, with the help of existing literature and in consultation with the safety experts, seven factors for the evaluation of work stress is developed. An instrument ( Questionnaire) was developed using these seven factors for the evaluation of work stress .The validity , and unidimensionality of the questionnaire was ensured by confirmatory factor analysis. The reliability of the questionnaire was ensured before administration. While analyzing the relation ship between the variables, it is noted that no relationship exists between them, and hence the above factors are treated as independent factors/ variables for the purpose of research .Initially five profit making manufacturing industries, under public sector in the state of Kerala, were selected for the study. The influence of factors responsible for work stress is analyzed in these industries. These industries were classified in to two types, namely chemical and heavy engineering ,based on the product manufactured and work environment and the analysis is further carried out for these two categories.The variation of work stress with different age , designation and experience of the employees are analyzed by means of one-way ANOVA. Further three different type of modelling of work stress, namely factor modelling, structural equation modelling and multinomial logistic regression modelling was done to analyze the association of factors responsible for work stress. All these models are found equally good in predicting the work stress.The present study indicates that work stress exists among the employees in public sector industries in Kerala. Employees belonging to age group 40-45yrs and experience groups 15-20yrs had relatively higher work demand ,low job control, and low support at work. Low job control was noted among lower designation levels, particularly at the worker level in these industries. Hence the instrument developed using the seven factors namely demand, control, manager support, peer support, relationship, role and change can be effectively used for the evaluation of work stress in industries.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study is directed to examine how far price fluctuations in pepper can be controlled in the Indian context so as to have a reasonable and stable income for the primary producers which will ensure an adequate ‘encouragement for higher production and better export earnings. In a study of the methods of controlling violent price fluctuations a important question is that whether the present system of management of supply is satisfactory or not. It is more so when the demand is likely to be sanimlatsd by the importers and wholesalers of the foreign countries. Though pepper is the most important of all the spices gross in India, little work has been done so far to study the problems and prospects of this commodity.