11 resultados para IPO Failures

em Cochin University of Science


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the present environment, industry should provide the products of high quality. Quality of products is judged by the period of time they can successfully perform their intended functions without failure. The cause of the failures can be ascertained through life testing experiments and the times to failure due to different cause are likely to follow different distributions. Knowledge of this distribution is essential to eliminate causes of failures and thereby to improve the quality and the reliability of products. The main accomplishment expected to the study is to develop statistical tools that could facilitate solution to lifetime data arising in such and similar contexts

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Comprehensive studies integrating production, consumption and marketing of ornamental fishes were never conducted in the past in Kerala and this study is an attempt to highlight the dynamics in a systematic manner by making a primary empirical study. The advantages of such an approach are many. First, the study entails detailed empirical data under a given time frame on various inter linked economic activities in ornamental fisheries sector. Second, the study tries to improve upon the previous discipline-bound studies by adopting an integrated approach. As Kerala has diverse ecological and climatic conditions, case studies on various ornamental fish production systems help identifying suitable culture practice for selected environmental conditions. One of the major conclusions of this study as explained in the concluding chapter is that although Kerala is blessed with favourable geographic and climatic conditions needed for developing ornamental fishery, diverse local conditions and requirements contain adopting modern methods of enterprise development.Consumer studies conducted would help to identify the demand of ornamental fishes and the factors leading to it. Market studies would help in understanding the forces behind domestic market, which is very promising, but neglected. Viability studies throw light on the economic performance of both the production systems and the trading units of ornamental fishes. Despite the economic significance of ornamental fisheries industry, its development has been constrained due to various bio-technical, cultural, socio-economic, organisational and above all, institutional and policy failures. The outcome of the study would identify constraints facing the industry and institutional arrangements for development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents the methodology of linking Total Productive Maintenance (TPM) and Quality Function Deployment (QFD). The Synergic power ofTPM and QFD led to the formation of a new maintenance model named Maintenance Quality Function Deployment (MQFD). This model was found so powerful that, it could overcome the drawbacks of TPM, by taking care of customer voices. Those voices of customers are used to develop the house of quality. The outputs of house of quality, which are in the form of technical languages, are submitted to the top management for making strategic decisions. The technical languages, which are concerned with enhancing maintenance quality, are strategically directed by the top management towards their adoption of eight TPM pillars. The TPM characteristics developed through the development of eight pillars are fed into the production system, where their implementation is focused towards increasing the values of the maintenance quality parameters, namely overall equipment efficiency (GEE), mean time between failures (MTBF), mean time to repair (MTIR), performance quality, availability and mean down time (MDT). The outputs from production system are required to be reflected in the form of business values namely improved maintenance quality, increased profit, upgraded core competence, and enhanced goodwill. A unique feature of the MQFD model is that it is not necessary to change or dismantle the existing process ofdeveloping house ofquality and TPM projects, which may already be under practice in the company concerned. Thus, the MQFD model enables the tactical marriage between QFD and TPM.First, the literature was reviewed. The results of this review indicated that no activities had so far been reported on integrating QFD in TPM and vice versa. During the second phase, a survey was conducted in six companies in which TPM had been implemented. The objective of this survey was to locate any traces of QFD implementation in TPM programme being implemented in these companies. This survey results indicated that no effort on integrating QFD in TPM had been made in these companies. After completing these two phases of activities, the MQFD model was designed. The details of this work are presented in this research work. Followed by this, the explorative studies on implementing this MQFD model in real time environments were conducted. In addition to that, an empirical study was carried out to examine the receptivity of MQFD model among the practitioners and multifarious organizational cultures. Finally, a sensitivity analysis was conducted to find the hierarchy of various factors influencing MQFD in a company. Throughout the research work, the theory and practice of MQFD were juxtaposed by presenting and publishing papers among scholarly communities and conducting case studies in real time scenario.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

So far, in the bivariate set up, the analysis of lifetime (failure time) data with multiple causes of failure is done by treating each cause of failure separately. with failures from other causes considered as independent censoring. This approach is unrealistic in many situations. For example, in the analysis of mortality data on married couples one would be interested to compare the hazards for the same cause of death as well as to check whether death due to one cause is more important for the partners’ risk of death from other causes. In reliability analysis. one often has systems with more than one component and many systems. subsystems and components have more than one cause of failure. Design of high-reliability systems generally requires that the individual system components have extremely high reliability even after long periods of time. Knowledge of the failure behaviour of a component can lead to savings in its cost of production and maintenance and. in some cases, to the preservation of human life. For the purpose of improving reliability. it is necessary to identify the cause of failure down to the component level. By treating each cause of failure separately with failures from other causes considered as independent censoring, the analysis of lifetime data would be incomplete. Motivated by this. we introduce a new approach for the analysis of bivariate competing risk data using the bivariate vector hazard rate of Johnson and Kotz (1975).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The term reliability of an equipment or device is often meant to indicate the probability that it carries out the functions expected of it adequately or without failure and within specified performance limits at a given age for a desired mission time when put to use under the designated application and operating environmental stress. A broad classification of the approaches employed in relation to reliability studies can be made as probabilistic and deterministic, where the main interest in the former is to device tools and methods to identify the random mechanism governing the failure process through a proper statistical frame work, while the latter addresses the question of finding the causes of failure and steps to reduce individual failures thereby enhancing reliability. In the probabilistic attitude to which the present study subscribes to, the concept of life distribution, a mathematical idealisation that describes the failure times, is fundamental and a basic question a reliability analyst has to settle is the form of the life distribution. It is for no other reason that a major share of the literature on the mathematical theory of reliability is focussed on methods of arriving at reasonable models of failure times and in showing the failure patterns that induce such models. The application of the methodology of life time distributions is not confined to the assesment of endurance of equipments and systems only, but ranges over a wide variety of scientific investigations where the word life time may not refer to the length of life in the literal sense, but can be concieved in its most general form as a non-negative random variable. Thus the tools developed in connection with modelling life time data have found applications in other areas of research such as actuarial science, engineering, biomedical sciences, economics, extreme value theory etc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the increasing popularity of wireless network and its application, mobile ad-hoc networks (MANETS) emerged recently. MANET topology is highly dynamic in nature and nodes are highly mobile so that the rate of link failure is more in MANET. There is no central control over the nodes and the control is distributed among nodes and they can act as either router or source. MANTEs have been considered as isolated stand-alone network. Node can add or remove at any time and it is not infrastructure dependent. So at any time at any where the network can setup and a trouble free communication is possible. Due to more chances of link failures, collisions and transmission errors in MANET, the maintenance of network became costly. As per the study more frequent link failures became an important aspect of diminishing the performance of the network and also it is not predictable. The main objective of this paper is to study the route instability in AODV protocol and suggest a solution for improvement. This paper proposes a new approach to reduce the route failure by storing the alternate route in the intermediate nodes. In this algorithm intermediate nodes are also involved in the route discovery process. This reduces the route establishment overhead as well as the time to find the reroute when a link failure occurs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study reports the details of the finite element analysis of eleven shear critical partially prestressed concrete T-beams having steel fibers over partial or full depth. Prestressed T-beams having a shear span to depth ratio of 2.65 and 1.59 that failed in shear have been analyzed using the ‘ANSYS’ program. The ‘ANSYS’ model accounts for the nonlinearity, such as, bond-slip of longitudinal reinforcement, postcracking tensile stiffness of the concrete, stress transfer across the cracked blocks of the concrete and load sustenance through the bridging action of steel fibers at crack interface. The concrete is modeled using ‘SOLID65’- eight-node brick element, which is capable of simulating the cracking and crushing behavior of brittle materials. The reinforcement such as deformed bars, prestressing wires and steel fibers have been modeled discretely using ‘LINK8’ – 3D spar element. The slip between the reinforcement (rebars, fibers) and the concrete has been modeled using a ‘COMBIN39’- nonlinear spring element connecting the nodes of the ‘LINK8’ element representing the reinforcement and nodes of the ‘SOLID65’ elements representing the concrete. The ‘ANSYS’ model correctly predicted the diagonal tension failure and shear compression failure of prestressed concrete beams observed in the experiment. The capability of the model to capture the critical crack regions, loads and deflections for various types of shear failures in prestressed concrete beam has been illustrated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to minimize the risk of failures or major renewals of hull structures during the ship's expected life span, it is imperative that the precaution must be taken with regard to an adequate margin of safety against any one or combination of failure modes including excessive yielding, buckling, brittle fracture, fatigue and corrosion. The most efficient system for combating underwater corrosion is 'cathodic protection'. The basic principle of this method is that the ship's structure is made cathodic, i.e. the anodic (corrosion) reactions are suppressed by the application of an opposing current and the ship is there by protected. This paper deals with state of art in cathodic protection and its programming in ship structure

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A sandwich construction is a special form of the laminated composite consisting of light weight core, sandwiched between two stiff thin face sheets. Due to high stiffness to weight ratio, sandwich construction is widely adopted in aerospace industries. As a process dependent bonded structure, the most severe defects associated with sandwich construction are debond (skin core bond failure) and dent (locally deformed skin associated with core crushing). Reasons for debond may be attributed to initial manufacturing flaws or in service loads and dent can be caused by tool drops or impacts by foreign objects. This paper presents an evaluation on the performance of honeycomb sandwich cantilever beam with the presence of debond or dent, using layered finite element models. Dent is idealized by accounting core crushing in the core thickness along with the eccentricity of the skin. Debond is idealized using multilaminate modeling at debond location with contact element between the laminates. Vibration and buckling behavior of metallic honeycomb sandwich beam with and without damage are carried out. Buckling load factor, natural frequency, mode shape and modal strain energy are evaluated using finite element package ANSYS 13.0. Study shows that debond affect the performance of the structure more severely than dent. Reduction in the fundamental frequencies due to the presence of dent or debond is not significant for the case considered. But the debond reduces the buckling load factor significantly. Dent of size 8-20% of core thickness shows 13% reduction in buckling load capacity of the sandwich column. But debond of the same size reduced the buckling load capacity by about 90%. This underscores the importance of detecting these damages in the initiation level itself to avoid catastrophic failures. Influence of the damages on fundamental frequencies, mode shape and modal strain energy are examined. Effectiveness of these parameters as a damage detection tool for sandwich structure is also assessed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This is an attempt to understand the important factors that control the occurrence, development and hydrochemical evolution of groundwater resources in sedimentary multi aquifer systems. The primary objective of this work is an integrated study of the hydrogeology and hydrochemistry with a view to elucidate the hydrochemical evolution of groundwater resources in the aquifer systems. The study is taken up in a typical coastal sedimentary aquifer system evolved under fluvio-marine environment in the coastal area of Kerala, known as the Kuttanad. The present study has been carried out to understand the aquifer systems, their inter relationships and evolution in the Kuttanad area of Kerala. The multi aquifer systems in the Kuttanad basin were formed from the sediments deposited under fluvio-marine and fluvial depositional environments and the marine transgressions and regressions in the geological past and palaeo climatic conditions influenced the hydrochemical environment in these aquifers. The evolution of groundwater and the hydrochemical processes involved in the formation of the present day water quality are elucidated from hydrochemical studies and the information derived from the aquifer geometry and hydraulic properties. Kuttanad area comprises of three types of aquifer systems namely phreatic aquifer underlain by Recent confined aquifer followed by Tertiary confined aquifers. These systems were formed by the deposition of sediments under fluvio-marine and fluvial environment. The study of the hydrochemical and hydraulic properties of the three aquifer systems proved that these three systems are separate entities. The phreatic aquifers in the area have low hydraulic gradients and high rejected recharge. The Recent confined aquifer has very poor hydraulic characteristics and recharge to this aquifer is very low. The Tertiary aquifer system is the most potential fresh water aquifer system in the area and the groundwater flow in the aquifer is converging towards the central part of the study area (Alleppey town) due to large scale pumping of water for water supply from this aquifer system. Mixing of waters and anthropogenic interferences are the dominant processes modifying the hydrochemistry in phreatic aquifers. Whereas, leaching of salts and cation exchange are the dominant processes modifying the hydrochemistry of groundwater in the confined aquifer system of Recent alluvium. Two significant chemical reactions modifying the hydrochemistry in the Recent aquifers are oxidation of iron in ferruginous clays which contributes hydrogen ions and the decomposition of organic matter in the aquifer system which consumes hydrogen ions. The hydrochemical environment is entirely different in the Tertiary aquifers as the groundwater in this aquifer system are palaeo waters evolved during various marine transgressions and regressions and these waters are being modified by processes of leaching of salts, cation exchange and chemical reactions under strong reducing environment. It is proved that the salinity observed in the groundwaters of Tertiary aquifers are not due to seawater mixing or intrusion, but due to dissolution of salts from the clay formations and ion exchange processes. Fluoride contamination in this aquifer system lacks a regional pattern and is more or less site specific in natureThe lowering of piezometric heads in the Tertiary aquifer system has developed as consequence of large scale pumping over a long period. Hence, puping from this aquifer system is to be regulated as a groundwater management strategy. Pumping from the Tertiary aquifers with high capacity pumps leads to well failures and mixing of saline water from the brackish zones. Such mixing zones are noticed from the hydrochemical studies. This is the major aquifer contamination in the Tertiary aquifer system which requires immediate attention. Usage of pumps above 10 HP capacities in wells taping Tertiary aquifers should be discouraged for sustainable development of these aquifers. The recharge areas need to be identified precisely for recharging the aquifer systems throughartificial means.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software systems are progressively being deployed in many facets of human life. The implication of the failure of such systems, has an assorted impact on its customers. The fundamental aspect that supports a software system, is focus on quality. Reliability describes the ability of the system to function under specified environment for a specified period of time and is used to objectively measure the quality. Evaluation of reliability of a computing system involves computation of hardware and software reliability. Most of the earlier works were given focus on software reliability with no consideration for hardware parts or vice versa. However, a complete estimation of reliability of a computing system requires these two elements to be considered together, and thus demands a combined approach. The present work focuses on this and presents a model for evaluating the reliability of a computing system. The method involves identifying the failure data for hardware components, software components and building a model based on it, to predict the reliability. To develop such a model, focus is given to the systems based on Open Source Software, since there is an increasing trend towards its use and only a few studies were reported on the modeling and measurement of the reliability of such products. The present work includes a thorough study on the role of Free and Open Source Software, evaluation of reliability growth models, and is trying to present an integrated model for the prediction of reliability of a computational system. The developed model has been compared with existing models and its usefulness of is being discussed.