9 resultados para Analysis of multiple regression
em Cochin University of Science
Resumo:
This thesis Entitled “modelling and analysis of recurrent event data with multiple causes.Survival data is a term used for describing data that measures the time to occurrence of an event.In survival studies, the time to occurrence of an event is generally referred to as lifetime.Recurrent event data are commonly encountered in longitudinal studies when individuals are followed to observe the repeated occurrences of certain events. In many practical situations, individuals under study are exposed to the failure due to more than one causes and the eventual failure can be attributed to exactly one of these causes.The proposed model was useful in real life situations to study the effect of covariates on recurrences of certain events due to different causes.In Chapter 3, an additive hazards model for gap time distributions of recurrent event data with multiple causes was introduced. The parameter estimation and asymptotic properties were discussed .In Chapter 4, a shared frailty model for the analysis of bivariate competing risks data was presented and the estimation procedures for shared gamma frailty model, without covariates and with covariates, using EM algorithm were discussed. In Chapter 6, two nonparametric estimators for bivariate survivor function of paired recurrent event data were developed. The asymptotic properties of the estimators were studied. The proposed estimators were applied to a real life data set. Simulation studies were carried out to find the efficiency of the proposed estimators.
Resumo:
Data centre is a centralized repository,either physical or virtual,for the storage,management and dissemination of data and information organized around a particular body and nerve centre of the present IT revolution.Data centre are expected to serve uniinterruptedly round the year enabling them to perform their functions,it consumes enormous energy in the present scenario.Tremendous growth in the demand from IT Industry made it customary to develop newer technologies for the better operation of data centre.Energy conservation activities in data centre mainly concentrate on the air conditioning system since it is the major mechanical sub-system which consumes considerable share of the total power consumption of the data centre.The data centre energy matrix is best represented by power utilization efficiency(PUE),which is defined as the ratio of the total facility power to the IT equipment power.Its value will be greater than one and a large value of PUE indicates that the sub-systems draw more power from the facility and the performance of the data will be poor from the stand point of energy conservation. PUE values of 1.4 to 1.6 are acievable by proper design and management techniques.Optimizing the air conditioning systems brings enormous opportunity in bringing down the PUE value.The air conditioning system can be optimized by two approaches namely,thermal management and air flow management.thermal management systems are now introduced by some companies but they are highly sophisticated and costly and do not catch much attention in the thumb rules.
Resumo:
This work identifies the importance of plenum pressure on the performance of the data centre. The present methodology followed in the industry considers the pressure drop across the tile as a dependant variable, but it is shown in this work that this is the only one independent variable that is responsible for the entire flow dynamics in the data centre, and any design or assessment procedure must consider the pressure difference across the tile as the primary independent variable. This concept is further explained by the studies on the effect of dampers on the flow characteristics. The dampers have found to introduce an additional pressure drop there by reducing the effective pressure drop across the tile. The effect of damper is to change the flow both in quantitative and qualitative aspects. But the effect of damper on the flow in the quantitative aspect is only considered while using the damper as an aid for capacity control. Results from the present study suggest that the use of dampers must be avoided in data centre and well designed tiles which give required flow rates must be used in the appropriate locations. In the present study the effect of hot air recirculation is studied with suitable assumptions. It identifies that, the pressure drop across the tile is a dominant parameter which governs the recirculation. The rack suction pressure of the hardware along with the pressure drop across the tile determines the point of recirculation in the cold aisle. The positioning of hardware in the racks play an important role in controlling the recirculation point. The present study is thus helpful in the design of data centre air flow, based on the theory of jets. The air flow can be modelled both quantitatively and qualitatively based on the results.
Resumo:
In the present study the effect of hot air recirculation is studied with suitable assumptions. It identifies that, the pressure drop across the tile is a dominant parameter which governs the recirculation. The rack suction pressure of the hardware along with the pressure drop across the tile determines the point of recirculation in the cold aisle. The positioning of hardware in the racks play an important role in controlling the recirculation point. The present study is thus helpful in the design of data centre air flow, based on the theory of jets. The air flow can be modelled both quantitatively and qualitatively based on the results
Resumo:
A new approach, the multipole theory (MT) method, is presented for the computation of cutoff wavenumbers of waveguides partially filled with dielectric. The MT formulation of the eigenvalue problem of an inhomogeneous waveguide is derived. Representative computational examples, including dielectric-rod-loaded rectangular and double-ridged waveguides, are given to validate the theory, and to demonstrate the degree of its efficiency
Resumo:
This thesis describes the development and analysis of an Isosceles Trapezoidal Dielectric Resonator Antenna (ITDRA) by realizing different DR orientations with suitable feed configurations enabling it to be used as multiband, dual band dual polarized and wideband applications. The motivation for this work has been inspired by the need for compact, high efficient, low cost antenna suitable for multi band application, dual band dual polarized operation and broadband operation with the possibility of using with MICs, and to ensure less expensive, more efficient and quality wireless communication systems. To satisfy these challenging demands a novel shaped Dielectric Resonator (DR) is fabricated and investigated for the possibility of above required properties by trying out different orientations of the DR on a simple microstrip feed and with slotted ground plane as well. The thesis initially discusses and evaluates recent and past developments taken place within the microwave industry on this topic through a concise review of literature. Then the theoretical aspects of DRA and different feeding techniques are described. Following this, fabrication and characterization of DRA is explained. To achieve the desired requirements as above both simulations and experimental measurements were undertaken. A 3-D finite element method (FEM) electromagnetic simulation tool, HFSSTM by Agilent, is used to determine the optimum geometry of the dielectric resonator. It was found to be useful in producing approximate results although it had some limitations. A numerical analysis technique, finite difference time domain (FDTD) is used for validating the results of wide band design at the end. MATLAB is used for modeling the ITDR and implementing FDTD analysis. In conclusion this work offers a new, efficient and relatively simple alternative for antennas to be used for multiple requirements in the wireless communication system.
Resumo:
Warships are generally sleek, slender with V shaped sections and block coefficient below 0.5, compared to fuller forms and higher values for commercial ships. They normally operate in the higher Froude number regime, and the hydrodynamic design is primarily aimed at achieving higher speeds with the minimum power. Therefore the structural design and analysis methods are different from those for commercial ships. Certain design guidelines have been given in documents like Naval Engineering Standards and one of the new developments in this regard is the introduction of classification society rules for the design of warships.The marine environment imposes subjective and objective uncertainties on ship structure. The uncertainties in loads, material properties etc.,. make reliable predictions of ship structural response a difficult task. Strength, stiffness and durability criteria for warship structures can be established by investigations on elastic analysis, ultimate strength analysis and reliability analysis. For analysis of complicated warship structures, special means and valid approximations are required.Preliminary structural design of a frigate size ship has been carried out . A finite element model of the hold model, representative of the complexities in the geometric configuration has been created using the finite element software NISA. Two other models representing the geometry to a limited extent also have been created —- one with two transverse frames and the attached plating alongwith the longitudinal members and the other representing the plating and longitudinal stiffeners between two transverse frames. Linear static analysis of the three models have been carried out and each one with three different boundary conditions. The structural responses have been checked for deflections and stresses against the permissible values. The structure has been found adequate in all the cases. The stresses and deflections predicted by the frame model are comparable with those of the hold model. But no such comparison has been realized for the interstiffener plating model with the other two models.Progressive collapse analyses of the models have been conducted for the three boundary conditions, considering geometric nonlinearity and then combined geometric and material nonlinearity for the hold and the frame models. von Mises — lllyushin yield criteria with elastic-perfectly plastic stress-strain curve has been chosen. ln each case, P-Delta curves have been generated and the ultimate load causing failure (ultimate load factor) has been identified as a multiple of the design load specified by NES.Reliability analysis of the hull module under combined geometric and material nonlinearities have been conducted. The Young's Modulus and the shell thickness have been chosen as the variables. Randomly generated values have been used in the analysis. First Order Second Moment has been used to predict the reliability index and thereafter, the probability of failure. The values have been compared against standard values published in literature.
Resumo:
Occupational stress is becoming a major issue in both corporate and social agenda .In industrialized countries, there have been quite dramatic changes in the conditions at work, during the last decade ,caused by economic, social and technical development. As a consequence, the people today at work are exposed to high quantitative and qualitative demands as well as hard competition caused by global economy. A recent report says that ailments due to work related stress is likely to cost India’s exchequer around 72000 crores between 2009 and 2015. Though India is a fast developing country, it is yet to create facilities to mitigate the adverse effects of work stress, more over only little efforts have been made to assess the work related stress.In the absence of well defined standards to assess the work related stress in India, an attempt is made in this direction to develop the factors for the evaluation of work stress. Accordingly, with the help of existing literature and in consultation with the safety experts, seven factors for the evaluation of work stress is developed. An instrument ( Questionnaire) was developed using these seven factors for the evaluation of work stress .The validity , and unidimensionality of the questionnaire was ensured by confirmatory factor analysis. The reliability of the questionnaire was ensured before administration. While analyzing the relation ship between the variables, it is noted that no relationship exists between them, and hence the above factors are treated as independent factors/ variables for the purpose of research .Initially five profit making manufacturing industries, under public sector in the state of Kerala, were selected for the study. The influence of factors responsible for work stress is analyzed in these industries. These industries were classified in to two types, namely chemical and heavy engineering ,based on the product manufactured and work environment and the analysis is further carried out for these two categories.The variation of work stress with different age , designation and experience of the employees are analyzed by means of one-way ANOVA. Further three different type of modelling of work stress, namely factor modelling, structural equation modelling and multinomial logistic regression modelling was done to analyze the association of factors responsible for work stress. All these models are found equally good in predicting the work stress.The present study indicates that work stress exists among the employees in public sector industries in Kerala. Employees belonging to age group 40-45yrs and experience groups 15-20yrs had relatively higher work demand ,low job control, and low support at work. Low job control was noted among lower designation levels, particularly at the worker level in these industries. Hence the instrument developed using the seven factors namely demand, control, manager support, peer support, relationship, role and change can be effectively used for the evaluation of work stress in industries.
Resumo:
Econometrics is a young science. It developed during the twentieth century in the mid-1930’s, primarily after the World War II. Econometrics is the unification of statistical analysis, economic theory and mathematics. The history of econometrics can be traced to the use of statistical and mathematics analysis in economics. The most prominent contributions during the initial period can be seen in the works of Tinbergen and Frisch, and also that of Haavelmo in the 1940's through the mid 1950's. Right from the rudimentary application of statistics to economic data, like the use of laws of error through the development of least squares by Legendre, Laplace, and Gauss, the discipline of econometrics has later on witnessed the applied works done by Edge worth and Mitchell. A very significant mile stone in its evolution has been the work of Tinbergen, Frisch, and Haavelmo in their development of multiple regression and correlation analysis. They used these techniques to test different economic theories using time series data. In spite of the fact that some predictions based on econometric methodology might have gone wrong, the sound scientific nature of the discipline cannot be ignored by anyone. This is reflected in the economic rationale underlying any econometric model, statistical and mathematical reasoning for the various inferences drawn etc. The relevance of econometrics as an academic discipline assumes high significance in the above context. Because of the inter-disciplinary nature of econometrics (which is a unification of Economics, Statistics and Mathematics), the subject can be taught at all these broad areas, not-withstanding the fact that most often Economics students alone are offered this subject as those of other disciplines might not have adequate Economics background to understand the subject. In fact, even for technical courses (like Engineering), business management courses (like MBA), professional accountancy courses etc. econometrics is quite relevant. More relevant is the case of research students of various social sciences, commerce and management. In the ongoing scenario of globalization and economic deregulation, there is the need to give added thrust to the academic discipline of econometrics in higher education, across various social science streams, commerce, management, professional accountancy etc. Accordingly, the analytical ability of the students can be sharpened and their ability to look into the socio-economic problems with a mathematical approach can be improved, and enabling them to derive scientific inferences and solutions to such problems. The utmost significance of hands-own practical training on the use of computer-based econometric packages, especially at the post-graduate and research levels need to be pointed out here. Mere learning of the econometric methodology or the underlying theories alone would not have much practical utility for the students in their future career, whether in academics, industry, or in practice This paper seeks to trace the historical development of econometrics and study the current status of econometrics as an academic discipline in higher education. Besides, the paper looks into the problems faced by the teachers in teaching econometrics, and those of students in learning the subject including effective application of the methodology in real life situations. Accordingly, the paper offers some meaningful suggestions for effective teaching of econometrics in higher education