827 resultados para Causal inference
Resumo:
In this paper, we describe our investigation of the cointegration and causal relationships between energy consumption and economic output in Australia over a period of five decades. The framework used in this paper is the single-sector aggregate production function, which is the first comprehensive approach used in an Australian study of this type to include energy, capital and labour as separate inputs of production. The empirical evidence points to a cointegration relationship between energy and output and implies that energy is an important variable in the cointegration space, as are conventional inputs capital and labour. We also find some evidence of bidirectional causality between GDP and energy use. Although the evidence of causality from energy use to GDP was relatively weak when using the thermal aggregate of energy use, once energy consumption was adjusted for energy quality, we found strong evidence of Granger causality from energy use to GDP in Australia over the investigated period. The results are robust, irrespective of the assumptions of linear trends in the cointegration models, and are applicable for different econometric approaches.
Resumo:
The problem of ‘wet litter’, which occurs primarily in grow-out sheds for meat chickens (broilers), has been recognised for nearly a century. Nevertheless, it is an increasingly important problem in contemporary chicken-meat production as wet litter and associated conditions, especially footpad dermatitis, have developed into tangible welfare issues. This is only compounded by the market demand for chicken paws and compromised bird performance. This review considers the multidimensional causal factors of wet litter. While many causal factors can be listed it is evident that the critical ones could be described as micro-environmental factors and chief amongst them is proper management of drinking systems and adequate shed ventilation. Thus, this review focuses on these environmental factors and pays less attention to issues stemming from health and nutrition. Clearly, there are times when related avian health issues of coccidiosis and necrotic enteritis cannot be overlooked and the development of efficacious vaccines for the latter disease would be advantageous. Presently, the inclusion of phytate-degrading enzymes in meat chicken diets is routine and, therefore, the implication that exogenous phytases may contribute to wet litter is given consideration. Opinion is somewhat divided as how best to counter the problem of wet litter as some see education and extension as being more beneficial than furthering research efforts. However, it may prove instructive to assess the practice of whole grain feeding in relation to litter quality and the incidence of footpad dermatitis. Additional research could investigate the relationships between dietary concentrations of key minerals and the application of exogenous enzymes with litter quality.
Resumo:
The paper presents an innovative approach to modelling the causal relationships of human errors in rail crack incidents (RCI) from a managerial perspective. A Bayesian belief network is developed to model RCI by considering the human errors of designers, manufactures, operators and maintainers (DMOM) and the causal relationships involved. A set of dependent variables whose combinations express the relevant functions performed by each DMOM participant is used to model the causal relationships. A total of 14 RCI on Hong Kong’s mass transit railway (MTR) from 2008 to 2011 are used to illustrate the application of the model. Bayesian inference is used to conduct an importance analysis to assess the impact of the participants’ errors. Sensitivity analysis is then employed to gauge the effect the increased probability of occurrence of human errors on RCI. Finally, strategies for human error identification and mitigation of RCI are proposed. The identification of ability of maintainer in the case study as the most important factor influencing the probability of RCI implies the priority need to strengthen the maintenance management of the MTR system and that improving the inspection ability of the maintainer is likely to be an effective strategy for RCI risk mitigation.
Resumo:
The goal of this study was to examine the role of organizational causal attribution in understanding the relation of work stressors (work-role overload, excessive role responsibility, and unpleasant physical environment) and personal resources (social support and cognitive coping) to such organizational-attitudinal outcomes as work engagement, turnover intention, and organizational identification. In some analyses, cognitive coping was also treated as an organizational outcome. Causal attribution was conceptualized in terms of four dimensions: internality-externality, attributing the cause of one’s successes and failures to oneself, as opposed to external factors, stability (thinking that the cause of one’s successes and failures is stable over time), globality (perceiving the cause to be operative on many areas of one’s life), and controllability (believing that one can control the causes of one’s successes and failures). Several hypotheses were derived from Karasek’s (1989) Job Demands–Control (JD-C) model and from the Job Demands–Resources (JD-R) model (Demerouti, Bakker, Nachreiner & Schaufeli, 2001). Based on the JD-C model, a number of moderation effects were predicted, stating that the strength of the association of work stressors with the outcome variables (e.g. turnover intentions) varies as a function of the causal attribution; for example, unpleasant work environment is more strongly associated with turnover intention among those with an external locus of causality than among those with an internal locuse of causality. From the JD-R model, a number of hypotheses on the mediation model were derived. They were based on two processes posited by the model: an energy-draining process in which work stressors along with a mediating effect of causal attribution for failures deplete the nurses’ energy, leading to turnover intention, and a motivational process in which personal resources along with a mediating effect of causal attribution for successes foster the nurses’ engagement in their work, leading to higher organizational identification and to decreased intention to leave the nursing job. For instance, it was expected that the relationship between work stressors and turnover intention could be explained (mediated) by a tendency to attribute one’s work failures to stable causes. The data were collected from among Finnish hospital nurses using e-questionnaires. Overall 934 nurses responded the questionnaires. Work stressors and personal resources were measured by five scales derived from the Occupational Stress Inventory-Revised (Osipow, 1998). Causal attribution was measured using the Occupational Attributional Style Questionnaire (Furnham, 2004). Work engagement was assessed through the Utrecht Work Engagement Scale (Schaufeli & al., 2002), turnover intention by the Van Veldhoven & Meijman (1994) scale, and organizational identification by the Mael & Ashforth (1992) measure. The results provided support for the function of causal attribution in the overall work stress process. Findings related to the moderation model can be divided into three main findings. First, external locus of causality along with job level moderated the relationship between work overload and cognitive coping. Hence, this interaction was evidenced only among nurses in non-supervisory positions. Second, external locus of causality and job level together moderated the relationship between physical environment and turnover intention. An opposite pattern of interaction was found for this interaction: among nurses, externality exacerbated the effect of perceived unpleasantness of the physical environment on turnover intention, whereas among supervisors internality produced the same effect. Third, job level also disclosed a moderation effect for controllability attribution over the relationship between physical environment and cognitive coping. Findings related to the mediation model for the energetic process indicated that the partial model in which work stressors have also a direct effect on turnover intention fitted the data better. In the mediation model for the motivational process, an intermediate mediation effect in which the effects of personal resources on turnover intention went through two mediators (e.g., causal dimensions and organizational identification) fitted the data better. All dimensions of causal attribution appeared to follow a somewhat unique pattern of mediation effect not only for energetic but also for motivational processes. Overall findings on mediation models partly supported the two simultaneous underlying processes proposed by the JD-R model. While in the energetic process the dimension of externality mediated the relationship between stressors and turnover partially, all the dimensions of causal attribution appeared to entail significant mediator effects in the motivational process. The general findings supported the moderation effect and the mediation effect of causal attribution in the work stress process. The study contributes to several research traditions, including the interaction approach, the JD-C, and the JD-R models. However, many potential functions of organizational causal attribution are yet to be evaluated by relevant academic and organizational research. Keywords: organizational causal attribution, optimistic / pessimistic attributional style, work stressors, organisational stress process, stressors in nursing profession, hospital nursing, JD-R model, personal resources, turnover intention, work engagement, organizational identification.
Resumo:
In the thesis we consider inference for cointegration in vector autoregressive (VAR) models. The thesis consists of an introduction and four papers. The first paper proposes a new test for cointegration in VAR models that is directly based on the eigenvalues of the least squares (LS) estimate of the autoregressive matrix. In the second paper we compare a small sample correction for the likelihood ratio (LR) test of cointegrating rank and the bootstrap. The simulation experiments show that the bootstrap works very well in practice and dominates the correction factor. The tests are applied to international stock prices data, and the .nite sample performance of the tests are investigated by simulating the data. The third paper studies the demand for money in Sweden 1970—2000 using the I(2) model. In the fourth paper we re-examine the evidence of cointegration between international stock prices. The paper shows that some of the previous empirical results can be explained by the small-sample bias and size distortion of Johansen’s LR tests for cointegration. In all papers we work with two data sets. The first data set is a Swedish money demand data set with observations on the money stock, the consumer price index, gross domestic product (GDP), the short-term interest rate and the long-term interest rate. The data are quarterly and the sample period is 1970(1)—2000(1). The second data set consists of month-end stock market index observations for Finland, France, Germany, Sweden, the United Kingdom and the United States from 1980(1) to 1997(2). Both data sets are typical of the sample sizes encountered in economic data, and the applications illustrate the usefulness of the models and tests discussed in the thesis.
Resumo:
Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.
Resumo:
Our ability to infer the protein quaternary structure automatically from atom and lattice information is inadequate, especially for weak complexes, and heteromeric quaternary structures. Several approaches exist, but they have limited performance. Here, we present a new scheme to infer protein quaternary structure from lattice and protein information, with all-around coverage for strong, weak and very weak affinity homomeric and heteromeric complexes. The scheme combines naive Bayes classifier and point group symmetry under Boolean framework to detect quaternary structures in crystal lattice. It consistently produces >= 90% coverage across diverse benchmarking data sets, including a notably superior 95% coverage for recognition heteromeric complexes, compared with 53% on the same data set by current state-of-the-art method. The detailed study of a limited number of prediction-failed cases offers interesting insights into the intriguing nature of protein contacts in lattice. The findings have implications for accurate inference of quaternary states of proteins, especially weak affinity complexes.
Resumo:
In this paper, we give a method for probabilistic assignment to the Realistic Abductive Reasoning Model, The knowledge is assumed to be represented in the form of causal chaining, namely, hyper-bipartite network. Hyper-bipartite network is the most generalized form of knowledge representation for which, so far, there has been no way of assigning probability to the explanations, First, the inference mechanism using realistic abductive reasoning model is briefly described and then probability is assigned to each of the explanations so as to pick up the explanations in the decreasing order of plausibility.
Resumo:
Satisfiability algorithms for propositional logic have improved enormously in recently years. This improvement increases the attractiveness of satisfiability methods for first-order logic that reduce the problem to a series of ground-level satisfiability problems. R. Jeroslow introduced a partial instantiation method of this kind that differs radically from the standard resolution-based methods. This paper lays the theoretical groundwork for an extension of his method that is general enough and efficient enough for general logic programming with indefinite clauses. In particular we improve Jeroslow's approach by (1) extending it to logic with functions, (2) accelerating it through the use of satisfiers, as introduced by Gallo and Rago, and (3) simplifying it to obtain further speedup. We provide a similar development for a "dual" partial instantiation approach defined by Hooker and suggest a primal-dual strategy. We prove correctness of the primal and dual algorithms for full first-order logic with functions, as well as termination on unsatisfiable formulas. We also report some preliminary computational results.
Resumo:
Prediction of variable bit rate compressed video traffic is critical to dynamic allocation of resources in a network. In this paper, we propose a technique for preprocessing the dataset used for training a video traffic predictor. The technique involves identifying the noisy instances in the data using a fuzzy inference system. We focus on three prediction techniques, namely, linear regression, neural network and support vector regression and analyze their performance on H.264 video traces. Our experimental results reveal that data preprocessing greatly improves the performance of linear regression and neural network, but is not effective on support vector regression.
Resumo:
Effective network overload alleviation is very much essential in order to maintain security and integrity from the operational viewpoint of deregulated power systems. This paper aims at developing a methodology to reschedule the active power generation from the sources in order to manage the network congestion under normal/contingency conditions. An effective method has been proposed using fuzzy rule based inference system. Using virtual flows concept, which provides partial contributions/counter flows in the network elements is used as a basis in the proposed method to manage network congestions to the possible extent. The proposed method is illustrated on a sample 6 bus test system and on modified IEEE 39 bus system.
Resumo:
In this paper, we consider the inference for the component and system lifetime distribution of a k-unit parallel system with independent components based on system data. The components are assumed to have identical Weibull distribution. We obtain the maximum likelihood estimates of the unknown parameters based on system data. The Fisher information matrix has been derived. We propose -expectation tolerance interval and -content -level tolerance interval for the life distribution of the system. Performance of the estimators and tolerance intervals is investigated via simulation study. A simulated dataset is analyzed for illustration.