902 resultados para indirect causality


Relevância:

70.00% 70.00%

Publicador:

Resumo:

We propose methods for testing hypotheses of non-causality at various horizons, as defined in Dufour and Renault (1998, Econometrica). We study in detail the case of VAR models and we propose linear methods based on running vector autoregressions at different horizons. While the hypotheses considered are nonlinear, the proposed methods only require linear regression techniques as well as standard Gaussian asymptotic distributional theory. Bootstrap procedures are also considered. For the case of integrated processes, we propose extended regression methods that avoid nonstandard asymptotics. The methods are applied to a VAR model of the U.S. economy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Vector error-correction models (VECMs) have become increasingly important in their application to financial markets. Standard full-order VECM models assume non-zero entries in all their coefficient matrices. However, applications of VECM models to financial market data have revealed that zero entries are often a necessary part of efficient modelling. In such cases, the use of full-order VECM models may lead to incorrect inferences. Specifically, if indirect causality or Granger non-causality exists among the variables, the use of over-parameterised full-order VECM models may weaken the power of statistical inference. In this paper, it is argued that the zero–non-zero (ZNZ) patterned VECM is a more straightforward and effective means of testing for both indirect causality and Granger non-causality. For a ZNZ patterned VECM framework for time series of integrated order two, we provide a new algorithm to select cointegrating and loading vectors that can contain zero entries. Two case studies are used to demonstrate the usefulness of the algorithm in tests of purchasing power parity and a three-variable system involving the stock market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a method for diagnosing the impacts of second-home tourism and illustrates it for a Mediterranean Spanish destination. This method proposes the application of network analysis software to the analysis of causal maps in order to create a causal network model based on stakeholder-identified impacts. The main innovation is the analysis of indirect relations in causal maps for the identification of the most influential nodes in the model. The results show that the most influential nodes are of a political nature, which contradicts previous diagnoses identifying technical planning as the ultimate cause of problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents an investigation, of synchronisation and causality, motivated by problems in computational neuroscience. The thesis addresses both theoretical and practical signal processing issues regarding the estimation of interdependence from a set of multivariate data generated by a complex underlying dynamical system. This topic is driven by a series of problems in neuroscience, which represents the principal background motive behind the material in this work. The underlying system is the human brain and the generative process of the data is based on modern electromagnetic neuroimaging methods . In this thesis, the underlying functional of the brain mechanisms are derived from the recent mathematical formalism of dynamical systems in complex networks. This is justified principally on the grounds of the complex hierarchical and multiscale nature of the brain and it offers new methods of analysis to model its emergent phenomena. A fundamental approach to study the neural activity is to investigate the connectivity pattern developed by the brain’s complex network. Three types of connectivity are important to study: 1) anatomical connectivity refering to the physical links forming the topology of the brain network; 2) effective connectivity concerning with the way the neural elements communicate with each other using the brain’s anatomical structure, through phenomena of synchronisation and information transfer; 3) functional connectivity, presenting an epistemic concept which alludes to the interdependence between data measured from the brain network. The main contribution of this thesis is to present, apply and discuss novel algorithms of functional connectivities, which are designed to extract different specific aspects of interaction between the underlying generators of the data. Firstly, a univariate statistic is developed to allow for indirect assessment of synchronisation in the local network from a single time series. This approach is useful in inferring the coupling as in a local cortical area as observed by a single measurement electrode. Secondly, different existing methods of phase synchronisation are considered from the perspective of experimental data analysis and inference of coupling from observed data. These methods are designed to address the estimation of medium to long range connectivity and their differences are particularly relevant in the context of volume conduction, that is known to produce spurious detections of connectivity. Finally, an asymmetric temporal metric is introduced in order to detect the direction of the coupling between different regions of the brain. The method developed in this thesis is based on a machine learning extensions of the well known concept of Granger causality. The thesis discussion is developed alongside examples of synthetic and experimental real data. The synthetic data are simulations of complex dynamical systems with the intention to mimic the behaviour of simple cortical neural assemblies. They are helpful to test the techniques developed in this thesis. The real datasets are provided to illustrate the problem of brain connectivity in the case of important neurological disorders such as Epilepsy and Parkinson’s disease. The methods of functional connectivity in this thesis are applied to intracranial EEG recordings in order to extract features, which characterize underlying spatiotemporal dynamics before during and after an epileptic seizure and predict seizure location and onset prior to conventional electrographic signs. The methodology is also applied to a MEG dataset containing healthy, Parkinson’s and dementia subjects with the scope of distinguishing patterns of pathological from physiological connectivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To compare, in patients with cancer and in healthy subjects, measured resting energy expenditure (REE) from traditional indirect calorimetry to a new portable device (MedGem) and predicted REE. DESIGN: Cross-sectional clinical validation study. SETTING: Private radiation oncology centre, Brisbane, Australia. SUBJECTS: Cancer patients (n = 18) and healthy subjects (n = 17) aged 37-86 y, with body mass indices ranging from 18 to 42 kg/m(2). INTERVENTIONS: Oxygen consumption (VO(2)) and REE were measured by VMax229 (VM) and MedGem (MG) indirect calorimeters in random order after a 12-h fast and 30-min rest. REE was also calculated from the MG without adjustment for nitrogen excretion (MGN) and estimated from Harris-Benedict prediction equations. Data were analysed using the Bland and Altman approach, based on a clinically acceptable difference between methods of 5%. RESULTS: The mean bias (MGN-VM) was 10% and limits of agreement were -42 to 21% for cancer patients; mean bias -5% with limits of -45 to 35% for healthy subjects. Less than half of the cancer patients (n = 7, 46.7%) and only a third (n = 5, 33.3%) of healthy subjects had measured REE by MGN within clinically acceptable limits of VM. Predicted REE showed a mean bias (HB-VM) of -5% for cancer patients and 4% for healthy subjects, with limits of agreement of -30 to 20% and -27 to 34%, respectively. CONCLUSIONS: Limits of agreement for the MG and Harris Benedict equations compared to traditional indirect calorimetry were similar but wide, indicating poor clinical accuracy for determining the REE of individual cancer patients and healthy subjects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Government figures put the current indigenous unemployment rate at around 23%, 3 times the unemployment rate for other Australians. This thesis aims to assess whether Australian indirect discrimination legislation can provide a remedy for one of the causes of indigenous unemployment - the systemic discrimination which can result from the mere operation of established procedures of recruitment and hiring. The impact of those practices on indigenous people is examined in the context of an analysis of anti-discrimination legislation and cases from all Australian jurisdictions from the time of the passing of the Racial Discrimination Act by the Commonwealth in 1975 to the present. The thesis finds a number of reasons why the legislation fails to provide equality of opportunity for indigenous people seeking to enter the workforce. In nearly all jurisdictions it is obscurely drafted, used mainly by educated middle class white women, and provides remedies which tend to be compensatory damages rather than change to recruitment policy. White dominance of the legal process has produced legislative and judicial definitions of "race" and "Aboriginality" which focus on biology rather than cultural difference. In the commissions and tribunals complaints of racial discrimination are often rejected on the grounds of being "vexatious" or "frivolous", not reaching the required standard of proof, or not showing a causal connection between race and the conduct complained of. In all jurisdictions the cornerstone of liability is whether a particular employment term, condition or practice is reasonable. The thesis evaluates the approaches taken by appellate courts, including the High Court, and concludes that there is a trend towards an interpretation of reasonableness which favours employer arguments such as economic rationalism, the maintenance of good industrial relations, managerial prerogative to hire and fire, and the protection of majority rights. The thesis recommends that separate, clearly drafted legislation should be passed to address indigenous disadvantage and that indigenous people should be involved in all stages of the process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Neopolycystus sp. is the only primary egg parasitoid associated with the pest beetle Paropsis atomaria in subtropical eucalypt plantations, but its impact on its host populations is unknown. The simplified ecosystem represented by the plantation habitat, lack of interspecific competition for host and parasitoid, and the multivoltinism of the host population makes this an ideal system for quantifying the direct and indirect effects of egg parasitism, and hence, effects on host population dynamics. Within-, between- and overall-egg-batch parasitism rates were determined at three field sites over two field seasons, and up to seven host generations. The effect of exposure time (egg batch age), host density proximity to native forest and water sources on egg parasitism rates was also tested. Neopolycystus sp. exerts a significant influence on P. atomaria populations in Eucalyptus cloeziana. plantations in south-eastern Queensland, causing the direct (13%) and indirect (15%) mortality of almost one-third of all eggs in the field. Across seasons and generations, 45% of egg batches were parasitised, with a within-batch parasitism rate of around 30%. Between-batch parasitism increased up to 5–6 days after oviposition in the field, although within-batch parasitism rates generally did not. However, there were few apparent patterns to egg parasitism, with rates often varying significantly between sites and seasons.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the purpose of testing the hypothesis that households’ intentions to replace their old car have a direct negative relationship to its perceived quality (‘current level’) and a direct positive relationship to their aspirations for a new car (‘aspiration level’), a rotating panel of car owners were interviewed every fourth month during 2 years. In this data set the hypothesis received support. In addition the results showed that the age of the car, the total number of miles driven, and the number of anticipated repairs affected the current level, whereas marital status, the number of children, consumer confidence, and environmental concern affected the aspiration level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Boards of directors are thought to provide access to a wealth of knowledge and resources for the companies they serve, and are considered important to corporate governance. Under the Resource Based View (RBV) of the firm (Wernerfelt, 1984) boards are viewed as a strategic resource available to firms. As a consequence there has been a significant research effort aimed at establishing a link between board attributes and company performance. In this thesis I explore and extend the study of interlocking directorships (Mizruchi, 1996; Scott 1991a) by examining the links between directors’ opportunity networks and firm performance. Specifically, I use resource dependence theory (Pfeffer & Salancik, 1978) and social capital theory (Burt, 1980b; Coleman, 1988) as the basis for a new measure of a board’s opportunity network. I contend that both directors’ formal company ties and their social ties determine a director’s opportunity network through which they are able to access and mobilise resources for their firms. This approach is based on recent studies that suggest the measurement of interlocks at the director level, rather than at the firm level, may be a more reliable indicator of this phenomenon. This research uses publicly available data drawn from Australia’s top-105 listed companies and their directors in 1999. I employ Social Network Analysis (SNA) (Scott, 1991b) using the UCINET software to analyse the individual director’s formal and social networks. SNA is used to measure a the number of ties a director has to other directors in the top-105 company director network at both one and two degrees of separation, that is, direct ties and indirect (or ‘friend of a friend’) ties. These individual measures of director connectedness are aggregated to produce a board-level network metric for comparison with measures of a firm’s performance using multiple regression analysis. Performance is measured with accounting-based and market-based measures. Findings indicate that better-connected boards are associated with higher market-based company performance (measured by Tobin’s q). However, weaker and mostly unreliable associations were found for accounting-based performance measure ROA. Furthermore, formal (or corporate) network ties are a stronger predictor of market performance than total network ties (comprising social and corporate ties). Similarly, strong ties (connectedness at degree-1) are better predictors of performance than weak ties (connectedness at degree-2). My research makes four contributions to the literature on director interlocks. First, it extends a new way of measuring a board’s opportunity network based on the director rather than the company as the unit of interlock. Second, it establishes evidence of a relationship between market-based measures of firm performance and the connectedness of that firm’s board. Third, it establishes that director’s formal corporate ties matter more to market-based firm performance than their social ties. Fourth, it establishes that director’s strong direct ties are more important to market-based performance than weak ties. The thesis concludes with implications for research and practice, including a more speculative interpretation of these results. In particular, I raise the possibility of reverse causality – that is networked directors seek to join high-performing companies. Thus, the relationship may be a result of symbolic action by companies seeking to increase the legitimacy of their firms rather than a reflection of the social capital available to the companies. This is an important consideration worthy of future investigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the study of student learning literature, the traditional view holds that when students are faced with heavy workload, poor teaching, and content that they cannot relate to – important aspects of the learning context, they will more likely utilise the surface approach to learning due to stresses, lack of understanding and lack of perceived relevance of the content (Kreber, 2003; Lizzio, Wilson, & Simons, 2002; Ramdsen, 1989; Ramsden, 1992; Trigwell & Prosser, 1991; Vermunt, 2005). For example, in studies involving health and medical sciences students, courses that utilised student-centred, problem-based approaches to teaching and learning were found to elicit a deeper approach to learning than the teacher-centred, transmissive approach (Patel, Groen, & Norman, 1991; Sadlo & Richardson, 2003). It is generally accepted that the line of causation runs from the learning context (or rather students’ self reported data on the learning context) to students’ learning approaches. That is, it is the learning context as revealed by students’ self-reported data that elicit the associated learning behaviour. However, other research studies also found that the same teaching and learning environment can be perceived differently by different students. In a study of students’ perceptions of assessment requirements, Sambell and McDowell (1998) found that students “are active in the reconstruction of the messages and meanings of assessment” (p. 391), and their interpretations are greatly influenced by their past experiences and motivations. In a qualitative study of Hong Kong tertiary students, Kember (2004) found that students using the surface learning approach reported heavier workload than students using the deep learning approach. According to Kember if students learn by extracting meanings from the content and making connections, they will more likely see the higher order intentions embodied in the content and the high cognitive abilities being assessed. On the other hand, if they rote-learn for the graded task, they fail to see the hierarchical relationship in the content and to connect the information. These rote-learners will tend to see the assessment as requiring memorising and regurgitation of a large amount of unconnected knowledge, which explains why they experience a high workload. Kember (2004) thus postulate that it is the learning approach that influences how students perceive workload. Campbell and her colleagues made a similar observation in their interview study of secondary students’ perceptions of teaching in the same classroom (Campbell et al., 2001). The above discussions suggest that students’ learning approaches can influence their perceptions of assessment demands and other aspects of the learning context such as relevance of content and teaching effectiveness. In other words, perceptions of elements in the teaching and learning context are endogenously determined. This study attempted to investigate the causal relationships at the individual level between learning approaches and perceptions of the learning context in economics education. In this study, students’ learning approaches and their perceptions of the learning context were measured. The elements of the learning context investigated include: teaching effectiveness, workload and content. The authors are aware of existence of other elements of the learning context, such as generic skills, goal clarity and career preparation. These aspects, however, were not within the scope of this present study and were therefore not investigated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a novel approach for developing summary statistics for use in approximate Bayesian computation (ABC) algorithms by using indirect inference. ABC methods are useful for posterior inference in the presence of an intractable likelihood function. In the indirect inference approach to ABC the parameters of an auxiliary model fitted to the data become the summary statistics. Although applicable to any ABC technique, we embed this approach within a sequential Monte Carlo algorithm that is completely adaptive and requires very little tuning. This methodological development was motivated by an application involving data on macroparasite population evolution modelled by a trivariate stochastic process for which there is no tractable likelihood function. The auxiliary model here is based on a beta–binomial distribution. The main objective of the analysis is to determine which parameters of the stochastic model are estimable from the observed data on mature parasite worms.