958 resultados para empirical models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advances that have been characterizing spatial econometrics in recent years are mostly theoretical and have not found an extensive empirical application yet. In this work we aim at supplying a review of the main tools of spatial econometrics and to show an empirical application for one of the most recently introduced estimators. Despite the numerous alternatives that the econometric theory provides for the treatment of spatial (and spatiotemporal) data, empirical analyses are still limited by the lack of availability of the correspondent routines in statistical and econometric software. Spatiotemporal modeling represents one of the most recent developments in spatial econometric theory and the finite sample properties of the estimators that have been proposed are currently being tested in the literature. We provide a comparison between some estimators (a quasi-maximum likelihood, QML, estimator and some GMM-type estimators) for a fixed effects dynamic panel data model under certain conditions, by means of a Monte Carlo simulation analysis. We focus on different settings, which are characterized either by fully stable or quasi-unit root series. We also investigate the extent of the bias that is caused by a non-spatial estimation of a model when the data are characterized by different degrees of spatial dependence. Finally, we provide an empirical application of a QML estimator for a time-space dynamic model which includes a temporal, a spatial and a spatiotemporal lag of the dependent variable. This is done by choosing a relevant and prolific field of analysis, in which spatial econometrics has only found limited space so far, in order to explore the value-added of considering the spatial dimension of the data. In particular, we study the determinants of cropland value in Midwestern U.S.A. in the years 1971-2009, by taking the present value model (PVM) as the theoretical framework of analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Sub-Saharan Africa, non-democratic events, like civil wars and coup d'etat, destroy economic development. This study investigates both domestic and spatial effects on the likelihood of civil wars and coup d'etat. To civil wars, an increase of income growth is one of common research conclusions to stop wars. This study adds a concern on ethnic fractionalization. IV-2SLS is applied to overcome causality problem. The findings document that income growth is significant to reduce number and degree of violence in high ethnic fractionalized countries, otherwise they are trade-off. Income growth reduces amount of wars, but increases its violent level, in the countries with few large ethnic groups. Promoting growth should consider ethnic composition. This study also investigates the clustering and contagion of civil wars using spatial panel data models. Onset, incidence and end of civil conflicts spread across the network of neighboring countries while peace, the end of conflicts, diffuse only with the nearest neighbor. There is an evidence of indirect links from neighboring income growth, without too much inequality, to reduce the likelihood of civil wars. To coup d'etat, this study revisits its diffusion for both all types of coups and only successful ones. The results find an existence of both domestic and spatial determinants in different periods. Domestic income growth plays major role to reduce the likelihood of coup before cold war ends, while spatial effects do negative afterward. Results on probability to succeed coup are similar. After cold war ends, international organisations seriously promote democracy with pressure against coup d'etat, and it seems to be effective. In sum, this study indicates the role of domestic ethnic fractionalization and the spread of neighboring effects to the likelihood of non-democratic events in a country. Policy implementation should concern these factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Small-scale dynamic stochastic general equilibrium have been treated as the benchmark of much of the monetary policy literature, given their ability to explain the impact of monetary policy on output, inflation and financial markets. One cause of the empirical failure of New Keynesian models is partially due to the Rational Expectations (RE) paradigm, which entails a tight structure on the dynamics of the system. Under this hypothesis, the agents are assumed to know the data genereting process. In this paper, we propose the econometric analysis of New Keynesian DSGE models under an alternative expectations generating paradigm, which can be regarded as an intermediate position between rational expectations and learning, nameley an adapted version of the "Quasi-Rational" Expectatations (QRE) hypothesis. Given the agents' statistical model, we build a pseudo-structural form from the baseline system of Euler equations, imposing that the length of the reduced form is the same as in the `best' statistical model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the first chapter, I develop a panel no-cointegration test which extends Pesaran, Shin and Smith (2001)'s bounds test to the panel framework by considering the individual regressions in a Seemingly Unrelated Regression (SUR) system. This allows to take into account unobserved common factors that contemporaneously affect all the units of the panel and provides, at the same time, unit-specific test statistics. Moreover, the approach is particularly suited when the number of individuals of the panel is small relatively to the number of time series observations. I develop the algorithm to implement the test and I use Monte Carlo simulation to analyze the properties of the test. The small sample properties of the test are remarkable, compared to its single equation counterpart. I illustrate the use of the test through a test of Purchasing Power Parity in a panel of EU15 countries. In the second chapter of my PhD thesis, I verify the Expectation Hypothesis of the Term Structure in the repurchasing agreements (repo) market with a new testing approach. I consider an "inexact" formulation of the EHTS, which models a time-varying component in the risk premia and I treat the interest rates as a non-stationary cointegrated system. The effect of the heteroskedasticity is controlled by means of testing procedures (bootstrap and heteroskedasticity correction) which are robust to variance and covariance shifts over time. I fi#nd that the long-run implications of EHTS are verified. A rolling window analysis clarifies that the EHTS is only rejected in periods of turbulence of #financial markets. The third chapter introduces the Stata command "bootrank" which implements the bootstrap likelihood ratio rank test algorithm developed by Cavaliere et al. (2012). The command is illustrated through an empirical application on the term structure of interest rates in the US.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The diet of early human ancestors has received renewed theoretical interest since the discovery of elevated d13C values in the enamel of Australopithecus africanus and Paranthropus robustus. As a result, the hominin diet is hypothesized to have included C4 grass or the tissues of animals which themselves consumed C4 grass. On mechanical grounds, such a diet is incompatible with the dental morphology and dental microwear of early hominins. Most inferences, particularly for Paranthropus, favor a diet of hard or mechanically resistant foods. This discrepancy has invigorated the longstanding hypothesis that hominins consumed plant underground storage organs (USOs). Plant USOs are attractive candidate foods because many bulbous grasses and cormous sedges use C4 photosynthesis. Yet mechanical data for USOs—or any putative hominin food—are scarcely known. To fill this empirical void we measured the mechanical properties of USOs from 98 plant species from across sub-Saharan Africa. We found that rhizomes were the most resistant to deformation and fracture, followed by tubers, corms, and bulbs. An important result of this study is that corms exhibited low toughness values (mean = 265.0 J m-2) and relatively high Young’s modulus values (mean = 4.9 MPa). This combination of properties fits many descriptions of the hominin diet as consisting of hard-brittle objects. When compared to corms, bulbs are tougher (mean = 325.0 J m-2) and less stiff (mean = 2.5 MPa). Again, this combination of traits resembles dietary inferences, especially for Australopithecus, which is predicted to have consumed soft-tough foods. Lastly, we observed the roasting behavior of Hadza hunter-gatherers and measured the effects of roasting on the toughness on undomesticated tubers. Our results support assumptions that roasting lessens the work of mastication, and, by inference, the cost of digestion. Together these findings provide the first mechanical basis for discussing the adaptive advantages of roasting tubers and the plausibility of USOs in the diet of early hominins.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Marginal generalized linear models can be used for clustered and longitudinal data by fitting a model as if the data were independent and using an empirical estimator of parameter standard errors. We extend this approach to data where the number of observations correlated with a given one grows with sample size and show that parameter estimates are consistent and asymptotically Normal with a slower convergence rate than for independent data, and that an information sandwich variance estimator is consistent. We present two problems that motivated this work, the modelling of patterns of HIV genetic variation and the behavior of clustered data estimators when clusters are large.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider nonparametric missing data models for which the censoring mechanism satisfies coarsening at random and which allow complete observations on the variable X of interest. W show that beyond some empirical process conditions the only essential condition for efficiency of an NPMLE of the distribution of X is that the regions associated with incomplete observations on X contain enough complete observations. This is heuristically explained by describing the EM-algorithm. We provide identifiably of the self-consistency equation and efficiency of the NPMLE in order to make this statement rigorous. The usual kind of differentiability conditions in the proof are avoided by using an identity which holds for the NPMLE of linear parameters in convex models. We provide a bivariate censoring application in which the condition and hence the NPMLE fails, but where other estimators, not based on the NPMLE principle, are highly inefficient. It is shown how to slightly reduce the data so that the conditions hold for the reduced data. The conditions are verified for the univariate censoring, double censored, and Ibragimov-Has'minski models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The flammability zone boundaries are very important properties to prevent explosions in the process industries. Within the boundaries, a flame or explosion can occur so it is important to understand these boundaries to prevent fires and explosions. Very little work has been reported in the literature to model the flammability zone boundaries. Two boundaries are defined and studied: the upper flammability zone boundary and the lower flammability zone boundary. Three methods are presented to predict the upper and lower flammability zone boundaries: The linear model The extended linear model, and An empirical model The linear model is a thermodynamic model that uses the upper flammability limit (UFL) and lower flammability limit (LFL) to calculate two adiabatic flame temperatures. When the proper assumptions are applied, the linear model can be reduced to the well-known equation yLOC = zyLFL for estimation of the limiting oxygen concentration. The extended linear model attempts to account for the changes in the reactions along the UFL boundary. Finally, the empirical method fits the boundaries with linear equations between the UFL or LFL and the intercept with the oxygen axis. xx Comparison of the models to experimental data of the flammability zone shows that the best model for estimating the flammability zone boundaries is the empirical method. It is shown that is fits the limiting oxygen concentration (LOC), upper oxygen limit (UOL), and the lower oxygen limit (LOL) quite well. The regression coefficient values for the fits to the LOC, UOL, and LOL are 0.672, 0.968, and 0.959, respectively. This is better than the fit of the "zyLFL" method for the LOC in which the regression coefficient’s value is 0.416.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: This empirical study analyzes the current status of Cochrane Reviews (CRs) and their strength of recommendation for evidence-based decision making in the field of general surgery. METHODS: Systematic literature search of the Cochrane Database of Systematic Reviews and the Cochrane Collaboration's homepage to identify available CRs on surgical topics. Quantitative and qualitative characteristics, utilization, and formulated treatment recommendations were evaluated by 2 independent reviewers. Association of review characteristics with treatment recommendation was analyzed using univariate and multivariate logistic regression models. RESULTS: Ninety-three CRs, including 1,403 primary studies and 246,473 patients, were identified. Mean number of included primary studies per CR was 15.1 (standard deviation [SD] 14.5) including 2,650 (SD 3,340) study patients. Two and a half (SD 8.3) nonrandomized trials were included per analyzed CR. Seventy-two (77%) CRs were published or updated in 2005 or later. Explicit treatment recommendations were given in 45 (48%). Presence of a treatment recommendation was associated with the number of included primary studies and the proportion of randomized studies. Utilization of surgical CRs remained low and showed large inter-country differences. The most surgical CRs were accessed in UK, USA, and Australia, followed by several Western and Eastern European countries. CONCLUSION: Only a minority of available CRs address surgical questions and their current usage is low. Instead of unsystematically increasing the number of surgical CRs it would be far more efficient to focus the review process on relevant surgical questions. Prioritization of CRs needs valid methods which should be developed by the scientific surgical community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrated choice and latent variable (ICLV) models represent a promising new class of models which merge classic choice models with the structural equation approach (SEM) for latent variables. Despite their conceptual appeal, applications of ICLV models in marketing remain rare. We extend previous ICLV applications by first estimating a multinomial choice model and, second, by estimating hierarchical relations between latent variables. An empirical study on travel mode choice clearly demonstrates the value of ICLV models to enhance the understanding of choice processes. In addition to the usually studied directly observable variables such as travel time, we show how abstract motivations such as power and hedonism as well as attitudes such as a desire for flexibility impact on travel mode choice. Furthermore, we show that it is possible to estimate such a complex ICLV model with the widely available structural equation modeling package Mplus. This finding is likely to encourage more widespread application of this appealing model class in the marketing field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Landscapes of education are a new topic within the debate about adequate and just education and human development for everybody. In particular, children and youths from social classes affected by poverty, a lack of prospects or minimal schooling are a focal group that should be offered new approaches and opportunities of cognitive and social development by way of these landscapes of education. It has become apparent that the traditional school alone does not suffice to meet this need. There is no doubt that competency-based orientation and employability are core areas with the help of which the generation now growing up will manage the start of its professional career. In addition and by no means less important, the development involves individual, social, cultural and societal perspectives that can be combined under the term of human development. In this context, the Capability Approach elaborated by Amartya Sen and Martha Nussbaum has developed a more extensive concept of human development and related it to empirical instruments. Using the analytic concept of individual capabilities and societal opportunities they shaped a socio-political formula that should be adapted in particular to modern social work. Moreover, the Capability Approach offers a critical foil with regard to further development and revision of institutionalised approaches in education and human development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction Prospective memory (PM), the ability to remember to perform intended activities in the future (Kliegel & Jäger, 2007), is crucial to succeed in everyday life. PM seems to improve gradually over the childhood years (Zimmermann & Meier, 2006), but yet little is known about PM competences in young school children in general, and even less is known about factors influencing its development. Currently, a number of studies suggest that executive functions (EF) are potentially influencing processes (Ford, Driscoll, Shum & Macaulay, 2012; Mahy & Moses, 2011). Additionally, metacognitive processes (MC: monitoring and control) are assumed to be involved while optimizing one’s performance (Krebs & Roebers, 2010; 2012; Roebers, Schmid, & Roderer, 2009). Yet, the relations between PM, EF and MC remain relatively unspecified. We intend to empirically examine the structural relations between these constructs. Method A cross-sectional study including 119 2nd graders (mage = 95.03, sdage = 4.82) will be presented. Participants (n = 68 girls) completed three EF tasks (stroop, updating, shifting), a computerised event-based PM task and a MC spelling task. The latent variables PM, EF and MC that were represented by manifest variables deriving from the conducted tasks, were interrelated by structural equation modelling. Results Analyses revealed clear associations between the three cognitive constructs PM, EF and MC (rpm-EF = .45, rpm-MC = .23, ref-MC = .20). A three factor model, as opposed to one or two factor models, appeared to fit excellently to the data (chi2(17, 119) = 18.86, p = .34, remsea = .030, cfi = .990, tli = .978). Discussion The results indicate that already in young elementary school children, PM, EF and MC are empirically well distinguishable, but nevertheless substantially interrelated. PM and EF seem to share a substantial amount of variance while for MC, more unique processes may be assumed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Family change theory suggests three ideal-typical family models characterized by different combinations of emotional and material interdependencies in the family. Its major proposition is that in economically developing countries with a collectivistic background a family model of emotional interdependence emerges from a family model of complete interdependence. The current study aims to identify and compare patterns of family-related value orientations related to family change theory across three cultures and two generations. Overall, N = 919 dyads of mothers and their adolescent children from Germany, Turkey, and India participated in the study. Three clusters were identified representing the family models of independence, interdependence, and emotional interdependence, respectively. Especially the identification of an emotionally interdependent value pattern using a person-oriented approach is an important step in the empirical validation of family change theory. The preference for the three family models differed across as well as within cultures and generations according to theoretical predictions. Dyadic analyses pointed to substantial intergenerational similarities and also to differences in family models, reflecting both cultural continuity as well as change in family-related value orientations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Seizure freedom in patients suffering from pharmacoresistant epilepsies is still not achieved in 20–30% of all cases. Hence, current therapies need to be improved, based on a more complete understanding of ictogenesis. In this respect, the analysis of functional networks derived from intracranial electroencephalographic (iEEG) data has recently become a standard tool. Functional networks however are purely descriptive models and thus are conceptually unable to predict fundamental features of iEEG time-series, e.g., in the context of therapeutical brain stimulation. In this paper we present some first steps towards overcoming the limitations of functional network analysis, by showing that its results are implied by a simple predictive model of time-sliced iEEG time-series. More specifically, we learn distinct graphical models (so called Chow–Liu (CL) trees) as models for the spatial dependencies between iEEG signals. Bayesian inference is then applied to the CL trees, allowing for an analytic derivation/prediction of functional networks, based on thresholding of the absolute value Pearson correlation coefficient (CC) matrix. Using various measures, the thus obtained networks are then compared to those which were derived in the classical way from the empirical CC-matrix. In the high threshold limit we find (a) an excellent agreement between the two networks and (b) key features of periictal networks as they have previously been reported in the literature. Apart from functional networks, both matrices are also compared element-wise, showing that the CL approach leads to a sparse representation, by setting small correlations to values close to zero while preserving the larger ones. Overall, this paper shows the validity of CL-trees as simple, spatially predictive models for periictal iEEG data. Moreover, we suggest straightforward generalizations of the CL-approach for modeling also the temporal features of iEEG signals.