50 resultados para Higher order terms


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Research in epistasis or gene-gene interaction detection for human complex traits has grown over the last few years. It has been marked by promising methodological developments, improved translation efforts of statistical epistasis to biological epistasis and attempts to integrate different omics information sources into the epistasis screening to enhance power. The quest for gene-gene interactions poses severe multiple-testing problems. In this context, the maxT algorithm is one technique to control the false-positive rate. However, the memory needed by this algorithm rises linearly with the amount of hypothesis tests. Gene-gene interaction studies will require a memory proportional to the squared number of SNPs. A genome-wide epistasis search would therefore require terabytes of memory. Hence, cache problems are likely to occur, increasing the computation time. In this work we present a new version of maxT, requiring an amount of memory independent from the number of genetic effects to be investigated. This algorithm was implemented in C++ in our epistasis screening software MBMDR-3.0.3. We evaluate the new implementation in terms of memory efficiency and speed using simulated data. The software is illustrated on real-life data for Crohn’s disease. Results: In the case of a binary (affected/unaffected) trait, the parallel workflow of MBMDR-3.0.3 analyzes all gene-gene interactions with a dataset of 100,000 SNPs typed on 1000 individuals within 4 days and 9 hours, using 999 permutations of the trait to assess statistical significance, on a cluster composed of 10 blades, containing each four Quad-Core AMD Opteron(tm) Processor 2352 2.1 GHz. In the case of a continuous trait, a similar run takes 9 days. Our program found 14 SNP-SNP interactions with a multiple-testing corrected p-value of less than 0.05 on real-life Crohn’s disease (CD) data. Conclusions: Our software is the first implementation of the MB-MDR methodology able to solve large-scale SNP-SNP interactions problems within a few days, without using much memory, while adequately controlling the type I error rates. A new implementation to reach genome-wide epistasis screening is under construction. In the context of Crohn’s disease, MBMDR-3.0.3 could identify epistasis involving regions that are well known in the field and could be explained from a biological point of view. This demonstrates the power of our software to find relevant phenotype-genotype higher-order associations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Abstract. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Because conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. It is shown that as the number of simulations diverges, the estimator is consistent and a higher-order expansion reveals the stochastic difference between the infeasible GMM estimator based on the same moment conditions and the simulated version. In particular, we show how to adjust standard errors to account for the simulations. Monte Carlo results show how the estimator may be applied to a range of dynamic latent variable (DLV) models, and that it performs well in comparison to several other estimators that have been proposed for DLV models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Given a sample from a fully specified parametric model, let Zn be a given finite-dimensional statistic - for example, an initial estimator or a set of sample moments. We propose to (re-)estimate the parameters of the model by maximizing the likelihood of Zn. We call this the maximum indirect likelihood (MIL) estimator. We also propose a computationally tractable Bayesian version of the estimator which we refer to as a Bayesian Indirect Likelihood (BIL) estimator. In most cases, the density of the statistic will be of unknown form, and we develop simulated versions of the MIL and BIL estimators. We show that the indirect likelihood estimators are consistent and asymptotically normally distributed, with the same asymptotic variance as that of the corresponding efficient two-step GMM estimator based on the same statistic. However, our likelihood-based estimators, by taking into account the full finite-sample distribution of the statistic, are higher order efficient relative to GMM-type estimators. Furthermore, in many cases they enjoy a bias reduction property similar to that of the indirect inference estimator. Monte Carlo results for a number of applications including dynamic and nonlinear panel data models, a structural auction model and two DSGE models show that the proposed estimators indeed have attractive finite sample properties.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The classical wave-of-advance model of the neolithic transition (i.e., the shift from hunter-gatherer to agricultural economies) is based on Fisher's reaction-diffusion equation. Here we present an extension of Einstein's approach to Fickian diffusion, incorporating reaction terms. On this basis we show that second-order terms in the reaction-diffusion equation, which have been neglected up to now, are not in fact negligible but can lead to important corrections. The resulting time-delayed model agrees quite well with observations

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Projecte de recerca elaborat a partir d’una estada al Max Planck Institute for Human Cognitive and Brain Sciences, Alemanya, entre 2010 i 2012. El principal objectiu d’aquest projecte era estudiar en detall les estructures subcorticals, en concret, el rol dels ganglis basals en control cognitiu durant processament lingüístic i no-lingüístic. Per tal d’assolir una diferenciació minuciosa en els diferents nuclis dels ganglis basals s’utilitzà ressonància magnètica d’ultra-alt camp i alta resolució (7T-MRI). El còrtex prefrontal lateral i els ganglis basals treballant conjuntament per a mitjançar memòria de treball i la regulació “top-down” de la cognició. Aquest circuit regula l’equilibri entre respostes automàtiques i d’alt-ordre cognitiu. Es crearen tres condicions experimentals principals: frases/seqüències noambigües, no-gramatical i ambigües. Les frases/seqüències no-ambigües haurien de provocar una resposta automàtica, mentre les frases/seqüències ambigües i no-gramaticals produïren un conflicte amb la resposta automàtica, i per tant, requeririen una resposta de d’alt-ordre cognitiu. Dins del domini de la resposta de control, la ambigüitat i no-gramaticalitat representen dues dimensions diferents de la resolució de conflicte, mentre per una frase/seqüència temporalment ambigua existeix una interpretació correcte, aquest no és el cas per a les frases/seqüències no-gramaticals. A més, el disseny experimental incloïa una manipulació lingüística i nolingüística, la qual posà a prova la hipòtesi que els efectes són de domini-general; així com una manipulació semàntica i sintàctica que avaluà les diferències entre el processament d’ambigüitat/error “intrínseca” vs. “estructural”. Els resultats del primer experiment (sintax-lingüístic) mostraren un gradient rostroventralcaudodorsal de control cognitiu dins del nucli caudat, això és, les regions més rostrals sostenint els nivells més alts de processament cognitiu

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper I explore two hypotheses: (1) Formal child care availability for children under three has a positive effect across contexts, according to the degree of adaptation of social institutions to changes in gender roles. Event history models with regional fixed effects are applied to data from the European Community Household Panel (1994-2001). The results show a significant and positive effect of regional day care availability on both, first and higher order births, while results are consistent with the second hypothesis only for second or higher order births.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The paper proposes a numerical solution method for general equilibrium models with a continuum of heterogeneous agents, which combines elements of projection and of perturbation methods. The basic idea is to solve first for the stationary solutionof the model, without aggregate shocks but with fully specified idiosyncratic shocks. Afterwards one computes a first-order perturbation of the solution in the aggregate shocks. This approach allows to include a high-dimensional representation of the cross-sectional distribution in the state vector. The method is applied to a model of household saving with uninsurable income risk and liquidity constraints. The model includes not only productivity shocks, but also shocks to redistributive taxation, which cause substantial short-run variation in the cross-sectional distribution of wealth. If those shocks are operative, it is shown that a solution method based on very few statistics of the distribution is not suitable, while the proposed method can solve the model with high accuracy, at least for the case of small aggregate shocks. Techniques are discussed to reduce the dimension of the state space such that higher order perturbations are feasible.Matlab programs to solve the model can be downloaded.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we examine the effect of tax policy on the relationship between inequality and growth in a two-sector non-scale model. With non-scale models, the longrun equilibrium growth rate is determined by technological parameters and it is independent of macroeconomic policy instruments. However, this fact does not imply that fiscal policy is unimportant for long-run economic performance. It indeed has important effects on the different levels of key economic variables such as per capita stock of capital and output. Hence, although the economy grows at the same rate across steady states, the bases for economic growth may be different.The model has three essential features. First, we explicitly model skill accumulation, second, we introduce government finance into the production function, and we introduce an income tax to mirror the fiscal events of the 1980¿s and 1990¿s in the US. The fact that the non-scale model is associated with higher order dynamics enables it to replicate the distinctly non-linear nature of inequality in the US with relative ease. The results derived in this paper attract attention to the fact that the non-scale growth model does not only fit the US data well for the long-run (Jones, 1995b) but also that it possesses unique abilities in explaining short term fluctuations of the economy. It is shown that during transition the response of the relative simulated wage to changes in the tax code is rather non-monotonic, quite in accordance to the US inequality pattern in the 1980¿s and early 1990¿s.More specifically, we have analyzed in detail the dynamics following the simulation of an isolated tax decrease and an isolated tax increase. So, after a tax decrease the skill premium follows a lower trajectory than the one it would follow without a tax decrease. Hence we are able to reduce inequality for several periods after the fiscal shock. On the contrary, following a tax increase, the evolution of the skill premium remains above the trajectory carried on by the skill premium under a situation with no tax increase. Consequently, a tax increase would imply a higher level of inequality in the economy

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we examine the effect of tax policy on the relationship between inequality and growth in a two-sector non-scale model. With non-scale models, the longrun equilibrium growth rate is determined by technological parameters and it is independent of macroeconomic policy instruments. However, this fact does not imply that fiscal policy is unimportant for long-run economic performance. It indeed has important effects on the different levels of key economic variables such as per capita stock of capital and output. Hence, although the economy grows at the same rate across steady states, the bases for economic growth may be different.The model has three essential features. First, we explicitly model skill accumulation, second, we introduce government finance into the production function, and we introduce an income tax to mirror the fiscal events of the 1980¿s and 1990¿s in the US. The fact that the non-scale model is associated with higher order dynamics enables it to replicate the distinctly non-linear nature of inequality in the US with relative ease. The results derived in this paper attract attention to the fact that the non-scale growth model does not only fit the US data well for the long-run (Jones, 1995b) but also that it possesses unique abilities in explaining short term fluctuations of the economy. It is shown that during transition the response of the relative simulated wage to changes in the tax code is rather non-monotonic, quite in accordance to the US inequality pattern in the 1980¿s and early 1990¿s.More specifically, we have analyzed in detail the dynamics following the simulation of an isolated tax decrease and an isolated tax increase. So, after a tax decrease the skill premium follows a lower trajectory than the one it would follow without a tax decrease. Hence we are able to reduce inequality for several periods after the fiscal shock. On the contrary, following a tax increase, the evolution of the skill premium remains above the trajectory carried on by the skill premium under a situation with no tax increase. Consequently, a tax increase would imply a higher level of inequality in the economy

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We calculate the production of two b-quark pairs in hadron collisions. Sources of multiple pairs are multiple interactions and higher order perturbative QCD mechanisms. We subsequently investigate the competing effects of multiple b-pair production on measurements of CP violation: (i) the increase in event rate with multiple b-pair cross sections which may reach values of the order of 1 b in the presence of multiple interactions and (ii) the dilution of b versus b tagging efficiency because of the presence of events with four B mesons. The impact of multiple B-meson production is small unless the cross section for producing a single pair exceeds 1 mb. We show that even for larger values of the cross section the competing effects (i) and (ii) roughly compensate so that there is no loss in the precision with which CP-violating CKM angles can be determined.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Starting from a recent model of the η′N interaction, we evaluate the η ′-nucleus optical potential, including the contribution of lowest order in density, tρ/2mη′, together with the second-order terms accounting for η′ absorption by two nucleons. We also calculate the formation cross section of the η′bound states from (π, p) reactions on nuclei. The η′-nucleus potential suffers from uncertainties tied to the poorly known η′N interaction, which can be partially constrained by the experimental modulus of the η′N scattering length and/or the recently measured transparency ratios in η′nuclear photoproduction. Assuming an attractive interaction and taking the claimed experimental value |aη′N|= 0.1 fm, we obtain an η′optical potential in nuclear matter at saturation density of Vη′=−(8.7 + 1.8i) MeV, not attractive enough to produce η′bound states in light nuclei. Larger values of the scattering length give rise to deeper optical potentials, with moderate enough imaginary parts. For a value |aη′N|= 0.3 fm, which can still be considered to lie within the uncertainties of the experimental constraints, the spectra of light and medium nuclei show clear structures associated to η′-nuclear bound states and to threshold enhancements in the unbound region.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present an update of neutral Higgs boson decays into bottom quark pairs in the minimal supersymmetric extension of the standard model. In particular the resummation of potentially large higher-order corrections due to the soft supersymmetry (SUSY) breaking parameters Ab and is extended. The remaining theoretical uncertainties due to unknown higher-order SUSY-QCD corrections are analyzed quantitatively.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In a recent paper, Komaki studied the second-order asymptotic properties of predictive distributions, using the Kullback-Leibler divergence as a loss function. He showed that estimative distributions with asymptotically efficient estimators can be improved by predictive distributions that do not belong to the model. The model is assumed to be a multidimensional curved exponential family. In this paper we generalize the result assuming as a loss function any f divergence. A relationship arises between alpha connections and optimal predictive distributions. In particular, using an alpha divergence to measure the goodness of a predictive distribution, the optimal shift of the estimate distribution is related to alpha-covariant derivatives. The expression that we obtain for the asymptotic risk is also useful to study the higher-order asymptotic properties of an estimator, in the mentioned class of loss functions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Identifiability of the so-called ω-slice algorithm is proven for ARMA linear systems. Although proofs were developed in the past for the simpler cases of MA and AR models, they were not extendible to general exponential linear systems. The results presented in this paper demonstrate a unique feature of the ω-slice method, which is unbiasedness and consistency when order is overdetermined, regardless of the IIR or FIR nature of the underlying system, and numerical robustness.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study was conducted at colleges in three countries (United States, Venezuela, and Spain) and across three academic disciplines (engineering, education, and business), to examine how experienced faculty define competencies for their discipline, and design instructional interaction for online courses. A qualitative research design employing in-depth interviews was selected. Results show that disciplinary knowledge takes precedence when faculty members select competencies to be developed in online courses for their respective professions. In all three disciplines, the design of interaction to correspond with disciplinary competencies was often influenced by contextual factors that modify faculty intention. Therefore, instructional design will vary across countries in the same discipline to address the local context, such as the needs and expectations of the learners, faculty perspectives, beliefs and values, and the needs of the institution, the community, and country. The three disciplines from the three countries agreed on the importance of the following competencies: knowledge of the field, higher order cognitive processes such as critical thinking, analysis, problem solving, transfer of knowledge, oral and written communication skills, team work, decision making, leadership and management skills, indicating far more similarities in competencies than differences between the three different applied disciplines. We found a lack of correspondence between faculty¿s intent to develop collaborative learning skills and the actual development of them. Contextual factors such as faculty prior experience in design, student reluctance to engage in collaborative learning, and institutional assessment systems that focus on individual performance were some of these reasons.