965 resultados para linear-threshold model
Resumo:
The objectives of the current study were to assess the feasibility of using stayability traits to improve fertility of Nellore cows and to examine the genetic relationship among the stayabilities at different ages. Stayability was defined as whether a cow calved every year up to the age of 5 (Stay5), 6 (Stay6), or 7 (Stay7) yr of age or more, given that she was provided the opportunity to breed. Data were analyzed based on a maximum a posteriori probit threshold model to predict breeding values on the liability scale, whereas the Gibbs sampler was used to estimate variance components. The EBV were obtained using all animals included in the pedigree or bulls with at least 10 daughters with stayability observations, and average genetic trends were obtained in the liability and transformed to the probability scale. Additional analyses were performed to study the genetic relationship among stayability traits, which were compared by contrasting results in terms of EBV and the average genetic superiority as a function of the selected proportion of sires. Heritability estimates and SD were 0.25 +/- 0.02, 0.22 +/- 0.03, and 0.28 +/- 0.03 for Stay5, Stay6, and Stay7, respectively. Average genetic trends, by year, were 0.51 +/- 0.34, and 0.38% for Stay5, Stay6, and Stay7, respectively. Estimates of EBV SD, in the probability scale, for all animals included in the pedigree and for bulls with at least 10 daughters with stayability observations were 7.98 and 12.95, 6.93 and 11.38, and 8.24 and 14.30% for Stay5, Stay6, and Stay7, respectively. A reduction in the average genetic superiorities in Stay7 would be expected if the selection were based on Stay5 or Stay6. Nonetheless, the reduction in EPD, depending on selection intensity, is on average 0.74 and 1.55%, respectively. Regressions of the sires' EBV for Stay5 and Stay6 on the sires' EBV for Stay7 confirmed these results. The heritability and genetic trend estimates for all stayability traits indicate that it is possible to improve fertility with selection based on a threshold analysis of stayability. The SD of EBV for stayability traits show that there is adequate genetic variability among animals to justify inclusion of stayability as a selection criterion. The potential linear relationship among stayability traits indicates that selection for improved female traits would be more effective by having predictions on the Stay5 trait.
Resumo:
Complex mass poles, or ghost poles, are present in the Hartree-Fock solution of the Schwinger-Dyson equation for the nucleon propagator in renormalizable models with Yukawa-type meson-nucleon couplings, as shown many years ago by Brown, Puff and Wilets (BPW), These ghosts violate basic theorems of quantum field theory and their origin is related to the ultraviolet behavior of the model interactions, Recently, Krein et.al, proved that the ghosts disappear when vertex corrections are included in a self-consistent way, softening the interaction sufficiently in the ultraviolet region. In previous studies of pi N scattering using ''dressed'' nucleon propagator and bare vertices, did by Nutt and Wilets in the 70's (NW), it was found that if these poles are explicitly included, the value of the isospin-even amplitude A((+)) is satisfied within 20% at threshold. The absence of a theoretical explanation for the ghosts and the lack of chiral symmetry in these previous studies led us to re-investigate the subject using the approach of the linear sigma-model and study the interplay of low-energy theorems for pi N scattering and ghost poles. For bare interaction vertices we find that ghosts are present in this model as well and that the A((+)) value is badly described, As a first approach to remove these complex poles, we dress the vertices with phenomenological form factors and a reasonable agreement with experiment is achieved, In order to fix the two cutoff parameters, we use the A((+)) value for the chiral limit (m(pi) --> 0) and the experimental value of the isoscalar scattering length, Finally, we test our model by calculating the phase shifts for the S waves and we find a good agreement at threshold. (C) 1997 Elsevier B.V. B.V.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
We present a bidomain fire-diffuse-fire model that facilitates mathematical analysis of propagating waves of elevated intracellular calcium (Ca) in living cells. Modelling Ca release as a threshold process allows the explicit construction of travelling wave solutions to probe the dependence of Ca wave speed on physiologically important parameters such as the threshold for Ca release from the endoplasmic reticulum (ER) to the cytosol, the rate of Ca resequestration from the cytosol to the ER, and the total [Ca] (cytosolic plus ER). Interestingly, linear stability analysis of the bidomain fire-diffuse-fire model predicts the onset of dynamic wave instabilities leading to the emergence of Ca waves that propagate in a back-and-forth manner. Numerical simulations are used to confirm the presence of these so-called "tango waves" and the dependence of Ca wave speed on the total [Ca]. The original publication is available at www.springerlink.com (Journal of Mathematical Biology)
Resumo:
We introduce the log-beta Weibull regression model based on the beta Weibull distribution (Famoye et al., 2005; Lee et al., 2007). We derive expansions for the moment generating function which do not depend on complicated functions. The new regression model represents a parametric family of models that includes as sub-models several widely known regression models that can be applied to censored survival data. We employ a frequentist analysis, a jackknife estimator, and a parametric bootstrap for the parameters of the proposed model. We derive the appropriate matrices for assessing local influences on the parameter estimates under different perturbation schemes and present some ways to assess global influences. Further, for different parameter settings, sample sizes, and censoring percentages, several simulations are performed. In addition, the empirical distribution of some modified residuals are displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be extended to a modified deviance residual in the proposed regression model applied to censored data. We define martingale and deviance residuals to evaluate the model assumptions. The extended regression model is very useful for the analysis of real data and could give more realistic fits than other special regression models.
Resumo:
In this thesis we implement estimating procedures in order to estimate threshold parameters for the continuous time threshold models driven by stochastic di®erential equations. The ¯rst procedure is based on the EM (expectation-maximization) algorithm applied to the threshold model built from the Brownian motion with drift process. The second procedure mimics one of the fundamental ideas in the estimation of the thresholds in time series context, that is, conditional least squares estimation. We implement this procedure not only for the threshold model built from the Brownian motion with drift process but also for more generic models as the ones built from the geometric Brownian motion or the Ornstein-Uhlenbeck process. Both procedures are implemented for simu- lated data and the least squares estimation procedure is also implemented for real data of daily prices from a set of international funds. The ¯rst fund is the PF-European Sus- tainable Equities-R fund from the Pictet Funds company and the second is the Parvest Europe Dynamic Growth fund from the BNP Paribas company. The data for both funds are daily prices from the year 2004. The last fund to be considered is the Converging Europe Bond fund from the Schroder company and the data are daily prices from the year 2005.
Resumo:
One of the main implications of the efficient market hypothesis (EMH) is that expected future returns on financial assets are not predictable if investors are risk neutral. In this paper we argue that financial time series offer more information than that this hypothesis seems to supply. In particular we postulate that runs of very large returns can be predictable for small time periods. In order to prove this we propose a TAR(3,1)-GARCH(1,1) model that is able to describe two different types of extreme events: a first type generated by large uncertainty regimes where runs of extremes are not predictable and a second type where extremes come from isolated dread/joy events. This model is new in the literature in nonlinear processes. Its novelty resides on two features of the model that make it different from previous TAR methodologies. The regimes are motivated by the occurrence of extreme values and the threshold variable is defined by the shock affecting the process in the preceding period. In this way this model is able to uncover dependence and clustering of extremes in high as well as in low volatility periods. This model is tested with data from General Motors stocks prices corresponding to two crises that had a substantial impact in financial markets worldwide; the Black Monday of October 1987 and September 11th, 2001. By analyzing the periods around these crises we find evidence of statistical significance of our model and thereby of predictability of extremes for September 11th but not for Black Monday. These findings support the hypotheses of a big negative event producing runs of negative returns in the first case, and of the burst of a worldwide stock market bubble in the second example. JEL classification: C12; C15; C22; C51 Keywords and Phrases: asymmetries, crises, extreme values, hypothesis testing, leverage effect, nonlinearities, threshold models
Resumo:
In the context of the two-stage threshold model of decision making, with the agent’s choices determined by the interaction Of three “structural variables,” we study the restrictions on behavior that arise when one or more variables are xogenously known. Our results supply necessary and sufficient conditions for consistency with the model for all possible states of partial Knowledge, and for both single- and multivalued choice functions.
Resumo:
Division of labor in social insects is determinant to their ecological success. Recent models emphasize that division of labor is an emergent property of the interactions among nestmates obeying to simple behavioral rules. However, the role of evolution in shaping these rules has been largely neglected. Here, we investigate a model that integrates the perspectives of self-organization and evolution. Our point of departure is the response threshold model, where we allow thresholds to evolve. We ask whether the thresholds will evolve to a state where division of labor emerges in a form that fits the needs of the colony. We find that division of labor can indeed evolve through the evolutionary branching of thresholds, leading to workers that differ in their tendency to take on a given task. However, the conditions under which division of labor evolves depend on the strength of selection on the two fitness components considered: amount of work performed and on worker distribution over tasks. When selection is strongest on the amount of work performed, division of labor evolves if switching tasks is costly. When selection is strongest on worker distribution, division of labor is less likely to evolve. Furthermore, we show that a biased distribution (like 3:1) of workers over tasks is not easily achievable by a threshold mechanism, even under strong selection. Contrary to expectation, multiple matings of colony foundresses impede the evolution of specialization. Overall, our model sheds light on the importance of considering the interaction between specific mechanisms and ecological requirements to better understand the evolutionary scenarios that lead to division of labor in complex systems. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1007/s00265-012-1343-2) contains supplementary material, which is available to authorized users.
Resumo:
PURPOSE: The longitudinal relaxation rate (R1 ) measured in vivo depends on the local microstructural properties of the tissue, such as macromolecular, iron, and water content. Here, we use whole brain multiparametric in vivo data and a general linear relaxometry model to describe the dependence of R1 on these components. We explore a) the validity of having a single fixed set of model coefficients for the whole brain and b) the stability of the model coefficients in a large cohort. METHODS: Maps of magnetization transfer (MT) and effective transverse relaxation rate (R2 *) were used as surrogates for macromolecular and iron content, respectively. Spatial variations in these parameters reflected variations in underlying tissue microstructure. A linear model was applied to the whole brain, including gray/white matter and deep brain structures, to determine the global model coefficients. Synthetic R1 values were then calculated using these coefficients and compared with the measured R1 maps. RESULTS: The model's validity was demonstrated by correspondence between the synthetic and measured R1 values and by high stability of the model coefficients across a large cohort. CONCLUSION: A single set of global coefficients can be used to relate R1 , MT, and R2 * across the whole brain. Our population study demonstrates the robustness and stability of the model. Magn Reson Med, 2014. © 2014 The Authors. Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. Magn Reson Med 73:1309-1314, 2015. © 2014 Wiley Periodicals, Inc.