54 resultados para equilibrium asset pricing models with latent variables


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The human papillomavirus (HPV) has been historically associated with head and neck cancers, although its role in oral carcinogenesis remains poorly defined. The purpose of this study was to investigate the prevalence of HPV in mouth floor squamous cell carcinoma and correlate it with clinicopathologic variables, risk factors and survival. HPV presence was evaluated by nested polymerase chain reaction (nPCR) in 29 paraffin-embedded specimens of mouth floor squamous cell carcinoma. HPV DNA was detected in 17.2% (5 of 29) of the specimens; the highest prevalence was observed in non-smoking patients over the age of 60 years. All HPV DNA positive specimens were detected in men with clinical stage III and IV lesions, being most of which were moderately differentiated. Despite this correlation there were no statistically significant differences observed among the analyzed variables, including patients' survival. The relatively low incidence of HPV DNA present in these tumors suggests that this virus does not, by itself, have a significant role in the development of mouth floor squamous cell carcinoma. J Oral Pathol Med (2008) 37: 593-598

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In some supergravity models there are light weakly coupled scalar (S)-and pseudoscalar (P) particles. These particles arise following a superlight gravitino. In these models the decay SIP → γγ exists. We examine constraints on this process considering these photons as responsible by the extragalactic background light. We also consider the amount of SIP particles produced through the fusion of the cosmic background photons and contributing to the effective number of light neutrino species during primordial nucleosynthesis. We obtain bounds on the gravitino mass complementary to the existing ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The soliton spectrum (massive and massless) of a family of integrable models with local U(1) and U(1) ⊗U(1) symmetries is studied. These models represent relevant integrable deformations of SL(2,ℝ) ⊗U(1) n-1-WZW and SL(2,ℝ) ⊗ SL(2,ℝ) ⊗U(1) n-2-WZW models. Their massless solitons appear as specific topological solutions of the U(1)(or U(1) ⊗ U(1)-) CFTs. The nonconformal analog of the GKO-coset formula is derived and used in the construction of the composite massive solitons of the ungauged integrable models. © SISSA/ISAS 2002.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We explore regions of parameter space in a simple exponential model of the form V = V0 e-λ(Q/Mp) that are allowed by observational constraints. We find that the level of fine tuning in these models is not different from more sophisticated models of dark energy. We study a transient regime where the parameter λ has to be less than √3 and the fixed point ΩQ = 1 has not been reached. All values of the parameter λ that lead to this transient regime are permitted. We also point out that this model can accelerate the universe today even for λ > √2, leading to a halt of the present acceleration of the universe in the future thus avoiding the horizon problem. We conclude that this model can not be discarded by current observations. © SISSA/ISAS 2002.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear mixed effects models have been widely used in analysis of data where responses are clustered around some random effects, so it is not reasonable to assume independence between observations in the same cluster. In most biological applications, it is assumed that the distributions of the random effects and of the residuals are Gaussian. This makes inferences vulnerable to the presence of outliers. Here, linear mixed effects models with normal/independent residual distributions for robust inferences are described. Specific distributions examined include univariate and multivariate versions of the Student-t, the slash and the contaminated normal. A Bayesian framework is adopted and Markov chain Monte Carlo is used to carry out the posterior analysis. The procedures are illustrated using birth weight data on rats in a texicological experiment. Results from the Gaussian and robust models are contrasted, and it is shown how the implementation can be used for outlier detection. The thick-tailed distributions provide an appealing robust alternative to the Gaussian process in linear mixed models, and they are easily implemented using data augmentation and MCMC techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyze double Higgs boson production at the Large Hadron Collider in the context of Little Higgs models. In double Higgs production, the diagrams involved are directly related to those that cause the cancellation of the quadratic divergence of the Higgs self-energy, providing a robust prediction for this class of models. We find that in extensions of this model with the inclusion of a so-called T-parity, there is a significant enhancement in the cross sections as compared to the Standard Model. © SISSA 2006.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerical modeling of the interaction among waves and coastal structures is a challenge due to the many nonlinear phenomena involved, such as, wave propagation, wave transformation with water depth, interaction among incident and reflected waves, run-up / run-down and wave overtopping. Numerical models based on Lagrangian formulation, like SPH (Smoothed Particle Hydrodynamics), allow simulating complex free surface flows. The validation of these numerical models is essential, but comparing numerical results with experimental data is not an easy task. In the present paper, two SPH numerical models, SPHysics LNEC and SPH UNESP, are validated comparing the numerical results of waves interacting with a vertical breakwater, with data obtained in physical model tests made in one of the LNEC's flume. To achieve this validation, the experimental set-up is determined to be compatible with the Characteristics of the numerical models. Therefore, the flume dimensions are exactly the same for numerical and physical model and incident wave characteristics are identical, which allows determining the accuracy of the numerical models, particularly regarding two complex phenomena: wave-breaking and impact loads on the breakwater. It is shown that partial renormalization, i.e. renormalization applied only for particles near the structure, seems to be a promising compromise and an original method that allows simultaneously propagating waves, without diffusion, and modeling accurately the pressure field near the structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Starting with a brief introduction about the evolution of asset pricing models, we seek here to introduce the critique made by Benoit Mandelbrot about using the normality hypothesis when building such models. This critique arises when empirical and theoretical models where confronted and the expected results diverged from the ones obtained. Next, we reproduce Mandelbrot alternative which he believes is sufficient to solve the main problems implied by the normality hyphotesis

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An inclusive search for supersymmetric processes that produce final states with jets and missing transverse energy is performed in pp collisions at a centre-of-mass energy of 8 TeV. The data sample corresponds to an integrated luminosity of 11.7 fb-1 collected by the CMS experiment at the LHC. In this search, a dimensionless kinematic variable, αT, is used to discriminate between events with genuine and misreconstructed missing transverse energy. The search is based on an examination of the number of reconstructed jets per event, the scalar sum of transverse energies of these jets, and the number of these jets identified as originating from bottom quarks. No significant excess of events over the standard model expectation is found. Exclusion limits are set in the parameter space of simplified models, with a special emphasis on both compressed-spectrum scenarios and direct or gluino-induced production of third-generation squarks. For the case of gluino-mediated squark production, gluino masses up to 950-1125 GeV are excluded depending on the assumed model. For the direct pair-production of squarks, masses up to 450 GeV are excluded for a single light first- or second-generation squark, increasing to 600 GeV for bottom squarks. © 2013 CERN for the benefit of the CMS collaboration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work is to develop a non-stoichiometric equilibrium model to study parameter effects in the gasification process of a feedstock in downdraft gasifiers. The non-stoichiometric equilibrium model is also known as the Gibbs free energy minimization method. Four models were developed and tested. First a pure non-stoichiometric equilibrium model called M1 was developed; then the methane content was constrained by correlating experimental data and generating the model M2. A kinetic constraint that determines the apparent gasification rate was considered for model M3 and finally the two aforementioned constraints were implemented together in model M4. Models M2 and M4 showed to be the more accurate among the four developed models with mean RMS (root mean square error) values of 1.25 each.Also the gasification of Brazilian Pinus elliottii in a downdraft gasifier with air as gasification agent was studied. The input parameters considered were: (a) equivalence ratio (0.28-035); (b) moisture content (5-20%); (c) gasification time (30-120 min) and carbon conversion efficiency (80-100%). (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The optimized allocation of protective devices in strategic points of the circuit improves the quality of the energy supply and the system reliability index. This paper presents a nonlinear integer programming (NLIP) model with binary variables, to deal with the problem of protective device allocation in the main feeder and all branches of an overhead distribution circuit, to improve the reliability index and to provide customers with service of high quality and reliability. The constraints considered in the problem take into account technical and economical limitations, such as coordination problems of serial protective devices, available equipment, the importance of the feeder and the circuit topology. The use of genetic algorithms (GAs) is proposed to solve this problem, using a binary representation that does (1) or does not (0) show allocation of protective devices (reclosers, sectionalizers and fuses) in predefined points of the circuit. Results are presented for a real circuit (134 busses), with the possibility of protective device allocation in 29 points. Also the ability of the algorithm in finding good solutions while improving significantly the indicators of reliability is shown. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)