956 resultados para Multinomial logit models with random coefficients (RCL)
Resumo:
This dissertation proposes statistical methods to formulate, estimate and apply complex transportation models. Two main problems are part of the analyses conducted and presented in this dissertation. The first method solves an econometric problem and is concerned with the joint estimation of models that contain both discrete and continuous decision variables. The use of ordered models along with a regression is proposed and their effectiveness is evaluated with respect to unordered models. Procedure to calculate and optimize the log-likelihood functions of both discrete-continuous approaches are derived, and difficulties associated with the estimation of unordered models explained. Numerical approximation methods based on the Genz algortithm are implemented in order to solve the multidimensional integral associated with the unordered modeling structure. The problems deriving from the lack of smoothness of the probit model around the maximum of the log-likelihood function, which makes the optimization and the calculation of standard deviations very difficult, are carefully analyzed. A methodology to perform out-of-sample validation in the context of a joint model is proposed. Comprehensive numerical experiments have been conducted on both simulated and real data. In particular, the discrete-continuous models are estimated and applied to vehicle ownership and use models on data extracted from the 2009 National Household Travel Survey. The second part of this work offers a comprehensive statistical analysis of free-flow speed distribution; the method is applied to data collected on a sample of roads in Italy. A linear mixed model that includes speed quantiles in its predictors is estimated. Results show that there is no road effect in the analysis of free-flow speeds, which is particularly important for model transferability. A very general framework to predict random effects with few observations and incomplete access to model covariates is formulated and applied to predict the distribution of free-flow speed quantiles. The speed distribution of most road sections is successfully predicted; jack-knife estimates are calculated and used to explain why some sections are poorly predicted. Eventually, this work contributes to the literature in transportation modeling by proposing econometric model formulations for discrete-continuous variables, more efficient methods for the calculation of multivariate normal probabilities, and random effects models for free-flow speed estimation that takes into account the survey design. All methods are rigorously validated on both real and simulated data.
Resumo:
Creation of cold dark matter (CCDM) can macroscopically be described by a negative pressure, and, therefore, the mechanism is capable to accelerate the Universe, without the need of an additional dark energy component. In this framework, we discuss the evolution of perturbations by considering a Neo-Newtonian approach where, unlike in the standard Newtonian cosmology, the fluid pressure is taken into account even in the homogeneous and isotropic background equations (Lima, Zanchin, and Brandenberger, MNRAS 291, L1, 1997). The evolution of the density contrast is calculated in the linear approximation and compared to the one predicted by the Lambda CDM model. The difference between the CCDM and Lambda CDM predictions at the perturbative level is quantified by using three different statistical methods, namely: a simple chi(2)-analysis in the relevant space parameter, a Bayesian statistical inference, and, finally, a Kolmogorov-Smirnov test. We find that under certain circumstances, the CCDM scenario analyzed here predicts an overall dynamics (including Hubble flow and matter fluctuation field) which fully recovers that of the traditional cosmic concordance model. Our basic conclusion is that such a reduction of the dark sector provides a viable alternative description to the accelerating Lambda CDM cosmology.
Resumo:
This paper proposes a regression model considering the modified Weibull distribution. This distribution can be used to model bathtub-shaped failure rate functions. Assuming censored data, we consider maximum likelihood and Jackknife estimators for the parameters of the model. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and we also present some ways to perform global influence. Besides, for different parameter settings, sample sizes and censoring percentages, various simulations are performed and the empirical distribution of the modified deviance residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for a martingale-type residual in log-modified Weibull regression models with censored data. Finally, we analyze a real data set under log-modified Weibull regression models. A diagnostic analysis and a model checking based on the modified deviance residual are performed to select appropriate models. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
Much of the published work regarding the Isotropic Singularity is performed under the assumption that the matter source for the cosmological model is a barotropic perfect fluid, or even a perfect fluid with a gamma-law equation of state. There are, however, some general properties of cosmological models which admit an Isotropic Singularity, irrespective of the matter source. In particular, we show that the Isotropic Singularity is a point-like singularity and that vacuum space-times cannot admit an Isotropic Singularity. The relationships between the Isotropic Singularity, and the energy conditions, and the Hubble parameter is explored. A review of work by the authors, regarding the Isotropic Singularity, is presented.
Resumo:
Nine classes of integrable open boundary conditions, further extending the one-dimensional U-q (gl (212)) extended Hubbard model, have been constructed previously by means of the boundary Z(2)-graded quantum inverse scattering method. The boundary systems are now solved by using the algebraic Bethe ansatz method, and the Bethe ansatz equations are obtained for all nine cases.
Resumo:
When linear equality constraints are invariant through time they can be incorporated into estimation by restricted least squares. If, however, the constraints are time-varying, this standard methodology cannot be applied. In this paper we show how to incorporate linear time-varying constraints into the estimation of econometric models. The method involves the augmentation of the observation equation of a state-space model prior to estimation by the Kalman filter. Numerical optimisation routines are used for the estimation. A simple example drawn from demand analysis is used to illustrate the method and its application.
Resumo:
In this Letter we study the process of gluon fusion into a pair of Higgs bosons in a model with one universal extra dimension. We find that the contributions from the extra top quark Kaluza-Klem excitations lead to a Higgs pair production cross section at the LHC that can be significantly altered compared to the Standard Model value for small values of the compactification scale. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Three kinds of integrable Kondo impurity additions to one-dimensional q-deformed extended Hubbard models are studied by means of the boundary Z(2)-graded quantum inverse scattering method. The boundary K matrices depending on the local magnetic moments of the impurities are presented as nontrivial realisations of the reflection equation algebras in an impurity Hilbert space. The models are solved by using the algebraic Bethe ansatz method, and the Bethe ansatz equations are obtained.
Resumo:
Questionnaire surveys, while more economical, typically achieve poorer response rates than interview surveys. We used data from a national volunteer cohort of young adult twins, who were scheduled for assessment by questionnaire in 1989 and by interview in 1996-2000, to identify predictors of questionnaire non-response. Out of a total of 8536 twins, 5058 completed the questionnaire survey (59% response rate), and 6255 completed a telephone interview survey conducted a decade later (73% response rate). Multinomial logit models were fitted to the interview data to identify socioeconomic, psychiatric and health behavior correlates of non-response in the earlier questionnaire survey. Male gender, education below University level, and being a dizygotic rather than monozygotic twin, all predicted reduced likelihood of participating in the questionnaire survey. Associations between questionnaire response status and psychiatric history and health behavior variables were modest, with history of alcohol dependence and childhood conduct disorder predicting decreased probability of returning a questionnaire, and history of smoking and heavy drinking more weakly associated with non-response. Body-mass index showed no association with questionnaire non-response. Despite a poor response rate to the self-report questionnaire survey, we found only limited sampling biases for most variables. While not appropriate for studies where socioeconomic variables are critical, it appears that survey by questionnaire, with questionnaire administration by telephone to non-responders, will represent a viable strategy for gene-mapping studies requiring that large numbers of relatives be screened.
Resumo:
Tese de Doutoramento, Ciências Económicas e Empresariais (Desenvolvimento Económico e Social e Economia Pública), 16 de Janeiro de 2014, Universidade dos Açores.
Resumo:
We consider the quark sector of theories containing three scalar SU(2)(L) doublets in the triplet representation of A(4) (or S-4) and three generations of quarks in arbitrary A(4) (or S-4) representations. We show that for all possible choices of quark field representations and for all possible alignments of the Higgs vacuum expectation values that can constitute global minima of the scalar potential, it is not possible to obtain simultaneously nonvanishing quark masses and a nonvanishing CP-violating phase in the Cabibbo-Kobayashi-Maskawa quark mixing matrix. As a result, in this minimal form, models with three scalar fields in the triplet representation of A(4) or S-4 cannot be extended to the quark sector in a way consistent with experiment. DOI: 10.1103/PhysRevD.87.055010.
Resumo:
We study neutrino masses and mixing in the context of flavor models with A(4) symmetry, three scalar doublets in the triplet representation, and three lepton families. We show that there is no representation assignment that yields a dimension-5 mass operator consistent with experiment. We then consider a type-I seesaw with three heavy right-handed neutrinos, explaining in detail why it fails, and allowing us to show that agreement with the present neutrino oscillation data can be recovered with the inclusion of dimension-3 heavy neutrino mass terms that break softly the A(4) symmetry.
Resumo:
Dynamic parallel scheduling using work-stealing has gained popularity in academia and industry for its good performance, ease of implementation and theoretical bounds on space and time. Cores treat their own double-ended queues (deques) as a stack, pushing and popping threads from the bottom, but treat the deque of another randomly selected busy core as a queue, stealing threads only from the top, whenever they are idle. However, this standard approach cannot be directly applied to real-time systems, where the importance of parallelising tasks is increasing due to the limitations of multiprocessor scheduling theory regarding parallelism. Using one deque per core is obviously a source of priority inversion since high priority tasks may eventually be enqueued after lower priority tasks, possibly leading to deadline misses as in this case the lower priority tasks are the candidates when a stealing operation occurs. Our proposal is to replace the single non-priority deque of work-stealing with ordered per-processor priority deques of ready threads. The scheduling algorithm starts with a single deque per-core, but unlike traditional work-stealing, the total number of deques in the system may now exceed the number of processors. Instead of stealing randomly, cores steal from the highest priority deque.
Resumo:
We study some properties of the monotone solutions of the boundary value problem (p(u'))' - cu' + f(u) = 0, u(-infinity) = 0, u(+infinity) = 1, where f is a continuous function, positive in (0, 1) and taking the value zero at 0 and 1, and P may be an increasing homeomorphism of (0, 1) or (0, +infinity) onto [0, +infinity). This problem arises when we look for travelling waves for the reaction diffusion equation partial derivative u/partial derivative t = partial derivative/partial derivative x [p(partial derivative u/partial derivative x)] + f(u) with the parameter c representing the wave speed. A possible model for the nonlinear diffusion is the relativistic curvature operator p(nu)= nu/root 1-nu(2). The same ideas apply when P is given by the one- dimensional p- Laplacian P(v) = |v|(p-2)v. In this case, an advection term is also considered. We show that, as for the classical Fisher- Kolmogorov- Petrovski- Piskounov equations, there is an interval of admissible speeds c and we give characterisations of the critical speed c. We also present some examples of exact solutions. (C) 2014 Elsevier Inc. All rights reserved.