932 resultados para Multilevel linear model


Relevância:

40.00% 40.00%

Publicador:

Resumo:

We reassess the method of the linear delta expansion for the calculation of effective potentials in superspace, by adopting the improved version of the super-Feynman rules in the framework of the O'Raifeartaigh model for spontaneous supersymmetry breaking. The effective potential is calculated using both the fastest apparent convergence and the principle of minimal sensitivity criteria and the consistency and efficacy of the method are checked in deriving the Coleman-Weinberg potential.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It is of major importance to consider non-ideal energy sources in engineering problems. They act on an oscillating system and at the same time experience a reciprocal action from the system. Here, a non-ideal system is studied. In this system, the interaction between source energy and motion is accomplished through a special kind of friction. Results about the stability and instability of the equilibrium point of this system are obtained. Moreover, its bifurcation curves are determined. Hopf bifurcations are found in the set of parameters of the oscillating system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Complex mass poles, or ghost poles, are present in the Hartree-Fock solution of the Schwinger-Dyson equation for the nucleon propagator in renormalizable models with Yukawa-type meson-nucleon couplings, as shown many years ago by Brown, Puff and Wilets (BPW), These ghosts violate basic theorems of quantum field theory and their origin is related to the ultraviolet behavior of the model interactions, Recently, Krein et.al, proved that the ghosts disappear when vertex corrections are included in a self-consistent way, softening the interaction sufficiently in the ultraviolet region. In previous studies of pi N scattering using ''dressed'' nucleon propagator and bare vertices, did by Nutt and Wilets in the 70's (NW), it was found that if these poles are explicitly included, the value of the isospin-even amplitude A((+)) is satisfied within 20% at threshold. The absence of a theoretical explanation for the ghosts and the lack of chiral symmetry in these previous studies led us to re-investigate the subject using the approach of the linear sigma-model and study the interplay of low-energy theorems for pi N scattering and ghost poles. For bare interaction vertices we find that ghosts are present in this model as well and that the A((+)) value is badly described, As a first approach to remove these complex poles, we dress the vertices with phenomenological form factors and a reasonable agreement with experiment is achieved, In order to fix the two cutoff parameters, we use the A((+)) value for the chiral limit (m(pi) --> 0) and the experimental value of the isoscalar scattering length, Finally, we test our model by calculating the phase shifts for the S waves and we find a good agreement at threshold. (C) 1997 Elsevier B.V. B.V.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Two fundamental processes usually arise in the production planning of many industries. The first one consists of deciding how many final products of each type have to be produced in each period of a planning horizon, the well-known lot sizing problem. The other process consists of cutting raw materials in stock in order to produce smaller parts used in the assembly of final products, the well-studied cutting stock problem. In this paper the decision variables of these two problems are dependent of each other in order to obtain a global optimum solution. Setups that are typically present in lot sizing problems are relaxed together with integer frequencies of cutting patterns in the cutting problem. Therefore, a large scale linear optimizations problem arises, which is exactly solved by a column generated technique. It is worth noting that this new combined problem still takes the trade-off between storage costs (for final products and the parts) and trim losses (in the cutting process). We present some sets of computational tests, analyzed over three different scenarios. These results show that, by combining the problems and using an exact method, it is possible to obtain significant gains when compared to the usual industrial practice, which solve them in sequence. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We consider model selection uncertainty in linear regression. We study theoretically and by simulation the approach of Buckland and co-workers, who proposed estimating a parameter common to all models under study by taking a weighted average over the models, using weights obtained from information criteria or the bootstrap. This approach is compared with the usual approach in which the 'best' model is used, and with Bayesian model averaging. The weighted predictor behaves similarly to model averaging, with generally more realistic mean-squared errors than the usual model-selection-based estimator.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Classical procedures for model updating in non-linear mechanical systems based on vibration data can fail because the common linear metrics are not sensitive for non-linear behavior caused by gaps, backlash, bolts, joints, materials, etc. Several strategies were proposed in the literature in order to allow a correct representative model of non-linear structures. The present paper evaluates the performance of two approaches based on different objective functions. The first one is a time domain methodology based on the proper orthogonal decomposition constructed from the output time histories. The second approach uses objective functions with multiples convolutions described by the first and second order discrete-time Volterra kernels. In order to discuss the results, a benchmark of a clamped-clamped beam with an pre-applied static load is simulated and updated using proper orthogonal decomposition and Volterra Series. The comparisons and discussions of the results show the practical applicability and drawbacks of both approaches.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work addresses the solution to the problem of robust model predictive control (MPC) of systems with model uncertainty. The case of zone control of multi-variable stable systems with multiple time delays is considered. The usual approach of dealing with this kind of problem is through the inclusion of non-linear cost constraint in the control problem. The control action is then obtained at each sampling time as the solution to a non-linear programming (NLP) problem that for high-order systems can be computationally expensive. Here, the robust MPC problem is formulated as a linear matrix inequality problem that can be solved in real time with a fraction of the computer effort. The proposed approach is compared with the conventional robust MPC and tested through the simulation of a reactor system of the process industry.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The beta-Birnbaum-Saunders (Cordeiro and Lemonte, 2011) and Birnbaum-Saunders (Birnbaum and Saunders, 1969a) distributions have been used quite effectively to model failure times for materials subject to fatigue and lifetime data. We define the log-beta-Birnbaum-Saunders distribution by the logarithm of the beta-Birnbaum-Saunders distribution. Explicit expressions for its generating function and moments are derived. We propose a new log-beta-Birnbaum-Saunders regression model that can be applied to censored data and be used more effectively in survival analysis. We obtain the maximum likelihood estimates of the model parameters for censored data and investigate influence diagnostics. The new location-scale regression model is modified for the possibility that long-term survivors may be presented in the data. Its usefulness is illustrated by means of two real data sets. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper addresses the numerical solution of random crack propagation problems using the coupling boundary element method (BEM) and reliability algorithms. Crack propagation phenomenon is efficiently modelled using BEM, due to its mesh reduction features. The BEM model is based on the dual BEM formulation, in which singular and hyper-singular integral equations are adopted to construct the system of algebraic equations. Two reliability algorithms are coupled with BEM model. The first is the well known response surface method, in which local, adaptive polynomial approximations of the mechanical response are constructed in search of the design point. Different experiment designs and adaptive schemes are considered. The alternative approach direct coupling, in which the limit state function remains implicit and its gradients are calculated directly from the numerical mechanical response, is also considered. The performance of both coupling methods is compared in application to some crack propagation problems. The investigation shows that direct coupling scheme converged for all problems studied, irrespective of the problem nonlinearity. The computational cost of direct coupling has shown to be a fraction of the cost of response surface solutions, regardless of experiment design or adaptive scheme considered. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the present work we perform an econometric analysis of the Tribal art market. To this aim, we use a unique and original database that includes information on Tribal art market auctions worldwide from 1998 to 2011. In Literature, art prices are modelled through the hedonic regression model, a classic fixed-effect model. The main drawback of the hedonic approach is the large number of parameters, since, in general, art data include many categorical variables. In this work, we propose a multilevel model for the analysis of Tribal art prices that takes into account the influence of time on artwork prices. In fact, it is natural to assume that time exerts an influence over the price dynamics in various ways. Nevertheless, since the set of objects change at every auction date, we do not have repeated measurements of the same items over time. Hence, the dataset does not constitute a proper panel; rather, it has a two-level structure in that items, level-1 units, are grouped in time points, level-2 units. The main theoretical contribution is the extension of classical multilevel models to cope with the case described above. In particular, we introduce a model with time dependent random effects at the second level. We propose a novel specification of the model, derive the maximum likelihood estimators and implement them through the E-M algorithm. We test the finite sample properties of the estimators and the validity of the own-written R-code by means of a simulation study. Finally, we show that the new model improves considerably the fit of the Tribal art data with respect to both the hedonic regression model and the classic multilevel model.