940 resultados para Linear model


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Complex mass poles, or ghost poles, are present in the Hartree-Fock solution of the Schwinger-Dyson equation for the nucleon propagator in renormalizable models with Yukawa-type meson-nucleon couplings, as shown many years ago by Brown, Puff and Wilets (BPW), These ghosts violate basic theorems of quantum field theory and their origin is related to the ultraviolet behavior of the model interactions, Recently, Krein et.al, proved that the ghosts disappear when vertex corrections are included in a self-consistent way, softening the interaction sufficiently in the ultraviolet region. In previous studies of pi N scattering using ''dressed'' nucleon propagator and bare vertices, did by Nutt and Wilets in the 70's (NW), it was found that if these poles are explicitly included, the value of the isospin-even amplitude A((+)) is satisfied within 20% at threshold. The absence of a theoretical explanation for the ghosts and the lack of chiral symmetry in these previous studies led us to re-investigate the subject using the approach of the linear sigma-model and study the interplay of low-energy theorems for pi N scattering and ghost poles. For bare interaction vertices we find that ghosts are present in this model as well and that the A((+)) value is badly described, As a first approach to remove these complex poles, we dress the vertices with phenomenological form factors and a reasonable agreement with experiment is achieved, In order to fix the two cutoff parameters, we use the A((+)) value for the chiral limit (m(pi) --> 0) and the experimental value of the isoscalar scattering length, Finally, we test our model by calculating the phase shifts for the S waves and we find a good agreement at threshold. (C) 1997 Elsevier B.V. B.V.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Two fundamental processes usually arise in the production planning of many industries. The first one consists of deciding how many final products of each type have to be produced in each period of a planning horizon, the well-known lot sizing problem. The other process consists of cutting raw materials in stock in order to produce smaller parts used in the assembly of final products, the well-studied cutting stock problem. In this paper the decision variables of these two problems are dependent of each other in order to obtain a global optimum solution. Setups that are typically present in lot sizing problems are relaxed together with integer frequencies of cutting patterns in the cutting problem. Therefore, a large scale linear optimizations problem arises, which is exactly solved by a column generated technique. It is worth noting that this new combined problem still takes the trade-off between storage costs (for final products and the parts) and trim losses (in the cutting process). We present some sets of computational tests, analyzed over three different scenarios. These results show that, by combining the problems and using an exact method, it is possible to obtain significant gains when compared to the usual industrial practice, which solve them in sequence. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We consider model selection uncertainty in linear regression. We study theoretically and by simulation the approach of Buckland and co-workers, who proposed estimating a parameter common to all models under study by taking a weighted average over the models, using weights obtained from information criteria or the bootstrap. This approach is compared with the usual approach in which the 'best' model is used, and with Bayesian model averaging. The weighted predictor behaves similarly to model averaging, with generally more realistic mean-squared errors than the usual model-selection-based estimator.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Classical procedures for model updating in non-linear mechanical systems based on vibration data can fail because the common linear metrics are not sensitive for non-linear behavior caused by gaps, backlash, bolts, joints, materials, etc. Several strategies were proposed in the literature in order to allow a correct representative model of non-linear structures. The present paper evaluates the performance of two approaches based on different objective functions. The first one is a time domain methodology based on the proper orthogonal decomposition constructed from the output time histories. The second approach uses objective functions with multiples convolutions described by the first and second order discrete-time Volterra kernels. In order to discuss the results, a benchmark of a clamped-clamped beam with an pre-applied static load is simulated and updated using proper orthogonal decomposition and Volterra Series. The comparisons and discussions of the results show the practical applicability and drawbacks of both approaches.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work addresses the solution to the problem of robust model predictive control (MPC) of systems with model uncertainty. The case of zone control of multi-variable stable systems with multiple time delays is considered. The usual approach of dealing with this kind of problem is through the inclusion of non-linear cost constraint in the control problem. The control action is then obtained at each sampling time as the solution to a non-linear programming (NLP) problem that for high-order systems can be computationally expensive. Here, the robust MPC problem is formulated as a linear matrix inequality problem that can be solved in real time with a fraction of the computer effort. The proposed approach is compared with the conventional robust MPC and tested through the simulation of a reactor system of the process industry.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The beta-Birnbaum-Saunders (Cordeiro and Lemonte, 2011) and Birnbaum-Saunders (Birnbaum and Saunders, 1969a) distributions have been used quite effectively to model failure times for materials subject to fatigue and lifetime data. We define the log-beta-Birnbaum-Saunders distribution by the logarithm of the beta-Birnbaum-Saunders distribution. Explicit expressions for its generating function and moments are derived. We propose a new log-beta-Birnbaum-Saunders regression model that can be applied to censored data and be used more effectively in survival analysis. We obtain the maximum likelihood estimates of the model parameters for censored data and investigate influence diagnostics. The new location-scale regression model is modified for the possibility that long-term survivors may be presented in the data. Its usefulness is illustrated by means of two real data sets. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper addresses the numerical solution of random crack propagation problems using the coupling boundary element method (BEM) and reliability algorithms. Crack propagation phenomenon is efficiently modelled using BEM, due to its mesh reduction features. The BEM model is based on the dual BEM formulation, in which singular and hyper-singular integral equations are adopted to construct the system of algebraic equations. Two reliability algorithms are coupled with BEM model. The first is the well known response surface method, in which local, adaptive polynomial approximations of the mechanical response are constructed in search of the design point. Different experiment designs and adaptive schemes are considered. The alternative approach direct coupling, in which the limit state function remains implicit and its gradients are calculated directly from the numerical mechanical response, is also considered. The performance of both coupling methods is compared in application to some crack propagation problems. The investigation shows that direct coupling scheme converged for all problems studied, irrespective of the problem nonlinearity. The computational cost of direct coupling has shown to be a fraction of the cost of response surface solutions, regardless of experiment design or adaptive scheme considered. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In questa tesi sono state applicate le tecniche del gruppo di rinormalizzazione funzionale allo studio della teoria quantistica di campo scalare con simmetria O(N) sia in uno spaziotempo piatto (Euclideo) che nel caso di accoppiamento ad un campo gravitazionale nel paradigma dell'asymptotic safety. Nel primo capitolo vengono esposti in breve alcuni concetti basilari della teoria dei campi in uno spazio euclideo a dimensione arbitraria. Nel secondo capitolo si discute estensivamente il metodo di rinormalizzazione funzionale ideato da Wetterich e si fornisce un primo semplice esempio di applicazione, il modello scalare. Nel terzo capitolo è stato studiato in dettaglio il modello O(N) in uno spaziotempo piatto, ricavando analiticamente le equazioni di evoluzione delle quantità rilevanti del modello. Quindi ci si è specializzati sul caso N infinito. Nel quarto capitolo viene iniziata l'analisi delle equazioni di punto fisso nel limite N infinito, a partire dal caso di dimensione anomala nulla e rinormalizzazione della funzione d'onda costante (approssimazione LPA), già studiato in letteratura. Viene poi considerato il caso NLO nella derivative expansion. Nel quinto capitolo si è introdotto l'accoppiamento non minimale con un campo gravitazionale, la cui natura quantistica è considerata a livello di QFT secondo il paradigma di rinormalizzabilità dell'asymptotic safety. Per questo modello si sono ricavate le equazioni di punto fisso per le principali osservabili e se ne è studiato il comportamento per diversi valori di N.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

To propose the determination of the macromolecular baseline (MMBL) in clinical 1H MR spectra based on T(1) and T(2) differentiation using 2D fitting in FiTAID, a general Fitting Tool for Arrays of Interrelated Datasets.