864 resultados para Linear Entropy
Resumo:
A novel technique for selecting the poles of orthonormal basis functions (OBF) in Volterra models of any order is presented. It is well-known that the usual large number of parameters required to describe the Volterra kernels can be significantly reduced by representing each kernel using an appropriate basis of orthonormal functions. Such a representation results in the so-called OBF Volterra model, which has a Wiener structure consisting of a linear dynamic generated by the orthonormal basis followed by a nonlinear static mapping given by the Volterra polynomial series. Aiming at optimizing the poles that fully parameterize the orthonormal bases, the exact gradients of the outputs of the orthonormal filters with respect to their poles are computed analytically by using a back-propagation-through-time technique. The expressions relative to the Kautz basis and to generalized orthonormal bases of functions (GOBF) are addressed; the ones related to the Laguerre basis follow straightforwardly as a particular case. The main innovation here is that the dynamic nature of the OBF filters is fully considered in the gradient computations. These gradients provide exact search directions for optimizing the poles of a given orthonormal basis. Such search directions can, in turn, be used as part of an optimization procedure to locate the minimum of a cost-function that takes into account the error of estimation of the system output. The Levenberg-Marquardt algorithm is adopted here as the optimization procedure. Unlike previous related work, the proposed approach relies solely on input-output data measured from the system to be modeled, i.e., no information about the Volterra kernels is required. Examples are presented to illustrate the application of this approach to the modeling of dynamic systems, including a real magnetic levitation system with nonlinear oscillatory behavior.
Resumo:
Nesse artigo, tem-se o interesse em avaliar diferentes estratégias de estimação de parâmetros para um modelo de regressão linear múltipla. Para a estimação dos parâmetros do modelo foram utilizados dados de um ensaio clínico em que o interesse foi verificar se o ensaio mecânico da propriedade de força máxima (EM-FM) está associada com a massa femoral, com o diâmetro femoral e com o grupo experimental de ratas ovariectomizadas da raça Rattus norvegicus albinus, variedade Wistar. Para a estimação dos parâmetros do modelo serão comparadas três metodologias: a metodologia clássica, baseada no método dos mínimos quadrados; a metodologia Bayesiana, baseada no teorema de Bayes; e o método Bootstrap, baseado em processos de reamostragem.
Resumo:
Linear mixed models were developed to handle clustered data and have been a topic of increasing interest in statistics for the past 50 years. Generally. the normality (or symmetry) of the random effects is a common assumption in linear mixed models but it may, sometimes, be unrealistic, obscuring important features of among-subjects variation. In this article, we utilize skew-normal/independent distributions as a tool for robust modeling of linear mixed models under a Bayesian paradigm. The skew-normal/independent distributions is an attractive class of asymmetric heavy-tailed distributions that includes the skew-normal distribution, skew-t, skew-slash and the skew-contaminated normal distributions as special cases, providing an appealing robust alternative to the routine use of symmetric distributions in this type of models. The methods developed are illustrated using a real data set from Framingham cholesterol study. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Two fundamental processes usually arise in the production planning of many industries. The first one consists of deciding how many final products of each type have to be produced in each period of a planning horizon, the well-known lot sizing problem. The other process consists of cutting raw materials in stock in order to produce smaller parts used in the assembly of final products, the well-studied cutting stock problem. In this paper the decision variables of these two problems are dependent of each other in order to obtain a global optimum solution. Setups that are typically present in lot sizing problems are relaxed together with integer frequencies of cutting patterns in the cutting problem. Therefore, a large scale linear optimizations problem arises, which is exactly solved by a column generated technique. It is worth noting that this new combined problem still takes the trade-off between storage costs (for final products and the parts) and trim losses (in the cutting process). We present some sets of computational tests, analyzed over three different scenarios. These results show that, by combining the problems and using an exact method, it is possible to obtain significant gains when compared to the usual industrial practice, which solve them in sequence. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.
Resumo:
We introduce a problem called maximum common characters in blocks (MCCB), which arises in applications of approximate string comparison, particularly in the unification of possibly erroneous textual data coming from different sources. We show that this problem is NP-complete, but can nevertheless be solved satisfactorily using integer linear programming for instances of practical interest. Two integer linear formulations are proposed and compared in terms of their linear relaxations. We also compare the results of the approximate matching with other known measures such as the Levenshtein (edit) distance. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The estimation of data transformation is very useful to yield response variables satisfying closely a normal linear model, Generalized linear models enable the fitting of models to a wide range of data types. These models are based on exponential dispersion models. We propose a new class of transformed generalized linear models to extend the Box and Cox models and the generalized linear models. We use the generalized linear model framework to fit these models and discuss maximum likelihood estimation and inference. We give a simple formula to estimate the parameter that index the transformation of the response variable for a subclass of models. We also give a simple formula to estimate the rth moment of the original dependent variable. We explore the possibility of using these models to time series data to extend the generalized autoregressive moving average models discussed by Benjamin er al. [Generalized autoregressive moving average models. J. Amer. Statist. Assoc. 98, 214-223]. The usefulness of these models is illustrated in a Simulation study and in applications to three real data sets. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Conventional procedures employed in the modeling of viscoelastic properties of polymer rely on the determination of the polymer`s discrete relaxation spectrum from experimentally obtained data. In the past decades, several analytical regression techniques have been proposed to determine an explicit equation which describes the measured spectra. With a diverse approach, the procedure herein introduced constitutes a simulation-based computational optimization technique based on non-deterministic search method arisen from the field of evolutionary computation. Instead of comparing numerical results, this purpose of this paper is to highlight some Subtle differences between both strategies and focus on what properties of the exploited technique emerge as new possibilities for the field, In oder to illustrate this, essayed cases show how the employed technique can outperform conventional approaches in terms of fitting quality. Moreover, in some instances, it produces equivalent results With much fewer fitting parameters, which is convenient for computational simulation applications. I-lie problem formulation and the rationale of the highlighted method are herein discussed and constitute the main intended contribution. (C) 2009 Wiley Periodicals, Inc. J Appl Polym Sci 113: 122-135, 2009
Resumo:
This paper presents an automatic method to detect and classify weathered aggregates by assessing changes of colors and textures. The method allows the extraction of aggregate features from images and the automatic classification of them based on surface characteristics. The concept of entropy is used to extract features from digital images. An analysis of the use of this concept is presented and two classification approaches, based on neural networks architectures, are proposed. The classification performance of the proposed approaches is compared to the results obtained by other algorithms (commonly considered for classification purposes). The obtained results confirm that the presented method strongly supports the detection of weathered aggregates.
Resumo:
The development of circadian sleep-wakefulness rhythm was investigated by a longitudinal study of six normal infants. We propose an entropy based measure for the sleep/wake cycle fragmentation. Our results confirm that the sleep/wake cycle fragmentation and the sleep/wake ratio decrease, while the circadian power increases during the maturation process of infants. In addition to these expected linear trends in the variables devised to quantify sleep consolidation, circadian power and sleep/wake ratio, we found that they present infradian rhythms in the monthly range. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
We study and compare the information loss of a large class of Gaussian bipartite systems. It includes the usual Caldeira-Leggett-type model as well as Anosov models ( parametric oscillators, the inverted oscillator environment, etc), which exhibit instability, one of the most important characteristics of chaotic systems. We establish a rigorous connection between the quantum Lyapunov exponents and coherence loss, and show that in the case of unstable environments coherence loss is completely determined by the upper quantum Lyapunov exponent, a behavior which is more universal than that of the Caldeira-Leggett-type model.
Resumo:
Measurements of X-ray diffraction, electrical resistivity, and magnetization are reported across the Jahn-Teller phase transition in LaMnO(3). Using a thermodynamic equation, we obtained the pressure derivative of the critical temperature (T(JT)), dT(JT)/dP = -28.3 K GPa(-1). This approach also reveals that 5.7(3)J(mol K)(-1) comes from the volume change and 0.8(2)J(mol K)(-1) from the magnetic exchange interaction change across the phase transition. Around T(JT), a robust increase in the electrical conductivity takes place and the electronic entropy change, which is assumed to be negligible for the majority of electronic systems, was found to be 1.8(3)J(mol K)(-1).
Resumo:
We report the partitioning of the interaction-induced static electronic dipole (hyper)polarizabilities for linear hydrogen cyanide complexes into contributions arising from various interaction energy terms. We analyzed the nonadditivities of the studied properties and used these data to predict the electric properties of an infinite chain. The interaction-induced static electric dipole properties and their nonadditivities were analyzed using an approach based on numerical differentiation of the interaction energy components estimated in an external electric field. These were obtained using the hybrid variational-perturbational interaction energy decomposition scheme, augmented with coupled-cluster calculations, with singles, doubles, and noniterative triples. Our results indicate that the interaction-induced dipole moments and polarizabilities are primarily electrostatic in nature; however, the composition of the interaction hyperpolarizabilities is much more complex. The overlap effects substantially quench the contributions due to electrostatic interactions, and therefore, the major components are due to the induction and exchange induction terms, as well as the intramolecular electron-correlation corrections. A particularly intriguing observation is that the interaction first hyperpolarizability in the studied systems not only is much larger than the corresponding sum of monomer properties, but also has the opposite sign. We show that this effect can be viewed as a direct consequence of hydrogen-bonding interactions that lead to a decrease of the hyperpolarizability of the proton acceptor and an increase of the hyperpolarizability of the proton donor. In the case of the first hyperpolarizability, we also observed the largest nonadditivity of interaction properties (nearly 17%) which further enhances the effects of pairwise interactions.
Resumo:
The concept of Fock space representation is developed to deal with stochastic spin lattices written in terms of fermion operators. A density operator is introduced in order to follow in parallel the developments of the case of bosons in the literature. Some general conceptual quantities for spin lattices are then derived, including the notion of generating function and path integral via Grassmann variables. The formalism is used to derive the Liouvillian of the d-dimensional Linear Glauber dynamics in the Fock-space representation. Then the time evolution equations for the magnetization and the two-point correlation function are derived in terms of the number operator. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The magnetic linear dichroism (MLD) at band-edge photon energies in the Voigt geometry was calculated for EuTe. At the spin-flop transition, MLD shows a step-like increase. Above the spin-flop transition MLD slowly decreases and becomes zero when the averaged electronic charge becomes symmetric relative to the axis of light propagation. Further increase of the magnetic field causes ferromagnetic alignment of the spins along the magnetic field direction, and MLD is recovered but with an opposite sign, and reaches maximum absolute values. These results are explained by the rearrangement of the Eu(2+) spin distribution in the crystal lattice as a function of magnetic field, due to the Zeeman interaction, demonstrating that MLD can be a sensitive probe of the spin order in EuTe, and provides information that is not accessible from other magneto-optical techniques, such as magnetic circular dichroism measurement studies.
Resumo:
An entropy-based image segmentation approach is introduced and applied to color images obtained from Google Earth. Segmentation refers to the process of partitioning a digital image in order to locate different objects and regions of interest. The application to satellite images paves the way to automated monitoring of ecological catastrophes, urban growth, agricultural activity, maritime pollution, climate changing and general surveillance. Regions representing aquatic, rural and urban areas are identified and the accuracy of the proposed segmentation methodology is evaluated. The comparison with gray level images revealed that the color information is fundamental to obtain an accurate segmentation. (C) 2010 Elsevier B.V. All rights reserved.