46 resultados para generalized entropy


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article addresses the problem of estimating the Quality of Service (QoS) of a composite service given the QoS of the services participating in the composition. Previous solutions to this problem impose restrictions on the topology of the orchestration models, limiting their applicability to well-structured orchestration models for example. This article lifts these restrictions by proposing a method for aggregate QoS computation that deals with more general types of unstructured orchestration models. The applicability and scalability of the proposed method are validated using a collection of models from industrial practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the finite element modelling of steel frames, external loads usually act along the members rather than at the nodes only. Conventionally, when a member is subjected to these transverse loads, they are converted to nodal forces which act at the ends of the elements into which the member is discretised by either lumping or consistent nodal load approaches. For a contemporary geometrically non-linear analysis in which the axial force in the member is large, accurate solutions are achieved by discretising the member into many elements, which can produce unfavourable consequences on the efficacy of the method for analysing large steel frames. Herein, a numerical technique to include the transverse loading in the non-linear stiffness formulation for a single element is proposed, and which is able to predict the structural responses of steel frames involving the effects of first-order member loads as well as the second-order coupling effect between the transverse load and the axial force in the member. This allows for a minimal discretisation of a frame for second-order analysis. For those conventional analyses which do include transverse member loading, prescribed stiffness matrices must be used for the plethora of specific loading patterns encountered. This paper shows, however, that the principle of superposition can be applied to the equilibrium condition, so that the form of the stiffness matrix remains unchanged with only the magnitude of the loading being needed to be changed in the stiffness formulation. This novelty allows for a very useful generalised stiffness formulation for a single higher-order element with arbitrary transverse loading patterns to be formulated. The results are verified using analytical stability function studies, as well as with numerical results reported by independent researchers on several simple structural frames.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article we study the azimuthal shear deformations in a compressible Isotropic elastic material. This class of deformations involves an azimuthal displacement as a function of the radial and axial coordinates. The equilibrium equations are formulated in terms of the Cauchy-Green strain tensors, which form an overdetermined system of partial differential equations for which solutions do not exist in general. By means of a Legendre transformation, necessary and sufficient conditions for the material to support this deformation are obtained explicitly, in the sense that every solution to the azimuthal equilibrium equation will satisfy the remaining two equations. Additionally, we show how these conditions are sufficient to support all currently known deformations that locally reduce to simple shear. These conditions are then expressed both in terms of the invariants of the Cauchy-Green strain and stretch tensors. Several classes of strain energy functions for which this deformation can be supported are studied. For certain boundary conditions, exact solutions to the equilibrium equations are obtained. © 2005 Society for Industrial and Applied Mathematics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the electricity market environment, load-serving entities (LSEs) will inevitably face risks in purchasing electricity because there are a plethora of uncertainties involved. To maximize profits and minimize risks, LSEs need to develop an optimal strategy to reasonably allocate the purchased electricity amount in different electricity markets such as the spot market, bilateral contract market, and options market. Because risks originate from uncertainties, an approach is presented to address the risk evaluation problem by the combined use of the lower partial moment and information entropy (LPME). The lower partial moment is used to measure the amount and probability of the loss, whereas the information entropy is used to represent the uncertainty of the loss. Electricity purchasing is a repeated procedure; therefore, the model presented represents a dynamic strategy. Under the chance-constrained programming framework, the developed optimization model minimizes the risk of the electricity purchasing portfolio in different markets because the actual profit of the LSE concerned is not less than the specified target under a required confidence level. Then, the particle swarm optimization (PSO) algorithm is employed to solve the optimization model. Finally, a sample example is used to illustrate the basic features of the developed model and method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper I propose that identity is momentary, fluid, and multiple while simultaneously providing us with a sense of sameness and continuity. Building on Valsiner’s ideas about human sense-making I suggest that we can reasonably deal with the multiplicity/unity paradox if we conceive of this process as resulting in the construction of a fuzzy field of hyper-generalized personal sense, which ordinarily functions as an implicit and unspeakable background of our everyday functioning, while being constantly re-created through momentary instances of foregrounded and explicit identity-dialogues. I illustrate the ideas put forward in the paper by analysing a case of a young woman experiencing a change in her being. Finally, in an attempt to illustrate and further develop the case I introduce a metaphor of carpet-weaving as an apposite image for thinking about identity as a process of a multiple and fragmented, yet also a united and constant being.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The finite element method in principle adaptively divides the continuous domain with complex geometry into discrete simple subdomain by using an approximate element function, and the continuous element loads are also converted into the nodal load by means of the traditional lumping and consistent load methods, which can standardise a plethora of element loads into a typical numerical procedure, but element load effect is restricted to the nodal solution. It in turn means the accurate continuous element solutions with the element load effects are merely restricted to element nodes discretely, and further limited to either displacement or force field depending on which type of approximate function is derived. On the other hand, the analytical stability functions can give the accurate continuous element solutions due to element loads. Unfortunately, the expressions of stability functions are very diverse and distinct when subjected to different element loads that deter the numerical routine for practical applications. To this end, this paper presents a displacement-based finite element function (generalised element load method) with a plethora of element load effects in the similar fashion that never be achieved by the stability function, as well as it can generate the continuous first- and second-order elastic displacement and force solutions along an element without loss of accuracy considerably as the analytical approach that never be achieved by neither the lumping nor consistent load methods. Hence, the salient and unique features of this paper (generalised element load method) embody its robustness, versatility and accuracy in continuous element solutions when subjected to the great diversity of transverse element loads.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To classify each stage for a progressing disease such as Alzheimer’s disease is a key issue for the disease prevention and treatment. In this study, we derived structural brain networks from diffusion-weighted MRI using whole-brain tractography since there is growing interest in relating connectivity measures to clinical, cognitive, and genetic data. Relatively little work has usedmachine learning to make inferences about variations in brain networks in the progression of the Alzheimer’s disease. Here we developed a framework to utilize generalized low rank approximations of matrices (GLRAM) and modified linear discrimination analysis for unsupervised feature learning and classification of connectivity matrices. We apply the methods to brain networks derived from DWI scans of 41 people with Alzheimer’s disease, 73 people with EMCI, 38 people with LMCI, 47 elderly healthy controls and 221 young healthy controls. Our results show that this new framework can significantly improve classification accuracy when combining multiple datasets; this suggests the value of using data beyond the classification task at hand to model variations in brain connectivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate methods for data-based selection of working covariance models in the analysis of correlated data with generalized estimating equations. We study two selection criteria: Gaussian pseudolikelihood and a geodesic distance based on discrepancy between model-sensitive and model-robust regression parameter covariance estimators. The Gaussian pseudolikelihood is found in simulation to be reasonably sensitive for several response distributions and noncanonical mean-variance relations for longitudinal data. Application is also made to a clinical dataset. Assessment of adequacy of both correlation and variance models for longitudinal data should be routine in applications, and we describe open-source software supporting this practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To discuss generalized estimating equations as an extension of generalized linear models by commenting on the paper of Ziegler and Vens "Generalized Estimating Equations. Notes on the Choice of the Working Correlation Matrix". Methods Inviting an international group of experts to comment on this paper. Results Several perspectives have been taken by the discussants. Econometricians have established parallels to the generalized method of moments (GMM). Statisticians discussed model assumptions and the aspect of missing data Applied statisticians; commented on practical aspects in data analysis. Conclusions In general, careful modeling correlation is encouraged when considering estimation efficiency and other implications, and a comparison of choosing instruments in GMM and generalized estimating equations, (GEE) would be worthwhile. Some theoretical drawbacks of GEE need to be further addressed and require careful analysis of data This particularly applies to the situation when data are missing at random.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Selecting an appropriate working correlation structure is pertinent to clustered data analysis using generalized estimating equations (GEE) because an inappropriate choice will lead to inefficient parameter estimation. We investigate the well-known criterion of QIC for selecting a working correlation Structure. and have found that performance of the QIC is deteriorated by a term that is theoretically independent of the correlation structures but has to be estimated with an error. This leads LIS to propose a correlation information criterion (CIC) that substantially improves the QIC performance. Extensive simulation studies indicate that the CIC has remarkable improvement in selecting the correct correlation structures. We also illustrate our findings using a data set from the Madras Longitudinal Schizophrenia Study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The article describes a generalized estimating equations approach that was used to investigate the impact of technology on vessel performance in a trawl fishery during 1988-96, while accounting for spatial and temporal correlations in the catch-effort data. Robust estimation of parameters in the presence of several levels of clustering depended more on the choice of cluster definition than on the choice of correlation structure within the cluster. Models with smaller cluster sizes produced stable results, while models with larger cluster sizes, that may have had complex within-cluster correlation structures and that had within-cluster covariates, produced estimates sensitive to the correlation structure. The preferred model arising from this dataset assumed that catches from a vessel were correlated in the same years and the same areas, but independent in different years and areas. The model that assumed catches from a vessel were correlated in all years and areas, equivalent to a random effects term for vessel, produced spurious results. This was an unexpected finding that highlighted the need to adopt a systematic strategy for modelling. The article proposes a modelling strategy of selecting the best cluster definition first, and the working correlation structure (within clusters) second. The article discusses the selection and interpretation of the model in the light of background knowledge of the data and utility of the model, and the potential for this modelling approach to apply in similar statistical situations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quality of species distribution models (SDMs) relies to a large degree on the quality of the input data, from bioclimatic indices to environmental and habitat descriptors (Austin, 2002). Recent reviews of SDM techniques, have sought to optimize predictive performance e.g. Elith et al., 2006. In general SDMs employ one of three approaches to variable selection. The simplest approach relies on the expert to select the variables, as in environmental niche models Nix, 1986 or a generalized linear model without variable selection (Miller and Franklin, 2002). A second approach explicitly incorporates variable selection into model fitting, which allows examination of particular combinations of variables. Examples include generalized linear or additive models with variable selection (Hastie et al. 2002); or classification trees with complexity or model based pruning (Breiman et al., 1984, Zeileis, 2008). A third approach uses model averaging, to summarize the overall contribution of a variable, without considering particular combinations. Examples include neural networks, boosted or bagged regression trees and Maximum Entropy as compared in Elith et al. 2006. Typically, users of SDMs will either consider a small number of variable sets, via the first approach, or else supply all of the candidate variables (often numbering more than a hundred) to the second or third approaches. Bayesian SDMs exist, with several methods for eliciting and encoding priors on model parameters (see review in Low Choy et al. 2010). However few methods have been published for informative variable selection; one example is Bayesian trees (O’Leary 2008). Here we report an elicitation protocol that helps makes explicit a priori expert judgements on the quality of candidate variables. This protocol can be flexibly applied to any of the three approaches to variable selection, described above, Bayesian or otherwise. We demonstrate how this information can be obtained then used to guide variable selection in classical or machine learning SDMs, or to define priors within Bayesian SDMs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While many measures of viewpoint goodness have been proposed in computer graphics, none have been evaluated for ribbon representations of protein secondary structure. To fill this gap, we conducted a user study on Amazon’s Mechanical Turk platform, collecting human viewpoint preferences from 65 participants for 4 representative su- perfamilies of protein domains. In particular, we evaluated viewpoint entropy, which was previously shown to be a good predictor for human viewpoint preference of other, mostly non-abstract objects. In a second study, we asked 7 molecular biology experts to find the best viewpoint of the same protein domains and compared their choices with viewpoint entropy. Our results show that viewpoint entropy overall is a significant predictor of human viewpoint preference for ribbon representations of protein secondary structure. However, the accuracy is highly dependent on the complexity of the structure: while most participants agree on good viewpoints for small, non-globular structures with few secondary structure elements, viewpoint preference varies considerably for complex structures. Finally, experts tend to choose viewpoints of both low and high viewpoint entropy to emphasize different aspects of the respective structure.