11 resultados para Convergence Analysis

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The least-mean-fourth (LMF) algorithm is known for its fast convergence and lower steady state error, especially in sub-Gaussian noise environments. Recent work on normalised versions of the LMF algorithm has further enhanced its stability and performance in both Gaussian and sub-Gaussian noise environments. For example, the recently developed normalised LMF (XE-NLMF) algorithm is normalised by the mixed signal and error powers, and weighted by a fixed mixed-power parameter. Unfortunately, this algorithm depends on the selection of this mixing parameter. In this work, a time-varying mixed-power parameter technique is introduced to overcome this dependency. A convergence analysis, transient analysis, and steady-state behaviour of the proposed algorithm are derived and verified through simulations. An enhancement in performance is obtained through the use of this technique in two different scenarios. Moreover, the tracking analysis of the proposed algorithm is carried out in the presence of two sources of nonstationarities: (1) carrier frequency offset between transmitter and receiver and (2) random variations in the environment. Close agreement between analysis and simulation results is obtained. The results show that, unlike in the stationary case, the steady-state excess mean-square error is not a monotonically increasing function of the step size. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A method is proposed to accelerate the evaluation of the Green's function of an infinite double periodic array of thin wire antennas. The method is based on the expansion of the Green's function into series corresponding to the propagating and evanescent waves and the use of Poisson and Kummer transformations enhanced with the analytic summation of the slowly convergent asymptotic terms. Unlike existing techniques the procedure reported here provides uniform convergence regardless of the geometrical parameters of the problem or plane wave excitation wavelength. In addition, it is numerically stable and does not require numerical integration or internal tuning parameters, since all necessary series are directly calculated in terms of analytical functions. This means that for nonlinear problem scenarios that the algorithm can be deployed without run time intervention or recursive adjustment within a harmonic balance engine. Numerical examples are provided to illustrate the efficiency and accuracy of the developed approach as compared with the Ewald method for which these classes of problems requires run time splitting parameter adaptation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The new rigorous numerical-analytical technique based upon Galerkin method with the entire domain basis functions has been developed and applied to the study of the periodic aperture arrays containing multiple dissimilar apertures of complex shapes in stratified medium. The rapid uniform convergence of the solutions has enabled a comprehensive parametric study of complex array arrangements. The developed theory has revealed new effects of the aperture shape and layout on the array performance. The physical mechanisms underlying the TM wave resonances and Luebbers' anomaly have been explained for the first time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous papers have noted the difficulty in obtaining neural models which are stable under simulation when trained using prediction-error-based methods. Here the differences between series-parallel and parallel identification structures for training neural models are investigated. The effect of the error surface shape on training convergence and simulation performance is analysed using a standard algorithm operating in both training modes. A combined series-parallel/parallel training scheme is proposed, aiming to provide a more effective means of obtaining accurate neural simulation models. Simulation examples show the combined scheme is advantageous in circumstances where the solution space is known or suspected to be complex. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliable prediction of long-term medical device performance using computer simulation requires consideration of variability in surgical procedure, as well as patient-specific factors. However, even deterministic simulation of long-term failure processes for such devices is time and resource consuming so that including variability can lead to excessive time to achieve useful predictions. This study investigates the use of an accelerated probabilistic framework for predicting the likely performance envelope of a device and applies it to femoral prosthesis loosening in cemented hip arthroplasty.
A creep and fatigue damage failure model for bone cement, in conjunction with an interfacial fatigue model for the implant–cement interface, was used to simulate loosening of a prosthesis within a cement mantle. A deterministic set of trial simulations was used to account for variability of a set of surgical and patient factors, and a response surface method was used to perform and accelerate a Monte Carlo simulation to achieve an estimate of the likely range of prosthesis loosening. The proposed framework was used to conceptually investigate the influence of prosthesis selection and surgical placement on prosthesis migration.
Results demonstrate that the response surface method is capable of dramatically reducing the time to achieve convergence in mean and variance of predicted response variables. A critical requirement for realistic predictions is the size and quality of the initial training dataset used to generate the response surface and further work is required to determine the recommendations for a minimum number of initial trials. Results of this conceptual application predicted that loosening was sensitive to the implant size and femoral width. Furthermore, different rankings of implant performance were predicted when only individual simulations (e.g. an average condition) were used to rank implants, compared with when stochastic simulations were used. In conclusion, the proposed framework provides a viable approach to predicting realistic ranges of loosening behaviour for orthopaedic implants in reduced timeframes compared with conventional Monte Carlo simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a Statistical Shape Model for Human Figure Segmentation in gait sequences. Point Distribution Models (PDM) generally use Principal Component analysis (PCA) to describe the main directions of variation in the training set. However, PCA assumes a number of restrictions on the data that do not always hold. In this work, we explore the potential of Independent Component Analysis (ICA) as an alternative shape decomposition to the PDM-based Human Figure Segmentation. The shape model obtained enables accurate estimation of human figures despite segmentation errors in the input silhouettes and has really good convergence qualities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Community structure depends on both deterministic and stochastic processes. However, patterns of community dissimilarity (e.g. difference in species composition) are difficult to interpret in terms of the relative roles of these processes. Local communities can be more dissimilar (divergence) than, less dissimilar (convergence) than, or as dissimilar as a hypothetical control based on either null or neutral models. However, several mechanisms may result in the same pattern, or act concurrently to generate a pattern, and much research has recently been focusing on unravelling these mechanisms and their relative contributions. Using a simulation approach, we addressed the effect of a complex but realistic spatial structure in the distribution of the niche axis and we analysed patterns of species co-occurrence and beta diversity as measured by dissimilarity indices (e.g. Jaccard index) using either expectations under a null model or neutral dynamics (i.e., based on switching off the niche effect). The strength of niche processes, dispersal, and environmental noise strongly interacted so that niche-driven dynamics may result in local communities that either diverge or converge depending on the combination of these factors. Thus, a fundamental result is that, in real systems, interacting processes of community assembly can be disentangled only by measuring traits such as niche breadth and dispersal. The ability to detect the signal of the niche was also dependent on the spatial resolution of the sampling strategy, which must account for the multiple scale spatial patterns in the niche axis. Notably, some of the patterns we observed correspond to patterns of community dissimilarities previously observed in the field and suggest mechanistic explanations for them or the data required to solve them. Our framework offers a synthesis of the patterns of community dissimilarity produced by the interaction of deterministic and stochastic determinants of community assembly in a spatially explicit and complex context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developed countries, led by the EU and the US, have consistently called for ‘deeper integration’ over the course of the past three decades i.e., the convergence of ‘behind-the-border’ or domestic polices and rules such as services, competition, public procurement, intellectual property (“IP”) and so forth. Following the collapse of the Doha Development Round, the EU and the US have pursued this push for deeper integration by entering into deep and comprehensive free trade agreements (“DCFTAs”) that are comprehensive insofar as they are not limited to tariffs but extend to regulatory trade barriers. More recently, the EU and the US launched negotiations on a Transatlantic Trade and Investment Partnership (“TTIP”) and a Trade in Services Agreement (“TISA”), which put tackling barriers resulting from divergences in domestic regulation in the area of services at the very top of the agenda. Should these agreements come to pass, they may well set the template for the rules of international trade and define the core features of domestic services market regulation. This article examines the regulatory disciplines in the area of services included in existing EU and US DCFTAs from a comparative perspective in order to delineate possible similarities and divergences and assess the extent to which these DCFTAs can shed some light into the possible outcome and limitations of future trade negotiations in services. It also discusses the potential impact of such negotiations on developing countries and, more generally, on the multilateral process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Arc-Length Method is a solution procedure that enables a generic non-linear problem to pass limit points. Some examples are provided of mode-jumping problems solutions using a commercial nite element package, and other investigations are carried out on a simple structure of which the numerical solution can be compared with an analytical one. It is shown that Arc-Length Method is not reliable when bifurcations are present in the primary equilibrium path; also the presence of very sharp snap-backs or special boundary conditions may cause convergence diÆculty at limit points. An improvement to the predictor used in the incremental procedure is suggested, together with a reliable criteria for selecting either solution of the quadratic arc-length constraint. The gap that is sometimes observed between the experimantal load level of mode-jumping and its arc-length prediction is explained through an example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper revisits work on the socio-political amplification of risk, which predicts that those living in developing countries are exposed to greater risk than residents of developed nations. This prediction contrasts with the neoliberal expectation that market driven improvements in working conditions within industrialising/developing nations will lead to global convergence of hazard exposure levels. It also contradicts the assumption of risk society theorists that there will be an ubiquitous increase in risk exposure across the globe, which will primarily affect technically more advanced countries. Reviewing qualitative evidence on the impact of structural adjustment reforms in industrialising countries, the export of waste and hazardous waste recycling to these countries and new patterns of domestic industrialisation, the paper suggests that workers in industrialising countries continue to face far greater levels of hazard exposure than those of developed countries. This view is confirmed when a data set including 105 major multi-fatality industrial disasters from 1971 to 2000 is examined. The paper concludes that there is empirical support for the predictions of socio-political amplification of risk theory, which finds clear expression in the data in a consistent pattern of significantly greater fatality rates per industrial incident in industrialising/developing countries.