967 resultados para graphic computation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper deals with the calculation of the discrete approximation to the full spectrum for the tangent operator for the stability problem of the symmetric flow past a circular cylinder. It is also concerned with the localization of the Hopf bifurcation in laminar flow past a cylinder, when the stationary solution loses stability and often becomes periodic in time. The main problem is to determine the critical Reynolds number for which a pair of eigenvalues crosses the imaginary axis. We thus present a divergence-free method, based on a decoupling of the vector of velocities in the saddle-point system from the vector of pressures, allowing the computation of eigenvalues, from which we can deduce the fundamental frequency of the time-periodic solution. The calculation showed that stability is lost through a symmetry-breaking Hopf bifurcation and that the critical Reynolds number is in agreement with the value presented in reported computations. (c) 2007 IMACS. Published by Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Electrical impedance tomography (EIT) captures images of internal features of a body. Electrodes are attached to the boundary of the body, low intensity alternating currents are applied, and the resulting electric potentials are measured. Then, based on the measurements, an estimation algorithm obtains the three-dimensional internal admittivity distribution that corresponds to the image. One of the main goals of medical EIT is to achieve high resolution and an accurate result at low computational cost. However, when the finite element method (FEM) is employed and the corresponding mesh is refined to increase resolution and accuracy, the computational cost increases substantially, especially in the estimation of absolute admittivity distributions. Therefore, we consider in this work a fast iterative solver for the forward problem, which was previously reported in the context of structural optimization. We propose several improvements to this solver to increase its performance in the EIT context. The solver is based on the recycling of approximate invariant subspaces, and it is applied to reduce the EIT computation time for a constant and high resolution finite element mesh. In addition, we consider a powerful preconditioner and provide a detailed pseudocode for the improved iterative solver. The numerical results show the effectiveness of our approach: the proposed algorithm is faster than the preconditioned conjugate gradient (CG) algorithm. The results also show that even on a standard PC without parallelization, a high mesh resolution (more than 150,000 degrees of freedom) can be used for image estimation at a relatively low computational cost. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A model predictive controller (MPC) is proposed, which is robustly stable for some classes of model uncertainty and to unknown disturbances. It is considered as the case of open-loop stable systems, where only the inputs and controlled outputs are measured. It is assumed that the controller will work in a scenario where target tracking is also required. Here, it is extended to the nominal infinite horizon MPC with output feedback. The method considers an extended cost function that can be made globally convergent for any finite input horizon considered for the uncertain system. The method is based on the explicit inclusion of cost contracting constraints in the control problem. The controller considers the output feedback case through a non-minimal state-space model that is built using past output measurements and past input increments. The application of the robust output feedback MPC is illustrated through the simulation of a low-order multivariable system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scheduling parallel and distributed applications efficiently onto grid environments is a difficult task and a great variety of scheduling heuristics has been developed aiming to address this issue. A successful grid resource allocation depends, among other things, on the quality of the available information about software artifacts and grid resources. In this article, we propose a semantic approach to integrate selection of equivalent resources and selection of equivalent software artifacts to improve the scheduling of resources suitable for a given set of application execution requirements. We also describe a prototype implementation of our approach based on the Integrade grid middleware and experimental results that illustrate its benefits. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lightning-induced overvoltages have a considerable impact on the power quality of overhead distribution and telecommunications systems, and various models have been developed for the computation of the electromagnetic transients caused by indirect strokes. The most adequate has been shown to be the one proposed by Agrawal et al.; the Rusck model can be visualized as a particular case, as both models are equivalent when the lightning channel is perpendicular to the ground plane. In this paper, an extension of the Rusck model that enables the calculation of lightning-induced transients considering flashes to nearby elevated structures and realistic line configurations is tested against data obtained from both natural lightning and scale model experiments. The latter, performed under controlled conditions, can be used also to verify the validity of other coupling models and relevant codes. The so-called Extended Rusck Model, which is shown to be sufficiently accurate, is applied to the analysis of lightning-induced voltages on lines with a shield wire and/or surge arresters. The investigation conducted indicates that the ratio between the peak values of the voltages induced by typical first and subsequent strokes can be either greater or smaller than the unity, depending on the line configuration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Second-order phase locked loops (PLLs) are devices that are able to provide synchronization between the nodes in a network even under severe quality restrictions in the signal propagation. Consequently, they are widely used in telecommunication and control. Conventional master-slave (M-S) clock-distribution systems are being, replaced by mutually connected (MC) ones due to their good potential to be used in new types of application such as wireless sensor networks, distributed computation and communication systems. Here, by using an analytical reasoning, a nonlinear algebraic system of equations is proposed to establish the existence conditions for the synchronous state in an MC PLL network. Numerical experiments confirm the analytical results and provide ideas about how the network parameters affect the reachability of the synchronous state. The phase-difference oscillation amplitudes are related to the node parameters helping to design PLL neural networks. Furthermore, estimation of the acquisition time depending on the node parameters allows the performance evaluation of time distribution systems and neural networks based on phase-locked techniques. (c) 2008 Elsevier GmbH. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work aims at proposing the use of the evolutionary computation methodology in order to jointly solve the multiuser channel estimation (MuChE) and detection problems at its maximum-likelihood, both related to the direct sequence code division multiple access (DS/CDMA). The effectiveness of the proposed heuristic approach is proven by comparing performance and complexity merit figures with that obtained by traditional methods found in literature. Simulation results considering genetic algorithm (GA) applied to multipath, DS/CDMA and MuChE and multi-user detection (MuD) show that the proposed genetic algorithm multi-user channel estimation (GAMuChE) yields a normalized mean square error estimation (nMSE) inferior to 11%, under slowly varying multipath fading channels, large range of Doppler frequencies and medium system load, it exhibits lower complexity when compared to both maximum likelihood multi-user channel estimation (MLMuChE) and gradient descent method (GrdDsc). A near-optimum multi-user detector (MuD) based on the genetic algorithm (GAMuD), also proposed in this work, provides a significant reduction in the computational complexity when compared to the optimum multi-user detector (OMuD). In addition, the complexity of the GAMuChE and GAMuD algorithms were (jointly) analyzed in terms of number of operations necessary to reach the convergence, and compared to other jointly MuChE and MuD strategies. The joint GAMuChE-GAMuD scheme can be regarded as a promising alternative for implementing third-generation (3G) and fourth-generation (4G) wireless systems in the near future. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In many engineering applications, the time coordination of geographically separated events is of fundamental importance, as in digital telecommunications and integrated digital circuits. Mutually connected (MC) networks are very good candidates for some new types of application, such as wireless sensor networks. This paper presents a study on the behavior of MC networks of digital phase-locked loops (DPLLs). Analytical results are derived showing that, even for static networks without delays, different synchronous states may exist for the network. An upper bound for the number of such states is also presented. Numerical simulations are used to show the following results: (i) the synchronization precision in MC DPLLs networks; (ii) the existence of synchronous states for the network does not guarantee its achievement and (iii) different synchronous states may be achieved for different initial conditions. These results are important in the neural computation context. as in this case, each synchronous state may be associated to a different analog memory information. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work presents a method for predicting resource availability in opportunistic grids by means of use pattern analysis (UPA), a technique based on non-supervised learning methods. This prediction method is based on the assumption of the existence of several classes of computational resource use patterns, which can be used to predict the resource availability. Trace-driven simulations validate this basic assumptions, which also provide the parameter settings for the accurate learning of resource use patterns. Experiments made with an implementation of the UPA method show the feasibility of its use in the scheduling of grid tasks with very little overhead. The experiments also demonstrate the method`s superiority over other predictive and non-predictive methods. An adaptative prediction method is suggested to deal with the lack of training data at initialization. Further adaptative behaviour is motivated by experiments which show that, in some special environments, reliable resource use patterns may not always be detected. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the computer viruses pose a serious problem to individual and corporative computer systems, a lot of effort has been dedicated to study how to avoid their deleterious actions, trying to create anti-virus programs acting as vaccines in personal computers or in strategic network nodes. Another way to combat viruses propagation is to establish preventive policies based on the whole operation of a system that can be modeled with population models, similar to those that are used in epidemiological studies. Here, a modified version of the SIR (Susceptible-Infected-Removed) model is presented and how its parameters are related to network characteristics is explained. Then, disease-free and endemic equilibrium points are calculated, stability and bifurcation conditions are derived and some numerical simulations are shown. The relations among the model parameters in the several bifurcation conditions allow a network design minimizing viruses risks. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present various diagnostic methods for polyhazard models. Polyhazard models are a flexible family for fitting lifetime data. Their main advantage over the single hazard models, such as the Weibull and the log-logistic models, is to include a large amount of nonmonotone hazard shapes, as bathtub and multimodal curves. Some influence methods, such as the local influence and total local influence of an individual are derived, analyzed and discussed. A discussion of the computation of the likelihood displacement as well as the normal curvature in the local influence method are presented. Finally, an example with real data is given for illustration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study in detail the so-called beta-modified Weibull distribution, motivated by the wide use of the Weibull distribution in practice, and also for the fact that the generalization provides a continuous crossover towards cases with different shapes. The new distribution is important since it contains as special sub-models some widely-known distributions, such as the generalized modified Weibull, beta Weibull, exponentiated Weibull, beta exponential, modified Weibull and Weibull distributions, among several others. It also provides more flexibility to analyse complex real data. Various mathematical properties of this distribution are derived, including its moments and moment generating function. We examine the asymptotic distributions of the extreme values. Explicit expressions are also derived for the chf, mean deviations, Bonferroni and Lorenz curves, reliability and entropies. The estimation of parameters is approached by two methods: moments and maximum likelihood. We compare by simulation the performances of the estimates from these methods. We obtain the expected information matrix. Two applications are presented to illustrate the proposed distribution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A four-parameter extension of the generalized gamma distribution capable of modelling a bathtub-shaped hazard rate function is defined and studied. The beauty and importance of this distribution lies in its ability to model monotone and non-monotone failure rate functions, which are quite common in lifetime data analysis and reliability. The new distribution has a number of well-known lifetime special sub-models, such as the exponentiated Weibull, exponentiated generalized half-normal, exponentiated gamma and generalized Rayleigh, among others. We derive two infinite sum representations for its moments. We calculate the density of the order statistics and two expansions for their moments. The method of maximum likelihood is used for estimating the model parameters and the observed information matrix is obtained. Finally, a real data set from the medical area is analysed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce the log-beta Weibull regression model based on the beta Weibull distribution (Famoye et al., 2005; Lee et al., 2007). We derive expansions for the moment generating function which do not depend on complicated functions. The new regression model represents a parametric family of models that includes as sub-models several widely known regression models that can be applied to censored survival data. We employ a frequentist analysis, a jackknife estimator, and a parametric bootstrap for the parameters of the proposed model. We derive the appropriate matrices for assessing local influences on the parameter estimates under different perturbation schemes and present some ways to assess global influences. Further, for different parameter settings, sample sizes, and censoring percentages, several simulations are performed. In addition, the empirical distribution of some modified residuals are displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be extended to a modified deviance residual in the proposed regression model applied to censored data. We define martingale and deviance residuals to evaluate the model assumptions. The extended regression model is very useful for the analysis of real data and could give more realistic fits than other special regression models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A simplex-lattice statistical project was employed to study an optimization method for a preservative system in an ophthalmic suspension of dexametasone and polymyxin B. The assay matrix generated 17 formulas which were differentiated by the preservatives and EDTA (disodium ethylene diamine-tetraacetate), being the independent variable: X-1 = chlorhexidine digluconate (0.010 % w/v); X-2 = phenylethanol (0.500 % w/v); X-3 = EDTA (0.100 % w/v). The dependent variable was the Dvalue obtained from the microbial challenge of the formulas and calculated when the microbial killing process was modeled by an exponential function. The analysis of the dependent variable, performed using the software Design Expert/W, originated cubic equations with terms derived from stepwise adjustment method for the challenging microorganisms: Pseudomonas aeruginosa, Burkholderia cepacia, Staphylococcus aureus, Candida albicans and Aspergillus niger. Besides the mathematical expressions, the response surfaces and the contour graphics were obtained for each assay. The contour graphs obtained were overlaid in order to permit the identification of a region containing the most adequate formulas (graphic strategy), having as representatives: X-1 = 0.10 ( 0.001 % w/v); X-2 = 0.80 (0.400 % w/v); X-3 = 0.10 (0.010 % w/v). Additionally, in order to minimize responses (Dvalue), a numerical strategy corresponding to the use of the desirability function was used, which resulted in the following independent variables combinations: X-1 = 0.25 (0.0025 % w/v); X-2 = 0.75 (0.375 % w/v); X-3 = 0. These formulas, derived from the two strategies (graphic and numerical), were submitted to microbial challenge, and the experimental Dvalue obtained was compared to the theoretical Dvalue calculated from the cubic equation. Both Dvalues were similar to all the assays except that related to Staphylococcus aureus. This microorganism, as well as Pseudomonas aeruginosa, presented intense susceptibility to the formulas independently from the preservative and EDTA concentrations. Both formulas derived from graphic and numerical strategies attained the recommended criteria adopted by the official method. It was concluded that the model proposed allowed the optimization of the formulas in their preservation aspect.