984 resultados para Meta-heuristics algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Mathematik, Habil.-Schr., 2006

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Although diuretics are mainly used for the treatment of acute decompensated heart failure (ADHF), inadequate responses and complications have led to the use of extracorporeal ultrafiltration (UF) as an alternative strategy for reducing volume overloads in patients with ADHF. Objective: The aim of our study is to perform meta-analysis of the results obtained from studies on extracorporeal venous ultrafiltration and compare them with those of standard diuretic treatment for overload volume reduction in acute decompensated heart failure. Methods: MEDLINE, EMBASE, and the Cochrane Central Register of Controlled Trials databases were systematically searched using a pre‑specified criterion. Pooled estimates of outcomes after 48 h (weight change, serum creatinine level, and all-cause mortality) were computed using random effect models. Pooled weighted mean differences were calculated for weight loss and change in creatinine level, whereas a pooled risk ratio was used for the analysis of binary all-cause mortality outcome. Results: A total of nine studies, involving 613 patients, met the eligibility criteria. The mean weight loss in patients who underwent UF therapy was 1.78 kg [95% Confidence Interval (CI): −2.65 to −0.91 kg; p < 0.001) more than those who received standard diuretic therapy. The post-intervention creatinine level, however, was not significantly different (mean change = −0.25 mg/dL; 95% CI: −0.56 to 0.06 mg/dL; p = 0.112). The risk of all-cause mortality persisted in patients treated with UF compared with patients treated with standard diuretics (Pooled RR = 1.00; 95% CI: 0.64–1.56; p = 0.993). Conclusion: Compared with standard diuretic therapy, UF treatment for overload volume reduction in individuals suffering from ADHF, resulted in significant reduction of body weight within 48 h. However, no significant decrease of serum creatinine level or reduction of all-cause mortality was observed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Clinical decision-making requires synthesis of evidence from literature reviews focused on a specific theme. Evidence synthesis is performed with qualitative assessments and systematic reviews of randomized clinical trials, typically covering statistical pooling with pairwise meta-analyses. These methods include adjusted indirect comparison meta-analysis, network meta-analysis, and mixed-treatment comparison. These tools allow synthesis of evidence and comparison of effectiveness in cardiovascular research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Hypertension affects 25% of the world's population and is considered a risk factor for cardiovascular disorders and other diseases. The aim of this study was to examine the evidence regarding the acute effect of exercise on blood pressure (BP) using meta-analytic measures. Sixty-five studies were compared using effect sizes (ES), and heterogeneity and Z tests to determine whether the ES were different from zero. The mean corrected global ES for exercise conditions were -0.56 (-4.80 mmHg) for systolic BP (sBP) and -0.44 (-3.19 mmHg) for diastolic BP (dBP; z ≠ 0 for all; p < 0.05). The reduction in BP was significant regardless of the participant's initial BP level, gender, physical activity level, antihypertensive drug intake, type of BP measurement, time of day in which the BP was measured, type of exercise performed, and exercise training program (p < 0.05 for all). ANOVA tests revealed that BP reductions were greater if participants were males, not receiving antihypertensive medication, physically active, and if the exercise performed was jogging. A significant inverse correlation was found between age and BP ES, body mass index (BMI) and sBP ES, duration of the exercise's session and sBP ES, and between the number of sets performed in the resistance exercise program and sBP ES (p < 0.05). Regardless of the characteristics of the participants and exercise, there was a reduction in BP in the hours following an exercise session. However, the hypotensive effect was greater when the exercise was performed as a preventive strategy in those physically active and without antihypertensive medication.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work describes a test tool that allows to make performance tests of different end-to-end available bandwidth estimation algorithms along with their different implementations. The goal of such tests is to find the best-performing algorithm and its implementation and use it in congestion control mechanism for high-performance reliable transport protocols. The main idea of this paper is to describe the options which provide available bandwidth estimation mechanism for highspeed data transport protocols and to develop basic functionality of such test tool with which it will be possible to manage entities of test application on all involved testing hosts, aided by some middleware.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we investigate various algorithms for performing Fast Fourier Transformation (FFT)/Inverse Fast Fourier Transformation (IFFT), and proper techniques for maximizing the FFT/IFFT execution speed, such as pipelining or parallel processing, and use of memory structures with pre-computed values (look up tables -LUT) or other dedicated hardware components (usually multipliers). Furthermore, we discuss the optimal hardware architectures that best apply to various FFT/IFFT algorithms, along with their abilities to exploit parallel processing with minimal data dependences of the FFT/IFFT calculations. An interesting approach that is also considered in this paper is the application of the integrated processing-in-memory Intelligent RAM (IRAM) chip to high speed FFT/IFFT computing. The results of the assessment study emphasize that the execution speed of the FFT/IFFT algorithms is tightly connected to the capabilities of the FFT/IFFT hardware to support the provided parallelism of the given algorithm. Therefore, we suggest that the basic Discrete Fourier Transform (DFT)/Inverse Discrete Fourier Transform (IDFT) can also provide high performances, by utilizing a specialized FFT/IFFT hardware architecture that can exploit the provided parallelism of the DFT/IDF operations. The proposed improvements include simplified multiplications over symbols given in polar coordinate system, using sinе and cosine look up tables, and an approach for performing parallel addition of N input symbols.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we investigate various algorithms for performing Fast Fourier Transformation (FFT)/Inverse Fast Fourier Transformation (IFFT), and proper techniquesfor maximizing the FFT/IFFT execution speed, such as pipelining or parallel processing, and use of memory structures with pre-computed values (look up tables -LUT) or other dedicated hardware components (usually multipliers). Furthermore, we discuss the optimal hardware architectures that best apply to various FFT/IFFT algorithms, along with their abilities to exploit parallel processing with minimal data dependences of the FFT/IFFT calculations. An interesting approach that is also considered in this paper is the application of the integrated processing-in-memory Intelligent RAM (IRAM) chip to high speed FFT/IFFT computing. The results of the assessment study emphasize that the execution speed of the FFT/IFFT algorithms is tightly connected to the capabilities of the FFT/IFFT hardware to support the provided parallelism of the given algorithm. Therefore, we suggest that the basic Discrete Fourier Transform (DFT)/Inverse Discrete Fourier Transform (IDFT) can also provide high performances, by utilizing a specialized FFT/IFFT hardware architecture that can exploit the provided parallelism of the DFT/IDF operations. The proposed improvements include simplified multiplications over symbols given in polar coordinate system, using sinе and cosine look up tables,and an approach for performing parallel addition of N input symbols.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Some practical aspects of Genetic algorithms’ implementation regarding to life cycle management of electrotechnical equipment are considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is common to find in experimental data persistent oscillations in the aggregate outcomes and high levels of heterogeneity in individual behavior. Furthermore, it is not unusual to find significant deviations from aggregate Nash equilibrium predictions. In this paper, we employ an evolutionary model with boundedly rational agents to explain these findings. We use data from common property resource experiments (Casari and Plott, 2003). Instead of positing individual-specific utility functions, we model decision makers as selfish and identical. Agent interaction is simulated using an individual learning genetic algorithm, where agents have constraints in their working memory, a limited ability to maximize, and experiment with new strategies. We show that the model replicates most of the patterns that can be found in common property resource experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Vegeu el resum a l'inici del fitxer adjunt."

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the properties of the well known Replicator Dynamics when applied to a finitely repeated version of the Prisoners' Dilemma game. We characterize the behavior of such dynamics under strongly simplifying assumptions (i.e. only 3 strategies are available) and show that the basin of attraction of defection shrinks as the number of repetitions increases. After discussing the difficulties involved in trying to relax the 'strongly simplifying assumptions' above, we approach the same model by means of simulations based on genetic algorithms. The resulting simulations describe a behavior of the system very close to the one predicted by the replicator dynamics without imposing any of the assumptions of the analytical model. Our main conclusion is that analytical and computational models are good complements for research in social sciences. Indeed, while on the one hand computational models are extremely useful to extend the scope of the analysis to complex scenar

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The algorithmic approach to data modelling has developed rapidly these last years, in particular methods based on data mining and machine learning have been used in a growing number of applications. These methods follow a data-driven methodology, aiming at providing the best possible generalization and predictive abilities instead of concentrating on the properties of the data model. One of the most successful groups of such methods is known as Support Vector algorithms. Following the fruitful developments in applying Support Vector algorithms to spatial data, this paper introduces a new extension of the traditional support vector regression (SVR) algorithm. This extension allows for the simultaneous modelling of environmental data at several spatial scales. The joint influence of environmental processes presenting different patterns at different scales is here learned automatically from data, providing the optimum mixture of short and large-scale models. The method is adaptive to the spatial scale of the data. With this advantage, it can provide efficient means to model local anomalies that may typically arise in situations at an early phase of an environmental emergency. However, the proposed approach still requires some prior knowledge on the possible existence of such short-scale patterns. This is a possible limitation of the method for its implementation in early warning systems. The purpose of this paper is to present the multi-scale SVR model and to illustrate its use with an application to the mapping of Cs137 activity given the measurements taken in the region of Briansk following the Chernobyl accident.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Surgical clipping of unruptured intracranial aneurysms (UIAs) has recently been challenged by the emergence of endovascular treatment. We performed an updated systematic review and meta-analysis on the surgical treatment of UIAs, in an attempt to determine the aneurysm occlusion rates and safety of surgery in the modern era. METHODS: A detailed protocol was developed prior to conducting the review according to the Cochrane Collaboration guidelines. Electronic databases spanning January 1990-April 2011 were searched, complemented by hand searching. Heterogeneity was assessed using I(2), and publication bias with funnel plots. Surgical mortality and morbidity were analysed with weighted random effect models. RESULTS: 60 studies with 9845 patients harbouring 10 845 aneurysms were included. Mortality occurred in 157 patients (1.7%; 99% CI 0.9% to 3.0%; I(2)=82%). Unfavourable outcomes, including death, occurred in 692 patients (6.7%; 99% CI 4.9% to 9.0%; I(2)=85%). Morbidity rates were significantly greater in higher quality studies, and with large or posterior circulation aneurysms. Reported morbidity rates decreased over time. Studies were generally of poor quality; funnel plots showed heterogeneous results and publication bias, and data on aneurysm occlusion rates were scant. CONCLUSIONS: In studies published between 1990 and 2011, clipping of UIAs was associated with 1.7% mortality and 6.7% overall morbidity. The reputed durability of clipping has not been rigorously documented. Due to the quality of the included studies, the available literature cannot properly guide clinical decisions.