995 resultados para randomized algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the challenging task of computing multiple roots of a system of nonlinear equations. A repulsion algorithm that invokes the Nelder-Mead (N-M) local search method and uses a penalty-type merit function based on the error function, known as 'erf', is presented. In the N-M algorithm context, different strategies are proposed to enhance the quality of the solutions and improve the overall efficiency. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimization with stochastic algorithms has become a relevant research field. Due to its stochastic nature, its assessment is not straightforward and involves integrating accuracy and precision. Performance profiles for the mean do not show the trade-off between accuracy and precision, and parametric stochastic profiles require strong distributional assumptions and are limited to the mean performance for a large number of runs. In this work, bootstrap performance profiles are used to compare stochastic algorithms for different statistics. This technique allows the estimation of the sampling distribution of almost any statistic even with small samples. Multiple comparison profiles are presented for more than two algorithms. The advantages and drawbacks of each assessment methodology are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although some studies point to cognitive stimulation as a beneficial therapy for older adults with cognitive impairments, this area of research and practice is still lacking dissemination and is underrepresented in many countries. Moreover, the comparative effects of different intervention durations remain to be established and, besides cognitive effects, pragmatic parameters, such as cost-effectiveness and experiential relevance to participants, are seldom explored. In this work, we present a randomized con- trolled wait-list trial evaluating 2 different intervention durations (standard 1⁄4 17 vs brief 1⁄4 11 sessions) of a cognitive stimulation program developed for older adults with cognitive impairments with or without dementia. 20 participants were randomly assigned to the standard duration intervention program (17 sessions, 1.5 months) or to a wait-list group. At postintervention of the standard intervention group, the wait-list group crossed over to receive the brief intervention program (11 sessions, 1 month). Changes in neuropsychological, functionality, quality of life, and caregiver outcomes were evaluated. Experience during intervention and costs and feasibility were also evaluated. The current cognitive stimulation programs (ie, standard and brief) showed high values of experiential relevance for both intervention durations. High adherence, completion rates, and reasonable costs were found for both formats. Further studies are needed to definitively establish the potential efficacy, optimal duration, cost-effectiveness, and experiential relevance for participants of cognitive intervention approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PhD thesis in Biomedical Engineering

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed data aggregation is an important task, allowing the de- centralized determination of meaningful global properties, that can then be used to direct the execution of other applications. The resulting val- ues result from the distributed computation of functions like count, sum and average. Some application examples can found to determine the network size, total storage capacity, average load, majorities and many others. In the last decade, many di erent approaches have been pro- posed, with di erent trade-o s in terms of accuracy, reliability, message and time complexity. Due to the considerable amount and variety of ag- gregation algorithms, it can be di cult and time consuming to determine which techniques will be more appropriate to use in speci c settings, jus- tifying the existence of a survey to aid in this task. This work reviews the state of the art on distributed data aggregation algorithms, providing three main contributions. First, it formally de nes the concept of aggrega- tion, characterizing the di erent types of aggregation functions. Second, it succinctly describes the main aggregation techniques, organizing them in a taxonomy. Finally, it provides some guidelines toward the selection and use of the most relevant techniques, summarizing their principal characteristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over the last two decades the results of randomized clinical studies, which are powerful aids for correctly assessing therapeutical strategies, have consolidated cardiological practice. In addition, scientifically interesting hypotheses have been generated through the results of epidemiological studies. Properly conducted randomized studies without systematic errors and with statistical power adequate for demonstrating moderate and reasonable benefits in relevant clinical outcomes have provided reliable and strong results altering clinical practice, thus providing adequate treatment for patients with cardiovascular disease (CVD). The dissemination and use of evidence-based medicine in treating coronary artery disease (CAD), heart failure (HF), and in prevention will prevent hundreds of thousands of deaths annually in developed and developing countries. CVD is responsible for approximately 12 million deaths annually throughout the world, and approximately 60% of these deaths occur in developing countries. During recent years, an increase in mortality and morbidity rates due to CVD has occurred in developing countries. This increase is an indication that an epidemiological (demographic, economical, and health-related) transition is taking place in developing countries and this transition implies a global epidemic of CVD, which will require wide-ranging and globally effective strategies for prevention. The identification of conventional and emerging risk factors for CVD, as well as their management in high-risk individuals, has contributed to the decrease in the mortality rate due to CVD. Through a national collaboration, several multi-center and multinational randomized and epidemiological studies have been carried out throughout Brazil, thus contributing not only to a generalized scientific growth in different Brazilian hospitals but also to the consolidation of an increasingly evidence-based clinical practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To assess the effects of carvedilol in patients with idiopathic dilated cardiomyopathy. METHODS: In a double-blind randomized placebo-controlled study, 30 patients (7 women) with functional class II and III heart failure were assessed. Their ages ranged from 28 to 66 years (mean of 43±9 years), and their left ventricular ejection fraction varied from 8% to 35%. Carvedilol was added to the usual therapy of 20 patients; placebo was added to the usual therapy of 10 patients. The initial dose of carvedilol was 12.5 mg, which was increased weekly until it reached 75 mg/day, according to the patient's tolerance. Clinical assessment, electrocardiogram, echocardiogram, and radionuclide ventriculography were performed in the pretreatment phase, being repeated after 2 and 6 months of medication use. RESULTS: A reduction in heart rate (p=0.016) as well as an increase in left ventricular shortening fraction (p=0.02) and in left ventricular ejection fraction (p=0.017) occurred in the group using carvedilol as compared with that using placebo. CONCLUSION: Carvedilol added to the usual therapy for heart failure resulted in better heart function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En este proyecto se desarrollarán algoritmos numéricos para sistemas no lineales hiperbólicos-parabólicos de ecuaciones diferenciales en derivadas parciales. Dichos sistemas tienen aplicación en propagación de ondas en ámbitos aeroespaciales y astrofísicos.Objetivos generales: 1)Desarrollo y mejora de algoritmos numéricos con la finalidad de incrementar la calidad en la simulación de propagación e interacción de ondas gasdinámicas y magnetogasdinámicas no lineales. 2)Desarrollo de códigos computacionales con la finalidad de simular flujos gasdinámicos de elevada entalpía incluyendo cambios químicos, efectos dispersivos y difusivos.3)Desarrollo de códigos computacionales con la finalidad de simular flujos magnetogasdinámicos ideales y reales.4)Aplicación de los nuevos algoritmos y códigos computacionales a la solución del flujo aerotermodinámico alrededor de cuerpos que ingresan en la atmósfera terrestre. 5)Aplicación de los nuevos algoritmos y códigos computacionales a la simulación del comportamiento dinámico no lineal de arcos magnéticos en la corona solar. 6)Desarrollo de nuevos modelos para describir el comportamiento no lineal de arcos magnéticos en la corona solar.Este proyecto presenta como objetivo principal la introducción de mejoras en algoritmos numéricos para simular la propagación e interacción de ondas no lineales en dos medios gaseosos: aquellos que no poseen carga eléctrica libre (flujos gasdinámicos) y aquellos que tienen carga eléctrica libre (flujos magnetogasdinámicos). Al mismo tiempo se desarrollarán códigos computacionales que implementen las mejoras de las técnicas numéricas.Los algoritmos numéricos se aplicarán con la finalidad de incrementar el conocimiento en tópicos de interés en la ingeniería aeroespacial como es el cálculo del flujo de calor y fuerzas aerotermodinámicas que soportan objetos que ingresan a la atmósfera terrestre y en temas de astrofísica como la propagación e interacción de ondas, tanto para la transferencia de energía como para la generación de inestabilidades en arcos magnéticos de la corona solar. Estos dos temas poseen en común las técnicas y algoritmos numéricos con los que serán tratados. Las ecuaciones gasdinámicas y magnetogasdinámicas ideales conforman sistemas hiperbólicos de ecuaciones diferenciales y pueden ser solucionados utilizando "Riemann solvers" junto con el método de volúmenes finitos (Toro 1999; Udrea 1999; LeVeque 1992 y 2005). La inclusión de efectos difusivos genera que los sistemas de ecuaciones resulten hiperbólicos-parabólicos. La contribución parabólica puede ser considerada como términos fuentes y tratada adicionalmente tanto en forma explícita como implícita (Udrea 1999; LeVeque 2005).Para analizar el flujo alrededor de cuerpos que ingresan en la atmósfera se utilizarán las ecuaciones de Navier-Stokes químicamente activas, mientras la temperatura no supere los 6000K. Para mayores temperaturas es necesario considerar efectos de ionización (Anderson, 1989). Tanto los efectos difusivos como los cambios químicos serán considerados como términos fuentes en las ecuaciones de Euler. Para tratar la propagación de ondas, transferencia de energía e inestabilidades en arcos magnéticos de la corona solar se utilizarán las ecuaciones de la magnetogasdinámica ideal y real. En este caso será también conveniente implementar términos fuente para el tratamiento de fenómenos de transporte como el flujo de calor y el de radiación. Los códigos utilizarán la técnica de volúmenes finitos, junto con esquemas "Total Variation Disminishing - TVD" sobre mallas estructuradas y no estructuradas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En nuestro proyecto anterior aproximamos el cálculo de una integral definida con integrandos de grandes variaciones funcionales. Nuestra aproximación paraleliza el algoritmo de cómputo de un método adaptivo de cuadratura, basado en reglas de Newton-Cote. Los primeros resultados obtenidos fueron comunicados en distintos congresos nacionales e internacionales; ellos nos permintieron comenzar con una tipificación de las reglas de cuadratura existentes y una clasificación de algunas funciones utilizadas como funciones de prueba. Estas tareas de clasificación y tipificación no las hemos finalizado, por lo que pretendemos darle continuidad a fin de poder informar sobre la conveniencia o no de utilizar nuestra técnica. Para llevar adelante esta tarea se buscará una base de funciones de prueba y se ampliará el espectro de reglas de cuadraturas a utilizar. Además, nos proponemos re-estructurar el cálculo de algunas rutinas que intervienen en el cómputo de la mínima energía de una molécula. Este programa ya existe en su versión secuencial y está modelizado utilizando la aproximación LCAO. El mismo obtiene resultados exitosos en cuanto a precisión, comparado con otras publicaciones internacionales similares, pero requiere de un tiempo de cálculo significativamente alto. Nuestra propuesta es paralelizar el algoritmo mencionado abordándolo al menos en dos niveles: 1- decidir si conviene distribuir el cálculo de una integral entre varios procesadores o si será mejor distribuir distintas integrales entre diferentes procesadores. Debemos recordar que en los entornos de arquitecturas paralelas basadas en redes (típicamente redes de área local, LAN) el tiempo que ocupa el envío de mensajes entre los procesadores es muy significativo medido en cantidad de operaciones de cálculo que un procesador puede completar. 2- de ser necesario, paralelizar el cálculo de integrales dobles y/o triples. Para el desarrollo de nuestra propuesta se desarrollarán heurísticas para verificar y construir modelos en los casos mencionados tendientes a mejorar las rutinas de cálculo ya conocidas. A la vez que se testearán los algoritmos con casos de prueba. La metodología a utilizar es la habitual en Cálculo Numérico. Con cada propuesta se requiere: a) Implementar un algoritmo de cálculo tratando de lograr versiones superadoras de las ya existentes. b) Realizar los ejercicios de comparación con las rutinas existentes para confirmar o desechar una mejor perfomance numérica. c) Realizar estudios teóricos de error vinculados al método y a la implementación. Se conformó un equipo interdisciplinario integrado por investigadores tanto de Ciencias de la Computación como de Matemática. Metas a alcanzar Se espera obtener una caracterización de las reglas de cuadratura según su efectividad, con funciones de comportamiento oscilatorio y con decaimiento exponencial, y desarrollar implementaciones computacionales adecuadas, optimizadas y basadas en arquitecturas paralelas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As digital imaging processing techniques become increasingly used in a broad range of consumer applications, the critical need to evaluate algorithm performance has become recognised by developers as an area of vital importance. With digital image processing algorithms now playing a greater role in security and protection applications, it is of crucial importance that we are able to empirically study their performance. Apart from the field of biometrics little emphasis has been put on algorithm performance evaluation until now and where evaluation has taken place, it has been carried out in a somewhat cumbersome and unsystematic fashion, without any standardised approach. This paper presents a comprehensive testing methodology and framework aimed towards automating the evaluation of image processing algorithms. Ultimately, the test framework aims to shorten the algorithm development life cycle by helping to identify algorithm performance problems quickly and more efficiently.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background:Ventricular and supraventricular premature complexes (PC) are frequent and usually symptomatic. According to a previous study, magnesium pidolate (MgP) administration to symptomatic patients can improve the PC density and symptoms.Objective:To assess the late follow-up of that clinical intervention in patients treated with MgP or placebo.Methods:In the first phase of the study, 90 symptomatic and consecutive patients with PC were randomized (double-blind) to receive either MgP or placebo for 30 days. Monthly follow-up visits were conducted for 15 months to assess symptoms and control electrolytes. 24-hour Holter was performed twice, regardless of symptoms, or whenever symptoms were present. In the second phase of the study, relapsing patients, who had received MgP or placebo (crossing-over) in the first phase, were treated with MgP according to the same protocol.Results:Of the 45 patients initially treated with MgP, 17 (37.8%) relapsed during the 15-month follow-up, and the relapse time varied. Relapsing patients treated again had a statistically significant reduction in the PC density of 138.25/hour (p < 0.001). The crossing-over patients reduced it by 247/hour (p < 0.001). Patients who did not relapse, had a low PC frequency (3 PC/hour). Retreated patients had a 76.5% improvement in symptom, and crossing-over patients, 71.4%.Conclusion:Some patients on MgP had relapse of symptoms and PC, indicating that MgP is neither a definitive nor a curative treatment for late follow-up. However, improvement in the PC frequency and symptoms was observed in the second phase of treatment, similar to the response in the first phase of treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background:Effective interventions to improve medication adherence are usually complex and expensive.Objective:To assess the impact of a low-cost intervention designed to improve medication adherence and clinical outcomes in post-discharge patients with CVD.Method:A pilot RCT was conducted at a teaching hospital. Intervention was based on the four-item Morisky Medication Adherence Scale (MMAS-4). The primary outcome measure was medication adherence assessed using the eight-item MMAS at baseline, at 1 month post hospital discharge and re-assessed 1 year after hospital discharge. Other outcomes included readmission and mortality rates.Results:61 patients were randomized to intervention (n = 30) and control (n = 31) groups. The mean age of the patients was 61 years (SD 12.73), 52.5% were males, and 57.4% were married or living with a partner. Mean number of prescribed medications per patient was 4.5 (SD 3.3). Medication adherence was correlated to intervention (p = 0.04) and after 1 month, 48.4% of patients in the control group and 83.3% in the intervention group were considered adherent. However, this difference decreased after 1 year, when adherence was 34.8% and 60.9%, respectively. Readmission and mortality rates were related to low adherence in both groups.Conclusion:The intervention based on a validated patient self-report instrument for assessing adherence is a potentially effective method to improve adherent behavior and can be successfully used as a tool to guide adherence counseling in the clinical visit. However, a larger study is required to assess the real impact of intervention on these outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Mathematik, Habil.-Schr., 2006

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work describes a test tool that allows to make performance tests of different end-to-end available bandwidth estimation algorithms along with their different implementations. The goal of such tests is to find the best-performing algorithm and its implementation and use it in congestion control mechanism for high-performance reliable transport protocols. The main idea of this paper is to describe the options which provide available bandwidth estimation mechanism for highspeed data transport protocols and to develop basic functionality of such test tool with which it will be possible to manage entities of test application on all involved testing hosts, aided by some middleware.