997 resultados para Randomized algorithm
Resumo:
Over the last two decades the results of randomized clinical studies, which are powerful aids for correctly assessing therapeutical strategies, have consolidated cardiological practice. In addition, scientifically interesting hypotheses have been generated through the results of epidemiological studies. Properly conducted randomized studies without systematic errors and with statistical power adequate for demonstrating moderate and reasonable benefits in relevant clinical outcomes have provided reliable and strong results altering clinical practice, thus providing adequate treatment for patients with cardiovascular disease (CVD). The dissemination and use of evidence-based medicine in treating coronary artery disease (CAD), heart failure (HF), and in prevention will prevent hundreds of thousands of deaths annually in developed and developing countries. CVD is responsible for approximately 12 million deaths annually throughout the world, and approximately 60% of these deaths occur in developing countries. During recent years, an increase in mortality and morbidity rates due to CVD has occurred in developing countries. This increase is an indication that an epidemiological (demographic, economical, and health-related) transition is taking place in developing countries and this transition implies a global epidemic of CVD, which will require wide-ranging and globally effective strategies for prevention. The identification of conventional and emerging risk factors for CVD, as well as their management in high-risk individuals, has contributed to the decrease in the mortality rate due to CVD. Through a national collaboration, several multi-center and multinational randomized and epidemiological studies have been carried out throughout Brazil, thus contributing not only to a generalized scientific growth in different Brazilian hospitals but also to the consolidation of an increasingly evidence-based clinical practice.
Resumo:
OBJECTIVE: To assess the effects of carvedilol in patients with idiopathic dilated cardiomyopathy. METHODS: In a double-blind randomized placebo-controlled study, 30 patients (7 women) with functional class II and III heart failure were assessed. Their ages ranged from 28 to 66 years (mean of 43±9 years), and their left ventricular ejection fraction varied from 8% to 35%. Carvedilol was added to the usual therapy of 20 patients; placebo was added to the usual therapy of 10 patients. The initial dose of carvedilol was 12.5 mg, which was increased weekly until it reached 75 mg/day, according to the patient's tolerance. Clinical assessment, electrocardiogram, echocardiogram, and radionuclide ventriculography were performed in the pretreatment phase, being repeated after 2 and 6 months of medication use. RESULTS: A reduction in heart rate (p=0.016) as well as an increase in left ventricular shortening fraction (p=0.02) and in left ventricular ejection fraction (p=0.017) occurred in the group using carvedilol as compared with that using placebo. CONCLUSION: Carvedilol added to the usual therapy for heart failure resulted in better heart function.
Resumo:
El avance en la potencia de cómputo en nuestros días viene dado por la paralelización del procesamiento, dadas las características que disponen las nuevas arquitecturas de hardware. Utilizar convenientemente este hardware impacta en la aceleración de los algoritmos en ejecución (programas). Sin embargo, convertir de forma adecuada el algoritmo en su forma paralela es complejo, y a su vez, esta forma, es específica para cada tipo de hardware paralelo. En la actualidad los procesadores de uso general más comunes son los multicore, procesadores paralelos, también denominados Symmetric Multi-Processors (SMP). Hoy en día es difícil hallar un procesador para computadoras de escritorio que no tengan algún tipo de paralelismo del caracterizado por los SMP, siendo la tendencia de desarrollo, que cada día nos encontremos con procesadores con mayor numero de cores disponibles. Por otro lado, los dispositivos de procesamiento de video (Graphics Processor Units - GPU), a su vez, han ido desarrollando su potencia de cómputo por medio de disponer de múltiples unidades de procesamiento dentro de su composición electrónica, a tal punto que en la actualidad no es difícil encontrar placas de GPU con capacidad de 200 a 400 hilos de procesamiento paralelo. Estos procesadores son muy veloces y específicos para la tarea que fueron desarrollados, principalmente el procesamiento de video. Sin embargo, como este tipo de procesadores tiene muchos puntos en común con el procesamiento científico, estos dispositivos han ido reorientándose con el nombre de General Processing Graphics Processor Unit (GPGPU). A diferencia de los procesadores SMP señalados anteriormente, las GPGPU no son de propósito general y tienen sus complicaciones para uso general debido al límite en la cantidad de memoria que cada placa puede disponer y al tipo de procesamiento paralelo que debe realizar para poder ser productiva su utilización. Los dispositivos de lógica programable, FPGA, son dispositivos capaces de realizar grandes cantidades de operaciones en paralelo, por lo que pueden ser usados para la implementación de algoritmos específicos, aprovechando el paralelismo que estas ofrecen. Su inconveniente viene derivado de la complejidad para la programación y el testing del algoritmo instanciado en el dispositivo. Ante esta diversidad de procesadores paralelos, el objetivo de nuestro trabajo está enfocado en analizar las características especificas que cada uno de estos tienen, y su impacto en la estructura de los algoritmos para que su utilización pueda obtener rendimientos de procesamiento acordes al número de recursos utilizados y combinarlos de forma tal que su complementación sea benéfica. Específicamente, partiendo desde las características del hardware, determinar las propiedades que el algoritmo paralelo debe tener para poder ser acelerado. Las características de los algoritmos paralelos determinará a su vez cuál de estos nuevos tipos de hardware son los mas adecuados para su instanciación. En particular serán tenidos en cuenta el nivel de dependencia de datos, la necesidad de realizar sincronizaciones durante el procesamiento paralelo, el tamaño de datos a procesar y la complejidad de la programación paralela en cada tipo de hardware. Today´s advances in high-performance computing are driven by parallel processing capabilities of available hardware architectures. These architectures enable the acceleration of algorithms when thes ealgorithms are properly parallelized and exploit the specific processing power of the underneath architecture. Most current processors are targeted for general pruposes and integrate several processor cores on a single chip, resulting in what is known as a Symmetric Multiprocessing (SMP) unit. Nowadays even desktop computers make use of multicore processors. Meanwhile, the industry trend is to increase the number of integrated rocessor cores as technology matures. On the other hand, Graphics Processor Units (GPU), originally designed to handle only video processing, have emerged as interesting alternatives to implement algorithm acceleration. Current available GPUs are able to implement from 200 to 400 threads for parallel processing. Scientific computing can be implemented in these hardware thanks to the programability of new GPUs that have been denoted as General Processing Graphics Processor Units (GPGPU).However, GPGPU offer little memory with respect to that available for general-prupose processors; thus, the implementation of algorithms need to be addressed carefully. Finally, Field Programmable Gate Arrays (FPGA) are programmable devices which can implement hardware logic with low latency, high parallelism and deep pipelines. Thes devices can be used to implement specific algorithms that need to run at very high speeds. However, their programmability is harder that software approaches and debugging is typically time-consuming. In this context where several alternatives for speeding up algorithms are available, our work aims at determining the main features of thes architectures and developing the required know-how to accelerate algorithm execution on them. We look at identifying those algorithms that may fit better on a given architecture as well as compleme
Resumo:
Background:Ventricular and supraventricular premature complexes (PC) are frequent and usually symptomatic. According to a previous study, magnesium pidolate (MgP) administration to symptomatic patients can improve the PC density and symptoms.Objective:To assess the late follow-up of that clinical intervention in patients treated with MgP or placebo.Methods:In the first phase of the study, 90 symptomatic and consecutive patients with PC were randomized (double-blind) to receive either MgP or placebo for 30 days. Monthly follow-up visits were conducted for 15 months to assess symptoms and control electrolytes. 24-hour Holter was performed twice, regardless of symptoms, or whenever symptoms were present. In the second phase of the study, relapsing patients, who had received MgP or placebo (crossing-over) in the first phase, were treated with MgP according to the same protocol.Results:Of the 45 patients initially treated with MgP, 17 (37.8%) relapsed during the 15-month follow-up, and the relapse time varied. Relapsing patients treated again had a statistically significant reduction in the PC density of 138.25/hour (p < 0.001). The crossing-over patients reduced it by 247/hour (p < 0.001). Patients who did not relapse, had a low PC frequency (3 PC/hour). Retreated patients had a 76.5% improvement in symptom, and crossing-over patients, 71.4%.Conclusion:Some patients on MgP had relapse of symptoms and PC, indicating that MgP is neither a definitive nor a curative treatment for late follow-up. However, improvement in the PC frequency and symptoms was observed in the second phase of treatment, similar to the response in the first phase of treatment.
Resumo:
Background:Effective interventions to improve medication adherence are usually complex and expensive.Objective:To assess the impact of a low-cost intervention designed to improve medication adherence and clinical outcomes in post-discharge patients with CVD.Method:A pilot RCT was conducted at a teaching hospital. Intervention was based on the four-item Morisky Medication Adherence Scale (MMAS-4). The primary outcome measure was medication adherence assessed using the eight-item MMAS at baseline, at 1 month post hospital discharge and re-assessed 1 year after hospital discharge. Other outcomes included readmission and mortality rates.Results:61 patients were randomized to intervention (n = 30) and control (n = 31) groups. The mean age of the patients was 61 years (SD 12.73), 52.5% were males, and 57.4% were married or living with a partner. Mean number of prescribed medications per patient was 4.5 (SD 3.3). Medication adherence was correlated to intervention (p = 0.04) and after 1 month, 48.4% of patients in the control group and 83.3% in the intervention group were considered adherent. However, this difference decreased after 1 year, when adherence was 34.8% and 60.9%, respectively. Readmission and mortality rates were related to low adherence in both groups.Conclusion:The intervention based on a validated patient self-report instrument for assessing adherence is a potentially effective method to improve adherent behavior and can be successfully used as a tool to guide adherence counseling in the clinical visit. However, a larger study is required to assess the real impact of intervention on these outcomes.
Resumo:
Background:Vascular remodeling, the dynamic dimensional change in face of stress, can assume different directions as well as magnitudes in atherosclerotic disease. Classical measurements rely on reference to segments at a distance, risking inappropriate comparison between dislike vessel portions.Objective:to explore a new method for quantifying vessel remodeling, based on the comparison between a given target segment and its inferred normal dimensions.Methods:Geometric parameters and plaque composition were determined in 67 patients using three-vessel intravascular ultrasound with virtual histology (IVUS-VH). Coronary vessel remodeling at cross-section (n = 27.639) and lesion (n = 618) levels was assessed using classical metrics and a novel analytic algorithm based on the fractional vessel remodeling index (FVRI), which quantifies the total change in arterial wall dimensions related to the estimated normal dimension of the vessel. A prediction model was built to estimate the normal dimension of the vessel for calculation of FVRI.Results:According to the new algorithm, “Ectatic” remodeling pattern was least common, “Complete compensatory” remodeling was present in approximately half of the instances, and “Negative” and “Incomplete compensatory” remodeling types were detected in the remaining. Compared to a traditional diagnostic scheme, FVRI-based classification seemed to better discriminate plaque composition by IVUS-VH.Conclusion:Quantitative assessment of coronary remodeling using target segment dimensions offers a promising approach to evaluate the vessel response to plaque growth/regression.
Resumo:
Magdeburg, Univ., Fak. für Informatik, Diss., 2015
Resumo:
The parameterized expectations algorithm (PEA) involves a long simulation and a nonlinear least squares (NLS) fit, both embedded in a loop. Both steps are natural candidates for parallelization. This note shows that parallelization can lead to important speedups for the PEA. I provide example code for a simple model that can serve as a template for parallelization of more interesting models, as well as a download link for an image of a bootable CD that allows creation of a cluster and execution of the example code in minutes, with no need to install any software.
Resumo:
Ma (1996) studied the random order mechanism, a matching mechanism suggested by Roth and Vande Vate (1990) for marriage markets. By means of an example he showed that the random order mechanism does not always reach all stable matchings. Although Ma's (1996) result is true, we show that the probability distribution he presented - and therefore the proof of his Claim 2 - is not correct. The mistake in the calculations by Ma (1996) is due to the fact that even though the example looks very symmetric, some of the calculations are not as ''symmetric.''
Resumo:
OBJECTIVE: Tuberculosis (TB) is highly prevalent among HIV-infected people, including those receiving combination antiretroviral therapy (cART), necessitating a well tolerated and efficacious TB vaccine for these populations. We evaluated the safety and immunogenicity of the candidate TB vaccine M72/AS01 in adults with well controlled HIV infection on cART. DESIGN: A randomized, observer-blind, controlled trial (NCT00707967). METHODS: HIV-infected adults on cART in Switzerland were randomized 3 : 1 : 1 to receive two doses, 1 month apart, of M72/AS01, AS01 or 0.9% physiological saline (N = 22, N = 8 and N = 7, respectively) and were followed up to 6 months postdose 2 (D210). Individuals with CD4⁺ cell counts below 200 cells/μl were excluded. Adverse events (AEs) including HIV-specific and laboratory safety parameters were recorded. Cell-mediated (ICS) and humoral (ELISA) responses were evaluated before vaccination, 1 month after each dose (D30, D60) and D210. RESULTS: Thirty-seven individuals [interquartile range (IQR) CD4⁺ cell counts at screening: 438-872 cells/μl; undetectable HIV-1 viremia] were enrolled; 73% of individuals reported previous BCG vaccination, 97.3% tested negative for the QuantiFERON-TB assay. For M72/AS01 recipients, no vaccine-related serious AEs or cART-regimen adjustments were recorded, and there were no clinically relevant effects on laboratory safety parameters, HIV-1 viral loads or CD4⁺ cell counts. M72/AS01 was immunogenic, inducing persistent and polyfunctional M72-specific CD4⁺ T-cell responses [medians 0.70% (IQR 0.37-1.07) at D60] and 0.42% (0.24-0.61) at D210, predominantly CD40L⁺IL-2⁺TNF-α⁺, CD40L⁺IL-2⁺ and CD40L⁺IL-2⁺TNF-α⁺IFN-γ⁺]. All M72/AS01 vaccines were seropositive for anti-M72 IgG after second vaccination until study end. CONCLUSION: M72/AS01 was clinically well tolerated and immunogenic in this population, supporting further clinical evaluation in HIV-infected individuals in TB-endemic settings.
Resumo:
Background: Motive-oriented therapeutic relationship (MOTR) was postulated to be a particularly helpful therapeutic ingredient in the early treatment phase of patients with personality disorders, in particular with borderline personality disorder (BPD). The present randomized controlled study using an add-on design is the first study to test this assumption in a 10-session general psychiatric treatment with patients presenting with BPD on symptom reduction and therapeutic alliance. Methods: A total of 85 patients were randomized. They were either allocated to a manual-based short variant of the general psychiatric management (GPM) treatment (in 10 sessions) or to the same treatment where MOTR was deliberately added to the treatment. Treatment attrition and integrity analyses yielded satisfactory results. Results: The results of the intent-to-treat analyses suggested a global efficacy of MOTR, in the sense of an additional reduction of general problems, i.e. symptoms, interpersonal and social problems (F1, 73 = 7.25, p < 0.05). However, they also showed that MOTR did not yield an additional reduction of specific borderline symptoms. It was also shown that a stronger therapeutic alliance, as assessed by the therapist, developed in MOTR treatments compared to GPM (Z55 = 0.99, p < 0.04). Conclusions: These results suggest that adding MOTR to psychiatric and psychotherapeutic treatments of BPD is promising. Moreover, the findings shed additional light on the perspective of shortening treatments for patients presenting with BPD. © 2014 S. Karger AG, Basel.
Resumo:
OBJECTIVES: : To evaluate the outcome after Hartmann's procedure (HP) versus primary anastomosis (PA) with diverting ileostomy for perforated left-sided diverticulitis. BACKGROUND: : The surgical management of left-sided colonic perforation with purulent or fecal peritonitis remains controversial. PA with ileostomy seems to be superior to HP; however, results in the literature are affected by a significant selection bias. No randomized clinical trial has yet compared the 2 procedures. METHODS: : Sixty-two patients with acute left-sided colonic perforation (Hinchey III and IV) from 4 centers were randomized to HP (n = 30) and to PA (with diverting ileostomy, n = 32), with a planned stoma reversal operation after 3 months in both groups. Data were analyzed on an intention-to-treat basis. The primary end point was the overall complication rate. The study was discontinued following an interim analysis that found significant differences of relevant secondary end points as well as a decreasing accrual rate (NCT01233713). RESULTS: : Patient demographics were equally distributed in both groups (Hinchey III: 76% vs 75% and Hinchey IV: 24% vs 25%, for HP vs PA, respectively). The overall complication rate for both resection and stoma reversal operations was comparable (80% vs 84%, P = 0.813). Although the outcome after the initial colon resection did not show any significant differences (mortality 13% vs 9% and morbidity 67% vs 75% in HP vs PA), the stoma reversal rate after PA with diverting ileostomy was higher (90% vs 57%, P = 0.005) and serious complications (Grades IIIb-IV: 0% vs 20%, P = 0.046), operating time (73 minutes vs 183 minutes, P < 0.001), hospital stay (6 days vs 9 days, P = 0.016), and lower in-hospital costs (US $16,717 vs US $24,014) were significantly reduced in the PA group. CONCLUSIONS: : This is the first randomized clinical trial favoring PA with diverting ileostomy over HP in patients with perforated diverticulitis.
Resumo:
Vaniprevir (MK-7009) is a macrocyclic hepatitis C virus (HCV) nonstructural protein 3/4A protease inhibitor. The aim of the present phase II study was to examine virologic response rates with vaniprevir in combination with pegylated interferon alpha-2a (Peg-IFN-α-2a) plus ribavirin (RBV). In this double-blind, placebo-controlled, dose-ranging study, treatment-naïve patients with HCV genotype 1 infection (n = 94) were randomized to receive open-label Peg-IFN-α-2a (180 μg/week) and RBV (1,000-1,200 mg/day) in combination with blinded placebo or vaniprevir (300 mg twice-daily [BID], 600 mg BID, 600 mg once-daily [QD], or 800 mg QD) for 28 days, then open-label Peg-IFN-α-2a and RBV for an additional 44 weeks. The primary efficacy endpoint was rapid viral response (RVR), defined as undetectable plasma HCV RNA at week 4. Across all doses, vaniprevir was associated with a rapid two-phase decline in viral load, with HCV RNA levels approximately 3 log(10) IU/mL lower in vaniprevir-treated patients, compared to placebo recipients. Rates of RVR were significantly higher in each of the vaniprevir dose groups, compared to the control regimen (68.8%-83.3% versus 5.6%; P < 0.001 for all comparisons). There were numerically higher, but not statistically significant, early and sustained virologic response rates with vaniprevir, as compared to placebo. Resistance profile was predictable, with variants at R155 and D168 detected in a small number of patients. No relationship between interleukin-28B genotype and treatment outcomes was demonstrated in this study. The incidence of adverse events was generally comparable between vaniprevir and placebo recipients; however, vomiting appeared to be more common at higher vaniprevir doses. CONCLUSION: Vaniprevir is a potent HCV protease inhibitor with a predictable resistance profile and favorable safety profile that is suitable for QD or BID administration.
Resumo:
BACKGROUND: Screening of peripheral atherosclerosis is increasingly used, but few trials have examined its clinical impact. We aimed to assess whether carotid plaque screening helps smokers to improve their health behaviors and cardiovascular risk factors. METHODS: We randomly assigned 536 smokers aged 40 to 70 years to carotid plaque ultrasonographic screening (US group) vs no screening (control group) in addition to individual counseling and nicotine replacement therapy for all participants. Smokers with at least 1 plaque received pictures of their plaques with a 7-minute structured explanation. The outcomes included biochemically validated smoking cessation at 12 months (primary outcome) and changes in cardiovascular risk factor levels and Framingham risk score. RESULTS: At baseline, participants (mean age, 51.1 years; 45.0% women) smoked an average of 20 cigarettes per day with a median duration of 32 years. The US group had a high prevalence of carotid plaques (57.9%). At 12 months, smoking cessation rates were high, but did not differ between the US and control groups (24.9% vs 22.1%; P = .45). In the US group, cessation rates did not differ according to the presence or absence of plaques. Control of cardiovascular risk factors (ie, blood pressure and low-density lipoprotein cholesterol and hemoglobin A(1c) levels in diabetic patients) and mean absolute risk change in Framingham risk score did not differ between the groups. The mean absolute risk change in Framingham risk score was +0.6 in the US group vs +0.3 in the control group (P = .56). CONCLUSION: In smokers, carotid plaque screening performed in addition to thorough smoking cessation counseling is not associated with increased rates of smoking cessation or control of cardiovascular risk factors. Trial Registration clinicaltrials.gov Identifier: NCT00548665.