909 resultados para decreasing relative risk aversion


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although scientific literature has demonstrated the relevance of oral hygiene with chlorhexidine in preventing ventilation-associated pneumonia, there is a wide variation of concentrations, frequency and techniques when using the antiseptic. The aim of this research was to assessthe best chlorhexidine concentration used to perform oral hygiene to prevent ventilation-associated pneumonia. A systematic review followed by four meta-analysis using chlorhexidine concentration as criterion was carried out. Articles in English, Spanish or Portuguese indexed in the Cochrane, Embase, Lilacs, PubMed/Medline and Ovid electronic databases were selected. The research was carried out from May to June 2011. The primary outcome measure of interest was ventilation-associated pneumonia. Ten primary studies were divided in four groups (Gl-4), based on chlorhexidine concentration criterion. Gl (5 primary studies, chlorhexidine 0.12%) showed homogeneity among studies and the use of chlorhexidine represented a protective factor. G2 (3 primary studies, chlorhexidine 0.20%) showed heterogeneity among studies and chlorhexidine did not represent a protective factor. G3 (2 primary studies, chlorhexidine 2,00%) showed homogeneity among studies and the use of chlorhexidine was significant. G4 (10 primary studies with different chlorhexidine concentrations) showed homogeneity among studies and the common Relative Risk was significant. Statistic analyses showed a protective effect of oral hygiene with chlorhexidine in preventing ventilation-associated pneumonia. However, it was not possible to identity a standard to establish optimal chlorhexidine concentration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[EN] Introduction: Candidemia in critically ill patients is usually a severe and life-threatening condition with a high crude mortality. Very few studies have focused on the impact of candidemia on ICU patient outcome and attributable mortality still remains controversial. This study was carried out to determine the attributable mortality of ICU-acquired candidemia in critically ill patients using propensity score matching analysis. Methods: A prospective observational study was conducted of all consecutive non-neutropenic adult patients admitted for at least seven days to 36 ICUs in Spain, France, and Argentina between April 2006 and June 2007. The probability of developing candidemia was estimated using a multivariate logistic regression model. Each patient with ICU-acquired candidemia was matched with two control patients with the nearest available Mahalanobis metric matching within the calipers defined by the propensity score. Standardized differences tests (SDT) for each variable before and after matching were calculated. Attributable mortality was determined by a modified Poisson regression model adjusted by those variables that still presented certain misalignments defined as a SDT > 10%. Results: Thirty-eight candidemias were diagnosed in 1,107 patients (34.3 episodes/1,000 ICU patients). Patients with and without candidemia had an ICU crude mortality of 52.6% versus 20.6% (P < 0.001) and a crude hospital mortality of 55.3% versus 29.6% (P = 0.01), respectively. In the propensity matched analysis, the corresponding figures were 51.4% versus 37.1% (P = 0.222) and 54.3% versus 50% (P = 0.680). After controlling residual confusion by the Poisson regression model, the relative risk (RR) of ICU- and hospital-attributable mortality from candidemia was RR 1.298 (95% confidence interval (CI) 0.88 to 1.98) and RR 1.096 (95% CI 0.68 to 1.69), respectively. Conclusions: ICU-acquired candidemia in critically ill patients is not associated with an increase in either ICU or hospital mortality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we aim to propose a new approach for preliminary epidemiological studies on Standardized Mortality Ratios (SMR) collected in many spatial regions. A preliminary study on SMRs aims to formulate hypotheses to be investigated via individual epidemiological studies that avoid bias carried on by aggregated analyses. Starting from collecting disease counts and calculating expected disease counts by means of reference population disease rates, in each area an SMR is derived as the MLE under the Poisson assumption on each observation. Such estimators have high standard errors in small areas, i.e. where the expected count is low either because of the low population underlying the area or the rarity of the disease under study. Disease mapping models and other techniques for screening disease rates among the map aiming to detect anomalies and possible high-risk areas have been proposed in literature according to the classic and the Bayesian paradigm. Our proposal is approaching this issue by a decision-oriented method, which focus on multiple testing control, without however leaving the preliminary study perspective that an analysis on SMR indicators is asked to. We implement the control of the FDR, a quantity largely used to address multiple comparisons problems in the eld of microarray data analysis but which is not usually employed in disease mapping. Controlling the FDR means providing an estimate of the FDR for a set of rejected null hypotheses. The small areas issue arises diculties in applying traditional methods for FDR estimation, that are usually based only on the p-values knowledge (Benjamini and Hochberg, 1995; Storey, 2003). Tests evaluated by a traditional p-value provide weak power in small areas, where the expected number of disease cases is small. Moreover tests cannot be assumed as independent when spatial correlation between SMRs is expected, neither they are identical distributed when population underlying the map is heterogeneous. The Bayesian paradigm oers a way to overcome the inappropriateness of p-values based methods. Another peculiarity of the present work is to propose a hierarchical full Bayesian model for FDR estimation in testing many null hypothesis of absence of risk.We will use concepts of Bayesian models for disease mapping, referring in particular to the Besag York and Mollié model (1991) often used in practice for its exible prior assumption on the risks distribution across regions. The borrowing of strength between prior and likelihood typical of a hierarchical Bayesian model takes the advantage of evaluating a singular test (i.e. a test in a singular area) by means of all observations in the map under study, rather than just by means of the singular observation. This allows to improve the power test in small areas and addressing more appropriately the spatial correlation issue that suggests that relative risks are closer in spatially contiguous regions. The proposed model aims to estimate the FDR by means of the MCMC estimated posterior probabilities b i's of the null hypothesis (absence of risk) for each area. An estimate of the expected FDR conditional on data (\FDR) can be calculated in any set of b i's relative to areas declared at high-risk (where thenull hypothesis is rejected) by averaging the b i's themselves. The\FDR can be used to provide an easy decision rule for selecting high-risk areas, i.e. selecting as many as possible areas such that the\FDR is non-lower than a prexed value; we call them\FDR based decision (or selection) rules. The sensitivity and specicity of such rule depend on the accuracy of the FDR estimate, the over-estimation of FDR causing a loss of power and the under-estimation of FDR producing a loss of specicity. Moreover, our model has the interesting feature of still being able to provide an estimate of relative risk values as in the Besag York and Mollié model (1991). A simulation study to evaluate the model performance in FDR estimation accuracy, sensitivity and specificity of the decision rule, and goodness of estimation of relative risks, was set up. We chose a real map from which we generated several spatial scenarios whose counts of disease vary according to the spatial correlation degree, the size areas, the number of areas where the null hypothesis is true and the risk level in the latter areas. In summarizing simulation results we will always consider the FDR estimation in sets constituted by all b i's selected lower than a threshold t. We will show graphs of the\FDR and the true FDR (known by simulation) plotted against a threshold t to assess the FDR estimation. Varying the threshold we can learn which FDR values can be accurately estimated by the practitioner willing to apply the model (by the closeness between\FDR and true FDR). By plotting the calculated sensitivity and specicity (both known by simulation) vs the\FDR we can check the sensitivity and specicity of the corresponding\FDR based decision rules. For investigating the over-smoothing level of relative risk estimates we will compare box-plots of such estimates in high-risk areas (known by simulation), obtained by both our model and the classic Besag York Mollié model. All the summary tools are worked out for all simulated scenarios (in total 54 scenarios). Results show that FDR is well estimated (in the worst case we get an overestimation, hence a conservative FDR control) in small areas, low risk levels and spatially correlated risks scenarios, that are our primary aims. In such scenarios we have good estimates of the FDR for all values less or equal than 0.10. The sensitivity of\FDR based decision rules is generally low but specicity is high. In such scenario the use of\FDR = 0:05 or\FDR = 0:10 based selection rule can be suggested. In cases where the number of true alternative hypotheses (number of true high-risk areas) is small, also FDR = 0:15 values are well estimated, and \FDR = 0:15 based decision rules gains power maintaining an high specicity. On the other hand, in non-small areas and non-small risk level scenarios the FDR is under-estimated unless for very small values of it (much lower than 0.05); this resulting in a loss of specicity of a\FDR = 0:05 based decision rule. In such scenario\FDR = 0:05 or, even worse,\FDR = 0:1 based decision rules cannot be suggested because the true FDR is actually much higher. As regards the relative risk estimation, our model achieves almost the same results of the classic Besag York Molliè model. For this reason, our model is interesting for its ability to perform both the estimation of relative risk values and the FDR control, except for non-small areas and large risk level scenarios. A case of study is nally presented to show how the method can be used in epidemiology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kidney transplantation is the best treatment option for the restoration of excretory and endocrine kidney function in patients with end-stage renal disease. The success of the transplant is linked to the genetic compatibility between donor and recipient, and upon progress in surgery and immunosuppressive therapy. Numerous studies have established the importance of innate immunity in transplantation tolerance, in particular natural killer (NK) cells represent a population of cells involved in defense against infectious agents and tumor cells. NK cells express on their surface the Killer-cell Immunoglobulin-like Receptors (KIR) which, by recognizing and binding to MHC class I antigens, prevent the killing of autologous cells. In solid organ transplantation context, and in particular the kidney, recent studies show some correlation between the incompatibility KIR / HLA and outcome of transplantation so as to represent an interesting perspective, especially as regards setting of immunosuppressive therapy. The purpose of this study was therefore to assess whether the incompatibility between recipient KIR receptors and HLA class I ligands of the donor could be a useful predictor in order to improve the survival of the transplanted kidney and also to select patients who might benefit of a reduced regimen. One hundred and thirteen renal transplant patients from 1999 to 2005 were enrolled. Genomic DNA was extracted for each of them and their donors and genotyping of HLA A, B, C and 14 KIR genes was carried out. Data analysis was conducted on two case-control studies: one aimed at assessing the outcome of acute rejection and the other to assess the long term transplant outcome. The results showed that two genes, KIR2DS1 and KIR3DS1, are associated with the development of acute rejection (p = 0.02 and p = 0.05, respectively). The presence of the KIR2DS3 gene is associated with a better performance of serum creatinine and glomerular filtration rate (MDRD) over time (4 and 5 years after transplantation, p <0.05), while in the presence of ligand, the serum creatinine and MDRD trend seems to get worse in the long term. The analysis performed on the population, according to whether there was deterioration of renal function or not in the long term, showed that the absence of the KIR2DL1 gene is strongly associated with an increase of 20% of the creatinine value at 5 years, with a relative risk to having a greater creatinine level than the median 5-year equal to 2.7 95% (95% CI: 1.7788 - 2.6631). Finally, the presence of a kidney resulting negative for HLA-A3 / A11, compared to a positive result, in patients with KIR3DL2, showed a relative risk of having a serum creatinine above the median at 5 years after transplantation of 0.6609 (95% CI: 0.4529 -0.9643), suggesting a protective effect given to the absence of this ligand.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die Entstehung eines Marktpreises für einen Vermögenswert kann als Superposition der einzelnen Aktionen der Marktteilnehmer aufgefasst werden, die damit kumulativ Angebot und Nachfrage erzeugen. Dies ist in der statistischen Physik mit der Entstehung makroskopischer Eigenschaften vergleichbar, die von mikroskopischen Wechselwirkungen zwischen den beteiligten Systemkomponenten hervorgerufen werden. Die Verteilung der Preisänderungen an Finanzmärkten unterscheidet sich deutlich von einer Gaußverteilung. Dies führt zu empirischen Besonderheiten des Preisprozesses, zu denen neben dem Skalierungsverhalten nicht-triviale Korrelationsfunktionen und zeitlich gehäufte Volatilität zählen. In der vorliegenden Arbeit liegt der Fokus auf der Analyse von Finanzmarktzeitreihen und den darin enthaltenen Korrelationen. Es wird ein neues Verfahren zur Quantifizierung von Muster-basierten komplexen Korrelationen einer Zeitreihe entwickelt. Mit dieser Methodik werden signifikante Anzeichen dafür gefunden, dass sich typische Verhaltensmuster von Finanzmarktteilnehmern auf kurzen Zeitskalen manifestieren, dass also die Reaktion auf einen gegebenen Preisverlauf nicht rein zufällig ist, sondern vielmehr ähnliche Preisverläufe auch ähnliche Reaktionen hervorrufen. Ausgehend von der Untersuchung der komplexen Korrelationen in Finanzmarktzeitreihen wird die Frage behandelt, welche Eigenschaften sich beim Wechsel von einem positiven Trend zu einem negativen Trend verändern. Eine empirische Quantifizierung mittels Reskalierung liefert das Resultat, dass unabhängig von der betrachteten Zeitskala neue Preisextrema mit einem Anstieg des Transaktionsvolumens und einer Reduktion der Zeitintervalle zwischen Transaktionen einhergehen. Diese Abhängigkeiten weisen Charakteristika auf, die man auch in anderen komplexen Systemen in der Natur und speziell in physikalischen Systemen vorfindet. Über 9 Größenordnungen in der Zeit sind diese Eigenschaften auch unabhängig vom analysierten Markt - Trends, die nur für Sekunden bestehen, zeigen die gleiche Charakteristik wie Trends auf Zeitskalen von Monaten. Dies eröffnet die Möglichkeit, mehr über Finanzmarktblasen und deren Zusammenbrüche zu lernen, da Trends auf kleinen Zeitskalen viel häufiger auftreten. Zusätzlich wird eine Monte Carlo-basierte Simulation des Finanzmarktes analysiert und erweitert, um die empirischen Eigenschaften zu reproduzieren und Einblicke in deren Ursachen zu erhalten, die zum einen in der Finanzmarktmikrostruktur und andererseits in der Risikoaversion der Handelsteilnehmer zu suchen sind. Für die rechenzeitintensiven Verfahren kann mittels Parallelisierung auf einer Graphikkartenarchitektur eine deutliche Rechenzeitreduktion erreicht werden. Um das weite Spektrum an Einsatzbereichen von Graphikkarten zu aufzuzeigen, wird auch ein Standardmodell der statistischen Physik - das Ising-Modell - auf die Graphikkarte mit signifikanten Laufzeitvorteilen portiert. Teilresultate der Arbeit sind publiziert in [PGPS07, PPS08, Pre11, PVPS09b, PVPS09a, PS09, PS10a, SBF+10, BVP10, Pre10, PS10b, PSS10, SBF+11, PB10].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il danno epatico indotto dall'assunzione di farmaci viene comunemente indicato con il termine inglese DILI (Drug-Induced Liver Injury). Il paracetamolo rappresenta la causa più comune di DILI, seguito da antibiotici, FANS e farmaci antitubercolari. In particolare, i FANS sono una delle classi di farmaci maggiormente impiegate in terapia. Numerosi case report descrivono pazienti che hanno sviluppato danno epatico fatale durante il trattamento con FANS; molti di questi farmaci sono stati ritirati dal commercio in seguito a gravi reazioni avverse a carico del fegato. L'ultimo segnale di epatotossicità indotto da FANS è associato alla nimesulide; in alcuni paesi europei come la Finlandia, la Spagna e l'Irlanda, la nimesulide è stata sospesa dalla commercializzazione perché associata ad un'alta frequenza di epatotossicità. Sulla base dei dati disponibili fino a questo momento, l'Agenzia Europea dei Medicinali (EMA) ha recentemente concluso che i benefici del farmaco superano i rischi; un possibile aumento del rischio di epatotossicità associato a nimesulide rimane tuttavia una discussione aperta di cui ancora molto si dibatte. Tra le altre classi di farmaci che possono causare danno epatico acuto la cui incidenza tuttavia non è sempre ben definita sono gli antibiotici, quali amoxicillina e macrolidi, le statine e gli antidepressivi.Obiettivo dello studio è stato quello di determinare il rischio relativo di danno epatico indotto da farmaci con una prevalenza d'uso nella popolazione italiana maggiore o uguale al 6%. E’ stato disegnato uno studio caso controllo sviluppato intervistando pazienti ricoverati in reparti di diversi ospedali d’Italia. Il nostro studio ha messo in evidenza che il danno epatico da farmaci riguarda numerose classi farmacologiche e che la segnalazione di tali reazioni risulta essere statisticamente significativa per numerosi principi attivi. I dati preliminari hanno mostrato un valore di odds ratio significativo statisticamente per la nimesulide, i FANS, alcuni antibiotici come i macrolidi e il paracetamolo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Zur Verbesserung der Sicherheit und Effektivität der Phenprocoumon-Therapie wurden drei unterschiedliche Untersuchungen durchgeführt.rnZunächst wurde auf Grundlage bekannter Datenbanken und Informationsquellen zu Arznei-mittelinteraktionen (Drugdex, Abda Datenbank, Marcumar® Fachinformation, Coumarin-Interaktionsliste der Federatie van Nederlandse Trombosediensten, Review zu Warfarin-Interaktionen) eine handlungsorientierte Interaktionsdatenbank für Phenprocoumon erstellt. Dazu wurden in einer Übersichtstabelle relevante Informationen zu potentiellen Interaktionen für insgesamt 375 Arzneimittel zusammengestellt. Diese Tabelle wurde durch ein dreiköpfiges Expertenteam begutachtet und die potentiellen Interaktionspartner fünf verschiedenen Schweregraden und Stufen klinischer Relevanz zugeordnet. Für fast 50% der potentiellen Interaktionspartner wurden Handlungen als nicht erforderlich erachtet. Für die restlichen potentiellen Interaktionspartner wurden Handlungen zum klinischen Management der Interaktion in Abhängigkeit vom zeitlichen Zusammenhang mit der Phenprocoumon-Einnahme festgelegt. rnAnschließend wurde in einer Anwendungsbeobachtung der Zusammenhang zwischen der zusätzlichen Einnahme potentiell interagierender Arzneimittel (in der entwickelten Datenbank eingestuft mit dem Schweregrad „hoch“ und „sehr hoch“) und der Häufigkeit von Änderungen der Phenprocoumon-Wochendosis an 116 Patienten untersucht. Das relative Risiko für eine Dosisanpassung war bei Patienten in der Interaktions-Gruppe (n=23) signifikant erhöht (RR=1,9; p<0,001). Als weitere potentielle Einflussfaktoren stellten sich zunehmendes Alter (Alter 80-85 Jahre: RR=2; p<0,05), vielfache Komorbiditäten (4 Komorbiditäten: RR=2,1; p<0,05) und eingeschränkte Nieren- (RR=1,47; p>0,05) und Leberfunktion (RR=1,3; p>0,05) heraus.rnZur Untersuchung der Betreuungsqualität von VKA-Patienten im Thrombosedienst Mainz wurden retrospektiv die Daten von 118 Patienten ausgewertet. Als Qualitätsparameter wurden die prozentuale Häufigkeit von INR-Werten im Zielbereich, die TTR (Time in Therapeutic Range), die Dauer der NMH-Therapie, die Zeit bis zum Erreichen des Zielbereichs und der durchschnittliche Abstand zwischen zwei Kontrollterminen ermittelt. Im Median lag jeder Patient mit 73% der gemessenen INR-Werte und im individuellen Zielbereich. Die TTR betrug im Median 80%. Die Patienten benötigten 7 Tage zum Erreichen des Zielbereiches. Die NMH-Therapie wurde über 8 Tage durchgeführt. Die Patienten kamen im Median alle 11 Tage zu einem Kontrolltermin. Im Benchmark zu international publizierten Qualitätskenn-zahlen zur VKA-Therapie ist die Betreuungsqualität im Thrombosedienst Mainz als sehr gut einzustufen.rn

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives The purpose of this study was to assess the impact of renal insufficiency (RI) on the distribution pattern of peripheral arterial disease (PAD). We hypothesised that RI is associated with a distally accentuated involvement of the peripheral arterial tree. Design This is a retrospective analysis. Materials and Methods Analysis was based on a consecutive series of 2709 patients with chronic PAD of atherosclerotic origin undergoing primary endovascular treatment of lower-extremity arteries. Atherosclerotic pattern was grouped into femoropopliteal (n = 2085) and infragenicular (n = 892) disease according to target lesions treated while using iliac disease (n = 1133) as reference. Univariable and multivariable multinomial regression analyses were performed to assess relation with RI. Results are shown as relative risk ratio (RRRs) with 95% confidence intervals (95% CIs). A p < 0.05 was considered statistically significant. RI was defined as glomerular filtration rate (GFR) < 60 ml min−1 1.73 m−2. Results Presence of RI was an independent risk factor for a centrifugal lesion pattern (RRR 1.48, 95% CI: 1.17–1.86, p = 0.001). Moreover, a decrease in GFR by 10 ml min−1 1.73 m−2 was associated with an RRR of 1.08 for below-the-knee arterial disease (95% CI: 1.03–1.13, p = 0.003). Conclusion Presence and severity of RI are independent predictors of a distal obstructive pattern in patients with symptomatic PAD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Atrioventricular (AV) conduction impairment is well described after surgical aortic valve replacement, but little is known in patients undergoing transcatheter aortic valve implantation (TAVI). We assessed AV conduction and need for a permanent pacemaker in patients undergoing TAVI with the Medtronic CoreValve Revalving System (MCRS) or the Edwards Sapien Valve (ESV). Sixty-seven patients without pre-existing permanent pacemaker were included in the study. Forty-one patients (61%) and 26 patients (39%) underwent successful TAVI with the MCRS and ESV, respectively. Complete AV block occurred in 15 patients (22%), second-degree AV block in 4 (6%), and new left bundle branch block in 15 (22%), respectively. A permanent pacemaker was implanted in 23 patients (34%). Overall PR interval and QRS width increased significantly after the procedure (p <0.001 for the 2 comparisons). Implantation of the MCRS compared to the ESV resulted in a trend toward a higher rate of new left bundle branch block and complete AV block (29% vs 12%, p = 0.09 for the 2 comparisons). During follow-up, complete AV block resolved in 64% of patients. In multivariable regression analysis pre-existing right bundle branch block was the only independent predictor of complete AV block after TAVI (relative risk 7.3, 95% confidence interval 2.4 to 22.2). In conclusion, TAVI is associated with impairment of AV conduction in a considerable portion of patients, patients with pre-existing right bundle branch block are at increased risk of complete AV block, and complete AV block resolves over time in most patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Medication-related problems are common in the growing population of older adults and inappropriate prescribing is a preventable risk factor. Explicit criteria such as the Beers criteria provide a valid instrument for describing the rate of inappropriate medication (IM) prescriptions among older adults. Objective: To reduce IM prescriptions based on explicit Beers criteria using a nurse-led intervention in a nursing-home (NH) setting. Study Design: The pre/post-design included IM assessment at study start (pre-intervention), a 4-month intervention period, IM assessment after the intervention period (post-intervention) and a further IM assessment at 1-year follow-up. Setting: 204-bed inpatient NH in Bern, Switzerland. Participants: NH residents aged ≥60 years. Intervention: The intervention included four key intervention elements: (i) adaptation of Beers criteria to the Swiss setting; (ii) IM identification; (iii) IM discontinuation; and (iv) staff training. Main Outcome Measure: IM prescription at study start, after the 4-month intervention period and at 1-year follow-up. Results: The mean±SD resident age was 80.3±8.8 years. Residents were prescribed a mean±SD 7.8±4.0 medications. The prescription rate of IMs decreased from 14.5% pre-intervention to 2.8% post-intervention (relative risk [RR] = 0.2; 95% CI 0.06, 0.5). The risk of IM prescription increased nonstatistically significantly in the 1-year follow-up period compared with post-intervention (RR = 1.6; 95% CI 0.5, 6.1). Conclusions: This intervention to reduce IM prescriptions based on explicit Beers criteria was feasible, easy to implement in an NH setting, and resulted in a substantial decrease in IMs. These results underscore the importance of involving nursing staff in the medication prescription process in a long-term care setting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background There is concern that non-inferiority trials might be deliberately designed to conceal that a new treatment is less effective than a standard treatment. In order to test this hypothesis we performed a meta-analysis of non-inferiority trials to assess the average effect of experimental treatments compared with standard treatments. Methods One hundred and seventy non-inferiority treatment trials published in 121 core clinical journals were included. The trials were identified through a search of PubMed (1991 to 20 February 2009). Combined relative risk (RR) from meta-analysis comparing experimental with standard treatments was the main outcome measure. Results The 170 trials contributed a total of 175 independent comparisons of experimental with standard treatments. The combined RR for all 175 comparisons was 0.994 [95% confidence interval (CI) 0.978–1.010] using a random-effects model and 1.002 (95% CI 0.996–1.008) using a fixed-effects model. Of the 175 comparisons, experimental treatment was considered to be non-inferior in 130 (74%). The combined RR for these 130 comparisons was 0.995 (95% CI 0.983–1.006) and the point estimate favoured the experimental treatment in 58% (n = 76) and standard treatment in 42% (n = 54). The median non-inferiority margin (RR) pre-specified by trialists was 1.31 [inter-quartile range (IQR) 1.18–1.59]. Conclusion In this meta-analysis of non-inferiority trials the average RR comparing experimental with standard treatments was close to 1. The experimental treatments that gain a verdict of non-inferiority in published trials do not appear to be systematically less effective than the standard treatments. Importantly, publication bias and bias in the design and reporting of the studies cannot be ruled out and may have skewed the study results in favour of the experimental treatments. Further studies are required to examine the importance of such bias.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background During acute coronary syndromes patients perceive intense distress. We hypothesized that retrospective ratings of patients' MI-related fear of dying, helplessness, or pain, all assessed within the first year post-MI, are associated with poor cardiovascular outcome. Methods We studied 304 patients (61 ± 11 years, 85% men) who after a median of 52 days (range 12-365 days) after index MI retrospectively rated the level of distress in the form of fear of dying, helplessness, or pain they had perceived at the time of MI on a numeric scale ranging from 0 ("no distress") to 10 ("extreme distress"). Non-fatal hospital readmissions due to cardiovascular disease (CVD) related events (i.e., recurrent MI, elective and non-elective stent implantation, bypass surgery, pacemaker implantation, cerebrovascular incidents) were assessed at follow-up. The relative CVD event risk was computed for a (clinically meaningful) 2-point increase of distress using Cox proportional hazard models. Results During a median follow-up of 32 months (range 16-45), 45 patients (14.8%) experienced a CVD-related event requiring hospital readmission. Greater fear of dying (HR 1.21, 95% CI 1.03-1.43), helplessness (HR 1.22, 95% CI 1.04-1.44), or pain (HR 1.27, 95% CI 1.02-1.58) were significantly associated with an increased CVD risk without adjustment for covariates. A similarly increased relative risk emerged in patients with an unscheduled CVD-related hospital readmission, i.e., when excluding patients with elective stenting (fear of dying: HR 1.26, 95% CI 1.05-1.51; helplessness: 1.26, 95% CI 1.05-1.52; pain: HR 1.30, 95% CI 1.01-1.66). In the fully-adjusted models controlling for age, the number of diseased coronary vessels, hypertension, and smoking, HRs were 1.24 (95% CI 1.04-1.46) for fear of dying, 1.26 (95% CI 1.06-1.50) for helplessness, and 1.26 (95% CI 1.01-1.57) for pain. Conclusions Retrospectively perceived MI-related distress in the form of fear of dying, helplessness, or pain was associated with non-fatal cardiovascular outcome independent of other important prognostic factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. Measles control may be more challenging in regions with a high prevalence of HIV infection. HIV-infected children are likely to derive particular benefit from measles vaccines because of an increased risk of severe illness. However, HIV infection can impair vaccine effectiveness and may increase the risk of serious adverse events after receipt of live vaccines. We conducted a systematic review to assess the safety and immunogenicity of measles vaccine in HIV-infected children. Methods. The authors searched 8 databases through 12 February 2009 and reference lists. Study selection and data extraction were conducted in duplicate. Meta-analysis was conducted when appropriate. Results. Thirty-nine studies published from 1987 through 2008 were included. In 19 studies with information about measles vaccine safety, more than half reported no serious adverse events. Among HIV-infected children, 59% (95% confidence intervals [CI], 46–71%) were seropositive after receiving standard-titer measles vaccine at 6 months (1 study), comparable to the proportion of seropositive HIV-infected children vaccinated at 9 (8 studies) and 12 months (10 studies). Among HIV-exposed but uninfected and HIV-unexposed children, the proportion of seropositive children increased with increasing age at vaccination. Fewer HIV-infected children were protected after vaccination at 12 months than HIV-exposed but uninfected children (relative risk, 0.61; 95% CI, .50–.73). Conclusions. Measles vaccines appear to be safe in HIV-infected children, but the evidence is limited. When the burden of measles is high, measles vaccination at 6 months of age is likely to benefit children of HIV-infected women, regardless of the child's HIV infection status.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Current guidelines give recommendations for preferred combination antiretroviral therapy (cART). We investigated factors influencing the choice of initial cART in clinical practice and its outcome. METHODS We analyzed treatment-naive adults with human immunodeficiency virus (HIV) infection participating in the Swiss HIV Cohort Study and starting cART from January 1, 2005, through December 31, 2009. The primary end point was the choice of the initial antiretroviral regimen. Secondary end points were virologic suppression, the increase in CD4 cell counts from baseline, and treatment modification within 12 months after starting treatment. RESULTS A total of 1957 patients were analyzed. Tenofovir-emtricitabine (TDF-FTC)-efavirenz was the most frequently prescribed cART (29.9%), followed by TDF-FTC-lopinavir/r (16.9%), TDF-FTC-atazanavir/r (12.9%), zidovudine-lamivudine (ZDV-3TC)-lopinavir/r (12.8%), and abacavir/lamivudine (ABC-3TC)-efavirenz (5.7%). Differences in prescription were noted among different Swiss HIV Cohort Study sites (P < .001). In multivariate analysis, compared with TDF-FTC-efavirenz, starting TDF-FTC-lopinavir/r was associated with prior AIDS (relative risk ratio, 2.78; 95% CI, 1.78-4.35), HIV-RNA greater than 100 000 copies/mL (1.53; 1.07-2.18), and CD4 greater than 350 cells/μL (1.67; 1.04-2.70); TDF-FTC-atazanavir/r with a depressive disorder (1.77; 1.04-3.01), HIV-RNA greater than 100 000 copies/mL (1.54; 1.05-2.25), and an opiate substitution program (2.76; 1.09-7.00); and ZDV-3TC-lopinavir/r with female sex (3.89; 2.39-6.31) and CD4 cell counts greater than 350 cells/μL (4.50; 2.58-7.86). At 12 months, 1715 patients (87.6%) achieved viral load less than 50 copies/mL and CD4 cell counts increased by a median (interquartile range) of 173 (89-269) cells/μL. Virologic suppression was more likely with TDF-FTC-efavirenz, and CD4 increase was higher with ZDV-3TC-lopinavir/r. No differences in outcome were observed among Swiss HIV Cohort Study sites. CONCLUSIONS Large differences in prescription but not in outcome were observed among study sites. A trend toward individualized cART was noted suggesting that initial cART is significantly influenced by physician's preference and patient characteristics. Our study highlights the need for evidence-based data for determining the best initial regimen for different HIV-infected persons.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lifestyle changes should be considered before anything else in patients with dyslipidemia according to the new guidelines on dyslipidemias of the European Society of Cardiology (ESC) and the European Atherosclerosis Society (EAS). The guidelines recommend the SCORE system (Systematic Coronary Risk Estimation) to classify cardiovascular risk into four categories (very high, high, medium or low risk) as the basis for treatment decisions. HDL cholesterol, which is inversely proportional to cardiovascular risk, is included to the total risk estimation. In addition to calculating absolute risk, the guidelines contain a table with the relative risk, which could be useful in young patients with a low absolute risk, but high risk compared to individuals of the same age group.