913 resultados para Risk ratio
Resumo:
OBJECTIVES: To evaluate the influence of genetic polymorphisms on the susceptibility to Candida colonization and intra-abdominal candidiasis, a blood culture-negative life-threatening infection in high-risk surgical ICU patients. DESIGN: Prospective observational cohort study. SETTING: Surgical ICUs from two University hospitals of the Fungal Infection Network of Switzerland. PATIENTS: Eighty-nine patients at high risk for intra-abdominal candidiasis (68 with recurrent gastrointestinal perforation and 21 with acute necrotizing pancreatitis). MEASUREMENTS AND MAIN RESULTS: Eighteen single-nucleotide polymorphisms in 16 genes previously associated with development of fungal infections were analyzed from patient's DNA by using an Illumina Veracode genotyping platform. Candida colonization was defined by recovery of Candida species from at least one nonsterile site by twice weekly monitoring of cultures from oropharynx, stools, urine, skin, and/or respiratory tract. A corrected colonization index greater than or equal to 0.4 defined "heavy" colonization. Intra-abdominal candidiasis was defined by the presence of clinical symptoms and signs of peritonitis or intra-abdominal abscess and isolation of Candida species either in pure or mixed culture from intraoperatively collected abdominal samples. Single-nucleotide polymorphisms in three innate immune genes were associated with development of a Candida corrected colonization index greater than or equal to 0.4 (Toll-like receptor rs4986790, hazard ratio = 3.39; 95% CI, 1.45-7.93; p = 0.005) or occurrence of intra-abdominal candidiasis (tumor necrosis factor-α rs1800629, hazard ratio = 4.31; 95% CI, 1.85-10.1; p= 0.0007; β-defensin 1 rs1800972, hazard ratio = 3.21; 95% CI, 1.36-7.59; p = 0.008). CONCLUSION: We report a strong association between the promoter rs1800629 single-nucleotide polymorphism in tumor necrosis factor-α and an increased susceptibility to intra-abdominal candidiasis in a homogenous prospective cohort of high-risk surgical ICU patients. This finding highlights the relevance of the tumor necrosis factor-α functional polymorphism in immune response to fungal pathogens. Immunogenetic profiling in patients at clinical high risk followed by targeted antifungal interventions may improve the prevention or preemptive management of this life-threatening infection.
Resumo:
A high heart rate (HR) predicts future cardiovascular events. We explored the predictive value of HR in patients with high-risk hypertension and examined whether blood pressure reduction modifies this association. The participants were 15,193 patients with hypertension enrolled in the Valsartan Antihypertensive Long-term Use Evaluation (VALUE) trial and followed up for 5 years. The HR was assessed from electrocardiographic recordings obtained annually throughout the study period. The primary end point was the interval to cardiac events. After adjustment for confounders, the hazard ratio of the composite cardiac primary end point for a 10-beats/min of the baseline HR increment was 1.16 (95% confidence interval 1.12 to 1.20). Compared to the lowest HR quintile, the adjusted hazard ratio in the highest quintile was 1.73 (95% confidence interval 1.46 to 2.04). Compared to the pooled lower quintiles of baseline HR, the annual incidence of primary end point in the top baseline quintile was greater in each of the 5 study years (all p <0.05). The adjusted hazard ratio for the primary end point in the highest in-trial HR heart rate quintile versus the lowest quintile was 1.53 (95% confidence interval 1.26 to 1.85). The incidence of primary end points in the highest in-trial HR group compared to the pooled 4 lower quintiles was 53% greater in patients with well-controlled blood pressure (p <0.001) and 34% greater in those with uncontrolled blood pressure (p = 0.002). In conclusion, an increased HR is a long-term predictor of cardiovascular events in patients with high-risk hypertension. This effect was not modified by good blood pressure control. It is not yet known whether a therapeutic reduction of HR would improve cardiovascular prognosis.
Resumo:
Ces dernières années, de nombreuses recherches ont mis en évidence les effets toxiques des micropolluants organiques pour les espèces de nos lacs et rivières. Cependant, la plupart de ces études se sont focalisées sur la toxicité des substances individuelles, alors que les organismes sont exposés tous les jours à des milliers de substances en mélange. Or les effets de ces cocktails ne sont pas négligeables. Cette thèse de doctorat s'est ainsi intéressée aux modèles permettant de prédire le risque environnemental de ces cocktails pour le milieu aquatique. Le principal objectif a été d'évaluer le risque écologique des mélanges de substances chimiques mesurées dans le Léman, mais aussi d'apporter un regard critique sur les méthodologies utilisées afin de proposer certaines adaptations pour une meilleure estimation du risque. Dans la première partie de ce travail, le risque des mélanges de pesticides et médicaments pour le Rhône et pour le Léman a été établi en utilisant des approches envisagées notamment dans la législation européenne. Il s'agit d'approches de « screening », c'est-à-dire permettant une évaluation générale du risque des mélanges. Une telle approche permet de mettre en évidence les substances les plus problématiques, c'est-à-dire contribuant le plus à la toxicité du mélange. Dans notre cas, il s'agit essentiellement de 4 pesticides. L'étude met également en évidence que toutes les substances, même en trace infime, contribuent à l'effet du mélange. Cette constatation a des implications en terme de gestion de l'environnement. En effet, ceci implique qu'il faut réduire toutes les sources de polluants, et pas seulement les plus problématiques. Mais l'approche proposée présente également un biais important au niveau conceptuel, ce qui rend son utilisation discutable, en dehors d'un screening, et nécessiterait une adaptation au niveau des facteurs de sécurité employés. Dans une deuxième partie, l'étude s'est portée sur l'utilisation des modèles de mélanges dans le calcul de risque environnemental. En effet, les modèles de mélanges ont été développés et validés espèce par espèce, et non pour une évaluation sur l'écosystème en entier. Leur utilisation devrait donc passer par un calcul par espèce, ce qui est rarement fait dû au manque de données écotoxicologiques à disposition. Le but a été donc de comparer, avec des valeurs générées aléatoirement, le calcul de risque effectué selon une méthode rigoureuse, espèce par espèce, avec celui effectué classiquement où les modèles sont appliqués sur l'ensemble de la communauté sans tenir compte des variations inter-espèces. Les résultats sont dans la majorité des cas similaires, ce qui valide l'approche utilisée traditionnellement. En revanche, ce travail a permis de déterminer certains cas où l'application classique peut conduire à une sous- ou sur-estimation du risque. Enfin, une dernière partie de cette thèse s'est intéressée à l'influence que les cocktails de micropolluants ont pu avoir sur les communautés in situ. Pour ce faire, une approche en deux temps a été adoptée. Tout d'abord la toxicité de quatorze herbicides détectés dans le Léman a été déterminée. Sur la période étudiée, de 2004 à 2009, cette toxicité due aux herbicides a diminué, passant de 4% d'espèces affectées à moins de 1%. Ensuite, la question était de savoir si cette diminution de toxicité avait un impact sur le développement de certaines espèces au sein de la communauté des algues. Pour ce faire, l'utilisation statistique a permis d'isoler d'autres facteurs pouvant avoir une influence sur la flore, comme la température de l'eau ou la présence de phosphates, et ainsi de constater quelles espèces se sont révélées avoir été influencées, positivement ou négativement, par la diminution de la toxicité dans le lac au fil du temps. Fait intéressant, une partie d'entre-elles avait déjà montré des comportements similaires dans des études en mésocosmes. En conclusion, ce travail montre qu'il existe des modèles robustes pour prédire le risque des mélanges de micropolluants sur les espèces aquatiques, et qu'ils peuvent être utilisés pour expliquer le rôle des substances dans le fonctionnement des écosystèmes. Toutefois, ces modèles ont bien sûr des limites et des hypothèses sous-jacentes qu'il est important de considérer lors de leur application. - Depuis plusieurs années, les risques que posent les micropolluants organiques pour le milieu aquatique préoccupent grandement les scientifiques ainsi que notre société. En effet, de nombreuses recherches ont mis en évidence les effets toxiques que peuvent avoir ces substances chimiques sur les espèces de nos lacs et rivières, quand elles se retrouvent exposées à des concentrations aiguës ou chroniques. Cependant, la plupart de ces études se sont focalisées sur la toxicité des substances individuelles, c'est à dire considérées séparément. Actuellement, il en est de même dans les procédures de régulation européennes, concernant la partie évaluation du risque pour l'environnement d'une substance. Or, les organismes sont exposés tous les jours à des milliers de substances en mélange, et les effets de ces "cocktails" ne sont pas négligeables. L'évaluation du risque écologique que pose ces mélanges de substances doit donc être abordé par de la manière la plus appropriée et la plus fiable possible. Dans la première partie de cette thèse, nous nous sommes intéressés aux méthodes actuellement envisagées à être intégrées dans les législations européennes pour l'évaluation du risque des mélanges pour le milieu aquatique. Ces méthodes sont basées sur le modèle d'addition des concentrations, avec l'utilisation des valeurs de concentrations des substances estimées sans effet dans le milieu (PNEC), ou à partir des valeurs des concentrations d'effet (CE50) sur certaines espèces d'un niveau trophique avec la prise en compte de facteurs de sécurité. Nous avons appliqué ces méthodes à deux cas spécifiques, le lac Léman et le Rhône situés en Suisse, et discuté les résultats de ces applications. Ces premières étapes d'évaluation ont montré que le risque des mélanges pour ces cas d'étude atteint rapidement une valeur au dessus d'un seuil critique. Cette valeur atteinte est généralement due à deux ou trois substances principales. Les procédures proposées permettent donc d'identifier les substances les plus problématiques pour lesquelles des mesures de gestion, telles que la réduction de leur entrée dans le milieu aquatique, devraient être envisagées. Cependant, nous avons également constaté que le niveau de risque associé à ces mélanges de substances n'est pas négligeable, même sans tenir compte de ces substances principales. En effet, l'accumulation des substances, même en traces infimes, atteint un seuil critique, ce qui devient plus difficile en terme de gestion du risque. En outre, nous avons souligné un manque de fiabilité dans ces procédures, qui peuvent conduire à des résultats contradictoires en terme de risque. Ceci est lié à l'incompatibilité des facteurs de sécurité utilisés dans les différentes méthodes. Dans la deuxième partie de la thèse, nous avons étudié la fiabilité de méthodes plus avancées dans la prédiction de l'effet des mélanges pour les communautés évoluant dans le système aquatique. Ces méthodes reposent sur le modèle d'addition des concentrations (CA) ou d'addition des réponses (RA) appliqués sur les courbes de distribution de la sensibilité des espèces (SSD) aux substances. En effet, les modèles de mélanges ont été développés et validés pour être appliqués espèce par espèce, et non pas sur plusieurs espèces agrégées simultanément dans les courbes SSD. Nous avons ainsi proposé une procédure plus rigoureuse, pour l'évaluation du risque d'un mélange, qui serait d'appliquer d'abord les modèles CA ou RA à chaque espèce séparément, et, dans une deuxième étape, combiner les résultats afin d'établir une courbe SSD du mélange. Malheureusement, cette méthode n'est pas applicable dans la plupart des cas, car elle nécessite trop de données généralement indisponibles. Par conséquent, nous avons comparé, avec des valeurs générées aléatoirement, le calcul de risque effectué selon cette méthode plus rigoureuse, avec celle effectuée traditionnellement, afin de caractériser la robustesse de cette approche qui consiste à appliquer les modèles de mélange sur les courbes SSD. Nos résultats ont montré que l'utilisation de CA directement sur les SSDs peut conduire à une sous-estimation de la concentration du mélange affectant 5 % ou 50% des espèces, en particulier lorsque les substances présentent un grand écart- type dans leur distribution de la sensibilité des espèces. L'application du modèle RA peut quant à lui conduire à une sur- ou sous-estimations, principalement en fonction de la pente des courbes dose- réponse de chaque espèce composant les SSDs. La sous-estimation avec RA devient potentiellement importante lorsque le rapport entre la EC50 et la EC10 de la courbe dose-réponse des espèces est plus petit que 100. Toutefois, la plupart des substances, selon des cas réels, présentent des données d' écotoxicité qui font que le risque du mélange calculé par la méthode des modèles appliqués directement sur les SSDs reste cohérent et surestimerait plutôt légèrement le risque. Ces résultats valident ainsi l'approche utilisée traditionnellement. Néanmoins, il faut garder à l'esprit cette source d'erreur lorsqu'on procède à une évaluation du risque d'un mélange avec cette méthode traditionnelle, en particulier quand les SSD présentent une distribution des données en dehors des limites déterminées dans cette étude. Enfin, dans la dernière partie de cette thèse, nous avons confronté des prédictions de l'effet de mélange avec des changements biologiques observés dans l'environnement. Dans cette étude, nous avons utilisé des données venant d'un suivi à long terme d'un grand lac européen, le lac Léman, ce qui offrait la possibilité d'évaluer dans quelle mesure la prédiction de la toxicité des mélanges d'herbicide expliquait les changements dans la composition de la communauté phytoplanctonique. Ceci à côté d'autres paramètres classiques de limnologie tels que les nutriments. Pour atteindre cet objectif, nous avons déterminé la toxicité des mélanges sur plusieurs années de 14 herbicides régulièrement détectés dans le lac, en utilisant les modèles CA et RA avec les courbes de distribution de la sensibilité des espèces. Un gradient temporel de toxicité décroissant a pu être constaté de 2004 à 2009. Une analyse de redondance et de redondance partielle, a montré que ce gradient explique une partie significative de la variation de la composition de la communauté phytoplanctonique, même après avoir enlevé l'effet de toutes les autres co-variables. De plus, certaines espèces révélées pour avoir été influencées, positivement ou négativement, par la diminution de la toxicité dans le lac au fil du temps, ont montré des comportements similaires dans des études en mésocosmes. On peut en conclure que la toxicité du mélange herbicide est l'un des paramètres clés pour expliquer les changements de phytoplancton dans le lac Léman. En conclusion, il existe diverses méthodes pour prédire le risque des mélanges de micropolluants sur les espèces aquatiques et celui-ci peut jouer un rôle dans le fonctionnement des écosystèmes. Toutefois, ces modèles ont bien sûr des limites et des hypothèses sous-jacentes qu'il est important de considérer lors de leur application, avant d'utiliser leurs résultats pour la gestion des risques environnementaux. - For several years now, the scientists as well as the society is concerned by the aquatic risk organic micropollutants may pose. Indeed, several researches have shown the toxic effects these substances may induce on organisms living in our lakes or rivers, especially when they are exposed to acute or chronic concentrations. However, most of the studies focused on the toxicity of single compounds, i.e. considered individually. The same also goes in the current European regulations concerning the risk assessment procedures for the environment of these substances. But aquatic organisms are typically exposed every day simultaneously to thousands of organic compounds. The toxic effects resulting of these "cocktails" cannot be neglected. The ecological risk assessment of mixtures of such compounds has therefore to be addressed by scientists in the most reliable and appropriate way. In the first part of this thesis, the procedures currently envisioned for the aquatic mixture risk assessment in European legislations are described. These methodologies are based on the mixture model of concentration addition and the use of the predicted no effect concentrations (PNEC) or effect concentrations (EC50) with assessment factors. These principal approaches were applied to two specific case studies, Lake Geneva and the River Rhône in Switzerland, including a discussion of the outcomes of such applications. These first level assessments showed that the mixture risks for these studied cases exceeded rapidly the critical value. This exceeding is generally due to two or three main substances. The proposed procedures allow therefore the identification of the most problematic substances for which management measures, such as a reduction of the entrance to the aquatic environment, should be envisioned. However, it was also showed that the risk levels associated with mixtures of compounds are not negligible, even without considering these main substances. Indeed, it is the sum of the substances that is problematic, which is more challenging in term of risk management. Moreover, a lack of reliability in the procedures was highlighted, which can lead to contradictory results in terms of risk. This result is linked to the inconsistency in the assessment factors applied in the different methods. In the second part of the thesis, the reliability of the more advanced procedures to predict the mixture effect to communities in the aquatic system were investigated. These established methodologies combine the model of concentration addition (CA) or response addition (RA) with species sensitivity distribution curves (SSD). Indeed, the mixture effect predictions were shown to be consistent only when the mixture models are applied on a single species, and not on several species simultaneously aggregated to SSDs. Hence, A more stringent procedure for mixture risk assessment is proposed, that would be to apply first the CA or RA models to each species separately and, in a second step, to combine the results to build an SSD for a mixture. Unfortunately, this methodology is not applicable in most cases, because it requires large data sets usually not available. Therefore, the differences between the two methodologies were studied with datasets created artificially to characterize the robustness of the traditional approach applying models on species sensitivity distribution. The results showed that the use of CA on SSD directly might lead to underestimations of the mixture concentration affecting 5% or 50% of species, especially when substances present a large standard deviation of the distribution from the sensitivity of the species. The application of RA can lead to over- or underestimates, depending mainly on the slope of the dose-response curves of the individual species. The potential underestimation with RA becomes important when the ratio between the EC50 and the EC10 for the dose-response curve of the species composing the SSD are smaller than 100. However, considering common real cases of ecotoxicity data for substances, the mixture risk calculated by the methodology applying mixture models directly on SSDs remains consistent and would rather slightly overestimate the risk. These results can be used as a theoretical validation of the currently applied methodology. Nevertheless, when assessing the risk of mixtures, one has to keep in mind this source of error with this classical methodology, especially when SSDs present a distribution of the data outside the range determined in this study Finally, in the last part of this thesis, we confronted the mixture effect predictions with biological changes observed in the environment. In this study, long-term monitoring of a European great lake, Lake Geneva, provides the opportunity to assess to what extent the predicted toxicity of herbicide mixtures explains the changes in the composition of the phytoplankton community next to other classical limnology parameters such as nutrients. To reach this goal, the gradient of the mixture toxicity of 14 herbicides regularly detected in the lake was calculated, using concentration addition and response addition models. A decreasing temporal gradient of toxicity was observed from 2004 to 2009. Redundancy analysis and partial redundancy analysis showed that this gradient explains a significant portion of the variation in phytoplankton community composition, even when having removed the effect of all other co-variables. Moreover, some species that were revealed to be influenced positively or negatively, by the decrease of toxicity in the lake over time, showed similar behaviors in mesocosms studies. It could be concluded that the herbicide mixture toxicity is one of the key parameters to explain phytoplankton changes in Lake Geneva. To conclude, different methods exist to predict the risk of mixture in the ecosystems. But their reliability varies depending on the underlying hypotheses. One should therefore carefully consider these hypotheses, as well as the limits of the approaches, before using the results for environmental risk management
Resumo:
BACKGROUND: The risk of falls is the most commonly cited reason for not providing oral anticoagulation, although the risk of bleeding associated with falls on oral anticoagulants is still debated. We aimed to evaluate whether patients on oral anticoagulation with high falls risk have an increased risk of major bleeding. METHODS: We prospectively studied consecutive adult medical patients who were discharged on oral anticoagulants. The outcome was the time to a first major bleed within a 12-month follow-up period adjusted for age, sex, alcohol abuse, number of drugs, concomitant treatment with antiplatelet agents, and history of stroke or transient ischemic attack. RESULTS: Among the 515 enrolled patients, 35 patients had a first major bleed during follow-up (incidence rate: 7.5 per 100 patient-years). Overall, 308 patients (59.8%) were at high risk of falls, and these patients had a nonsignificantly higher crude incidence rate of major bleeding than patients at low risk of falls (8.0 vs 6.8 per 100 patient-years, P=.64). In multivariate analysis, a high falls risk was not statistically significantly associated with the risk of a major bleed (hazard ratio 1.09; 95% confidence interval, 0.54-2.21). Overall, only 3 major bleeds occurred directly after a fall (incidence rate: 0.6 per 100 patient-years). CONCLUSIONS: In this prospective cohort, patients on oral anticoagulants at high risk of falls did not have a significantly increased risk of major bleeds. These findings suggest that being at risk of falls is not a valid reason to avoid oral anticoagulants in medical patients.
Resumo:
Background: there is little information regarding the health status of migrants compared to subjects who remained in their country of origin. The aim was to compare Portuguese living in Porto (Portugal) with Portuguese migrants living in Lausanne (Switzerland). Design: cross-sectional studies conducted in Porto (EpiPorto, n=1150) and Lausanne (CoLaus, n=388) among Portuguese subjects aged between 35 and 65 years. Methods: body mass index, blood pressure, cholesterol and glucose levels were assessed using standardized procedures. Educational level, antihypertensive, hypocholesterolemic and antidiabetic treatments were collected using questionnaires. Results: Portuguese living in Lausanne were younger, more frequently male and had a lower education than Portuguese living in Porto. After multivariate adjustment, Portuguese living in Porto had a higher likelihood of being obese [Odds ratio and 95% confidence interval: 1.40 (1.01-1.94)] or abdominal obese [OR: 1.40 (1.02-1.93)] than Portuguese living in Lausanne. Portuguese living in Porto had a higher likelihood of being hypertensive than Portuguese living in Lausanne [OR: 1.38 (1.01-1.90)], while no differences were found regarding hypertension management and control. Portuguese living in Porto had a higher likelihood of being hypercholesterolemic [OR: 1.40 (1.06-1.85)] and were less likely to be treated [OR: 0.47 (0.27-0.83)] and controlled [OR: 0.47 (0.27-0.83)] than Portuguese living in Lausanne. Finally, no differences were found regarding smoking, prevalence and management of diabetes. Conclusion: Portuguese living in Lausanne, Switzerland, present a better cardiovascular risk profile and tend to be better managed regarding their cardiovascular risk factors than Portuguese living in Porto, Portugal.
Resumo:
QUESTIONS UNDER STUDY: Iron deficiency with or without anaemia is the most common deficiency in the world. Its prevalence is higher in developing countries and in low socioeconomic populations. We aimed at determining and comparing the prevalence of iron deficiency in an immigrant and non-immigrant population. METHODS: Every child scheduled for a routine check-up at 12 months of age was allowed to participate in the study. Haemoglobin, ferritin, anthropometric data, familial and nutritional status were measured. RESULTS: 586 infants were eligible and 463 were included in the study as they had assessment data at 12 months. Children were divided into two groups: immigrants' children and non-immigrants' children. The global prevalence of iron deficiency was 5.7% at 12 months. A significant difference for iron deficiency was noticed between the groups at 12 months (p = 0.01). Among risk factors, immigration (odds ratio 2.91; 95% CI 1.05-8.04) and unemployment (odds ratio 6.08; 95% CI 1.18-31.30) had the higher odds in the multivariable analysis. CONCLUSION: The prevalence of iron deficiency in the immigrant population is higher than in non-immigrants. Immigration and the category of employment are risk factors for iron deficiency, as starting baby cereals before 9 months is a protective factor. Good socioeconomic conditions in Switzerland, the quality of food for pregnant women and young infants may be the explanation. A study up to five years of age is necessary before drawing general conclusions on infancy.
Resumo:
OBJECTIVE: To investigate HIV-related immunodeficiency as a risk factor for hepatocellular carcinoma (HCC) among persons infected with HIV, while controlling for the effect of frequent coinfection with hepatitis C and B viruses. DESIGN: A case-control study nested in the Swiss HIV Cohort Study. METHODS: Twenty-six HCC patients were identified in the Swiss HIV Cohort Study or through linkage with Swiss Cancer Registries, and were individually matched to 251 controls according to Swiss HIV Cohort Study centre, sex, HIV-transmission category, age and year at enrollment. Odds ratios and corresponding confidence intervals were estimated by conditional logistic regression. RESULTS: All HCC patients were positive for hepatitis B surface antigen or antibodies against hepatitis C virus. HCC patients included 14 injection drug users (three positive for hepatitis B surface antigen and 13 for antibodies against hepatitis C virus) and 12 men having sex with men/heterosexual/other (11 positive for hepatitis B surface antigen, three for antibodies against hepatitis C virus), revealing a strong relationship between HIV transmission route and hepatitis viral type. Latest CD4+ cell count [Odds ratio (OR) per 100 cells/mul decrease = 1.33, 95% confidence interval (CI) 1.06-1.68] and CD4+ cell count percentage (OR per 10% decrease = 1.65, 95% CI 1.01-2.71) were significantly associated with HCC. The effects of CD4+ cell count were concentrated among men having sex with men/heterosexual/other rather than injecting drug users. Highly active antiretroviral therapy use was not significantly associated with HCC risk (OR for ever versus never = 0.59, 95% confidence interval 0.18-1.91). CONCLUSION: Lower CD4+ cell counts increased the risk for HCC among persons infected with HIV, an effect that was particularly evident for hepatitis B virus-related HCC arising in non-injecting drug users.
Resumo:
BACKGROUND & AIMS: Development of strictures is a major concern for patients with eosinophilic esophagitis (EoE). At diagnosis, EoE can present with an inflammatory phenotype (characterized by whitish exudates, furrows, and edema), a stricturing phenotype (characterized by rings and stenosis), or a combination of these. Little is known about progression of stricture formation; we evaluated stricture development over time in the absence of treatment and investigated risk factors for stricture formation. METHODS: We performed a retrospective study using the Swiss EoE Database, collecting data on 200 patients with symptomatic EoE (153 men; mean age at diagnosis, 39 ± 15 years old). Stricture severity was graded based on the degree of difficulty associated with passing of the standard adult endoscope. RESULTS: The median delay in diagnosis of EoE was 6 years (interquartile range, 2-12 years). With increasing duration of delay in diagnosis, the prevalence of fibrotic features of EoE, based on endoscopy, increased from 46.5% (diagnostic delay, 0-2 years) to 87.5% (diagnostic delay, >20 years; P = .020). Similarly, the prevalence of esophageal strictures increased with duration of diagnostic delay, from 17.2% (diagnostic delay, 0-2 years) to 70.8% (diagnostic delay, >20 years; P < .001). Diagnostic delay was the only risk factor for strictures at the time of EoE diagnosis (odds ratio = 1.08; 95% confidence interval: 1.040-1.122; P < .001). CONCLUSIONS: The prevalence of esophageal strictures correlates with the duration of untreated disease. These findings indicate the need to minimize delay in diagnosis of EoE.
Resumo:
The present paper focuses on the analysis and discussion of a likelihood ratio (LR) development for propositions at a hierarchical level known in the context as 'offence level'. Existing literature on the topic has considered LR developments for so-called offender to scene transfer cases. These settings involve-in their simplest form-a single stain found on a crime scene, but with possible uncertainty about the degree to which that stain is relevant (i.e. that it has been left by the offender). Extensions to multiple stains or multiple offenders have also been reported. The purpose of this paper is to discuss a development of a LR for offence level propositions when case settings involve potential transfer in the opposite direction, i.e. victim/scene to offender transfer. This setting has previously not yet been considered. The rationale behind the proposed LR is illustrated through graphical probability models (i.e. Bayesian networks). The role of various uncertain parameters is investigated through sensitivity analyses as well as simulations.
Resumo:
A hospital-based case-control study of 86 cases of thyroid cancer and 317 controls was done in the Swiss Canton of Vaud. Patients with thyroid cancer tended to be better educated (odds ratio [OR] 2.1 for greater than or equal to 14 vs. less than or equal to 8 years of education 95% CI 1.1-4.1) and of higher social class than controls. Cases more often had a history of benign thyroid nodules (OR 25.2, 95% CI 7.6-83.6) and non-toxic goitre (OR 5.3, 95% CI 2.5-11.2). Furthermore, patients with thyroid cancer were more likely to have resided in endemic goitre areas (OR 1.7, 95% CI 1.0-3.0) and to have had first-degree relatives affected by benign thyroid disease (OR 3.9, 95% CI 2.1-7.1). Therefore, this study offers quantitative evidence of the association between various thyroid diseases and the risk of thyroid cancer which, despite difficulties in the classification of benign and malignant thyroid diseases, is remarkably consistent in studies from different countries.
Resumo:
The method of instrumental variable (referred to as Mendelian randomization when the instrument is a genetic variant) has been initially developed to infer on a causal effect of a risk factor on some outcome of interest in a linear model. Adapting this method to nonlinear models, however, is known to be problematic. In this paper, we consider the simple case when the genetic instrument, the risk factor, and the outcome are all binary. We compare via simulations the usual two-stages estimate of a causal odds-ratio and its adjusted version with a recently proposed estimate in the context of a clinical trial with noncompliance. In contrast to the former two, we confirm that the latter is (under some conditions) a valid estimate of a causal odds-ratio defined in the subpopulation of compliers, and we propose its use in the context of Mendelian randomization. By analogy with a clinical trial with noncompliance, compliers are those individuals for whom the presence/absence of the risk factor X is determined by the presence/absence of the genetic variant Z (i.e., for whom we would observe X = Z whatever the alleles randomly received at conception). We also recall and illustrate the huge variability of instrumental variable estimates when the instrument is weak (i.e., with a low percentage of compliers, as is typically the case with genetic instruments for which this proportion is frequently smaller than 10%) where the inter-quartile range of our simulated estimates was up to 18 times higher compared to a conventional (e.g., intention-to-treat) approach. We thus conclude that the need to find stronger instruments is probably as important as the need to develop a methodology allowing to consistently estimate a causal odds-ratio.
Resumo:
OBJECTIVES: The goal of this study was to determine whether subclinical thyroid dysfunction was associated with incident heart failure (HF) and echocardiogram abnormalities. BACKGROUND: Subclinical hypothyroidism and hyperthyroidism have been associated with cardiac dysfunction. However, long-term data on the risk of HF are limited. METHODS: We studied 3,044 adults>or=65 years of age who initially were free of HF in the Cardiovascular Health Study. We compared adjudicated HF events over a mean 12-year follow-up and changes in cardiac function over the course of 5 years among euthyroid participants, those with subclinical hypothyroidism (subdivided by thyroid-stimulating hormone [TSH] levels: 4.5 to 9.9, >or=10.0 mU/l), and those with subclinical hyperthyroidism. RESULTS: Over the course of 12 years, 736 participants developed HF events. Participants with TSH>or=10.0 mU/l had a greater incidence of HF compared with euthyroid participants (41.7 vs. 22.9 per 1,000 person years, p=0.01; adjusted hazard ratio: 1.88; 95% confidence interval: 1.05 to 3.34). Baseline peak E velocity, which is an echocardiographic measurement of diastolic function associated with incident HF in the CHS cohort, was greater in those patients with TSH>or=10.0 mU/l compared with euthyroid participants (0.80 m/s vs. 0.72 m/s, p=0.002). Over the course of 5 years, left ventricular mass increased among those with TSH>or=10.0 mU/l, but other echocardiographic measurements were unchanged. Those patients with TSH 4.5 to 9.9 mU/l or with subclinical hyperthyroidism had no increase in risk of HF. CONCLUSIONS: Compared with euthyroid older adults, those adults with TSH>or=10.0 mU/l have a moderately increased risk of HF and alterations in cardiac function but not older adults with TSH<10.0 mU/l. Clinical trials should assess whether the risk of HF might be ameliorated by thyroxine replacement in individuals with TSH>or=10.0 mU/l.
Resumo:
Cardiovascular risk assessment might be improved with the addition of emerging, new tests derived from atherosclerosis imaging, laboratory tests or functional tests. This article reviews relative risk, odds ratios, receiver-operating curves, posttest risk calculations based on likelihood ratios, the net reclassification improvement and integrated discrimination. This serves to determine whether a new test has an added clinical value on top of conventional risk testing and how this can be verified statistically. Two clinically meaningful examples serve to illustrate novel approaches. This work serves as a review and basic work for the development of new guidelines on cardiovascular risk prediction, taking into account emerging tests, to be proposed by members of the 'Taskforce on Vascular Risk Prediction' under the auspices of the Working Group 'Swiss Atherosclerosis' of the Swiss Society of Cardiology in the future.
Resumo:
ABSTRACT:: Adherence patterns and their influence on virologic outcome are well characterized for protease inhibitor (PI)- and non-nucleoside reverse transcriptase inhibitor (NNRTI)-based regimens. We aimed to determine how patterns of adherence to raltegravir influence the risk of virological failure. We conducted a prospective multicenter cohort following 81 HIV-infected antiretroviral-naive or experienced subjects receiving or starting twice-a-day raltegravir-based antiretroviral therapy. Their adherence patterns were monitored using the Medication Events Monitoring System. During follow-up (188 days, ±77), 12 (15%) of 81 subjects experienced virological failure. Longer treatment interruption [adjusted odds ratio per 24-hour increase: 2.4; 95% confidence interval: 1.2 to 6.9; P < 0.02] and average adherence (odds ratio per 5% increase: 0.68; 95% confidence interval: 0.46 to 1.00, P < 0.05) were both independently associated with virological failure controlling for prior duration of viral suppression. Timely interdose intervals and high levels of adherence to raltegravir are both necessary to control HIV replication.
Resumo:
BACKGROUND: High-risk sexual behaviors have been suggested as drivers of the recent dramatic increase of sexually transmitted hepatitis C virus (HCV) among human immunodeficiency virus (HIV)-infected men who have sex with men (MSM). METHODS: We assessed the association between the genetic bottleneck of HIV at transmission and the prevalence and incidence of HCV coinfection in HIV-infected MSM from the Swiss HIV Cohort Study (SHCS). As a proxy for the width of the transmission bottleneck, we used the fraction of ambiguous nucleotides detected by genotypic resistance tests sampled during early HIV infection. We defined a broad bottleneck as a fraction of ambiguous nucleotides exceeding a previously established threshold (0.5%). RESULTS: From the SHCS, we identified 671 MSM with available results of HCV serologic tests and with an HIV genotypic resistance test performed during early HIV infection. Of those, 161 (24.0%) exhibited a broad HIV transmission bottleneck, 38 (5.7%) had at least 1 positive HCV test result, and 26 (3.9%) had an incident HCV infection. Individuals with broad HIV transmission bottlenecks exhibited a 2-fold higher odds of having ever experienced an HCV coinfection (odds ratio, 2.2 [95% confidence interval {CI}, 1.1-4.3]) and a 3-fold higher hazard of having an incident HCV infection (hazard ratio, 3.0 [95% CI, 1.4-6.6]) than individuals with narrow HIV transmission bottlenecks. CONCLUSIONS: Our results indicate that the currently occurring sexual spread of HCV is focused on MSM who are prone to exhibit broad HIV transmission bottlenecks. This is consistent with an important role of high-risk behavior and mucosal barrier impairment in the transmission of HCV among MSM.