968 resultados para Subsequent Risk


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Treatment of septic hand tenosynovitis is complex, and often requires multiple débridements and prolonged antibiotic therapy. The authors undertook this study to identify factors that might be associated with the need for subsequent débridement (after the initial one) because of persistence or secondary worsening of infection. METHODS: In this retrospective single-center study, the authors included all adult patients who presented to their emergency department from 2007 to 2010 with septic tenosynovitis of the hand. RESULTS: The authors identified 126 adult patients (55 men; median age, 45 years), nine of whom were immunosuppressed. All had community-acquired infection; 34 (27 percent) had a subcutaneous abscess and eight (6 percent) were febrile. All underwent at least one surgical débridement and had concomitant antibiotic therapy (median, 15 days; range, 7 to 82 days). At least one additional surgical intervention was required in 18 cases (median, 1.13 interventions; range, one to five interventions). All but four episodes (97 percent) were cured of infection on the first attempt after a median follow-up of 27 months. By multivariate analysis, only two factors were significantly associated with the outcome "subsequent surgical débridement": abscess (OR, 4.6; 95 percent CI, 1.5 to 14.0) and longer duration of antibiotic therapy (OR, 1.2; 95 percent CI, 1.1 to 1.2). CONCLUSION: In septic tenosynovitis of the hand, the only presenting factor that was statistically predictive of an increased risk of needing a second débridement was the presence of a subcutaneous abscess. CLINICAL QUESTION/LEVEL OF EVIDENCE: Risk, III.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Temporary increases in plasma HIV RNA ('blips') are common in HIV patients on combination antiretroviral therapy (cART). Blips above 500 copies/mL have been associated with subsequent viral rebound. It is not clear if this relationship still holds when measurements are made using newer more sensitive assays. METHODS: We selected antiretroviral-naive patients that then recorded one or more episodes of viral suppression on cART with HIV RNA measurements made using more sensitive assays (lower limit of detection below 50 copies/ml). We estimated the association in these episodes between blip magnitude and the time to viral rebound. RESULTS: Four thousand ninety-four patients recorded a first episode of viral suppression on cART using more sensitive assays; 1672 patients recorded at least one subsequent suppression episode. Most suppression episodes (87 %) were recorded with TaqMan version 1 or 2 assays. Of the 2035 blips recorded, 84 %, 12 % and 4 % were of low (50-199 copies/mL), medium (200-499 copies/mL) and high (500-999 copies/mL) magnitude respectively. The risk of viral rebound increased as blip magnitude increased with hazard ratios of 1.20 (95 % CI 0.89-1.61), 1.42 (95 % CI 0.96-2.19) and 1.93 (95 % CI 1.24-3.01) for low, medium and high magnitude blips respectively; an increase of hazard ratio 1.09 (95 % CI 1.03 to 1.15) per 100 copies/mL of HIV RNA. CONCLUSIONS: With the more sensitive assays now commonly used for monitoring patients, blips above 200 copies/mL are increasingly likely to lead to viral rebound and should prompt a discussion about adherence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this book, I apply a philosophical approach to study the precautionary principle in environmental (and health) risk decision-making. The principle says that unacceptable environmental and health risks should be anticipated, and they ought to be forestalled before the damage comes to fruition even if scientific understanding of the risks is inadequate. The study consists of introductory chapters, summary and seven original publications which aim at explicating the principle, critically analysing the debate on the principle, and constructing a basis for the well-founded use of the principle. Papers I-V present the main thesis of this research. In the two last papers, the discussion is widened to new directions. The starting question is how well the currently embraced precautionary principle stands up to critical philosophical scrutiny. The approach employed is analytical: mainly conceptual, argumentative and ethical. The study draws upon Anglo-American style philosophy on the one hand, and upon sources of law as well as concrete cases and decision-making practices at the European Union level and in its member countries on the other. The framework is environmental (and health) risk governance, including the related law and policy. The main thesis of this study is that the debate on the precautionary principle needs to be shifted from the question of whether the principle (or its weak or strong interpretation) is well-grounded in general to questions about the theoretical plausibility and ethical and socio-political justifiability of specific understandings of the principle. The real picture of the precautionary principle is more complex than that found (i.e. presumed) in much of the current academic, political and public debate surrounding it. While certain presumptions and interpretations of the principle are found to be sound, others are theoretically flawed or include serious practical problems. The analysis discloses conceptual and ethical presumptions and elementary understandings of the precautionary principle, critically assesses current practices invoked in the name of the precautionary principle and public participation, and seeks to build bridges between precaution, engagement and philosophical ethics. Hence, it is intended to provide a sound basis upon which subsequent academic scrutiny can build.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The diagnosis of Pulmonary Embolism (PE) in the emergency department (ED) is crucial. As emergency physicians fear missing this potential life-threatening condition, PE tends to be over-investigated, exposing patients to unnecessary risks and uncertain benefit in terms of outcome. The Pulmonary Embolism Rule-out Criteria (PERC) is an eight-item block of clinical criteria that can identify patients who can safely be discharged from the ED without further investigation for PE. The endorsement of this rule could markedly reduce the number of irradiative imaging studies, ED length of stay, and rate of adverse events resulting from both diagnostic and therapeutic interventions. Several retrospective and prospective studies have shown the safety and benefits of the PERC rule for PE diagnosis in low-risk patients, but the validity of this rule is still controversial. We hypothesize that in European patients with a low gestalt clinical probability and who are PERC-negative, PE can be safely ruled out and the patient discharged without further testing. METHODS/DESIGN: This is a controlled, cluster randomized trial, in 15 centers in France. Each center will be randomized for the sequence of intervention periods: a 6-month intervention period (PERC-based strategy) followed by a 6-month control period (usual care), or in reverse order, with 2 months of "wash-out" between the 2 periods. Adult patients presenting to the ED with a suspicion of PE and a low pre test probability estimated by clinical gestalt will be eligible. The primary outcome is the percentage of failure resulting from the diagnostic strategy, defined as diagnosed venous thromboembolic events at 3-month follow-up, among patients for whom PE has been initially ruled out. DISCUSSION: The PERC rule has the potential to decrease the number of irradiative imaging studies in the ED, and is reported to be safe. However, no randomized study has ever validated the safety of PERC. Furthermore, some studies have challenged the safety of a PERC-based strategy to rule-out PE, especially in Europe where the prevalence of PE diagnosed in the ED is high. The PROPER study should provide high-quality evidence to settle this issue. If it confirms the safety of the PERC rule, physicians will be able to reduce the number of investigations, associated subsequent adverse events, costs, and ED length of stay for patients with a low clinical probability of PE. TRIAL REGISTRATION: NCT02375919 .

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alcohol misuse is the leading cause of cirrhosis and the second most common indication for liver transplantation in the Western world. We performed a genome-wide association study for alcohol-related cirrhosis in individuals of European descent (712 cases and 1,426 controls) with subsequent validation in two independent European cohorts (1,148 cases and 922 controls). We identified variants in the MBOAT7 (P = 1.03 × 10(-9)) and TM6SF2 (P = 7.89 × 10(-10)) genes as new risk loci and confirmed rs738409 in PNPLA3 as an important risk locus for alcohol-related cirrhosis (P = 1.54 × 10(-48)) at a genome-wide level of significance. These three loci have a role in lipid processing, suggesting that lipid turnover is important in the pathogenesis of alcohol-related cirrhosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The role of airway inflammation in ventilated preterm newborns and the risk factors associated with the development of chronic lung disease are not well understood. Our objective was to analyze the association of the airway inflammatory response in ventilated preterm infants by serial measurements of TNF-a and IL-10 in tracheobronchial lavage (TBL) with perinatal factors and lung function measured early in life. A series of TBL samples were collected from ventilated preterm infants (less than 32 weeks of gestational age) and concentrations of TNF-a and IL-10 were measured by ELISA. Pulmonary function tests were performed after discharge by the raised volume rapid compression technique. Twenty-five subjects were recruited and 70 TBL samples were obtained. There was a significant positive association between TNF-a and IL-10 levels and length of time between the rupture of the amniotic membranes and delivery (r = 0.65, P = 0.002, and r = 0.57, P < 0.001, respectively). Lung function was measured between 1 and 22 weeks of corrected age in 10 patients. Multivariable analysis with adjustment for differences in lung volume showed a significant negative association between TNF-a levels and forced expiratory flow (FEF50; r = -0.6; P = 0.04), FEF75 (r = -0.76; P = 0.02), FEF85 (r = -0.75; P = 0.03), FEF25-75 (-0.71; P = 0.02), and FEV0.5 (r = -0.39; P = 0.03). These data suggest that TNF-a levels in the airways during the first days of life were associated with subsequent lung function abnormalities measured weeks or months later.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study aim was to investigate the relationship between factors related to personal cancer history and lung cancer risk as well as assess their predictive utility. Characteristics of interest included the number, anatomical site(s), and age of onset of previous cancer(s). Data from the Prostate, Lung, Colorectal and Ovarian Screening (PLCO) Cancer Screening Trial (N = 154,901) and National Lung Screening Trial (N = 53,452) were analysed. Logistic regression models were used to assess the relationships between each variable of interest and 6-year lung cancer risk. Predictive utility was assessed through changes in area-under-the-curve (AUC) after substitution into the PLCOall2014 lung cancer risk prediction model. Previous lung, uterine and oral cancers were strongly and significantly associated with elevated 6-year lung cancer risk after controlling for confounders. None of these refined measures of personal cancer history offered more predictive utility than the simple (yes/no) measure already included in the PLCOall2014 model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La maladie de Crohn (MC) pédiatrique a des conséquences majeures sur la qualité de vie des patients atteints (troubles de croissance, absentéisme scolaire, etc). L’étiologie de la MC est inconnue. La théorie de l’hygiène (TH) stipule que les conditions de vie sanitaires des pays industrialisés préviennent l’exposition antigénique et empêchent le développement de la tolérance immunitaire chez les enfants. Ceci mènerait à une réaction excessive du système immunitaire lors d’expositions subséquentes et engendrerait le développement de maladies inflammatoires chroniques telles la MC. Objectif: Analyser l’association entre la fréquence, la temporalité et le type d’infections infantiles (indicateurs d’environnements pourvus d’antigènes) et le risque de MC pédiatrique. Une étude cas-témoin fût réalisée, les cas de MC provenant d’un centre hospitalier tertiaire montréalais. Les témoins, provenant des registres de la Régie d’assurance maladie du Québec (RAMQ), furent appariés aux cas selon leur âge, sexe et lieu de résidence. L’exposition aux infections fût déterminée grâce aux codes de diagnostic ICD-9 inscrits dans la base de données de la RAMQ. Un modèle de régression logistique conditionnelle fût construit afin d’analyser l’association entre infections et MC. Des ratios de cotes (RC) et intervalles de confiance à 95% (IC 95%) furent calculés. Résultats: 409 cas et 1621 témoins furent recrutés. Les résultats de l’analyse suggèrent un effet protecteur des infections infantiles sur le risque de MC (RC: 0,67 [IC: 0,48-0,93], p=0,018), plus particulièrement au cours des 5 premières années de vie (RC: 0.74 [IC: 0,57-0,96], p=0,025). Les infections rénales et urinaires, ainsi que les infections des voies orales et du système nerveux central (virale), semblent particulièrement associées à l’effet protecteur. Les résultats de l’étude appuient la théorie de l’hygiène: l’exposition aux infections infantiles pourrait réduire le risque de MC pédiatrique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microcosm studies were performed to evaluate the survival of Escherichia coli, Salmonella paratyphi and Vibrio parahaemolyticus in water and sediment collected from the freshwater region of Vembanad Lake (9 35◦N 76 25◦E) along the south west coast of India. All three test microorganisms showed significantly (p < 0.01) higher survival in sediment compared to overlying water. The survival in different sediment types with different particle size and organic carbon content revealed that sediment with small particle size and high organic carbon content could enhance their extended survival (p < 0.05). The results indicate that sediments of the Lake could act as a reservoir of pathogenic bacteria and exhibit a potential health hazard from possible resuspension and subsequent ingestion during recreational activities. Therefore, the assessment of bacterial concentration in freshwater Lake sediments used for contact and non contact recreation has of considerable significance for the proper assessment of microbial pollution of the overlying water, and for the management and protection of related health risk at specific recreational sites. Besides, assessment of the bacterial concentration in sediments can be used as a relatively stable indicator of long term mean bacterial concentration in the water column above

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prevalence of faecal coliform bacteria and the survival of Escherichia coli, Vibrio parahaemolyticus and Salmonella paratyphi were studied in the water and sediment from Vembanadu Lake in the presence and absence of protozoan predators. The density of faecal coliform bacteria ranged between mean MPN value 5080–9000/100 ml in water and 110,000–988,000/1 g in sediment (p <0.01), which was 110 times greater than in overlying water. The laboratory microcosm studies revealed that E. coli, V. parahaemolyticus and S. paratyphi showed significantly higher survival (p <0.05) potential in sediment than in overlying water both in the presence and absence of protozoan predators. The results indicate that Vembanadu Lake sediment constitutes a reservoir of pathogenic bacteria and exhibits potential health hazard from possible resuspension and subsequent ingestion during recreational activities. Therefore, assessment of bacterial concentration in freshwater lake sediments used for contact and non-contact recreation is of considerable significance for the proper assessment of microbial pollution of the overlying water and the management and protection of related health risk at specific recreational sites. In addition, assessment of the bacterial concentration in sediments can be used as a relatively stable indicator of long-term mean bacterial concentration in the water column above.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a cross-sectional study of 400 randomly selected smallholder dairy farms in the Tanga and Iringa regions of Tanzania, 14.2% (95% confidence interval (CI) = 11.6-17.3) of cows had developed clinical mastitis during the previous year. The point prevalence of subclinical mastitis, defined as a quarter positive by the California Mastitis Test (CMT) or by bacteriological culture, was 46.2% (95% Cl = 43.6-48.8) and 24.3% (95% Cl = 22.2-26.6), respectively. In a longitudinal disease study in Iringa, the incidence of clinical mastitis was 31.7 cases per 100 cow-years. A randomised intervention trial indicated that intramammary antibiotics significantly reduced the proportion of bacteriologically positive quarters in the short-term (14 days post-infusion) but teat dipping had no detectable effect on bacteriological infection and CMT positive quarters. Other risk and protective factors were identified from both the cross-sectional and longitudinal included animals with Boran breeding (odds ratio (OR) = 3,40, 95% CI = 1.00-11.57, P < 0.05 for clinical mastitis, and OR = 3.51, 95% CI = 1.299.55, P < 0.01 for a CMT positive quarter), while the practice of residual calf suckling was protective for a bacteriologically positive quarter (OR = 0.63, 95% Cl = 0.48-0.81, P <= 0.001) and for a CMT positive quarter (OR = 0.69, 95% Cl = 0.63-0.75, P < 0.001). A mastitis training course for farmers and extension officers was held, and the knowledge gained and use of different methods of dissemination were assessed over time. In a subsequent randomised controlled trial, there were strong associations between knowledge gained and both the individual question asked and the combination of dissemination methods (village meeting, video and handout) used. This study demonstrated that both clinical and subclinical mastitis is common in smallholder dairying in Tanzania, and that some of the risk and protective factors for mastitis can be addressed by practical management of dairy cows following effective knowledge transfer. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a cross-sectional study of 400 randomly selected smallholder dairy farms in the Tanga and Iringa regions of Tanzania, 14.2% (95% confidence interval (CI) = 11.6-17.3) of cows had developed clinical mastitis during the previous year. The point prevalence of subclinical mastitis, defined as a quarter positive by the California Mastitis Test (CMT) or by bacteriological culture, was 46.2% (95% Cl = 43.6-48.8) and 24.3% (95% Cl = 22.2-26.6), respectively. In a longitudinal disease study in Iringa, the incidence of clinical mastitis was 31.7 cases per 100 cow-years. A randomised intervention trial indicated that intramammary antibiotics significantly reduced the proportion of bacteriologically positive quarters in the short-term (14 days post-infusion) but teat dipping had no detectable effect on bacteriological infection and CMT positive quarters. Other risk and protective factors were identified from both the cross-sectional and longitudinal included animals with Boran breeding (odds ratio (OR) = 3,40, 95% CI = 1.00-11.57, P < 0.05 for clinical mastitis, and OR = 3.51, 95% CI = 1.299.55, P < 0.01 for a CMT positive quarter), while the practice of residual calf suckling was protective for a bacteriologically positive quarter (OR = 0.63, 95% Cl = 0.48-0.81, P <= 0.001) and for a CMT positive quarter (OR = 0.69, 95% Cl = 0.63-0.75, P < 0.001). A mastitis training course for farmers and extension officers was held, and the knowledge gained and use of different methods of dissemination were assessed over time. In a subsequent randomised controlled trial, there were strong associations between knowledge gained and both the individual question asked and the combination of dissemination methods (village meeting, video and handout) used. This study demonstrated that both clinical and subclinical mastitis is common in smallholder dairying in Tanzania, and that some of the risk and protective factors for mastitis can be addressed by practical management of dairy cows following effective knowledge transfer. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background noise should in theory hinder detection of auditory cues associated with approaching danger. We tested whether foraging chaffinches Fringilla coelebs responded to background noise by increasing vigilance, and examined whether this was explained by predation risk compensation or by a novel stimulus hypothesis. The former predicts that only inter-scan interval should be modified in the presence of background noise, not vigilance levels generally. This is because noise hampers auditory cue detection and increases perceived predation risk primarily when in the head-down position, and also because previous tests have shown that only interscan interval is correlated with predator detection ability in this system. Chaffinches only modified interscan interval supporting this hypothesis. At the same time they made significantly fewer pecks when feeding during the background noise treatment and so the increased vigilance led to a reduction in intake rate, suggesting that compensating for the increased predation risk could indirectly lead to a fitness cost. Finally, the novel stimulus hypothesis predicts that chaffinches should habituate to the noise, which did not occur within a trial or over 5 subsequent trials. We conclude that auditory cues may be an important component of the trade-off between vigilance and feeding, and discuss possible implications for anti-predation theory and ecological processes

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: We have previously reported higher and more variable salivary morning cortisol in 13-year-old adolescents whose mothers were depressed in the postnatal period, compared with control group adolescents whose mothers did not develop postnatal depression (PND). This observation suggested a biological mechanism by which intrafamilial risk for depressive disorder may be transmitted. In the current article, we examined whether the cortisol disturbances observed at 13 years could predict depressive symptornatology in adolescents at 16 years of age. Methods: We measured self-reported depressive symptoms in 16-year-old adolescents who had (n = 48) or had not (n = 39) been exposed to postnatal maternal depression and examined their prediction by morning and evening cortisol indices obtained via 10 days of salivary collections at 13 years. Results: Elevated morning cortisol secretion at 13 years, and particularly the maximum level recorded over 10 days of collection, predicted elevated depressive symptoms at 16 years over and above 13-year depressive symptom levels and other possible confounding factors. Morning cortisol secretion mediated an association between maternal PND and symptornatology in 16-year-old offspring. Conclusions: Alterations in steroid secretion observed in association with maternal PND may provide a mechanism by which risk for depression is transmitted from mother to offspring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reviews the evidence relating to the question: does the risk of fungicide resistance increase or decrease with dose? The development of fungicide resistance progresses through three key phases. During the ‘emergence phase’ the resistant strain has to arise through mutation and invasion. During the subsequent ‘selection phase’, the resistant strain is present in the pathogen population and the fraction of the pathogen population carrying the resistance increases due to the selection pressure caused by the fungicide. During the final phase of ‘adjustment’, the dose or choice of fungicide may need to be changed to maintain effective control over a pathogen population where resistance has developed to intermediate levels. Emergence phase: no experimental publications and only one model study report on the emergence phase, and we conclude that work in this area is needed. Selection phase: all the published experimental work, and virtually all model studies, relate to the selection phase. Seven peer reviewed and four non-peer reviewed publications report experimental evidence. All show increased selection for fungicide resistance with increased fungicide dose, except for one peer reviewed publication that does not detect any selection irrespective of dose and one conference proceedings publication which claims evidence for increased selection at a lower dose. In the mathematical models published, no evidence has been found that a lower dose could lead to a higher risk of fungicide resistance selection. We discuss areas of the dose rate debate that need further study. These include further work on pathogen-fungicide combinations where the pathogen develops partial resistance to the fungicide and work on the emergence phase.