961 resultados para Evaluate Risk


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A prerequisite for preventive measures is to diagnose erosive tooth wear and to evaluate the different etiological factors in order to identify persons at risk. No diagnostic device is available for the assessment of erosive defects. Thus, they can only be detected clinically. Consequently, erosion not diagnosed at an early stage may render timely preventive measures difficult. In order to assess the risk factors, patients should record their dietary intake for a distinct period of time. Then a dentist can determine the erosive potential of the diet. A table with common beverages and foodstuffs is presented for judging the erosive potential. Particularly, patients with more than 4 dietary acid intakes have a higher risk for erosion when other risk factors are present. Regurgitation of gastric acids is a further important risk factor for the development of erosion which has to be taken into account. Based on these analyses, an individually tailored preventive program may be suggested to the patients. It may comprise dietary advice, use of calcium-enriched beverages, optimization of prophylactic regimes, stimulation of salivary flow rate, use of buffering medicaments and particular motivation for nondestructive toothbrushing habits with an erosive-protecting toothpaste as well as rinsing solutions. Since erosion and abrasion often occur simultaneously, all of the causative components must be taken into consideration when planning preventive strategies but only those important and feasible for an individual should be communicated to the patient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND No data exist on the patterns of biochemical recurrence (BCR) and their effect on survival in patients with high-risk prostate cancer (PCa) treated with surgery. The aim of our investigation was to evaluate the natural history of PCa in patients treated with radical prostatectomy (RP) alone. MATERIALS AND METHODS Overall, 2,065 patients with high-risk PCa treated with RP at 7 tertiary referral centers between 1991 and 2011 were identified. First, we calculated the probability of experiencing BCR after surgery. Particularly, we relied on conditional survival estimates for BCR after RP. Competing-risks regression analyses were then used to evaluate the effect of time to BCR on the risk of cancer-specific mortality (CSM). RESULTS Median follow-up was 70 months. Overall, the 5-year BCR-free survival rate was 55.2%. Given the BCR-free survivorship at 1, 2, 3, 4, and 5 years, the BCR-free survival rates improved by+7.6%,+4.1%,+4.8%,+3.2%, and+3.7%, respectively. Overall, the 10-year CSM rate was 14.8%. When patients were stratified according to time to BCR, patients experiencing BCR within 36 months from surgery had higher 10-year CSM rates compared with those experiencing late BCR (19.1% vs. 4.4%; P<0.001). At multivariate analyses, time to BCR represented an independent predictor of CSM (P<0.001). CONCLUSIONS Increasing time from surgery is associated with a reduction of the risk of subsequent BCR. Additionally, time to BCR represents a predictor of CSM in these patients. These results might help provide clinicians with better follow-up strategies and more aggressive treatments for early BCR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND AIMS Limited data from large cohorts are available on tumor necrosis factor (TNF) antagonists (infliximab, adalimumab, certolizumab pegol) switch over time. We aimed to evaluate the prevalence of switching from one TNF antagonist to another and to identify associated risk factors. METHODS Data from the Swiss Inflammatory Bowel Diseases Cohort Study (SIBDCS) were analyzed. RESULTS Of 1731 patients included into the SIBDCS (956 with Crohn's disease [CD] and 775 with ulcerative colitis [UC]), 347 CD patients (36.3%) and 129 UC patients (16.6%) were treated with at least one TNF antagonist. A total of 53/347 (15.3%) CD patients (median disease duration 9 years) and 20/129 (15.5%) of UC patients (median disease duration 7 years) needed to switch to a second and/or a third TNF antagonist, respectively. Median treatment duration was longest for the first TNF antagonist used (CD 25 months; UC 14 months), followed by the second (CD 13 months; UC 4 months) and third TNF antagonist (CD 11 months; UC 15 months). Primary nonresponse, loss of response and side effects were the major reasons to stop and/or switch TNF antagonist therapy. A low body mass index, a short diagnostic delay and extraintestinal manifestations at inclusion were identified as risk factors for a switch of the first used TNF antagonist within 24 months of its use in CD patients. CONCLUSION Switching of the TNF antagonist over time is a common issue. The median treatment duration with a specific TNF antagonist is diminishing with an increasing number of TNF antagonists being used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES To evaluate the impact of preoperative sepsis on risk of postoperative arterial and venous thromboses. DESIGN Prospective cohort study using the National Surgical Quality Improvement Program database of the American College of Surgeons (ACS-NSQIP). SETTING Inpatient and outpatient procedures in 374 hospitals of all types across the United States, 2005-12. PARTICIPANTS 2,305,380 adults who underwent surgical procedures. MAIN OUTCOME MEASURES Arterial thrombosis (myocardial infarction or stroke) and venous thrombosis (deep venous thrombosis or pulmonary embolism) in the 30 days after surgery. RESULTS Among all surgical procedures, patients with preoperative systemic inflammatory response syndrome or any sepsis had three times the odds of having an arterial or venous postoperative thrombosis (odds ratio 3.1, 95% confidence interval 3.0 to 3.1). The adjusted odds ratios were 2.7 (2.5 to 2.8) for arterial thrombosis and 3.3 (3.2 to 3.4) for venous thrombosis. The adjusted odds ratios for thrombosis were 2.5 (2.4 to 2.6) in patients with systemic inflammatory response syndrome, 3.3 (3.1 to 3.4) in patients with sepsis, and 5.7 (5.4 to 6.1) in patients with severe sepsis, compared with patients without any systemic inflammation. In patients with preoperative sepsis, both emergency and elective surgical procedures had a twofold increased odds of thrombosis. CONCLUSIONS Preoperative sepsis represents an important independent risk factor for both arterial and venous thromboses. The risk of thrombosis increases with the severity of the inflammatory response and is higher in both emergent and elective surgical procedures. Suspicion of thrombosis should be higher in patients with sepsis who undergo surgery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the incidence of colic and risk factors for colic in equids hospitalized for ocular disease. DESIGN: Retrospective observational study. Animals-337 equids (317 horses, 19 ponies, and 1 donkey) hospitalized for ocular disease. PROCEDURES: Medical records of equids hospitalized for > 24 hours for treatment of ocular disease between January 1997 and December 2008 were reviewed. Information from only the first hospitalization was used for equids that were hospitalized for ocular disease on more than 1 occasion. Information gathered included the signalment, the type of ocular lesion and the treatment administered, and any colic signs recorded during hospitalization as well as the severity, presumptive diagnosis, and treatment of the colic. Statistical analysis was used to identify any risk factors for colic in equids hospitalized for ocular disease. RESULTS: 72 of 337 (21.4%) equids hospitalized for ocular disease had signs of colic during hospitalization. Most equids (59.7% [43/72]) had mild signs of colic, and most (87.5% [63/72]) were treated medically. Ten of 72 (13.9%) equids with colic had a cecal impaction. Risk factors for colic in equids hospitalized for ocular disease were age (0 to 1 year and ≥ 21 years) and an increased duration of hospitalization (≥ 8 days). CONCLUSIONS AND CLINICAL RELEVANCE: There was a high incidence of colic in equids hospitalized with ocular disease in this study. Findings from this study may help identify equids at risk for development of colic and thereby help direct implementation of prophylactic measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A World Health Organization expert meeting on Ebola vaccines proposed urgent safety and efficacy studies in response to the outbreak in West Africa. One approach to communicable disease control is ring vaccination of individuals at high risk of infection due to their social or geographical connection to a known case. This paper describes the protocol for a novel cluster randomised controlled trial design which uses ring vaccination.In the Ebola ça suffit ring vaccination trial, rings are randomised 1:1 to (a) immediate vaccination of eligible adults with single dose vaccination or (b) vaccination delayed by 21 days. Vaccine efficacy against disease is assessed in participants over equivalent periods from the day of randomisation. Secondary objectives include vaccine effectiveness at the level of the ring, and incidence of serious adverse events.Ring vaccination trials are adaptive, can be run until disease elimination, allow interim analysis, and can go dormant during inter-epidemic periods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We developed a model to calculate a quantitative risk score for individual aquaculture sites. The score indicates the risk of the site being infected with a specific fish pathogen (viral haemorrhagic septicaemia virus (VHSV); infectious haematopoietic necrosis virus, Koi herpes virus), and is intended to be used for risk ranking sites to support surveillance for demonstration of zone or member state freedom from these pathogens. The inputs to the model include a range of quantitative and qualitative estimates of risk factors organised into five risk themes (1) Live fish and egg movements; (2) Exposure via water; (3) On-site processing; (4) Short-distance mechanical transmission; (5) Distance-independent mechanical transmission. The calculated risk score for an individual aquaculture site is a value between zero and one and is intended to indicate the risk of a site relative to the risk of other sites (thereby allowing ranking). The model was applied to evaluate 76 rainbow trout farms in 3 countries (42 from England, 32 from Italy and 2 from Switzerland) with the aim to establish their risk of being infected with VHSV. Risk scores for farms in England and Italy showed great variation, clearly enabling ranking. Scores ranged from 0.002 to 0.254 (mean score 0.080) in England and 0.011 to 0.778 (mean of 0.130) for Italy, reflecting the diversity of infection status of farms in these countries. Requirements for broader application of the model are discussed. Cost efficient farm data collection is important to realise the benefits from a risk-based approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Potentially avoidable risk factors continue to cause unnecessary disability and premature death in older people. Health risk assessment (HRA), a method successfully used in working-age populations, is a promising method for cost-effective health promotion and preventive care in older individuals, but the long-term effects of this approach are unknown. The objective of this study was to evaluate the effects of an innovative approach to HRA and counselling in older individuals for health behaviours, preventive care, and long-term survival. METHODS AND FINDINGS This study was a pragmatic, single-centre randomised controlled clinical trial in community-dwelling individuals aged 65 y or older registered with one of 19 primary care physician (PCP) practices in a mixed rural and urban area in Switzerland. From November 2000 to January 2002, 874 participants were randomly allocated to the intervention and 1,410 to usual care. The intervention consisted of HRA based on self-administered questionnaires and individualised computer-generated feedback reports, combined with nurse and PCP counselling over a 2-y period. Primary outcomes were health behaviours and preventive care use at 2 y and all-cause mortality at 8 y. At baseline, participants in the intervention group had a mean ± standard deviation of 6.9 ± 3.7 risk factors (including unfavourable health behaviours, health and functional impairments, and social risk factors) and 4.3 ± 1.8 deficits in recommended preventive care. At 2 y, favourable health behaviours and use of preventive care were more frequent in the intervention than in the control group (based on z-statistics from generalised estimating equation models). For example, 70% compared to 62% were physically active (odds ratio 1.43, 95% CI 1.16-1.77, p = 0.001), and 66% compared to 59% had influenza vaccinations in the past year (odds ratio 1.35, 95% CI 1.09-1.66, p = 0.005). At 8 y, based on an intention-to-treat analysis, the estimated proportion alive was 77.9% in the intervention and 72.8% in the control group, for an absolute mortality difference of 4.9% (95% CI 1.3%-8.5%, p = 0.009; based on z-test for risk difference). The hazard ratio of death comparing intervention with control was 0.79 (95% CI 0.66-0.94, p = 0.009; based on Wald test from Cox regression model), and the number needed to receive the intervention to prevent one death was 21 (95% CI 12-79). The main limitations of the study include the single-site study design, the use of a brief self-administered questionnaire for 2-y outcome data collection, the unavailability of other long-term outcome data (e.g., functional status, nursing home admissions), and the availability of long-term follow-up data on mortality for analysis only in 2014. CONCLUSIONS This is the first trial to our knowledge demonstrating that a collaborative care model of HRA in community-dwelling older people not only results in better health behaviours and increased use of recommended preventive care interventions, but also improves survival. The intervention tested in our study may serve as a model of how to implement a relatively low-cost but effective programme of disease prevention and health promotion in older individuals. TRIAL REGISTRATION International Standard Randomized Controlled Trial Number: ISRCTN 28458424.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE In patients with a long life expectancy with high-risk (HR) prostate cancer (PCa), the chance to die from PCa is not negligible and may change significantly according to the time elapsed from surgery. The aim of this study was to evaluate long-term survival patterns in young patients treated with radical prostatectomy (RP) for HRPCa. MATERIALS AND METHODS Within a multiinstitutional cohort, 600 young patients (≤59 years) treated with RP between 1987 and 2012 for HRPCa (defined as at least one of the following adverse characteristics: prostate specific antigen>20, cT3 or higher, biopsy Gleason sum 8-10) were identified. Smoothed cumulative incidence plot was performed to assess cancer-specific mortality (CSM) and other cause mortality (OCM) rates at 10, 15, and 20 years after RP. The same analyses were performed to assess the 5-year probability of CSM and OCM in patients who survived 5, 10, and 15 years after RP. A multivariable competing risk regression model was fitted to identify predictors of CSM and OCM. RESULTS The 10-, 15- and 20-year CSM and OCM rates were 11.6% and 5.5% vs. 15.5% and 13.5% vs. 18.4% and 19.3%, respectively. The 5-year probability of CSM and OCM rates among patients who survived at 5, 10, and 15 years after RP, were 6.4% and 2.7% vs. 4.6% and 9.6% vs. 4.2% and 8.2%, respectively. Year of surgery, pathological stage and Gleason score, surgical margin status and lymph node invasion were the major determinants of CSM (all P≤0.03). Conversely, none of the covariates was significantly associated with OCM (all P≥ 0.09). CONCLUSIONS Very long-term cancer control in young high-risk patients after RP is highly satisfactory. The probability of dying from PCa in young patients is the leading cause of death during the first 10 years of survivorship after RP. Thereafter, mortality not related to PCa became the main cause of death. Consequently, surgery should be consider among young patients with high-risk disease and strict PCa follow-up should enforce during the first 10 years of survivorship after RP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Lack of donor organs remains a major obstacle in organ transplantation. Our aim was to evaluate (1) the association between engaging in high-risk recreational activities and attitudes toward organ donation and (2) the degree of reciprocity between organ acceptance and donation willingness in young men. Methods A 17-item, close-ended survey was offered to male conscripts ages 18 to 26 years in all Swiss military conscription centers. Predictors of organ donation attitudes were assessed in bivariate analyses and multiple logistic regression. Reciprocity of the intentions to accept and to donate organs was assessed by means of donor card status. Results In 1559 responses analyzed, neither motorcycling nor practicing extreme sports reached significant association with donor card holder status. Family communication about organ donation, student, or academic profession and living in a Latin linguistic region were predictors of positive organ donation attitudes, whereas residence in a German-speaking region and practicing any religion predicted reluctance. Significantly more respondents were willing to accept than to donate organs, especially among those without family communication concerning organ donation. Conclusions For the first time, it was shown that high-risk recreational activities do not influence organ donation attitudes. Second, a considerable discrepancy in organ donation reciprocity was identified. We propose that increasing this reciprocity could eventually increase organ donation rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE The aim of this study was to examine the prevalence of nutritional risk and its association with multiple adverse clinical outcomes in a large cohort of acutely ill medical inpatients from a Swiss tertiary care hospital. METHODS We prospectively followed consecutive adult medical inpatients for 30 d. Multivariate regression models were used to investigate the association of the initial Nutritional Risk Score (NRS 2002) with mortality, impairment in activities of daily living (Barthel Index <95 points), hospital length of stay, hospital readmission rates, and quality of life (QoL; adapted from EQ5 D); all parameters were measured at 30 d. RESULTS Of 3186 patients (mean age 71 y, 44.7% women), 887 (27.8%) were at risk for malnutrition with an NRS ≥3 points. We found strong associations (odds ratio/hazard ratio [OR/HR], 95% confidence interval [CI]) between nutritional risk and mortality (OR/HR, 7.82; 95% CI, 6.04-10.12), impaired Barthel Index (OR/HR, 2.56; 95% CI, 2.12-3.09), time to hospital discharge (OR/HR, 0.48; 95% CI, 0.43-0.52), hospital readmission (OR/HR, 1.46; 95% CI, 1.08-1.97), and all five dimensions of QoL measures. Associations remained significant after adjustment for sociodemographic characteristics, comorbidities, and medical diagnoses. Results were robust in subgroup analysis with evidence of effect modification (P for interaction < 0.05) based on age and main diagnosis groups. CONCLUSION Nutritional risk is significant in acutely ill medical inpatients and is associated with increased medical resource use, adverse clinical outcomes, and impairments in functional ability and QoL. Randomized trials are needed to evaluate evidence-based preventive and treatment strategies focusing on nutritional factors to improve outcomes in these high-risk patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND & AIMS Non-selective beta-blockers (NSBB) are used in patients with cirrhosis and oesophageal varices. Experimental data suggest that NSBB inhibit angiogenesis and reduce bacterial translocation, which may prevent hepatocellular carcinoma (HCC). We therefore assessed the effect of NSBB on HCC by performing a systematic review with meta-analyses of randomized trials. METHODS Electronic and manual searches were combined. Authors were contacted for unpublished data. Included trials assessed NSBB for patients with cirrhosis; the control group could receive any other intervention than NSBB. Fixed and random effects meta-analyses were performed with I(2) as a measure of heterogeneity. Subgroup, sensitivity, regression and sequential analyses were performed to evaluate heterogeneity, bias and the robustness of the results after adjusting for multiple testing. RESULTS Twenty-three randomized trials on 2618 patients with cirrhosis were included, of which 12 reported HCC incidence and 23 reported HCC mortality. The mean duration of follow-up was 26 months (range 8-82). In total, 47 of 694 patients randomized to NSBB developed HCC vs 65 of 697 controls (risk difference -0.026; 95% CI-0.052 to -0.001; number needed to treat 38 patients). There was no heterogeneity (I(2) = 7%) or evidence of small study effects (Eggers P = 0.402). The result was not confirmed in sequential analysis, which suggested that 3719 patients were needed to achieve the required information size. NSBB did not reduce HCC-related mortality (RD -0.011; 95% CI -0.040 to 0.017). CONCLUSIONS Non-selective beta-blockers may prevent HCC in patients with cirrhosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

UNLABELLED In a prospective multicentre study of bloodstream infection (BSI) from November 01, 2007 to July 31, 2010, seven paediatric cancer centres (PCC) from Germany and one from Switzerland included 770 paediatric cancer patients (58% males; median age 8.3 years, interquartile range (IQR) 3.8-14.8 years) comprising 153,193 individual days of surveillance (in- and outpatient days during intensive treatment). Broviac catheters were used in 63% of all patients and Ports in 20%. One hundred forty-two patients (18%; 95% CI 16 to 21%) experienced at least one BSI (179 BSIs in total; bacteraemia 70%, bacterial sepsis 27%, candidaemia 2%). In 57%, the BSI occurred in inpatients, in 79% after conventional chemotherapy. Only 56 % of the patients showed neutropenia at BSI onset. Eventually, patients with acute lymphoblastic leukaemia (ALL) or acute myeloblastic leukaemia (AML), relapsed malignancy and patients with a Broviac faced an increased risk of BSI in the multivariate analysis. Relapsed malignancy (16%) was an independent risk factor for all BSI and for Gram-positive BSI. CONCLUSION This study confirms relapsed malignancy as an independent risk factor for BSIs in paediatric cancer patients. On a unit level, data on BSIs in this high-risk population derived from prospective surveillance are not only mandatory to decide on empiric antimicrobial treatment but also beneficial in planning and evaluating preventive bundles. WHAT IS KNOWN • Paediatric cancer patients face an increased risk of nosocomial bloodstream infections (BSIs). • In most cases, these BSIs are associated with the use of a long-term central venous catheter (Broviac, Port), severe and prolonged immunosuppression (e.g. neutropenia) and other chemotherapy-induced alterations of host defence mechanisms (e.g. mucositis). What is New: • This study is the first multicentre study confirming relapsed malignancy as an independent risk factor for BSIs in paediatric cancer patients. • It describes the epidemiology of nosocomial BSI in paediatric cancer patients mainly outside the stem cell transplantation setting during conventional intensive therapy and argues for prospective surveillance programmes to target and evaluate preventive bundle interventions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We used meat-inspection data collected over a period of three years in Switzerland to evaluate slaughterhouse-level, farm-level and animal-level factors that may be associated with whole carcass condemnation (WCC) in cattle after slaughter. The objective of this study was to identify WCC risk factors so they can be communicated to, and managed by, the slaughter industry and veterinary services. During meat inspection, there were three main important predictors of the risk of WCC; the slaughtered animal's sex, age, and the size of the slaughterhouse it was processed in. WCC for injuries and significant weight loss (visible welfare indicators) were almost exclusive to smaller slaughterhouses. Cattle exhibiting clinical syndromes that were not externally visible (e.g. pneumonia lesions) and that are associated with fattening of cattle, end up in larger slaughterhouses. For this reason, it is important for animal health surveillance to collect data from both types of slaughterhouses. Other important risk factors for WCC were on-farm mortality rate and the number of cattle on the farm of origin. This study highlights the fact that the many risk factors for WCC are as complex as the production system itself, with risk factors interacting with one another in ways which are sometimes difficult to interpret biologically. Risk-based surveillance aimed at farms with reoccurring health problems (e.g. a history of above average condemnation rates) may be more appropriate than the selection, of higher-risk animals arriving at slaughter. In Switzerland, the introduction of a benchmarking system that would provide feedback to the farmer with information on condemnation reasons, and his/her performance compared to the national/regional average could be a first step towards improving herd-management and financial returns for producers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Limitations in the primary studies constitute one important factor to be considered in the grading of recommendations assessment, development, and evaluation (GRADE) system of rating quality of evidence. However, in the network meta-analysis (NMA), such evaluation poses a special challenge because each network estimate receives different amounts of contributions from various studies via direct as well as indirect routes and because some biases have directions whose repercussion in the network can be complicated. FINDINGS In this report we use the NMA of maintenance pharmacotherapy of bipolar disorder (17 interventions, 33 studies) and demonstrate how to quantitatively evaluate the impact of study limitations using netweight, a STATA command for NMA. For each network estimate, the percentage of contributions from direct comparisons at high, moderate or low risk of bias were quantified, respectively. This method has proven flexible enough to accommodate complex biases with direction, such as the one due to the enrichment design seen in some trials of bipolar maintenance pharmacotherapy. CONCLUSIONS Using netweight, therefore, we can evaluate in a transparent and quantitative manner how study limitations of individual studies in the NMA impact on the quality of evidence of each network estimate, even when such limitations have clear directions.