18 resultados para Risk Indicators

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In modern medicine, vigorous efforts are being made in the prediction and prevention of diseases. Mental disorders are suitable candidates for the application of this program. The currently known neurobiological and psychosocial risk indicators for schizophrenia do not have a predictive power sufficient for selective prevention in asymptomatic patients at risk. However, once predictive basic and later pre-psychotic high risk symptoms of psychosis develop into the five-year initial prodrome, the impending outbreak of the disease can be predicted with high accuracy. Research findings suggest a differential strategy of indicated prevention with cognitive behavioral therapy in early initial prodromal states and low dosage atypical antipsychotics in late initial prodromal states. The most important future tasks are the improvement of the predictive power by risk enrichment and stratification, as well as the confirmation of the existing and the development of new prevention strategies, with a stronger focus on the etiology of the disorder. In addition, the prediction and prevention approach would benefit from the inclusion of risk symptoms in the DSM-5 criteria.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

PURPOSE: The aim of this review was to evaluate the clinical outcomes for the different time points of implant placement following tooth extraction. MATERIALS AND METHODS: A PubMed search and a hand search of selected journals were performed to identify clinical studies published in English that reported on outcomes of implants in postextraction sites. Only studies that included 10 or more patients were accepted. For implant success/survival outcomes, only studies with a mean follow-up period of at least 12 months from the time of implant placement were included. The following outcomes were identified: (1) change in peri-implant defect dimension, (2) implant survival and success, and (3) esthetic outcomes. RESULTS AND CONCLUSIONS: Of 1,107 abstracts and 170 full-text articles considered, 91 studies met the inclusion criteria for this review. Bone augmentation procedures are effective in promoting bone fill and defect resolution at implants in postextraction sites, and are more successful with immediate (type 1) and early placement (type 2 and type 3) than with late placement (type 4). The majority of studies reported survival rates of over 95%. Similar survival rates were observed for immediate (type 1) and early (type 2) placement. Recession of the facial mucosal margin is common with immediate (type 1) placement. Risk indicators included a thin tissue biotype, a facial malposition of the implant, and a thin or damaged facial bone wall. Early implant placement (type 2 and type 3) is associated with a lower frequency of mucosal recession compared to immediate placement (type 1).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE The aim of this cross-sectional study was to estimate bone loss of implants with platform-switching design and analyze possible risk indicators after 5 years of loading in a multi-centered private practice network. METHOD AND MATERIALS Peri-implant bone loss was measured radiographically as the distance from the implant shoulder to the mesial and distal alveolar crest, respectively. Risk factor analysis for marginal bone loss included type of implant prosthetic treatment concept and dental status of the opposite arch. RESULTS A total of 316 implants in 98 study patients after 5 years of loading were examined. The overall mean value for radiographic bone loss was 1.02 mm (SD ± 1.25 mm, 95% CI 0.90- 1.14). Correlation analyses indicated a strong association of peri-implant bone loss > 2 mm for removable implant-retained prostheses with an odds ratio of 53.8. CONCLUSION The 5-year-results of the study show clinically acceptable values of mean bone loss after 5 years of loading. Implant-supported removable prostheses seem to be a strong co-factor for extensive bone level changes compared to fixed reconstructions. However, these results have to be considered for evaluation of the included special cohort under private dental office conditions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND It is often assumed that horses with mild respiratory clinical signs, such as mucous nasal discharge and occasional coughing, have an increased risk of developing recurrent airway obstruction (RAO). HYPOTHESIS Compared to horses without any clinical signs of respiratory disease, those with occasional coughing, mucous nasal discharge, or both have an increased risk of developing signs of RAO (frequent coughing, increased breathing effort, exercise intolerance, or a combination of these) as characterized by the Horse Owner Assessed Respiratory Signs Index (HOARSI 1-4). ANIMALS Two half-sibling families descending from 2 RAO-affected stallions (n = 65 and n = 47) and an independent replication population of unrelated horses (n = 88). METHODS In a retrospective cohort study, standardized information on occurrence and frequency of coughing, mucous nasal discharge, poor performance, and abnormal breathing effort-and these factors combined in the HOARSI-as well as management factors were collected at intervals of 1.3-5 years. RESULTS Compared to horses without clinical signs of respiratory disease (half-siblings 7%; unrelated horses 3%), those with mild respiratory signs developed clinical signs of RAO more frequently: half-siblings with mucous nasal discharge 35% (P < .001, OR: 7.0, sensitivity: 62%, specificity: 81%), with mucous nasal discharge and occasional coughing 43% (P < .001, OR: 9.9, sensitivity: 55%, specificity: 89%); unrelated horses with occasional coughing: 25% (P = .006, OR = 9.7, sensitivity: 75%, specificity: 76%). CONCLUSIONS AND CLINICAL IMPORTANCE Occasional coughing and mucous nasal discharge might represent an increased risk of developing RAO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ultrasound detection of sub-clinical atherosclerosis (ATS) may help identify individuals at high cardiovascular risk. Most studies evaluated intima-media thickness (IMT) at carotid level. We compared the relationships between main cardiovascular risk factors (CVRF) and five indicators of ATS (IMT, mean and maximal plaque thickness, mean and maximal plaque area) at both carotid and femoral levels. Ultrasound was performed on 496 participants aged 45-64 years randomly selected from the general population of the Republic of Seychelles. 73.4 % participants had ≥ 1 plaque (IMT thickening ≥ 1.2 mm) at carotid level and 67.5 % at femoral level. Variance (adjusted R2) contributed by age, sex and CVRF (smoking, LDL-cholesterol, HDL-cholesterol, blood pressure, diabetes) in predicting any of the ATS markers was larger at femoral than carotid level. At both carotid and femoral levels, the association between CVRF and ATS was stronger based on plaque-based markers than IMT. Our findings show that the associations between CVRF and ATS markers were stronger at femoral than carotid level, and with plaque-based markers rather than IMT. Pending comparison of these markers using harder cardiovascular endpoints, our findings suggest that markers based on plaque morphology assessed at femoral artery level might be useful cardiovascular risk predictors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Soil erosion models and soil erosion risk maps are often used as indicators to assess potential soil erosion in order to assist policy decisions. This paper shows the scientific basis of the soil erosion risk map of Switzerland and its application in policy and practice. Linking a USLE/RUSLE-based model approach (AVErosion) founded on multiple flow algorithms and the unit contributing area concept with an extremely precise and high-resolution digital terrain model (2 m × 2 m grid) using GIS allows for a realistic assessment of the potential soil erosion risk, on single plots, i.e. uniform and comprehensive for the agricultural area of Switzerland (862,579 ha in the valley area and the lower mountain regions). The national or small-scale soil erosion prognosis has thus reached a level heretofore possible only in smaller catchment areas or single plots. Validation was carried out using soil loss data from soil erosion damage mappings in the field from long-term monitoring in different test areas. 45% of the evaluated agricultural area of Switzerland was classified as low potential erosion risk, 12% as moderate potential erosion risk, and 43% as high potential erosion risk. However, many of the areas classified as high potential erosion risk are located at the transition from valley to mountain zone, where many areas are used as permanent grassland, which drastically lowers their current erosion risk. The present soil erosion risk map serves on the one hand to identify and prioritise the high-erosion risk areas, and on the other hand to promote awareness amongst farmers and authorities. It was published on the internet and will be made available to the authorities in digital form. It is intended as a tool for simplifying and standardising enforcement of the legal framework for soil erosion prevention in Switzerland. The work therefore provides a successful example of cooperation between science, policy and practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cardiac troponin I (cTnI) and T (cTnT) have a high sequence homology across phyla and are sensitive and specific markers of myocardial damage. The purpose of this study was to evaluate the Cardiac Reader, a human point-of-care system for the determination of cTnT and myoglobin, and the Abbott Axsym System for the determination of cTnI and creatine kinase isoenzyme MB (CK-MB) in healthy dogs and in dogs at risk for acute myocardial damage because of gastric dilatation-volvulus (GDV) and blunt chest trauma (BCT). In healthy dogs (n = 56), cTnI was below detection limits (<0.1 microg/L) in 35 of 56 dogs (reference range 0-0.7 microg/L), and cTnT was not measurable (<0.05 ng/mL) in all but 1 dog. At presentation, cTnI, CK-MB, myoglobin, and lactic acid were all significantly higher in dogs with GDV (n = 28) and BCT (n = 8) than in control dogs (P < .001), but cTnT was significantly higher only in dogs with BCT (P = .033). Increased cTnI or cTnT values were found in 26 of 28 (highest values 1.1-369 microg/L) and 16 of 28 dogs (0.1-1.7 ng/mL) with GDV, and in 6 of 8 (2.3-82.4 microg/L) and 3 of 8 dogs (0.1-0.29 ng/mL) with BCT, respectively. In dogs suffering from GDV, cTnI and cTnT increased further within the first 48 hours (P < .001). Increased cardiac troponins suggestive of myocardial damage occurred in 93% of dogs with GDV and 75% with BCT. cTnI appeared more sensitive, but cTnT may be a negative prognostic indicator in GDV. Both systems tested seemed applicable for the measurement of canine cardiac troponins, with the Cardiac Reader particularly suitable for use in emergency settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quality of life (QL) is an important consideration when comparing adjuvant therapies for early breast cancer, especially if they differ substantially in toxicity. We evaluated QL and Q-TWiST among patients randomised to adjuvant dose-intensive epirubicin and cyclophosphamide administered with filgrastim and progenitor cell support (DI-EC) or standard-dose anthracycline-based chemotherapy (SD-CT). We estimated the duration of chemotherapy toxicity (TOX), time without disease symptoms and toxicity (TWiST), and time following relapse (REL). Patients scored QL indicators. Mean durations for the three transition times were weighted with patient reported utilities to obtain mean Q-TWiST. Patients receiving DI-EC reported worse QL during TOX, especially treatment burden (month 3: P<0.01), but a faster recovery 3 months following chemotherapy than patients receiving SD-CT, for example, less coping effort (P<0.01). Average Q-TWiST was 1.8 months longer for patients receiving DI-EC (95% CI, -2.5 to 6.1). Q-TWiST favoured DI-EC for most values of utilities attached to TOX and REL. Despite greater initial toxicity, quality-adjusted survival was similar or better with dose-intensive treatment as compared to standard treatment. Thus, QL considerations should not be prohibitive if future intensive therapies show superior efficacy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: There is no general agreement on the best indication and timing of vitrectomy in patients suffering from Terson syndrome. Therefore, we reviewed our cases in order to assess factors interfering with the functional outcome and complication rates after vitrectomy. METHODS: In this retrospective consecutive case series, the records from all patients undergoing vitrectomy for Terson syndrome between 1975 and 2005 were evaluated. RESULTS: Thirty-seven patients (45 eyes) were identified, 36 of whom (44 corresponding eyes) were eligible. The best-corrected visual acuity (BCVA) at first and last presentation was 0.07 +/- 0.12 and 0.72 +/- 0.31, respectively. Thirty-five eyes (79.5%) achieved a postoperative BCVA of > or = 0.5; 26 (59.1%) eyes achieved a postoperative BCVA of > or = 0.8. Patients operated on within 90 days of vitreous haemorrhage achieved a better final BCVA than those with a longer latency (BCVA of 0.87 +/- 0.27 compared to 0.66 +/- 0.31; P = 0.03). Patients younger than 45 years of age achieved a better final BCVA than older patients (0.85 +/- 0.24 compared to 0.60 +/- 0.33; P = 0.006). Retinal detachment developed in four patients between 6 and 27 months after surgery. Seven patients (16%) required epiretinal membrane peeling and seven cataract surgery. CONCLUSION: Ninety-eight per cent of our patients experienced a rapid and persisting visual recovery after removal of a vitreous haemorrhage caused by Terson syndrome. A shorter time between occurrence of vitreous haemorrhage and surgery as well as a younger patient age are predictive of a better outcome. Generally, the surgical risk is low, but complications (namely retinal detachment) may occur late after surgery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The diagnosis of the obliterative bronchiolitis syndrome in lung transplantation is presently best established by evaluation of postoperative lung function tests. Unfortunately the decline in lung function occurs only when obliteration has progressed significantly and is therefore not an early predictive indicator. To distinguish patients at increased risk for the development of obliterative bronchiolitis, we regularly assessed the chemiluminescence response of polymorphonuclear leukocytes, opsonic capacity, and plasma elastase/beta-N-acetylglucosaminidase in 52 outpatients (25 women and 27 men; mean age 45 +/- 12 years) who underwent transplantation between January 1991 and January 1992. Recent onset bronchiolitis within the described observation period occurred in 16 patients (group obliterative bronchiolitis). A matched cohort of 16 patients was formed according to type of procedure, age and follow-up (control) from the remaining 36 patients. Data obtained from a period 6 months before clinical onset of the syndrome showed a significant drop of the opsonic capacity (group obliterative bronchiolitis = 87% +/- 7%; control = 100% +/- 9%; p < 0.023) and rise of the N-acetyl-D-glucosaminidase (group obliterative bronchiolitis = 7.5 +/- 2 U/L; control = 5.8 +/- 1.8 U/L; p < 0.04). No correlation was found between the number of infectious events or rejection episodes and the incidence of obliterative bronchiolitis. According to these results, it can be concluded that a decrease in the plasma opsonic capacity and a rise in beta-N-acetylglucosaminidase may be early markers before clinical onset of obliterative bronchiolitis. The nonspecific immune system may therefore play an important role in the development of obliterative bronchiolitis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Empirical research has illustrated an association between study size and relative treatment effects, but conclusions have been inconsistent about the association of study size with the risk of bias items. Small studies give generally imprecisely estimated treatment effects, and study variance can serve as a surrogate for study size. METHODS We conducted a network meta-epidemiological study analyzing 32 networks including 613 randomized controlled trials, and used Bayesian network meta-analysis and meta-regression models to evaluate the impact of trial characteristics and study variance on the results of network meta-analysis. We examined changes in relative effects and between-studies variation in network meta-regression models as a function of the variance of the observed effect size and indicators for the adequacy of each risk of bias item. Adjustment was performed both within and across networks, allowing for between-networks variability. RESULTS Imprecise studies with large variances tended to exaggerate the effects of the active or new intervention in the majority of networks, with a ratio of odds ratios of 1.83 (95% CI: 1.09,3.32). Inappropriate or unclear conduct of random sequence generation and allocation concealment, as well as lack of blinding of patients and outcome assessors, did not materially impact on the summary results. Imprecise studies also appeared to be more prone to inadequate conduct. CONCLUSIONS Compared to more precise studies, studies with large variance may give substantially different answers that alter the results of network meta-analyses for dichotomous outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last forty years, applying dendrogeomorphology to palaeoflood analysis has improved estimates of the frequency and magnitude of past floods worldwide. This paper reviews the main results obtained by applying dendrogeomorphology to flood research in several case studies in Central Spain. These dendrogeomorphological studies focused on the following topics: (1) anatomical analysis to understand the physiological response of trees to flood damage and improve sampling efficiency; (2) compiling robust flood chronologies in ungauged mountain streams, (3) determining flow depth and estimating flood discharge using two-dimensional hydraulic modelling, and comparing them with other palaeostage indicators; (4) calibrating hydraulic model parameters (i.e. Manning roughness); and (5) implementing stochastic-based, cost–benefit analysis to select optimal mitigation measures. The progress made in these areas is presented with suggestions for further research to improve the applicability of dendrogeochronology to palaeoflood studies. Further developments will include new methods for better identification of the causes of specific types of flood damage to trees (e.g. tilted trees) or stable isotope analysis of tree rings to identify the climatic conditions associated with periods of increasing flood magnitude or frequency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dendrogeomorphology uses information sources recorded in the roots, trunks and branches of trees and bushes located in the fluvial system to complement (or sometimes even replace) systematic and palaeohydrological records of past floods. The application of dendrogeomorphic data sources and methods to palaeoflood analysis over nearly 40 years has allowed improvements to be made in frequency and magnitude estimations of past floods. Nevertheless, research carried out so far has shown that the dendrogeomorphic indicators traditionally used (mainly scar evidence), and their use to infer frequency and magnitude, have been restricted to a small, limited set of applications. New possibilities with enormous potential remain unexplored. New insights in future research of palaeoflood frequency and magnitude using dendrogeomorphic data sources should: (1) test the application of isotopic indicators (16O/18O ratio) to discover the meteorological origin of past floods; (2) use different dendrogeomorphic indicators to estimate peak flows with 2D (and 3D) hydraulic models and study how they relate to other palaeostage indicators; (3) investigate improved calibration of 2D hydraulic model parameters (roughness); and (4) apply statistics-based cost–benefit analysis to select optimal mitigation measures. This paper presents an overview of these innovative methodologies, with a focus on their capabilities and limitations in the reconstruction of recent floods and palaeofloods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Subclinical thyroid dysfunction has been associated with coronary heart disease, but the risk of stroke is unclear. Our aim is to combine the evidence on the association between subclinical thyroid dysfunction and the risk of stroke in prospective cohort studies. We searched Medline (OvidSP), Embase, Web-of-Science, Pubmed Publisher, Cochrane and Google Scholar from inception to November 2013 using a cohort filter, but without language restriction or other limitations. Reference lists of articles were searched. Two independent reviewers screened articles according to pre-specified criteria and selected prospective cohort studies with baseline thyroid function measurements and assessment of stroke outcomes. Data were derived using a standardized data extraction form. Quality was assessed according to previously defined quality indicators by two independent reviewers. We pooled the outcomes using a random-effects model. Of 2,274 articles screened, six cohort studies, including 11,309 participants with 665 stroke events, met the criteria. Four of six studies provided information on subclinical hyperthyroidism including a total of 6,029 participants and five on subclinical hypothyroidism (n = 10,118). The pooled hazard ratio (HR) was 1.08 (95 % CI 0.87-1.34) for subclinical hypothyroidism (I (2) of 0 %) and 1.17 (95 % CI 0.54-2.56) for subclinical hyperthyroidism (I (2) of 67 %) compared to euthyroidism. Subgroup analyses yielded similar results. Our systematic review provides no evidence supporting an increased risk for stroke associated with subclinical thyroid dysfunction. However, the available literature is insufficient and larger datasets are needed to perform extended analyses. Also, there were insufficient events to exclude clinically significant risk from subclinical hyperthyroidism, and more data are required for subgroup analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: There is little knowledge in the literature on the role of time-related variables for the prognosis of acute and subacute low back pain (LBP). OBJECTIVE: The aim of this study was to estimate the relationship between time-related LBP characteristics and prognostic factors for acute/subacute LBP. METHODS: We performed a prospective inception cohort study of 315 patients attending a health practitioner for acute/subacute LBP or recurrent LBP. One-tailed correlations were conducted between patient characteristics and time-related variables. RESULTS: The pattern of correlation between risk factors for and resources against persistent LBP differed between three time-related variables. 'Subacute LBP' and 'delayed presentation' were positively associated with psychological factors. Both indicators were negatively correlated with resources against development of persistent LBP. Moreover, 'delayed presentation' was related positively with occupational stressors. In contrast, 'recurrent LBP' was only related to more impaired health-related factors. CONCLUSIONS: Patients with current LBP waiting longer until seeking help in primary care have a more disadvantageous profile of occupational and psychological risk factors and lower resource levels. A similar but less pronounced pattern occurred in those with subacute LBP compared to those with acute LBP. Consideration of time characteristics of LBP may help to better understand LBP.