38 resultados para Return on preventive measures
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
PURPOSE To systematically appraise whether anti-infective protocols are effective in preventing biologic implant complications and implant loss after a mean observation period ≥ 10 years after loading. MATERIALS AND METHODS An electronic search of Medline via PubMed and Embase via Ovid databases complemented by manual search was conducted up to October 31, 2012. Studies were included provided that they were published in English, German, French, or Italian, and conducted on ≥ 20 partially and fully edentulous patients with dental implants and regular (≥ 1×/year) supportive periodontal therapy (SPT) over a mean observation period ≥ 10 years. Assessment of the identified studies and data extraction were performed independently by two reviewers. Authors were contacted if required. Collected data were reported by descriptive methods. RESULTS The initial electronic search resulted in the identification of 994 titles from Medline via PubMed and 531 titles from Embase via Ovid databases, respectively. After elimination of duplicate titles and exclusion of 60 full-text articles, 143 articles were analyzed, resulting in 15 studies eligible for qualitative analysis. The implant survival rate ranged from 85.7% to 99.2% after a mean observation period ≥ 10 years. One comparative study assessed the effects of regular SPT on the occurrence of biologic complications and implant loss. Overall, regular diagnosis and implementation of anti-infective therapeutic protocols were effective in the management of biological complications and prevention of implant loss. Residual probing depths at the end of active periodontal therapy and development of reinfection during supportive periodontal therapy (SPT) represented a significant risk for the onset of peri-implantitis and implant loss. Comparative studies indicated that implant survival and success rates were lower in periodontally compromised vs noncompromised patients. CONCLUSIONS In order to achieve high long-term survival and success rates of dental implants and their restorations, enrollment in regular SPT including anti-infective preventive measures should be implemented. Therapy of peri-implant mucositis should be considered as a preventive measure for the onset of peri-implantitis. Completion of active periodontal therapy should precede implant placement in periodontally compromised patients.
Resumo:
A prerequisite for preventive measures is to diagnose erosive tooth wear and to evaluate the different etiological factors in order to identify persons at risk. No diagnostic device is available for the assessment of erosive defects. Thus, they can only be detected clinically. Consequently, erosion not diagnosed in the early stage may render timely preventive measures difficult. In order to assess the risk factors, patient should record their dietary intake for a distinct period of time. Then a dentist can determine the erosive potential of the diet. Particularly, patients with more than four dietary acid intakes have a higher risk for erosion when other risk factors (such as holding the drink in the mouth) are present. Regurgitation of gastric acids (reflux, vomiting, alcohol abuse, etc.) is a further important risk factor for the development of erosion which has to be taken into account. Based on these analyses, an individually tailored preventive program may be suggested to the patients. It may comprise dietary advice, optimization of fluoride regimes, stimulation of salivary flow rate, use of buffering medicaments and particular motivation for nondestructive toothbrushing habits with a low abrasive toothpaste. The frequent use of fluoride gel and fluoride solution in addition to fluoride toothpaste offers the opportunity to reduce somewhat abrasion of tooth substance. It is also advisable to avoid abrasive tooth cleaning and whitening products, since they may remove the pellicle and may render teeth more susceptible to erosion. Since erosion, attrition and abrasion often occur simultaneously all causative components must be taken into consideration when planning preventive strategies.
Resumo:
A prerequisite for preventive measures is to diagnose erosive tooth wear and to evaluate the different etiological factors in order to identify persons at risk. No diagnostic device is available for the assessment of erosive defects. Thus, they can only be detected clinically. Consequently, erosion not diagnosed at an early stage may render timely preventive measures difficult. In order to assess the risk factors, patients should record their dietary intake for a distinct period of time. Then a dentist can determine the erosive potential of the diet. A table with common beverages and foodstuffs is presented for judging the erosive potential. Particularly, patients with more than 4 dietary acid intakes have a higher risk for erosion when other risk factors are present. Regurgitation of gastric acids is a further important risk factor for the development of erosion which has to be taken into account. Based on these analyses, an individually tailored preventive program may be suggested to the patients. It may comprise dietary advice, use of calcium-enriched beverages, optimization of prophylactic regimes, stimulation of salivary flow rate, use of buffering medicaments and particular motivation for nondestructive toothbrushing habits with an erosive-protecting toothpaste as well as rinsing solutions. Since erosion and abrasion often occur simultaneously, all of the causative components must be taken into consideration when planning preventive strategies but only those important and feasible for an individual should be communicated to the patient.
Resumo:
Background. To explore effects of a health risk appraisal for older people (HRA-O) program with reinforcement, we conducted a randomized controlled trial in 21 general practices in Hamburg, Germany. Methods. Overall, 2,580 older patients of 14 general practitioners trained in reinforcing recommendations related to HRA-O-identified risk factors were randomized into intervention (n = 878) and control (n = 1,702) groups. Patients (n = 746) of seven additional matched general practitioners who did not receive this training served as a comparison group. Patients allocated to the intervention group, and their general practitioners, received computer-tailored written recommendations, and patients were offered the choice between interdisciplinary group sessions (geriatrician, physiotherapist, social worker, and nutritionist) and home visits (nurse). Results. Among the intervention group, 580 (66%) persons made use of personal reinforcement (group sessions: 503 [87%], home visits: 77 [13%]). At 1-year follow-up, persons in the intervention group had higher use of preventive services (eg, influenza vaccinations, adjusted odds ratio 1.7; 95% confidence interval 1.4–2.1) and more favorable health behavior (eg, high fruit/fiber intake, odds ratio 2.0; 95% confidence interval 1.6–2.6), as compared with controls. Comparisons between intervention and comparison group data revealed similar effects, suggesting that physician training alone had no effect. Subgroup analyses indicated favorable effects for HRA-O with personal reinforcement, but not for HRA-O without reinforcement. Conclusions. HRA-O combined with physician training and personal reinforcement had favorable effects on preventive care use and health behavior.
Resumo:
Despite changes in patient demographics and short-ened length of hospital stay deep vein thrombosis (DVT) remains a major health care problem which may lead to a variety of other high risk complications. Current treatment guidelines focus on preventive measures. Beside drug therapy, physical measures executed by nursing professionals exist, the outcomes of which are discussed controversially. Based on 25 studies that were found in MEDLINE and the Cochrane library, this systematic literature review identifies the effectiveness of intermittent pneumatic compression (IPC) on thrombosis prophylaxis. In almost all medical settings IPC contributes to a significant reduction of the incidence of DVT. At the same time, IPC has minimal negative side effects and is also cost effective. Correct application of IPC and patient compliance are essential to achieve its effectiveness. An increased awareness within the healthcare team in identifying the risk for and implementing measures against DVT is needed. Guidelines need to be developed in order to improve the effectiveness of thrombosis prophylaxis with the implementation of IPC.
Resumo:
It has been suggested that there are several distinct phenotypes of childhood asthma or childhood wheezing. Here, we review the research relating to these phenotypes, with a focus on the methods used to define and validate them. Childhood wheezing disorders manifest themselves in a range of observable (phenotypic) features such as lung function, bronchial responsiveness, atopy and a highly variable time course (prognosis). The underlying causes are not sufficiently understood to define disease entities based on aetiology. Nevertheless, there is a need for a classification that would (i) facilitate research into aetiology and pathophysiology, (ii) allow targeted treatment and preventive measures and (iii) improve the prediction of long-term outcome. Classical attempts to define phenotypes have been one-dimensional, relying on few or single features such as triggers (exclusive viral wheeze vs. multiple trigger wheeze) or time course (early transient wheeze, persistent and late onset wheeze). These definitions are simple but essentially subjective. Recently, a multi-dimensional approach has been adopted. This approach is based on a wide range of features and relies on multivariate methods such as cluster or latent class analysis. Phenotypes identified in this manner are more complex but arguably more objective. Although phenotypes have an undisputed standing in current research on childhood asthma and wheezing, there is confusion about the meaning of the term 'phenotype' causing much circular debate. If phenotypes are meant to represent 'real' underlying disease entities rather than superficial features, there is a need for validation and harmonization of definitions. The multi-dimensional approach allows validation by replication across different populations and may contribute to a more reliable classification of childhood wheezing disorders and to improved precision of research relying on phenotype recognition, particularly in genetics. Ultimately, the underlying pathophysiology and aetiology will need to be understood to properly characterize the diseases causing recurrent wheeze in children.
Resumo:
This is a pilot study whose objective was to collect data on attempted suicide in 5 districts of Shanghai and to test the feasibility of introducing an ongoing monitoring of attempted suicide. Data on a total of 363 cases were collected. The mean age of the patients was 33 years, 67% being female. Ingesting drugs or other chemical substances was the main method used for self-harm. Reasons for attempted suicide in these districts of Shanghai often appear to be related to family conflicts and unemployment. In spite of methodological limitations, the recorded data allow some preliminary conclusions regarding the characteristics of patients in districts of Shanghai admitted after a suicide attempt. Continuous monitoring of attempted suicide in this urban area of China should be established and data collection improved to raise awareness in health professionals and to develop preventive measures geared toward the needs of these patients.
Resumo:
INTRODUCTION: Winter sports have evolved from an upper class activity to a mass industry. Especially sledging regained popularity at the start of this century, with more and more winter sports resorts offering sledge runs. This study investigated the rates of sledging injuries over the last 13 years and analysed injury patterns specific for certain age groups, enabling us to make suggestions for preventive measures. METHODS: We present a retrospective analysis of prospectively collected data. From 1996/1997 to 2008/2009, all patients involved in sledging injuries were recorded upon admission to a Level III trauma centre. Injuries were classified into body regions according to the Abbreviated Injury Scale (AIS). The Injury Severity Score (ISS) was calculated. Patients were stratified into 7 age groups. Associations between age and injured body region were tested using the chi-squared test. The slope of the linear regression with 95% confidence intervals was calculated for the proportion of patients with different injured body regions and winter season. RESULTS: 4956 winter sports patients were recorded. 263 patients (5%) sustained sledging injuries. Sledging injury patients had a median age of 22 years (interquartile range [IQR] 14-38 years) and a median ISS of 4 (IQR 1-4). 136 (51.7%) were male. Injuries (AIS≥2) were most frequent to the lower extremities (n=91, 51.7% of all AIS≥2 injuries), followed by the upper extremities (n=48, 27.3%), the head (n=17, 9.7%), the spine (n=7, 4.0%). AIS≥2 injuries to different body regions varied from season to season, with no significant trends (p>0.19). However, the number of patients admitted with AIS≥2 injuries increased significantly over the seasons analysed (p=0.031), as did the number of patients with any kind of sledging injury (p=0.004). Mild head injuries were most frequent in the youngest age group (1-10 years old). Injuries to the lower extremities were more often seen in the age groups from 21 to 60 years (p<0.001). CONCLUSION: Mild head trauma was mainly found in very young sledgers, and injuries to the lower extremities were more frequent in adults. In accordance with the current literature, we suggest that sledging should be performed in designated, obstacle-free areas that are specially prepared, and that children should always be supervised by adults. The effect of routine use of helmets and other protective devices needs further evaluation, but it seems evident that these should be obligatory on official runs.
Resumo:
Background Chronic localized pain syndromes, especially chronic low back pain (CLBP), are common reasons for consultation in general practice. In some cases chronic localized pain syndromes can appear in combination with chronic widespread pain (CWP). Numerous studies have shown a strong association between CWP and several physical and psychological factors. These studies are population-based cross-sectional and do not allow for assessing chronology. There are very few prospective studies that explore the predictors for the onset of CWP, where the main focus is identifying risk factors for the CWP incidence. Until now there have been no studies focusing on preventive factors keeping patients from developing CWP. Our aim is to perform a cross sectional study on the epidemiology of CLBP and CWP in general practice and to look for distinctive features regarding resources like resilience, self-efficacy and coping strategies. A subsequent cohort study is designed to identify the risk and protective factors of pain generalization (development of CWP) in primary care for CLBP patients. Methods/Design Fifty-nine general practitioners recruit consecutively, during a 5 month period, all patients who are consulting their family doctor because of chronic low back pain (where the pain is lasted for 3 months). Patients are asked to fill out a questionnaire on pain anamnesis, pain-perception, co-morbidities, therapy course, medication, socio demographic data and psychosomatic symptoms. We assess resilience, coping resources, stress management and self-efficacy as potential protective factors for pain generalization. Furthermore, we raise risk factors for pain generalization like anxiety, depression, trauma and critical life events. During a twelve months follow up period a cohort of CLBP patients without CWP will be screened on a regular basis (3 monthly) for pain generalization (outcome: incident CWP). Discussion This cohort study will be the largest study which prospectively analyzes predictors for transition from CLBP to CWP in primary care setting. In contrast to the typically researched risk factors, which increase the probability of pain generalization, this study also focus intensively on protective factors, which decrease the probability of pain generalization.
Resumo:
INTRODUCTION: Surgical site infections (SSI) are the most common hospital-acquired infections among surgical patients, with significant impact on patient morbidity and health care costs. The Basel SSI Cohort Study was performed to evaluate risk factors and validate current preventive measures for SSI. The objective of the present article was to review the main results of this study and its implications for clinical practice and future research. SUMMARY OF METHODS OF THE BASEL SSI COHORT STUDY: The prospective observational cohort study included 6,283 consecutive general surgery procedures closely monitored for evidence of SSI up to 1 year after surgery. The dataset was analysed for the influence of various potential SSI risk factors, including timing of surgical antimicrobial prophylaxis (SAP), glove perforation, anaemia, transfusion and tutorial assistance, using multiple logistic regression analyses. In addition, post hoc analyses were performed to assess the economic burden of SSI, the efficiency of the clinical SSI surveillance system, and the spectrum of SSI-causing pathogens. REVIEW OF MAIN RESULTS OF THE BASEL SSI COHORT STUDY: The overall SSI rate was 4.7% (293/6,283). While SAP was administered in most patients between 44 and 0 minutes before surgical incision, the lowest risk of SSI was recorded when the antibiotics were administered between 74 and 30 minutes before surgery. Glove perforation in the absence of SAP increased the risk of SSI (OR 2.0; CI 1.4-2.8; p <0.001). No significant association was found for anaemia, transfusion and tutorial assistance with the risk of SSI. The mean additional hospital cost in the event of SSI was CHF 19,638 (95% CI, 8,492-30,784). The surgical staff documented only 49% of in-hospital SSI; the infection control team registered the remaining 51%. Staphylococcus aureus was the most common SSI-causing pathogen (29% of all SSI with documented microbiology). No case of an antimicrobial-resistant pathogen was identified in this series. CONCLUSIONS: The Basel SSI Cohort Study suggested that SAP should be administered between 74 and 30 minutes before surgery. Due to the observational nature of these data, corroboration is planned in a randomized controlled trial, which is supported by the Swiss National Science Foundation. Routine change of gloves or double gloving is recommended in the absence of SAP. Anaemia, transfusion and tutorial assistance do not increase the risk of SSI. The substantial economic burden of in-hospital SSI has been confirmed. SSI surveillance by the surgical staff detected only half of all in-hospital SSI, which prompted the introduction of an electronic SSI surveillance system at the University Hospital of Basel and the Cantonal Hospital of Aarau. Due to the absence of multiresistant SSI-causing pathogens, the continuous use of single-shot single-drug SAP with cefuroxime (plus metronidazole in colorectal surgery) has been validated.
Resumo:
There is some evidence that the presence of erosion is growing steadily. Because of different scoring systems, samples and examiners, it is difficult to compare and judge the outcome of the studies. Preschool children aged between 2 and 5 years showed erosion on deciduous teeth in 6-50% of the subjects. Young schoolchildren (aged 5-9) already had erosive lesions on permanent teeth in 14% of the cases. In the adolescent group (aged between 9 and 17) 11-100% of the young people examined showed signs of erosion. Incidence data (= increase of subjects with erosion) evaluated in three of these studies were 12% over 2 years, 18% over 5 years and 27% over 1.5 years. In adults (aged between 18 and 88), prevalence data ranged between 4 and 82%. Incidence data are scarce; only one study was found and this showed an incidence of 5% for the younger and 18% for the older examined group (= increase of tooth surfaces with erosion). Prevalence data indicated that males had somewhat more erosive tooth wear than females. The distribution of erosion showed a predominance of occlusal surfaces (especially mandibular first molars), followed by facial surfaces (anterior maxillary teeth). Oral erosion was frequently found on maxillary incisors and canines. Overall, prevalence data are not homogeneous. Nevertheless, there is already a trend for more pronounced rate of erosion in younger age groups. Therefore, it is important to detect at-risk patients early to initiate adequate preventive measures.
Resumo:
Waterproofing agents are widely used to protect leather and textiles in both domestic and occupational activities. An outbreak of acute respiratory syndrome following exposure to waterproofing sprays occurred during the winter 2002-2003 in Switzerland. About 180 cases were reported by the Swiss Toxicological Information Centre between October 2002 and March 2003, whereas fewer than 10 cases per year had been recorded previously. The reported cases involved three brands of sprays containing a common waterproofing mixture, that had undergone a formulation change in the months preceding the outbreak. A retrospective analysis was undertaken in collaboration with the Swiss Toxicological Information Centre and the Swiss Registries for Interstitial and Orphan Lung Diseases to clarify the circumstances and possible causes of the observed health effects. Individual exposure data were generated with questionnaires and experimental emission measurements. The collected data was used to conduct numeric simulation for 102 cases of exposure. A classical two-zone model was used to assess the aerosol dispersion in the near- and far-field during spraying. The resulting assessed dose and exposure levels obtained were spread on large scales, of several orders of magnitude. No dose-response relationship was found between exposure indicators and health effects indicators (perceived severity and clinical indicators). Weak relationships were found between unspecific inflammatory response indicators (leukocytes, C-reactive protein) and the maximal exposure concentration. The results obtained disclose a high interindividual response variability and suggest that some indirect mechanism(s) predominates in the respiratory disease occurrence. Furthermore, no threshold could be found to define a safe level of exposure. These findings suggest that the improvement of environmental exposure conditions during spraying alone does not constitute a sufficient measure to prevent future outbreaks of waterproofing spray toxicity. More efficient preventive measures are needed prior to the marketing and distribution of new waterproofing agents.
Resumo:
Dental erosion is often described solely as a surface phenomenon, unlike caries where it has been established that the destructive effects involve both the surface and the subsurface region. However, besides removal and softening of the surface, erosion may show dissolution of mineral underneath the surface. There is some evidence that the presence of this condition is growing steadily. Hence, erosive tooth wear is becoming increasingly significant in the management of the long-term health of the dentition. What is considered as an acceptable amount of wear is dependent on the anticipated lifespan of the dentition and, therefore, is different for deciduous compared to permanent teeth. However, erosive damage to the permanent teeth occurring in childhood may compromise the growing child's dentition for their entire lifetime and may require repeated and increasingly complex and expensive restoration. Therefore, it is important that diagnosis of the tooth wear process in children and adults is made early and adequate preventive measures are undertaken. These measures can only be initiated when the risk factors are known and interactions between them are present. A scheme is proposed which allows the possible risk factors and their relation to each other to be examined.
Resumo:
In the dual ex vivo perfusion of an isolated human placental cotyledon it takes on average 20-30 min to set up stable perfusion circuits for the maternal and fetal vascular compartments. In vivo placental tissue of all species maintains a highly active metabolism and it continues to puzzle investigators how this tissue can survive 30 min of ischemia with more or less complete anoxia following expulsion of the organ from the uterus and do so without severe damage. There seem to be parallels between "depressed metabolism" seen in the fetus and the immature neonate in the peripartum period and survival strategies described in mammals with increased tolerance of severe hypoxia like hibernators in the state of torpor or deep sea diving turtles. Increased tolerance of hypoxia in both is explained by "partial metabolic arrest" in the sense of a temporary suspension of Kleiber's rule. Furthermore the fetus can react to major changes in surrounding oxygen tension by decreasing or increasing the rate of specific basal metabolism, providing protection against severe hypoxia as well as oxidative stress. There is some evidence that adaptive mechanisms allowing increased tolerance of severe hypoxia in the fetus or immature neonate can also be found in placental tissue, of which at least the villous portion is of fetal origin. A better understanding of the molecular details of reprogramming of fetal and placental tissues in late pregnancy may be of clinical relevance for an improved risk assessment of the individual fetus during the critical transition from intrauterine life to the outside and for the development of potential prophylactic measures against severe ante- or intrapartum hypoxia. Responses of the tissue to reperfusion deserve intensive study, since they may provide a rational basis for preventive measures against reperfusion injury and related oxidative stress. Modification of the handling of placental tissue during postpartum ischemia, and adaptation of the artificial reperfusion, may lead to an improvement of the ex vivo perfusion technique.