100 resultados para preventative measures
Resumo:
BACKGROUND AND PURPOSE We report on workflow and process-based performance measures and their effect on clinical outcome in Solitaire FR Thrombectomy for Acute Revascularization (STAR), a multicenter, prospective, single-arm study of Solitaire FR thrombectomy in large vessel anterior circulation stroke patients. METHODS Two hundred two patients were enrolled across 14 centers in Europe, Canada, and Australia. The following time intervals were measured: stroke onset to hospital arrival, hospital arrival to baseline imaging, baseline imaging to groin puncture, groin puncture to first stent deployment, and first stent deployment to reperfusion. Effects of time of day, general anesthesia use, and multimodal imaging on workflow were evaluated. Patient characteristics and workflow processes associated with prolonged interval times and good clinical outcome (90-day modified Rankin score, 0-2) were analyzed. RESULTS Median times were onset of stroke to hospital arrival, 123 minutes (interquartile range, 163 minutes); hospital arrival to thrombolysis in cerebral infarction (TICI) 2b/3 or final digital subtraction angiography, 133 minutes (interquartile range, 99 minutes); and baseline imaging to groin puncture, 86 minutes (interquartile range, 24 minutes). Time from baseline imaging to puncture was prolonged in patients receiving intravenous tissue-type plasminogen activator (32-minute mean delay) and when magnetic resonance-based imaging at baseline was used (18-minute mean delay). Extracranial carotid disease delayed puncture to first stent deployment time on average by 25 minutes. For each 1-hour increase in stroke onset to final digital subtraction angiography (or TICI 2b/3) time, odds of good clinical outcome decreased by 38%. CONCLUSIONS Interval times in the STAR study reflect current intra-arterial therapy for patients with acute ischemic stroke. Improving workflow metrics can further improve clinical outcome. CLINICAL TRIAL REGISTRATION: URL http://www.clinicaltrials.gov. Unique identifier: NCT01327989.
Resumo:
There is a need for accurate predictions of ecosystem carbon (C) and water fluxes in field conditions. Previous research has shown that ecosystem properties can be predicted from community abundance-weighted means (CWM) of plant functional traits and measures of trait variability within a community (FDvar). The capacity for traits to predict carbon (C) and water fluxes, and the seasonal dependency of these trait-function relationships has not been fully explored. Here we measured daytime C and water fluxes over four seasons in grasslands of a range of successional ages in southern England. In a model selection procedure, we related these fluxes to environmental covariates and plant biomass measures before adding CWM and FDvar plant trait measures that were scaled up from measures of individual plants grown in greenhouse conditions. Models describing fluxes in periods of low biological activity contained few predictors, which were usually abiotic factors. In more biologically active periods, models contained more predictors, including plant trait measures. Field-based plant biomass measures were generally better predictors of fluxes than CWM and FDvar traits. However, when these measures were used in combination traits accounted for additional variation. Where traits were significant predictors their identity often reflected seasonal vegetation dynamics. These results suggest that database derived trait measures can improve the prediction of ecosystem C and water fluxes. Controlled studies and those involving more detailed flux measurements are required to validate and explore these findings, a worthwhile effort given the potential for using simple vegetation measures to help predict landscape-scale fluxes.
Resumo:
Excessive runoff and soil erosion in the upper Blue Nile Basin poses a threat that has attracted the attention of the Ethiopian government because of the serious on-site effects in addition to downstream effects, such as the siltation of water harvesting structures and reservoirs. The objective of the study was to evaluate and recommend effective biophysical soil and water conservation measure(s) in the Debre Mewi watershed, about 30 km south of the Lake Tana. Six conservation measures were evaluated for their effects on runoff, soil loss, and forage yield using runoff plots. There was a significant difference between treatments for both runoff and soil loss. The four-year average annual soil loss in the different plots ranged from 26 to 71 t ha−1, and total runoff ranged from 180 to 302 mm, while annual rainfall varied between 854 mm in 2008 and 1247 mm in 2011. Soil bund combined with elephant grass had the lowest runoff and soil loss as compared to the other treatments, whereas the untreated control plot had the highest for both parameters. As an additional benefit, 2.8 and 0.7 t ha−1 year−1 of dried forage was obtained from elephant and local grasses, respectively. Furthermore, it was found that soil bund combined with Tephrosia increased soil organic matter by 13% compared to the control plot. Soil bund efficiency was significantly enhanced by combining them with biological measures and improved farmers’ perception of soil and water conservation measures.
Resumo:
AIM To characterize the subgingival microbiota within a cohort of adult males (n = 32) naïve to oral hygiene practices, and to compare the composition of bacterial taxa present in periodontal sites with various probing depths. MATERIAL AND METHODS Subgingival plaque samples were collected from single shallow pocket [pocket probing depth (PPD)≤3 mm] and deep pocket (PPD≥6 mm) sites from each subject. A polymerase chain reaction based strategy was used to construct a clone library of 16S ribosomal RNA (rRNA) genes for each site. The sequences of ca. 30-60 plasmid clones were determined for each site to identify resident taxa. Microbial composition was compared using a variety of statistical and bioinformatics approaches. RESULTS A total of 1887 cloned 16S rRNA gene sequences were analysed, which were assigned to 318 operational taxonomic units (98% identity cut-off). The subgingival microbiota was dominated by Firmicutes (69.8%), Proteobacteria (16.3%), and Fusobacteria (8.0%). The overall composition of microbial communities in shallow sites was significantly different from those within deep sites (∫-Libshuff, p < 0.001). CONCLUSIONS A taxonomically diverse subgingival microbiota was present within this cohort; however, the structures of the microbial communities present in the respective subjects exhibited limited variation. Deep and shallow sites contained notably different microbial compositions, but this was not correlated with the rate of periodontal progression.
Resumo:
PURPOSE To systematically appraise whether anti-infective protocols are effective in preventing biologic implant complications and implant loss after a mean observation period ≥ 10 years after loading. MATERIALS AND METHODS An electronic search of Medline via PubMed and Embase via Ovid databases complemented by manual search was conducted up to October 31, 2012. Studies were included provided that they were published in English, German, French, or Italian, and conducted on ≥ 20 partially and fully edentulous patients with dental implants and regular (≥ 1×/year) supportive periodontal therapy (SPT) over a mean observation period ≥ 10 years. Assessment of the identified studies and data extraction were performed independently by two reviewers. Authors were contacted if required. Collected data were reported by descriptive methods. RESULTS The initial electronic search resulted in the identification of 994 titles from Medline via PubMed and 531 titles from Embase via Ovid databases, respectively. After elimination of duplicate titles and exclusion of 60 full-text articles, 143 articles were analyzed, resulting in 15 studies eligible for qualitative analysis. The implant survival rate ranged from 85.7% to 99.2% after a mean observation period ≥ 10 years. One comparative study assessed the effects of regular SPT on the occurrence of biologic complications and implant loss. Overall, regular diagnosis and implementation of anti-infective therapeutic protocols were effective in the management of biological complications and prevention of implant loss. Residual probing depths at the end of active periodontal therapy and development of reinfection during supportive periodontal therapy (SPT) represented a significant risk for the onset of peri-implantitis and implant loss. Comparative studies indicated that implant survival and success rates were lower in periodontally compromised vs noncompromised patients. CONCLUSIONS In order to achieve high long-term survival and success rates of dental implants and their restorations, enrollment in regular SPT including anti-infective preventive measures should be implemented. Therapy of peri-implant mucositis should be considered as a preventive measure for the onset of peri-implantitis. Completion of active periodontal therapy should precede implant placement in periodontally compromised patients.
Resumo:
A prerequisite for preventive measures is to diagnose erosive tooth wear and to evaluate the different etiological factors in order to identify persons at risk. No diagnostic device is available for the assessment of erosive defects. Thus, they can only be detected clinically. Consequently, erosion not diagnosed at an early stage may render timely preventive measures difficult. In order to assess the risk factors, patients should record their dietary intake for a distinct period of time. Then a dentist can determine the erosive potential of the diet. A table with common beverages and foodstuffs is presented for judging the erosive potential. Particularly, patients with more than 4 dietary acid intakes have a higher risk for erosion when other risk factors are present. Regurgitation of gastric acids is a further important risk factor for the development of erosion which has to be taken into account. Based on these analyses, an individually tailored preventive program may be suggested to the patients. It may comprise dietary advice, use of calcium-enriched beverages, optimization of prophylactic regimes, stimulation of salivary flow rate, use of buffering medicaments and particular motivation for nondestructive toothbrushing habits with an erosive-protecting toothpaste as well as rinsing solutions. Since erosion and abrasion often occur simultaneously, all of the causative components must be taken into consideration when planning preventive strategies but only those important and feasible for an individual should be communicated to the patient.
Resumo:
Background: According to the World Health Organization, stroke is the 'incoming epidemic of the 21st century'. In light of recent data suggesting that 85% of all strokes may be preventable, strategies for prevention are moving to the forefront in stroke management. Summary: This review discusses the risk factors and provides evidence on the effective medical interventions and lifestyle modifications for optimal stroke prevention. Key Messages: Stroke risk can be substantially reduced using the medical measures that have been proven in many randomized trials, in combination with effective lifestyle modifications. The global modification of health and lifestyle is more beneficial than the treatment of individual risk factors. Clinical Implications: Hypertension is the most important modifiable risk factor for stroke. Efficacious reduction of blood pressure is essential for stroke prevention, even more so than the choice of antihypertensive drugs. Indications for the use of antihypertensive drugs depend on blood pressure values and vascular risk profile; thus, treatment should be initiated earlier in patients with diabetes mellitus or in those with a high vascular risk profile. Treatment of dyslipidemia with statins, anticoagulation therapy in atrial fibrillation, and carotid endarterectomy in symptomatic high-grade carotid stenosis are also effective for stroke prevention. Lifestyle factors that have been proven to reduce stroke risk include reducing salt, eliminating smoking, performing regular physical activity, and maintaining a normal body weight. © 2015 S. Karger AG, Basel.
Resumo:
OBJECTIVE The cause precipitating intracranial aneurysm rupture remains unknown in many cases. It has been observed that aneurysm ruptures are clustered in time, but the trigger mechanism remains obscure. Because solar activity has been associated with cardiovascular mortality and morbidity, we decided to study its association to aneurysm rupture in the Swiss population. METHODS Patient data were extracted from the Swiss SOS database, at time of analysis covering 918 consecutive patients with angiography-proven aneurysmal subarachnoid hemorrhage treated at 7 Swiss neurovascular centers between January 1, 2009, and December 31, 2011. The daily rupture frequency (RF) was correlated to the absolute amount and the change in various parameters of interest representing continuous measurements of solar activity (radioflux [F10.7 index], solar proton flux, solar flare occurrence, planetary K-index/planetary A-index, Space Environment Services Center [SESC] sunspot number and sunspot area) using Poisson regression analysis. RESULTS During the period of interest, there were 517 days without recorded aneurysm rupture. There were 398, 139, 27, 12, 1, and 1 days with 1, 2, 3, 4, 5, and 6 ruptures per day. Poisson regression analysis demonstrated a significant correlation of F10.7 index and RF (incidence rate ratio [IRR] = 1.006303; standard error (SE) 0.0013201; 95% confidence interval (CI) 1.003719-1.008894; P < 0.001), according to which every 1-unit increase of the F10.7 index increased the count for an aneurysm to rupture by 0.63%. A likewise statistically significant relationship of both the SESC sunspot number (IRR 1.003413; SE 0.0007913; 95% CI 1.001864-1.004965; P < 0.001) and the sunspot area (IRR 1.000419; SE 0.0000866; 95% CI 1.000249-1.000589; P < 0.001) emerged. All other variables analyzed showed no significant correlation with RF. CONCLUSIONS We found greater radioflux, SESC sunspot number, and sunspot area to be associated with an increased count of aneurysm rupture. The clinical meaningfulness of this statistical association must be interpreted carefully and future studies are warranted to rule out a type-1 error.
Resumo:
BACKGROUND Assessment of the proportion of patients with well controlled cardiovascular risk factors underestimates the proportion of patients receiving high quality of care. Evaluating whether physicians respond appropriately to poor risk factor control gives a different picture of quality of care. We assessed physician response to control cardiovascular risk factors, as well as markers of potential overtreatment in Switzerland, a country with universal healthcare coverage but without systematic quality monitoring, annual report cards on quality of care or financial incentives to improve quality. METHODS We performed a retrospective cohort study of 1002 randomly selected patients aged 50-80 years from four university primary care settings in Switzerland. For hypertension, dyslipidemia and diabetes mellitus, we first measured proportions in control, then assessed therapy modifications among those in poor control. "Appropriate clinical action" was defined as a therapy modification or return to control without therapy modification within 12 months among patients with baseline poor control. Potential overtreatment of these conditions was defined as intensive treatment among low-risk patients with optimal target values. RESULTS 20% of patients with hypertension, 41% with dyslipidemia and 36% with diabetes mellitus were in control at baseline. When appropriate clinical action in response to poor control was integrated into measuring quality of care, 52 to 55% had appropriate quality of care. Over 12 months, therapy of 61% of patients with baseline poor control was modified for hypertension, 33% for dyslipidemia, and 85% for diabetes mellitus. Increases in number of drug classes (28-51%) and in drug doses (10-61%) were the most common therapy modifications. Patients with target organ damage and higher baseline values were more likely to have appropriate clinical action. We found low rates of potential overtreatment with 2% for hypertension, 3% for diabetes mellitus and 3-6% for dyslipidemia. CONCLUSIONS In primary care, evaluating whether physicians respond appropriately to poor risk factor control, in addition to assessing proportions in control, provide a broader view of the quality of care than relying solely on measures of proportions in control. Such measures could be more clinically relevant and acceptable to physicians than simply reporting levels of control.
Resumo:
OBJECTIVES Sensorineural hearing loss from sound overexposure has a considerable prevalence. Identification of sound hazards is crucial, as prevention, due to a lack of definitive therapies, is the sole alternative to hearing aids. One subjectively loud, yet little studied, potential sound hazard is movie theaters. This study uses smart phones to evaluate their applicability as a widely available, validated sound pressure level (SPL) meter. Therefore, this study measures sound levels in movie theaters to determine whether sound levels exceed safe occupational noise exposure limits and whether sound levels in movie theaters differ as a function of movie, movie theater, presentation time, and seat location within the theater. DESIGN Six smart phones with an SPL meter software application were calibrated with a precision SPL meter and validated as an SPL meter. Additionally, three different smart phone generations were measured in comparison to an integrating SPL meter. Two different movies, an action movie and a children's movie, were measured six times each in 10 different venues (n = 117). To maximize representativeness, movies were selected focusing on large release productions with probable high attendance. Movie theaters were selected in the San Francisco, CA, area based on whether they screened both chosen movies and to represent the largest variety of theater proprietors. Measurements were analyzed in regard to differences between theaters, location within the theater, movie, as well as presentation time and day as indirect indicator of film attendance. RESULTS The smart phone measurements demonstrated high accuracy and reliability. Overall, sound levels in movie theaters do not exceed safe exposure limits by occupational standards. Sound levels vary significantly across theaters and demonstrated statistically significant higher sound levels and exposures in the action movie compared to the children's movie. Sound levels decrease with distance from the screen. However, no influence on time of day or day of the week as indirect indicator of film attendance could be found. CONCLUSIONS Calibrated smart phones with an appropriate software application as used in this study can be utilized as a validated SPL meter. Because of the wide availability, smart phones in combination with the software application can provide high quantity recreational sound exposure measurements, which can facilitate the identification of potential noise hazards. Sound levels in movie theaters decrease with distance to the screen, but do not exceed safe occupational noise exposure limits. Additionally, there are significant differences in sound levels across movie theaters and movies, but not in presentation time.