970 resultados para Recommendation grades
Resumo:
BACKGROUND: Many emergency department (ED) providers do not follow guideline recommendations for the use of the pneumonia severity index (PSI) to determine the initial site of treatment for patients with community-acquired pneumonia (CAP). We identified the reasons why ED providers hospitalize low-risk patients or manage higher-risk patients as outpatients. METHODS: As a part of a trial to implement a PSI-based guideline for the initial site of treatment of patients with CAP, we analyzed data for patients managed at 12 EDs allocated to a high-intensity guideline implementation strategy study arm. The guideline recommended outpatient care for low-risk patients (nonhypoxemic patients with a PSI risk classification of I, II, or III) and hospitalization for higher-risk patients (hypoxemic patients or patients with a PSI risk classification of IV or V). We asked providers who made guideline-discordant decisions on site of treatment to detail the reasons for nonadherence to guideline recommendations. RESULTS: There were 1,306 patients with CAP (689 low-risk patients and 617 higher-risk patients). Among these patients, physicians admitted 258 (37.4%) of 689 low-risk patients and treated 20 (3.2%) of 617 higher-risk patients as outpatients. The most commonly reported reasons for admitting low-risk patients were the presence of a comorbid illness (178 [71.5%] of 249 patients); a laboratory value, vital sign, or symptom that precluded ED discharge (73 patients [29.3%]); or a recommendation from a primary care or a consulting physician (48 patients [19.3%]). Higher-risk patients were most often treated as outpatients because of a recommendation by a primary care or consulting physician (6 [40.0%] of 15 patients). CONCLUSION: ED providers hospitalize many low-risk patients with CAP, most frequently for a comorbid illness. Although higher-risk patients are infrequently treated as outpatients, this decision is often based on the request of an involved physician.
Resumo:
Impact of parental emigration on educational outcomes of children is theoretically ambiguous. Using novel data I collected on migration experience and its timing, family background and school performance of lower secondary pupils in Poland, I analyse the question empirically. Migration is mostly temporary in nature, with one parent engaging in employment abroad. As many as 63% of migrant parents have vocational qualifications, 29% graduated from high school, 4% have no qualifications and the remaining 4% graduated from university. Almost 18% of children are affected by parental migration. Perhaps surprisingly, estimates suggest that parental employment abroad has a positive immediate impact on a pupil’s grade. Parental education appears pivotal; children of high school graduates benefit most. Longer term effects appear more negative, however, suggesting that a prolonged migration significantly lowers a child’s grade. Interestingly, siblings’ foreign experiences exert a large, positive impact on pupils’ grades.
Resumo:
BACKGROUND & AIMS: Since the publications of the ESPEN guidelines on enteral and parenteral nutrition in ICU, numerous studies have added information to assist the nutritional management of critically ill patients regarding the recognition of the right population to feed, the energy-protein targeting, the route and the timing to start. METHODS: We reviewed and discussed the literature related to nutrition in the ICU from 2006 until October 2013. RESULTS: To identify safe, minimal and maximal amounts for the different nutrients and at the different stages of the acute illness is necessary. These amounts might be specific for different phases in the time course of the patient's illness. The best approach is to target the energy goal defined by indirect calorimetry. High protein intake (1.5 g/kg/d) is recommended during the early phase of the ICU stay, regardless of the simultaneous calorie intake. This recommendation can reduce catabolism. Later on, high protein intake remains recommended, likely combined with a sufficient amount of energy to avoid proteolysis. CONCLUSIONS: Pragmatic recommendations are proposed to practically optimize nutritional therapy based on recent publications. However, on some issues, there is insufficient evidence to make expert recommendations.
Resumo:
BACKGROUND & AIMS: Nutrition therapy is a cornerstone of burn care from the early resuscitation phase until the end of rehabilitation. While several aspects of nutrition therapy are similar in major burns and other critical care conditions, the patho-physiology of burn injury with its major endocrine, inflammatory, metabolic and immune alterations requires some specific nutritional interventions. The present text developed by the French speaking societies, is updated to provide evidenced-based recommendations for clinical practice. METHODS: A group of burn specialists used the GRADE methodology (Grade of Recommendation, Assessment, Development and Evaluation) to evaluate human burn clinical trials between 1979 and 2011. The resulting recommendations, strong suggestions or suggestions were then rated by the non-burn specialized experts according to their agreement (strong, moderate or weak). RESULTS: Eight major recommendations were made. Strong recommendations were made regarding, 1) early enteral feeding, 2) the elevated protein requirements (1.5-2 g/kg in adults, 3 g/kg in children), 3) the limitation of glucose delivery to a maximum of 55% of energy and 5 mg/kg/h associated with moderate blood glucose (target ≤ 8 mmol/l) control by means of continuous infusion, 4) to associated trace element and vitamin substitution early on, and 5) to use non-nutritional strategies to attenuate hypermetabolism by pharmacological (propranolol, oxandrolone) and physical tools (early surgery and thermo-neutral room) during the first weeks after injury. Suggestion were made in absence of indirect calorimetry, to use of the Toronto equation (Schoffield in children) for energy requirement determination (risk of overfeeding), and to maintain fat administration ≤ 30% of total energy delivery. CONCLUSION: The nutritional therapy in major burns has evidence-based specificities that contribute to improve clinical outcome.
Resumo:
ABSTRACT: BACKGROUND: There is no recommendation to screen ferritin level in blood donors, even though several studies have noted the high prevalence of iron deficiency after blood donation, particularly among menstruating females. Furthermore, some clinical trials have shown that non-anaemic women with unexplained fatigue may benefit from iron supplementation. Our objective is to determine the clinical effect of iron supplementation on fatigue in female blood donors without anaemia, but with a mean serum ferritin </= 30 ng/ml. METHODS/DESIGN: In a double blind randomised controlled trial, we will measure blood count and ferritin level of women under age 50 yr, who donate blood to the University Hospital of Lausanne Blood Transfusion Department, at the time of the donation and after 1 week. One hundred and forty donors with a ferritin level </= 30 ng/ml and haemoglobin level >/= 120 g/l (non-anaemic) a week after the donation will be included in the study and randomised. A one-month course of oral ferrous sulphate (80 mg/day of elemental iron) will be introduced vs. placebo. Self-reported fatigue will be measured using a visual analogue scale. Secondary outcomes are: score of fatigue (Fatigue Severity Scale), maximal aerobic power (Chester Step Test), quality of life (SF-12), and mood disorders (Prime-MD). Haemoglobin and ferritin concentration will be monitored before and after the intervention. DISCUSSION: Iron deficiency is a potential problem for all blood donors, especially menstruating women. To our knowledge, no other intervention study has yet evaluated the impact of iron supplementation on subjective symptoms after a blood donation. TRIAL REGISTRATION: NCT00689793.
Resumo:
Introduction: In order to improve safety of pedicle screw placement several techniques have been developed. More recently robotically assisted pedicle insertion has been introduced aiming at increasing accuracy. The aim of this study was to compare this new technique with the two main pedicle insertion techniques in our unit namely fluoroscopically assisted vs EMG aided insertion. Material and methods: A total of 382 screws (78 thoracic,304 lumbar) were introduced in 64 patients (m/f = 1.37, equally distributed between insertion technique groups) by a single experienced spinal surgeon. From those, 64 (10 thoracic, 54 lumbar) were introduced in 11 patients using a miniature robotic device based on pre operative CT images under fluoroscopic control. 142 (4 thoracic, 138 lumbar) screws were introduced using lateral fluoroscopy in 27 patients while 176 (64 thoracic, 112 lumbar) screws in 26 patients were inserted using both fluoroscopy and EMG monitoring. There was no difference in the distribution of scoliotic spines between the 3 groups (n = 13). Screw position was assessed by an independent observer on CTs in axial, sagittal and coronal planes using the Rampersaud A to D classification. Data of lumbar and thoracic screws were processed separately as well as data obtained from axial, sagittal and coronal CT planes. Results: Intra- and interobserver reliability of the Rampersaud classification was moderate, (0.35 and 0.45 respectively) being the least good on axial plane. The total number of misplaced screws (C&D grades) was generally low (12 thoracic and 12 lumbar screws). Misplacement rates were same in straight and scoliotic spines. The only difference in misplacement rates was observed on axial and coronal images in the EMG assisted thoracic screw group with a higher proportion of C or D grades (p <0.05) in that group. Recorded compound muscle action potentials (CMAP) values of the inserted screws were 30.4 mA for the robot and 24.9mA for the freehand technique with a CI of 3.8 of the mean difference of 5.5 mA. Discussion: Robotic placement did improve the placement of thoracic screws but not that of lumbar screws possibly because our misplacement rates in general near that of published navigation series. Robotically assisted spine surgery might therefore enhance the safety of screw placement in particular in training settings were different users at various stages of their learning curve are involved in pedicle instrumentation.
Development of an optimized methodology for tensile testing of carbon steels in hydrogen environment
Resumo:
The study was performed at OCAS, the Steel Research Centre of ArcelorMittal for the Industry market. The major aim of this research was to obtain an optimized tensile testing methodology with in-situ H-charging to reveal the hydrogen embrittlement in various high strength steels. The second aim of this study has been the mechanical characterization of the hydrogen effect on hight strength carbon steels with varying microstructure, i.e. ferrite-martensite and ferrite-bainite grades. The optimal parameters for H-charging - which influence the tensile test results (sample geometry type of electrolyte, charging methods effect of steel type, etc.) - were defined and applied to Slow Strain Rate testing, Incremental Step Loading and Constant Load Testing. To better understand the initiation and propagation of cracks during tensile testing with in-situ H-charging, and to make the correlation with crystallographic orientation, some materials have been analyzed in the SEM in combination with the EBSD technique. The introduction of a notch on the tensile samples permits to reach a significantly improved reproducibility of the results. Comparing the various steel grades reveals that Dual Phase (ferrite-martensite) steels are more sensitive to hydrogen induced cracking than the FB (ferritic-bainitic) ones. This higher sensitivity to hydrogen was found back in the reduced failure times, increased creep rates and enhanced crack initiation (SEM) for the Dual Phase steels in comparison with the FB steels.
Resumo:
BACKGROUND: The aim of this study was to assess, at the European level and using digital technology, the inter-pathologist reproducibility of the ISHLT 2004 system and to compare it with the 1990 system We also assessed the reproducibility of the morphologic criteria for diagnosis of antibody-mediated rejection detailed in the 2004 grading system. METHODS: The hematoxylin-eosin-stained sections of 20 sets of endomyocardial biopsies were pre-selected and graded by two pathologists (A.A. and M.B.) and digitized using a telepathology digital pathology system (Aperio ImageScope System; for details refer to http://aperio.com/). Their diagnoses were considered the index diagnoses, which covered all grades of acute cellular rejection (ACR), early ischemic lesions, Quilty lesions, late ischemic lesions and (in the 2005 system) antibody-mediated rejection (AMR). Eighteen pathologists from 16 heart transplant centers in 7 European countries participated in the study. Inter-observer reproducibility was assessed using Fleiss's kappa and Krippendorff's alpha statistics. RESULTS: The combined kappa value of all grades diagnosed by all 18 pathologists was 0.31 for the 1990 grading system and 0.39 for the 2005 grading system, with alpha statistics at 0.57 and 0.55, respectively. Kappa values by grade for 1990/2005, respectively, were: 0 = 0.52/0.51; 1A/1R = 0.24/0.36; 1B = 0.15; 2 = 0.13; 3A/2R = 0.29/0.29; 3B/3R = 0.13/0.23; and 4 = 0.18. For the 2 cases of AMR, 6 of 18 pathologists correctly suspected AMR on the hematoxylin-eosin slides, whereas, in each of 17 of the 18 AMR-negative cases a small percentage of pathologists (range 5% to 33%) overinterpreted the findings as suggestive for AMR. CONCLUSIONS: Reproducibility studies of cardiac biopsies by pathologists in different centers at the international level were feasible using digitized slides rather than conventional histology glass slides. There was a small improvement in interobserver agreement between pathologists of different European centers when moving from the 1990 ISHLT classification to the "new" 2005 ISHLT classification. Morphologic suspicion of AMR in the 2004 system on hematoxylin-eosin-stained slides only was poor, highlighting the need for better standardization of morphologic criteria for AMR. Ongoing educational programs are needed to ensure standardization of diagnosis of both acute cellular and antibody-mediated rejection.
Resumo:
Eosinophilic esophagitis (EoE) is a clinicopathologic condition of increasing recognition and prevalence. In 2007, a consensus recommendation provided clinical and histopathologic guidance for the diagnosis and treatment of EoE; however, only a minority of physicians use the 2007 guidelines, which require fulfillment of both histologic and clinical features. Since 2007, the number of EoE publications has doubled, providing new disease insight. Accordingly, a panel of 33 physicians with expertise in pediatric and adult allergy/immunology, gastroenterology, and pathology conducted a systematic review of the EoE literature (since September 2006) using electronic databases. Based on the literature review and expertise of the panel, information and recommendations were provided in each of the following areas of EoE: diagnostics, genetics, allergy testing, therapeutics, and disease complications. Because accumulating animal and human data have provided evidence that EoE appears to be an antigen-driven immunologic process that involves multiple pathogenic pathways, a new conceptual definition is proposed highlighting that EoE represents a chronic, immune/antigen-mediated disease characterized clinically by symptoms related to esophageal dysfunction and histologically by eosinophil-predominant inflammation. The diagnostic guidelines continue to define EoE as an isolated chronic disorder of the esophagus diagnosed by the need of both clinical and pathologic features. Patients commonly have high rates of concurrent allergic diatheses, especially food sensitization, compared with the general population. Proved therapeutic options include chronic dietary elimination, topical corticosteroids, and esophageal dilation. Important additions since 2007 include genetic underpinnings that implicate EoE susceptibility caused by polymorphisms in the thymic stromal lymphopoietin protein gene and the description of a new potential disease phenotype, proton pump inhibitor-responsive esophageal eosinophila. Further advances and controversies regarding diagnostic methods, surrogate disease markers, allergy testing, and treatment approaches are discussed.
Resumo:
CONTEXT: New trial data and drug regimens that have become available in the last 2 years warrant an update to guidelines for antiretroviral therapy (ART) in human immunodeficiency virus (HIV)-infected adults in resource-rich settings. OBJECTIVE: To provide current recommendations for the treatment of adult HIV infection with ART and use of laboratory-monitoring tools. Guidelines include when to start therapy and with what drugs, monitoring for response and toxic effects, special considerations in therapy, and managing antiretroviral failure. DATA SOURCES, STUDY SELECTION, AND DATA EXTRACTION: Data that had been published or presented in abstract form at scientific conferences in the past 2 years were systematically searched and reviewed by an International Antiviral Society-USA panel. The panel reviewed available evidence and formed recommendations by full panel consensus. DATA SYNTHESIS: Treatment is recommended for all adults with HIV infection; the strength of the recommendation and the quality of the evidence increase with decreasing CD4 cell count and the presence of certain concurrent conditions. Recommended initial regimens include 2 nucleoside reverse transcriptase inhibitors (tenofovir/emtricitabine or abacavir/lamivudine) plus a nonnucleoside reverse transcriptase inhibitor (efavirenz), a ritonavir-boosted protease inhibitor (atazanavir or darunavir), or an integrase strand transfer inhibitor (raltegravir). Alternatives in each class are recommended for patients with or at risk of certain concurrent conditions. CD4 cell count and HIV-1 RNA level should be monitored, as should engagement in care, ART adherence, HIV drug resistance, and quality-of-care indicators. Reasons for regimen switching include virologic, immunologic, or clinical failure and drug toxicity or intolerance. Confirmed treatment failure should be addressed promptly and multiple factors considered. CONCLUSION: New recommendations for HIV patient care include offering ART to all patients regardless of CD4 cell count, changes in therapeutic options, and modifications in the timing and choice of ART in the setting of opportunistic illnesses such as cryptococcal disease and tuberculosis.
Resumo:
Background: Mucosal healing in ulcerative colitis (UC) is reported to be associated with favourable clinical outcomes such as reduced hospitalization and surgery rates. Activity monitoring by endoscopy has its shortcomings due to invasiveness, costs, and potential patient discomfort. Data on the correlation of noninvasive biomarkers with endoscopic severity in UC are scarce. Aim: to evaluate the correlation between endoscopic activity according to the modified Baron Index and fecal calprotectin, C-reactive protein (CRP), blood leukocytes, and the Lichtiger Index (clinical score). Methods: UC patients with leftsided and extensive colitis undergoing complete colonoscopy were prospectively enrolled and scored clinically and endoscopically. Fecal and blood samples were analyzed in UC patients (in a blinded fashion) and controls. The modified Baron score describes the following 5 endoscopic conditions: 0 = normal, 1 = granular mucosa, edema, 2 = friable mucosa but no spontaneous bleeding, 3 = microulcerations with spontaneous bleeding, 4 = gross ulceration, denuded mucosa. Results: We enrolled 228 UC patients (mean age 41 ± 13 years, 39 female) and 52 healthy controls. Disease was located in 40% in the left colon, 21% had an extensive and 39% a pancolitis. Endoscopic disease activity correlated best with fecal calprotectin (Spearman's rank correlation coefficient r = 0.821), followed by the Lichtiger Index (r = 0.682), CRP (r = 0.556), and blood leukocytes (r = 0.401). Fecal calprotectin was the only marker that could discriminate between different grades of endoscopic activity (grade 0, 25 ± 11 μg/g; grade 1, 44 ± 34 μg/g; grade 2, 111 ± 74 μg/g; grade 3, 330 ± 332 μg/g; grade 4, 659 ± 319 μg/g; P = 0.002 for discriminating grade 0 vs. 1, and P < 0.001 for discriminating grade 1 vs. 2, grade 2 vs. 3, and grade 3 vs. 4). Fecal calprotectin had the highest overall accuracy (91%) to detect endoscopically active disease (modified Baron Index ≥ 2), followed by the Lichtiger Index score of ≥ 4 (77%), CRP > 5 mg/L (69%) and blood leukocytosis (58%). Conclusions: Fecal calprotectin better correlated with endoscopic disease activity than clinical activity, CRP, and blood leukocytes. The strong correlation with endoscopic disease activity suggests that FC represents a useful biomarker for noninvasive monitoring of disease activity in UC patients.
Resumo:
The objective of this population-based study was to estimate the liver morbidity attributable to Schistosoma mansoni infection by ultrasonography adopting the proposed standard protocols of the Cairo Meeting on Ultrasonography, 1991. We examined 2384 individuals representing 20 of the households of the rural population of the Ismailia Governorate, East of Delta, Egypt. Prevalence of S. mansoni and S. haematobium infections were 40.3 and 1.7 respectively. Portal tract thickening (PTT) grade 1, 2 and 3 considered diagnostic of schistosomal liver morbidity was detected in 35.1, 1.3 and 0.2 individuals respectively. Generally, ultrasonographically-detected pathological changes increased with age, but correlated with intensity of infection only in age group 20-59 years. Comparing individuals with and without S. mansoni infections in an endemic and a non-endemic community indicated no significant difference between the former and the latter in either case. In conclusion: ultrasonography had a limited value in estimating schistosomal liver morbidity in our population-based study where early grades of liver morbidly were prevalent. The criteria of diagnosing grade I portal fibrosis need to be revised as well as the staging system proposed by the Cairo Meeting on ultrasonography in schistosomiasis.
Resumo:
Background and objective: Therapeutic Drug Monitoring (TDM) has been introduced early 1970 in our hospital (CHUV). It represents nowadays an important routine activity of the Division of Clinical Pharmacology and Toxicology (PCL), and its impact and utility for clinicians required assessment. This study thus evaluated the impact of TDM recommendations in terms of dosage regimen adaptation. Design: A prospective observational study was conducted over 5 weeks. The primary objective was to evaluate the application of our TDM recommendations and to identify potential factors associated to variations in their implementation. The secondary objective was to identify pre-analytical problems linked to the collection and processing of blood samples. Setting: Four representative clinical units at CHUV. Main outcome measure: Clinical data, drug related data (intake, collection and processing) and all information regarding the implementation of clinical recommendations were collected and analyzed by descriptive statistics. Results: A total of 241 blood measurement requests were collected, among which 105 triggered a recommendation. 37% of the recommendations delivered were applied, 25 % partially applied and 34% not applied. In 4% it was not applicable. The factors determinant for implementation were the clinical unit and the mode of transmission of the recommendation (written vs oral). No clear difference between types of drugs could be detected. Pre-analytical problems were not uncommon, mostly related to completion of request forms and delays in blood sampling (equilibration or steady-state not reached). We have identified 6% of inappropriate and unusable drug level measurements that could cause a substantial cost for the hospital. Conclusion: This survey highlighted a better implementation of TDM recommendations in clinical units where this routine is well integrated and understood by the medical staff. Our results emphasize the importance of communication with the nurse or the physician in charge, either to transmit clinical recommendations or to establish consensual therapeutic targets in specific conditions. Development of strong partnerships between clinical pharmacists or pharmacologists and clinical units would be beneficial to improve the impact of this clinical activity.
Resumo:
The snails Lymnaea (Radix) luteola exhibited marked variations in growth, longevity, and attaining sexual maturity at different temperatures and diets. At 10°C, irrespective of foods, pH and salinity of water, the snails had minimum life span, maximum death rate and lowest growth rate. At 15°C, the growth rate was comparatively higher and the snails survived for a few more days. But at these temperatures they failed to attain sexual maturity. Snails exposed to pH 5 and 9 at 20°, 25°, 30°, 35°C and room temperatures (19.6°-29.6°C); to 0.5, 1.5 and 2.5 NaCl at 20° and 35ºC; to 2.5 NaCl at 25°C and room temperatures failed to attain sexual maturity. The snails exposed to pH 7 and different salinity grades at 20°, 25°, 30°, 35°C and room temperatures became sexually mature between 25-93 days depending upon the type of foods used in the culture.
Resumo:
Heavy domestic and peridomestic infestations of Triatoma infestans were controlled in two villages in southern Bolivia by the application of deltamethrin SC25 (2.5% suspension concentrate) at a target dose of 25 mg a.i./m². Actual applied dose was monitored by HPLC analysis of filter papers placed at various heights on the house walls, and was shown to range from 0 to 59.6 about a mean of 28.5 mg a.i./m². Wall bioassays showed high mortality of T. infestans during the first month after the application of deltamethrin. Mortality declined to zero as summer temperatures increased, but reappeared with the onset of the following winter. In contrast, knockdown was apparent throughout the trial, showing no discernible temperature dependence. House infestation rates, measured by manual sampling and use of paper sheets to collect bug faeces, declined from 79% at the beginning of the trial to zero at the 6 month evaluation. All but one of the houses were still free of T. infestans at the final evaluation 12 months after spraying, although a small number of bugs were found at this time in 5 of 355 peridomestic dependencies. Comparative cost studies endorse the recommendation of large-scale application of deltamethrin, or pyrethroid of similar cost-effectiveness, as a means to eliminate domestic T. infestans populations in order to interrupt transmission of Chagas disease