212 resultados para Clinical practice guideline


Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND AND METHODS:: The objectives of this article were to systematically describe and examine the novel roles and responsibilities assumed by nurses in a forensic consultation for victims of violence at a University Hospital in French-speaking Switzerland. Utilizing a case study methodology, information was collected from two main sources: (a) discussion groups with nurses and forensic pathologists and (b) a review of procedures and protocols. Following a critical content analysis, the roles and responsibilities of the forensic nurses were described and compared with the seven core competencies of advanced nursing practice as outlined by Hamric, Spross, and Hanson (2009). RESULTS:: Advanced nursing practice competencies noted in the analysis included "direct clinical practice," "coaching and guidance," and "collaboration." The role of the nurse in terms of "consultation," "leadership," "ethics," and "research" was less evident in the analysis. DISCUSSION AND CONCLUSION:: New forms of nursing are indeed practiced in the forensic clinical setting, and our findings suggest that nursing practice in this domain is following the footprints of an advanced nursing practice model. Further reflections are required to determine whether the role of the forensic nurse in Switzerland should be developed as a clinical nurse specialist or that of a nurse practitioner.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: There may be a considerable gap between LDL cholesterol (LDL-C) and blood pressure (BP) goal values recommended by the guidelines and results achieved in daily practice. Design Prospective cross-sectional survey of cardiovascular disease risk profiles and management with focus on lipid lowering and BP lowering in clinical practice. Methods: In phase 1, the cardiovascular risk of patients with known lipid profile visiting their general practitioner was anonymously assessed in accordance to the PROCAM-score. In phase 2, high-risk patients who did not achieve LDL-C goal less than 2.6 mmol/l in phase 1 could be further documented. Results: Six hundred thirty-five general practitioners collected the data of 23 892 patients with known lipid profile. Forty percent were high-risk patients (diabetes mellitus or coronary heart disease or PROCAM-score >20%), compared with 27% estimated by the physicians. Goal attainment rate was almost double for BP than for LDL-C in high-risk patients (62 vs. 37%). Both goals were attained by 25%. LDL-C values in phase 1 and 2 were available for 3097 high-risk patients not at LDL-C goal in phase 1; 32% of patients achieved LDL-C goal of less than 2.6 mmol/l after a mean of 17 weeks. The most successful strategies for LDL-C reduction were implemented in only 22% of the high-risk patients. Conclusion: Although patients at high cardiovascular risk were treated more intensively than low or medium risk patients, the majority remained insufficiently controlled, which is an incentive for intensified medical education. Adequate implementation of Swiss and International guidelines would expectedly contribute to improved achievement of LDL-C and BP goal values in daily practice.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

RATIONALE, AIMS AND OBJECTIVES: There is little evidence regarding the benefit of stress ulcer prophylaxis (SUP) outside a critical care setting. Overprescription of SUP is not devoid of risks. This prospective study aimed to evaluate the use of proton pump inhibitors (PPIs) for SUP in a general surgery department. METHOD: Data collection was performed prospectively during an 8-week period on patients hospitalized in a general surgery department (58 beds) by pharmacists. Patients with a PPI prescription for the treatment of ulcers, gastro-oesophageal reflux disease, oesophagitis or epigastric pain were excluded. Patients admitted twice during the study period were not reincluded. The American Society of Health-System Pharmacists guidelines on SUP were used to assess the appropriateness of de novo PPI prescriptions. RESULTS: Among 255 patients in the study, 138 (54%) received a prophylaxis with PPI, of which 86 (62%) were de novo PPI prescriptions. A total of 129 patients (94%) received esomeprazole (according to the hospital drug policy). The most frequent dosage was at 40 mg once daily. Use of PPI for SUP was evaluated in 67 patients. A total of 53 patients (79%) had no risk factors for SUP. Twelve and two patients had one or two risk factors, respectively. At discharge, PPI prophylaxis was continued in 33% of patients with a de novo PPI prescription. CONCLUSIONS: This study highlights the overuse of PPIs in non-intensive care unit patients and the inappropriate continuation of PPI prescriptions at discharge. Treatment recommendations for SUP are needed to restrict PPI use for justified indications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Diabetes represents an important health burden on our society: for example in Lausanne (Switzerland) 16% of the adult population have abnormal glucose homeostasis and 6% have diabetes, of which about a third is not aware. Some guidelines identify the "at risk" population for which screening seems indicated. Simple clinical scores have been developed at allow to better estimate the risk of diabetes and hence to potentially better target screening of the disease. The recent discovery of more that 18 genetic variants associated with an increased risk to develop the diseased has allowed to include individual genotype into genetic risk scores. We will discuss in this article the usefulness of these genetic score, how they compare to clinical score, their implication for clinical practice as well as their potential ethical or economical consequences.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Clinical practice guidelines have become an important source of information to support clinicians in the management of individual patients. However, current guideline methods have limitations that include the lack of separating the quality of evidence from the strength of recommendations. The Grading of Recommendations, Assessment, Development and Evaluation (GRADE) working group, an international collaboration of guideline developers, methodologists, and clinicians have developed a system that addresses these shortcomings. Core elements include transparent methodology for grading the quality of evidence, the distinction between quality of the evidence and strength of a recommendation, an explicit balancing of benefits and harms of health care interventions, an explicit recognition of the values and preferences that underlie recommendations. The GRADE system has been piloted in various practice settings to ensure that it captures the complexity involved in evidence assessment and grading recommendations while maintaining simplicity and practicality. Many guideline organizations and medical societies have endorsed the system and adopted it for their guideline processes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tools to predict fracture risk are useful for selecting patients for pharmacological therapy in order to reduce fracture risk and redirect limited healthcare resources to those who are most likely to benefit. FRAX® is a World Health Organization fracture risk assessment algorithm for estimating the 10-year probability of hip fracture and major osteoporotic fracture. Effective application of FRAX® in clinical practice requires a thorough understanding of its limitations as well as its utility. For some patients, FRAX® may underestimate or overestimate fracture risk. In order to address some of the common issues encountered with the use of FRAX® for individual patients, the International Society for Clinical Densitometry (ISCD) and International Osteoporosis Foundation (IOF) assigned task forces to review the medical evidence and make recommendations for optimal use of FRAX® in clinical practice. Among the issues addressed were the use of bone mineral density (BMD) measurements at skeletal sites other than the femoral neck, the use of technologies other than dual-energy X-ray absorptiometry, the use of FRAX® without BMD input, the use of FRAX® to monitor treatment, and the addition of the rate of bone loss as a clinical risk factor for FRAX®. The evidence and recommendations were presented to a panel of experts at the Joint ISCD-IOF FRAX® Position Development Conference, resulting in the development of Joint ISCD-IOF Official Positions addressing FRAX®-related issues.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Vertebral fracture assessments (VFAs) using dual-energy X-ray absorptiometry increase vertebral fracture detection in clinical practice and are highly reproducible. Measures of reproducibility are dependent on the frequency and distribution of the event. The aim of this study was to compare 2 reproducibility measures, reliability and agreement, in VFA readings in both a population-based and a clinical cohort. We measured agreement and reliability by uniform kappa and Cohen's kappa for vertebral reading and fracture identification: 360 VFAs from a population-based cohort and 85 from a clinical cohort. In the population-based cohort, 12% of vertebrae were unreadable. Vertebral fracture prevalence ranged from 3% to 4%. Inter-reader and intrareader reliability with Cohen's kappa was fair to good (0.35-0.71 and 0.36-0.74, respectively), with good inter-reader and intrareader agreement by uniform kappa (0.74-0.98 and 0.76-0.99, respectively). In the clinical cohort, 15% of vertebrae were unreadable, and vertebral fracture prevalence ranged from 7.6% to 8.1%. Inter-reader reliability was moderate to good (0.43-0.71), and the agreement was good (0.68-0.91). In clinical situations, the levels of reproducibility measured by the 2 kappa statistics are concordant, so that either could be used to measure agreement and reliability. However, if events are rare, as in a population-based cohort, we recommend evaluating reproducibility using the uniform kappa, as Cohen's kappa may be less accurate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In recent years, evidence has emerged for a bidirectional relationship between sleep and neurological and psychiatric disorders. First, sleep-wake disorders (SWDs) are very common and may be the first/main manifestation of underlying neurological and psychiatric disorders. Secondly, SWDs may represent an independent risk factor for neuropsychiatric morbidities. Thirdly, sleep-wake function (SWF) may influence the course and outcome of neurological and psychiatric disorders. This review summarizes the most important research and clinical findings in the fields of neuropsychiatric sleep and circadian research and medicine, and discusses the promise they bear for the next decade. The findings herein summarize discussions conducted in a workshop with 26 European experts in these fields, and formulate specific future priorities for clinical practice and translational research. More generally, the conclusion emerging from this workshop is the recognition of a tremendous opportunity offered by our knowledge of SWF and SWDs that has unfortunately not yet entered as an important key factor in clinical practice, particularly in Europe. Strengthening pre-graduate and postgraduate teaching, creating academic multidisciplinary sleep-wake centres and simplifying diagnostic approaches of SWDs coupled with targeted treatment strategies yield enormous clinical benefits for these diseases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

NlmCategory="UNASSIGNED">We report outcomes of a clinical audit examining criteria used in clinical practice to rationalize endotracheal tube (ETT) suction, and the extent these matched criteria in the Endotracheal Suction Assessment Tool(ESAT)©. A retrospective audit of patient notes (N = 292) and analyses of criteria documented by pediatric intensive care nurses to rationalize ETT suction were undertaken. The median number of documented respiratory and ventilation status criteria per ETT suction event that matched the ESAT© criteria was 2 [Interquartile Range (IQR) 1-6]. All criteria listed within the ESAT© were documented within the reviewed notes. A direct link was established between criteria used for current clinical practice of ETT suction and the ESAT©. The ESAT©, therefore, reflects documented clinical decision making and could be used as both a clinical and educational guide for inexperienced pediatric critical care nurses. Modification to the ESAT© requires "preparation for extubation" to be added.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Second-generation antipsychotics (SGAs) have become the first-line antipsychotic treatment for psychotic disorders due to their better overall tolerance compared to classical antipsychotics. However, metabolic side effects such as weight gain are frequently described during treatment with SGAs and/or other psychotropic drugs including some antidepressants and mood stabilizers, which may also result in poor adherence to treatment. The aim of this work was to investigate different methods to predict common side effects, in particular weight gain during treatment with weight gain inducing psychotropic drugs. Firstly, clinical data were used to determine the potential predictive power of a one month weight gain on weight increase after three and 12 months of treatment (n=351 patients). A fast and strong weight gain of >5% after a period of one month (>5%WG) was found to be the best predictor for an important weight gain at three (>15%) and 12 months (>20%). Similar analyses in an independent cohort of psychiatric adolescents (n=42), showed that a comparable >4% weight gain at one month is the best predictor for an important weight gain at three months (>15%). Secondly, we aimed to determine whether an extensive analysis of genes could be used, in addition to clinical factors, to predict patients at risk for >5%WG or for type 2 diabetes (T2D). Adding genetic markers to clinical variables to predict >5%WG increased significantly the area under the curve (AUC) of the analysis (AUCfinai:0.92, AUCdmicai:0.75, pcO.OOOl, n=248). Conversely, genetic risk scores were found to be associated with T2D (OR: 2.5, p=0.03, n=285) but without a significant increase of AUC'when compared to the prediction based to clinical factors alone. Finally, therapeutic drug monitoring was used to predict extrapyramidal symptoms during risperidone treatment (n=150). Active moiety (sum of risperidone and of its active metabolite 9- hydroxyrisperidone plasma concentrations) of >40 ng/ml should be targeted only in case of insufficient response. These results highlight different approaches for personalizing psychotropic treatments in order to reduce related side effects. Further research is needed, in particular on the identification of genetic markers, to improve the implementation of these results into clinical practice. Résumé Les antipsychotiques atypiques (APA) sont devenus le traitement antipsychotique de première intention pour le traitement des psychoses, grâce à un profil d'effets secondaires plus favorables comparé aux antipsychotiques typiques. Néanmoins, d'autres effets indésirables d'ordre métabolique (ex. prise pondérale) sont observés sous APA, stabilisateurs de l'humeur et/ou certains antidépresseurs, pouvant aussi limiter l'adhérence au traitement. L'objectif de ce travail est d'explorer différentes méthodes permettant de prédire des effets secondaires courants, en particulier la prise de poids durant un traitement avec des psychotropes pouvant induire un tel effet. Dans une première partie, des données cliniques ont été évaluées pour leurs potentiels prédictifs d'une prise de poids à un mois sur une prise de poids à trois et 12 mois de traitement (n=351 patients). Une prise de poids rapide et forte >5% à un mois (PP>5%) s'est avérée être le meilleur prédicteur pour une prise pondérale importante à trois (>15%) et 12 (>20%) mois de traitement. Des analyses similaires dans une cohorte pédiatrique (n=42) ont indiqué une prise de poids >4% à un mois comme le meilleur prédicteur pour une prise pondérale importante (>15%) à trois mois de traitement. Dans une deuxième partie, des marqueurs génétiques, en complément aux données cliniques, ont été analysés pour leur contribution potentielle à la prédiction d'une PP>5% et au dépistage du diabète de type 2 (DT2). L'ajout de variants génétiques aux données cliniques afin de prédire une PP>5% a augmenté significativement l'aire sous la courbe (ASC) de l'analyse (ASCflnai:0.92, ASCC|inique:0.75, p<0.0001, n=248). Concernant le DT2, un score génétique est associé au DT2 (OR: 2.5, p=0.03, n=285), néanmoins aucune augmentation significative de l'ASC n'a été observée par rapport à l'analyse avec les données cliniques seules. Finalement, des mesures de concentrations plasmatiques de médicaments ont été utilisées pour prédire la survenue de symptômes extrapyramidaux sous rispéridone (n=150). Cette analyse nous a permis d'établir qu'une concentration plasmatique de rispéridone associée à son métabolite actif >40 ng/ml ne devrait être recherchée qu'en cas de réponse clinique insuffisante. Ces différents résultats soulignent différentes approches pour personnaliser la prescription de psychotropes afin de réduire la survenue d'effets secondaires. Des études supplémentaires sont néanmoins nécessaires, en particulier sur l'identification de marqueurs génétiques, afin d'améliorer l'implémentation de ces résultats en pratique clinique. Résumé large publique Les antipsychotiques atypiques et autres traitements psychotropes sont couramment utilisés pour traiter les symptômes liés à la schizophrénie et aux troubles de l'humeur. Comme pour tout médicament, des effets secondaires sont observés. L'objectif de ce travail est d'explorer différentes méthodes qui permettraient de prédire la survenue de certains effets indésirables, en particulier une prise de poids et la survenue d'un diabète. Dans une première partie, nous avons évalué l'effet d'une prise de poids précoce sur une prise de poids au long terme sous traitement psychotrope. Les analyses ont mis en évidence dans une population psychiatrique qu'une prise de poids à un mois >5% par rapport au poids initial permettait de prédire une prise pondérale importante après trois (>15%) et 12 (>20%) mois de traitement. Un résultat semblable a. été observé dans un autre groupe de patients exclusivement pédiatriques. Dans une deuxième partie, nous avons évalué la contribution potentielle de marqueurs génétiques à la prédiction d'une prise pondérale de >5% après un mois de traitement ainsi que dans la survenue d'un diabète de type 2. Pour la prise de poids, la combinaison des données génétiques aux données cliniques a permis d'augmenter de 17% la précision de la prédiction, en passant de 70% à 87%. Concernant la survenue d'un diabète, les données génétiques n'ont pas amélioré la prédiction. Finalement, nous avons analysé la relation possible entre les concentrations sanguines d'un antipsychotique atypique couramment utilisé, la rispéridone, et la survenue d'effets secondaires (ici les tremblements). Il est ressorti de cette étude qu'une concentration plasmatique du médicament supérieure à 40 ng/ml ne devrait être dépassée qu'en cas de réponse thérapeutique insuffisante, au risque de voir augmenter la survenue d'effets secondaires du type tremblements. Ces résultats démontrent la possibilité de prédire avec une bonne précision la survenue de certains effets secondaires. Cependant, en particulier dans le domaine de la génétique, d'autres études sont nécessaires afin de confirmer les résultats obtenus dans nos analyses. Une fois cette étape franchie, il serait possible d'utiliser ces outils dans la pratique clinique. A terme, cela pourrait permettre au prescripteur de sélectionner les traitements les mieux adaptés aux profils spécifiques de chaque patient.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Un certain nombre de patients palliatifs en fin de vie présentent des symptômes qui ne peuvent pas être soulagés malgré la mise en oeuvre de tous les moyens traditionnels à disposition. Pour ces patients, lorsque des symptômes réfractaires induisent une souffrance intolérable, la sédation palliative est un moyen de dernier recours qui peut leur offrir un soulagement transitoire ou même définitif. Tout en présentant les enjeux éthiques, cet article explore les dimensions cliniques d'ordre pratique qui peuvent se présenter lors d'une sédation palliative chez un patient en fin de vie. Patients at the end-of-life may present with refractory symptoms which cannot be adequately relieved despite the use of all traditional means. When refractory symptoms lead to intolerable suffering, palliative sedation is a last recourse temporary or definitive treatment. While discussing ethical issues, clinical practice dimensions of palliative sedation are explored in this article

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Clinical guidelines are essential in implementing and maintaining nationwide stage-specific diagnostic and therapeutic standards. In 2011, the first German expert consensus guideline defined the evidence for diagnosis and treatment of early and locally advanced esophagogastric cancers. Here, we compare this guideline with other national guidelines as well as current literature. METHODS: The German S3-guideline used an approved development process with de novo literature research, international guideline adaptation, or good clinical practice. Other recent evidence-based national guidelines and current references were compared with German recommendations. RESULTS: In the German S3 and other Western guidelines, adenocarcinomas of the esophagogastric junction (AEG) are classified according to formerly defined AEG I-III subgroups due to the high surgical impact. To stage local disease, computed tomography of the chest and abdomen and endosonography are reinforced. In contrast, laparoscopy is optional for staging. Mucosal cancers (T1a) should be endoscopically resected "en-bloc" to allow complete histological evaluation of lateral and basal margins. For locally advanced cancers of the stomach or esophagogastric junction (≥T3N+), preferred treatment is preoperative and postoperative chemotherapy. Preoperative radiochemotherapy is an evidence-based alternative for large AEG type I-II tumors (≥T3N+). Additionally, some experts recommend treating T2 tumors with a similar approach, mainly because pretherapeutic staging is often considered to be unreliable. CONCLUSIONS: The German S3 guideline represents an up-to-date European position with regard to diagnosis, staging, and treatment recommendations for patients with locally advanced esophagogastric cancer. Effects of perioperative chemotherapy versus chemoradiotherapy are still to be investigated for adenocarcinoma of the cardia and the lower esophagus.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The accurate estimation of total daily energy expenditure (TEE) in chronic kidney patients is essential to allow the provision of nutritional requirements; however, it remains a challenge to collect actual physical activity and resting energy expenditure in maintenance dialysis patients. The direct measurement of TEE by direct calorimetry or doubly labeled water cannot be used easily so that, in clinical practice, TEE is usually estimated from resting energy expenditure and physical activity. Prediction equations may also be used to estimate resting energy expenditure; however, their use has been poorly documented in dialysis patients. Recently, a new system called SenseWear Armband (BodyMedia, Pittsburgh, PA) was developed to assess TEE, but so far no data have been published in chronic kidney disease patients. The aim of this review is to describe new measurements of energy expenditure and physical activity in chronic kidney disease patients.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La douleur est fréquente en milieu de soins intensifs et sa gestion est l'une des missions des infirmières. Son évaluation est une prémisse indispensable à son soulagement. Cependant lorsque le patient est incapable de signaler sa douleur, les infirmières doivent se baser sur des signes externes pour l'évaluer. Les guides de bonne pratique recommandent chez les personnes non communicantes l'usage d'un instrument validé pour la population donnée et basé sur l'observation des comportements. A l'heure actuelle, les instruments d'évaluation de la douleur disponibles ne sont que partiellement adaptés aux personnes cérébrolésées dans la mesure où ces personnes présentent des comportements qui leur sont spécifiques. C'est pourquoi, cette étude vise à identifier, décrire et valider des indicateurs, et des descripteurs, de la douleur chez les personnes cérébrolésées. Un devis d'étude mixte multiphase avec une dominante quantitative a été choisi pour cette étude. Une première phase consistait à identifier des indicateurs et des descripteurs de la douleur chez les personnes cérébrolésées non communicantes aux soins intensifs en combinant trois sources de données : une revue intégrative des écrits, une démarche consultative utilisant la technique du groupe nominal auprès de 18 cliniciens expérimentés (6 médecins et 12 infirmières) et les résultats d'une étude pilote observationnelle réalisée auprès de 10 traumatisés crâniens. Les résultats ont permis d'identifier 6 indicateurs et 47 descripteurs comportementaux, vocaux et physiologiques susceptibles d'être inclus dans un instrument d'évaluation de la douleur destiné aux personnes cérébrolésées non- communicantes aux soins intensifs. Une deuxième phase séquentielle vérifiait les propriétés psychométriques des indicateurs et des descripteurs préalablement identifiés. La validation de contenu a été testée auprès de 10 experts cliniques et 4 experts scientifiques à l'aide d'un questionnaire structuré qui cherchait à évaluer la pertinence et la clarté/compréhensibilité de chaque descripteur. Cette démarche a permis de sélectionner 33 des 47 descripteurs et valider 6 indicateurs. Dans un deuxième temps, les propriétés psychométriques de ces indicateurs et descripteurs ont été étudiés au repos, lors de stimulation non nociceptive et lors d'une stimulation nociceptive (la latéralisation du patient) auprès de 116 personnes cérébrolésées aux soins intensifs hospitalisées dans deux centres hospitaliers universitaires. Les résultats montrent d'importantes variations dans les descripteurs observés lors de stimulation nociceptive probablement dues à l'hétérogénéité des patients au niveau de leur état de conscience. Dix descripteurs ont été éliminés, car leur fréquence lors de la stimulation nociceptive était inférieure à 5% ou leur fiabilité insuffisante. Les descripteurs physiologiques ont tous été supprimés en raison de leur faible variabilité et d'une fiabilité inter juge problématique. Les résultats montrent que la validité concomitante, c'est-à-dire la corrélation entre l'auto- évaluation du patient et les mesures réalisées avec les descripteurs, est satisfaisante lors de stimulation nociceptive {rs=0,527, p=0,003, n=30). Par contre la validité convergente, qui vérifiait l'association entre l'évaluation de la douleur par l'infirmière en charge du patient et les mesures réalisés avec les descripteurs, ainsi que la validité divergente, qui vérifiait si les indicateurs discriminent entre la stimulation nociceptive et le repos, mettent en évidence des résultats variables en fonction de l'état de conscience des patients. Ces résultats soulignent la nécessité d'étudier les descripteurs de la douleur chez des patients cérébrolésés en fonction du niveau de conscience et de considérer l'hétérogénéité de cette population dans la conception d'un instrument d'évaluation de la douleur pour les personnes cérébrolésées non communicantes aux soins intensifs. - Pain is frequent in the intensive care unit (ICU) and its management is a major issue for nurses. The assessment of pain is a prerequisite for appropriate pain management. However, pain assessment is difficult when patients are unable to communicate about their experience and nurses have to base their evaluation on external signs. Clinical practice guidelines highlight the need to use behavioral scales that have been validated for nonverbal patients. Current behavioral pain tools for ICU patients unable to communicate may not be appropriate for nonverbal brain-injured ICU patients, as they demonstrate specific responses to pain. This study aimed to identify, describe and validate pain indicators and descriptors in brain-injured ICU patients. A mixed multiphase method design with a quantitative dominant was chosen for this study. The first phase aimed to identify indicators and descriptors of pain for nonverbal brain- injured ICU patients using data from three sources: an integrative literature review, a consultation using the nominal group technique with 18 experienced clinicians (12 nurses and 6 physicians) and the results of an observational pilot study with 10 traumatic brain injured patients. The results of this first phase identified 6 indicators and 47 behavioral, vocal and physiological descriptors of pain that could be included in a pain assessment tool for this population. The sequential phase two tested the psychometric properties of the list of previously identified indicators and descriptors. Content validity was tested with 10 clinical and 4 scientific experts for pertinence and comprehensibility using a structured questionnaire. This process resulted in 33 descriptors to be selected out of 47 previously identified, and six validated indicators. Then, the psychometric properties of the descriptors and indicators were tested at rest, during non nociceptive stimulation and nociceptive stimulation (turning) in a sample of 116 brain-injured ICLI patients who were hospitalized in two university centers. Results showed important variations in the descriptors observed during the nociceptive stimulation, probably due to the heterogeneity of patients' level of consciousness. Ten descriptors were excluded, as they were observed less than 5% of the time or their reliability was insufficient. All physiologic descriptors were deleted as they showed little variability and inter observer reliability was lacking. Concomitant validity, testing the association between patients' self report of pain and measures performed using the descriptors, was acceptable during nociceptive stimulation (rs=0,527, p=0,003, n=30). However, convergent validity ( testing for an association between the nurses' pain assessment and measures done with descriptors) and divergent validity (testing for the ability of the indicators to discriminate between rest and a nociceptive stimulation) varied according to the level of consciousness These results highlight the need to study pain descriptors in brain-injured patients with different level of consciousness and to take into account the heterogeneity of this population forthe conception of a pain assessment tool for nonverbal brain-injured ICU patients.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Accurate placement of an external ventricular drain (EVD) for the treatment of hydrocephalus is of paramount importance for its functionality and in order to minimize morbidity and complications. The aim of this study was to compare two different drain insertion assistance tools with the traditional free-hand anatomical landmark method, and to measure efficacy, safety and precision. METHODS: Ten cadaver heads were prepared by opening large bone windows centered on Kocher's points on both sides. Nineteen physicians, divided in two groups (trainees and board certified neurosurgeons) performed EVD insertions. The target for the ventricular drain tip was the ipsilateral foramen of Monro. Each participant inserted the external ventricular catheter in three different ways: 1) free-hand by anatomical landmarks, 2) neuronavigation-assisted (NN), and 3) XperCT-guided (XCT). The number of ventricular hits and dangerous trajectories; time to proceed; radiation exposure of patients and physicians; distance of the catheter tip to target and size of deviations projected in the orthogonal plans were measured and compared. RESULTS: Insertion using XCT increased the probability of ventricular puncture from 69.2 to 90.2 % (p = 0.02). Non-assisted placements were significantly less precise (catheter tip to target distance 14.3 ± 7.4 mm versus 9.6 ± 7.2 mm, p = 0.0003). The insertion time to proceed increased from 3.04 ± 2.06 min. to 7.3 ± 3.6 min. (p < 0.001). The X-ray exposure for XCT was 32.23 mSv, but could be reduced to 13.9 mSv if patients were initially imaged in the hybrid-operating suite. No supplementary radiation exposure is needed for NN if patients are imaged according to a navigation protocol initially. CONCLUSION: This ex vivo study demonstrates a significantly improved accuracy and safety using either NN or XCT-assisted methods. Therefore, efforts should be undertaken to implement these new technologies into daily clinical practice. However, the accuracy versus urgency of an EVD placement has to be balanced, as the image-guided insertion technique will implicate a longer preparation time due to a specific image acquisition and trajectory planning.