817 resultados para Patient Management
Resumo:
Intoxications are a frequent problem in the ER. In the vast majorityof cases, supportive treatment is sufficient. Severe intoxications withunknown agents are considered an indication for a urinary drug screen,and are recommended by several toxicology centers. However, theirusefulness for patient management remains uncertain.Study objectives: Evaluation of the impact of a urinary drug screen(Biosite Triage TOX Drug Screen) testing 11 substances(acetaminophen, amphetamines, methamphetamines, barbiturates,benzodiazepines, cocaïne, methadone, opioids, phencyclidine,cannabis, tricyclic antidepressants) on initial adult patient managementin the emergency department of a university hospital with ~35.000annual admissions.Methods: Observational retrospective analysis of all tests performedbetween 09/2009 and 09/2010. A test utility was defined as useful if itresulted in the administration of a specific antidote (Flumazenil/Naloxone), the use of a quantitative confirmatory toxicologic test, or achange in patient's disposition.Results: 57 tests were performed. Patient age was 32 ± 11 (SD) years;58% were men; 30% were also intoxicated with alcohol. Two patientsdied (3.5%): the first one of a diphenhydramin overdose, the other of ahypertensive intracerebral hemorrhage believed to be caused cocaineabuse but a negative urine test. Test indications were: 54% firstpsychotic episode; 25% acute respiratory failure; 18% coma; 12%seizure; 11% opioids toxidrome; 7% sympathicomimetic toxidrome; 5%hypotension; 4% ventricular arrhythmia (VT, VF, torsades de pointes)or long QT. 75% of tests were positives for >=1 substance (mean 1.7 ±0.9). 47% of results were unexpected by history. 18% of resultsinfluenced patient management: 7% had a negative test that confirmedthe diagnosis of endogenous psychosis in a first psychotic episode, andallowed transfer to psychiatry; 5% received flumazenil/naloxone;2% had an acetaminophen blood level after a positive screen; finally,4% had an unexpected methadone abuse that required prolongationof hospital stay.Conclusions: A rapid urinary toxicologic screen was seldom used inour emergency department, and its impact on patient managementwas marginal: only one in 6 tests influenced treatment decisions.
Resumo:
Background: Microbiological diagnostic procedures have changed significantly over the last decade. Initially the implementation of the polymerase chain reaction (PCR) resulted in improved detection tests for microbes that were difficult or even impossible to detect by conventional methods such as culture and serology, especially in community-acquired respiratory tract infections (CA-RTI). A further improvement was the development of real-time PCR, which allows end point detection and quantification, and many diagnostic laboratories have now implemented this powerful method. Objective: At present, new performant and convenient molecular tests have emerged targeting in parallel many viruses and bacteria responsible for lower and/or upper respiratory tract infections. The range of test formats and microbial agents detected is evolving very quickly and the added value of these new tests needs to be studied in terms of better use of antibiotics, better patient management, duration of hospitalization and overall costs. Conclusions: Molecular tools for a better microbial documentation of CA-RTI are now available. Controlled studies are now required to address the relevance issue of these new methods, such as, for example, the role of some newly detected respiratory viruses or of the microbial DNA load in a particular patient at a particular time. The future challenge for molecular diagnosis will be to become easy to handle, highly efficient and cost-effective, delivering rapid results with a direct impact on clinical management.
Resumo:
For patients with brain tumors identification of diagnostic and prognostic markers in easy accessible biological material, such as plasma or cerebrospinal fluid (CSF), would greatly facilitate patient management. MIC-1/GDF15 (growth differentiation factor 15) is a secreted protein of the TGF-beta superfamily and emerged as a candidate marker exhibiting increasing mRNA expression during malignant progression of glioma. Determination of MIC-1/GDF15 protein levels by ELISA in the CSF of a cohort of 94 patients with intracranial tumors including gliomas, meningioma and metastasis revealed significantly increased concentrations in glioblastoma patients (median, 229 pg/ml) when compared with control cohort of patients treated for non-neoplastic diseases (median below limit of detection of 156 pg/ml, p < 0.0001, Mann-Whitney test). However, plasma MIC-1/GDF15 levels were not elevated in the matching plasma samples from these patients. Most interestingly, patients with glioblastoma and increased CSF MIC-1/GDF15 had a shorter survival (p = 0.007, log-rank test). In conclusion, MIC-1/GDF15 protein measured in the CSF may have diagnostic and prognostic value in patients with intracranial tumors.
Resumo:
In this article, Médicos Sin Fronteras (MSF) Spain faces the challenge of selecting, piecing together, and conveying in the clearest possible way, the main lessons learnt over the course of the last seven years in the world of medical care for Chagas disease. More than two thousand children under the age of 14 have been treated; the majority of whom come from rural Latin American areas with difficult access. It is based on these lessons learnt, through mistakes and successes, that MSF advocates that medical care for patients with Chagas disease be a reality, in a manner which is inclusive (not exclusive), integrated (with medical, psychological, social, and educational components), and in which the patient is actively followed. This must be a multi-disease approach with permanent quality controls in place based on primary health care (PHC). Rapid diagnostic tests and new medications should be available, as well as therapeutic plans and patient management (including side effects) with standardised flows for medical care for patients within PHC in relation to secondary and tertiary level, inclusive of epidemiological surveillance systems.
Resumo:
The use of areal bone mineral density (aBMD) for fracture prediction may be enhanced by considering bone microarchitectural deterioration. Trabecular bone score (TBS) helped in redefining a significant subset of non-osteoporotic women as a higher risk group. INTRODUCTION: TBS is an index of bone microarchitecture. Our goal was to assess the ability of TBS to predict incident fracture. METHODS: TBS was assessed in 560 postmenopausal women from the Os des Femmes de Lyon cohort, who had a lumbar spine (LS) DXA scan (QDR 4500A, Hologic) between years 2000 and 2001. During a mean follow-up of 7.8 ± 1.3 years, 94 women sustained 112 fragility fractures. RESULTS: At the time of baseline DXA scan, women with incident fracture were significantly older (70 ± 9 vs. 65 ± 8 years) and had a lower LS_aBMD and LS_TBS (both -0.4SD, p < 0.001) than women without fracture. The magnitude of fracture prediction was similar for LS_aBMD and LS_TBS (odds ratio [95 % confidence interval] = 1.4 [1.2;1.7] and 1.6 [1.2;2.0]). After adjustment for age and prevalent fracture, LS_TBS remained predictive of an increased risk of fracture. Yet, its addition to age, prevalent fracture, and LS_aBMD did not reach the level of significance to improve the fracture prediction. When using the WHO classification, 39 % of fractures occurred in osteoporotic women, 46 % in osteopenic women, and 15 % in women with T-score > -1. Thirty-seven percent of fractures occurred in the lowest quartile of LS_TBS, regardless of BMD. Moreover, 35 % of fractures that occurred in osteopenic women were classified below this LS_TBS threshold. CONCLUSION: In conclusion, LS_aBMD and LS_TBS predicted fractures equally well. In our cohort, the addition of LS_TBS to age and LS_aBMD added only limited information on fracture risk prediction. However, using the lowest quartile of LS_TBS helped in redefining a significant subset of non-osteoporotic women as a higher risk group which is important for patient management.
Resumo:
Malignant gliomas, notably glioblastoma are among the most vascularized and angiogenic cancers, and microvascular proliferation is one of the hallmarks for the diagnosis of glioblastoma. Angiogenesis is regulated by a balance of pro- and antiangiogenic signals; overexpression of VEGF and activation of its receptors, most notable VEGFR-2 and -3, results in endothelial cell proliferation and leaky vasculature. Heterogeneous perfusion and oxygenation, peritumoral edema and increased interstitial pressure are the consequence. Both endothelial and tumour cells are strongly dependent on integrin-mediated adhesion for cell proliferation, survival, migration and invasion.Strategies aiming at inhibition of cell signaling and angiogenesis, including integrin inhibitors, have been clinically investigated in gliomas over the last 5 years. Radiological responses, a decreased requirement of corticosteroids and temporary improvement in performance status have repeatedly been observed. Toxicity was mild-moderate and manageable, notably there was no evidence for a substantially increased incidence of intracranial bleeding. However definitive comparative (randomized !) investigation has failed to demonstrate improved outcome with singleagent inhibition of EGFR, or PDGFR or VEGF/VEGFRs pathways in recurrent glioblastoma. Definitive phase III trials combining the anti- VEGF monoclonal antibody bevacizumab, or cilengitide, a peptidic integrininhibitor, together with temozolomide and radiotherapy are ongoing (accrual completed).The integration of anti-angiogenic strategies in the management of malignant glioma also poses entirely new challenges in patient management: 1) Many agents are known for increasing the risk of thrombosis, embolism and intracranial bleeding. 2) Evaluation of treatment efficacy is difficult and new biomarkers of activity, including functional, metabolic or molecular imaging techniques are urgently needed. Normalization of vasculature leads to decrease in contrast enhancement without necessarily reflecting tumour shrinkage. Tumour heterogeneity, putative prognostic or predictive factors require early controlled trials, novel trial designs and endpoints.3) Activation of alternate pathways and tumour escape mechanisms may require combination of multiple agents, which is often not feasible due to regulatory restrictions and potential complex toxicities. Emerging clinical and experimental evidence suggests that anti-angiogenic drugs might need to be combined with drugs targeting tumour adaptive mechanisms in addition to cytotoxic chemotherapy and irradiation for a maximal antitumour effect.
Resumo:
OBJECTIVE: The reverse transcriptase inhibitor efavirenz is currently used at a fixed dose of 600 mg/d. However, dosage individualization based on plasma concentration monitoring might be indicated. This study aimed to assess the efavirenz pharmacokinetic profile and interpatient versus intrapatient variability in patients who are positive for human immunodeficiency virus, to explore the relationship between drug exposure, efficacy, and central nervous system toxicity and to build up a Bayesian approach for dosage adaptation. METHODS: The population pharmacokinetic analysis was performed by use of NONMEM based on plasma samples from a cohort of unselected patients receiving efavirenz. With the use of a 1-compartment model with first-order absorption, the influence of demographic and clinical characteristics on oral clearance and oral volume of distribution was examined. The average drug exposure during 1 dosing interval was estimated for each patient and correlated with markers of efficacy and toxicity. The population kinetic parameters and the variabilities were integrated into a Bayesian equation for dosage adaptation based on a single plasma sample. RESULTS: Data from 235 patients with a total of 719 efavirenz concentrations were collected. Oral clearance was 9.4 L/h, oral volume of distribution was 252 L, and the absorption rate constant was 0.3 h(-1). Neither the demographic covariates evaluated nor the comedications showed a clinically significant influence on efavirenz pharmacokinetics. A large interpatient variability was found to affect efavirenz relative bioavailability (coefficient of variation, 54.6%), whereas the intrapatient variability was small (coefficient of variation, 26%). An inverse correlation between average drug exposure and viral load and a trend with central nervous system toxicity were detected. This enabled the derivation of a dosing adaptation strategy suitable to bring the average concentration into a therapeutic target from 1000 to 4000 microg/L to optimize viral load suppression and to minimize central nervous system toxicity. CONCLUSIONS: The high interpatient and low intrapatient variability values, as well as the potential relationship with markers of efficacy and toxicity, support the therapeutic drug monitoring of efavirenz. However, further evaluation is needed before individualization of an efavirenz dosage regimen based on routine drug level monitoring should be recommended for optimal patient management.
Resumo:
One hundred years after its discovery by Carlos Chagas, American trypanosomiasis, or Chagas disease, remains an epidemiologic challenge. Neither a vaccine nor an ideal specific treatment is available for most chronic cases. Therefore, the current strategy for countering Chagas disease consists of preventive actions against the vector and transfusion-transmitted disease. Here, the present challenges, including congenital and oral transmission of Trypanosoma cruzi infections, as well as the future potential for Chagas disease elimination are discussed in light of the current epidemiological picture. Finally, a list of challenging open questions is presented about Chagas disease control, patient management, programme planning and priority definitions faced by researchers and politicians.
Resumo:
Current applications of cardiac magnetic resonance (CMR) imaging offer a wide spectrum of indications in the setting of acute cardiac care. In particular, CMR is helpful for the differential diagnosis of chest pain by detection of myocarditis and pericarditis. Also, takotsubo cardiomyopathy and acute aortic diseases can be evaluated by CMR and are important differential diagnoses in patients with acute chest pain. In patients with restricted windows for echocardiography, CMR is the method of choice to evaluate complications of acute myocardial infarction (AMI). In AMI, CMR allows for a unique characterization of myocardial damage by quantifying necrosis, microvascular obstruction, oedema (=area at risk), and haemorrhage. These capabilities will help us to understand better the pathophysiological events during infarction and will also allow to assess new treatment strategies in AMI. To what extent the information on tissue damage will guide patient management is not yet clear and further research in this field is warranted. In the near future, CMR will certainly become more routine in acute cardiac care units, as manufacturers are now focusing strongly on this aspect of user-friendliness. Finally, in the next decade or so, MRI of other nuclei such as fluorine and carbon might become a clinical reality, which would allow for metabolic and targeted molecular imaging with excellent sensitivity and specificity
Resumo:
The prevalence of anemia across studies on patients with inflammatory bowel disease (IBD) is high (30%). Both iron deficiency (ID) and anemia of chronic disease contribute most to the development of anemia in IBD. The prevalence of ID is even higher (45%). Anemia and ID negatively impact the patient's quality of life. Therefore, together with an adequate control of disease activity, iron replacement therapy should start as soon as anemia or ID is detected to attain a normal hemoglobin (Hb) and iron status. Many patients will respond to oral iron, but compliance may be poor, whereas intravenous (i.v.) compounds are safe, provide a faster Hb increase and iron store repletion, and presents a lower rate of treatment discontinuation. Absolute indications for i.v. iron treatment should include severe anemia, intolerance or inappropriate response to oral iron, severe intestinal disease activity, or use of an erythropoietic stimulating agent. Four different products are principally used in clinical practice, which differ in their pharmacokinetic properties and safety profiles: iron gluconate and iron sucrose (lower single doses), and iron dextran and ferric carboxymaltose (higher single doses). After the initial resolution of anemia and the repletion of iron stores, the patient's hematological and iron parameters should be carefully and periodically monitored, and maintenance iron treatment should be provided as required. New i.v. preparations that allow for giving 1000-1500 mg in a single session, thus facilitating patient management, provide an excellent tool to prevent or treat anemia and ID in this patient population, which in turn avoids allogeneic blood transfusion and improves their quality of life.
Resumo:
Propofol is progressively replacing benzodiazepines for sedation during endoscopy, even when the sedation is administered by non-anesthesiologists. Propofol ensures a more rapid induction of sedation and recovery and, in certain conditions, higher patient satisfaction and improved quality of endoscopic examination. Specific training is required to use this drug. Patients at risk of complications should be identified before the endoscopy to optimize patient management with an anesthesiologist. After sedation, psychomotor recovery is faster with propofol compared to traditional sedation agents but tasks requiring particular attention (eg, driving) should be avoided. It is important to advise patients of these restrictions in advance.
Resumo:
In 2009, the World Health Organization (WHO) issued a new guideline that stratifies dengue-affected patients into severe (SD) and non-severe dengue (NSD) (with or without warning signs). To evaluate the new recommendations, we completed a retrospective cross-sectional study of the dengue haemorrhagic fever (DHF) cases reported during an outbreak in 2011 in northeastern Brazil. We investigated 84 suspected DHF patients, including 45 (53.6%) males and 39 (46.4%) females. The ages of the patients ranged from five-83 years and the median age was 29. According to the DHF/dengue shock syndrome classification, 53 (63.1%) patients were classified as having dengue fever and 31 (36.9%) as having DHF. According to the 2009 WHO classification, 32 (38.1%) patients were grouped as having NSD [4 (4.8%) without warning signs and 28 (33.3%) with warning signs] and 52 (61.9%) as having SD. A better performance of the revised classification in the detection of severe clinical manifestations allows for an improved detection of patients with SD and may reduce deaths. The revised classification will not only facilitate effective screening and patient management, but will also enable the collection of standardised surveillance data for future epidemiological and clinical studies.
Resumo:
The debate on the merits of observational studies as compared with randomized trials is ongoing. We will briefly touch on this subject, and demonstrate the role of cohort studies for the description of infectious disease patterns after transplantation. The potential benefits of cohort studies for the clinical management of patients outside of the expected gain in epidemiological knowledge are reviewed. The newly established Swiss Transplantation Cohort Study and in particular the part focusing on infectious diseases will serve as an illustration. A neglected area of research is the indirect value of large, multicenter cohort studies. These benefits can range from a deepened collaboration to the development of common definitions and guidelines. Unfortunately, very few data exist on the role of such indirect effects on improving quality of patient management. This review postulates an important role for cohort studies, which should not be viewed as inferior but complementary to established research tools, in particular randomized trials. Randomized trials remain the least bias-prone method to establish knowledge regarding the significance of diagnostic or therapeutic measures. Cohort studies have the power to reflect a real-world situation and to pinpoint areas of knowledge as well as of uncertainty. Prerequisite is a prospective design requiring a set of inclusive data coupled with the meticulous insistence on data retrieval and quality.
Resumo:
Background: Although combination antiretroviral therapy (cART) dramatically reduces rates of AIDS and death, a minority of patients experience clinical disease progression during treatment. <p>Objective: To investigate whether detection of CXCR4(X4)-specific strains or quantification of X4-specific HIV-1 load predict clinical outcome. Methods: From the Swiss HIV Cohort Study, 96 participants who initiated cART yet subsequently progressed to AIDS or death were compared with 84 contemporaneous, treated nonprogressors. A sensitive heteroduplex tracking assay was developed to quantify plasma X4 and CCR5 variants and resolve HIV-1 load into coreceptor-specific components. Measurements were analyzed as cofactors of progression in multivariable Cox models adjusted for concurrent CD4 cell count and total viral load, applying inverse probability weights to adjust for sampling bias. Results: Patients with X4 variants at baseline displayed reduced CD4 cell responses compared with those without X4 strains (40 versus 82 cells/mu l; P= 0.012). The adjusted multivariable hazard ratio (HR) for clinical progression was 4.8 [95% confidence interval (Cl) 2.3-10.0] for those demonstrating X4 strains at baseline. The X4-specific HIV-1 load was a similarly independent predictor, with HR values of 3.7(95%Cl, 1.2-11.3) and 5.9 (95% Cl, 2.2-15.0) for baseline loads of 2.2-4.3 and > 4.3 log(10)copies/ml, respectively, compared with < 2.2 log(10)copies/ml. Conclusions: HIV-1 coreceptor usage and X4-specific viral loads strongly predicted disease progression during cART, independent of and in addition to CD4 cell count or total viral load. Detection and quantification of X4 strains promise to be clinically useful biomarkers to guide patient management and study HIV-1 pathogenesis.
Resumo:
Clinical evaluation is an integral part of medical practice. However, recent data have demonstrated that a systematic and standardized evaluation modifies the prognosis of our rheumatoid arthritis patients. The systematic use of activity indexes allows us to better appreciate the needs of our patients and the necessity to optimize and intensifie treatment. Likewise, auto-evaluations tools bring useful information to patient management.