991 resultados para diagnostic tools


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aetiology of autoimmune hepatitis (AIH) is uncertain but the disease can be triggered in susceptible patients by external factors such as viruses or drugs. AIH usually develops in individuals with a genetic background mainly consisting of some risk alleles of the major histocompatibility complex (HLA). Many drugs have been linked to AIH phenotypes, which sometimes persist after drug discontinuation, suggesting that they awaken latent autoimmunity. At least three clinical scenarios have been proposed that refers to drug- induced autoimmune liver disease (DIAILD): AIH with drug-induced liver injury (DILI); drug induced-AIH (DI-AIH); and immune mediated DILI (IM-DILI). In addition, there are instances showing mixed features of DI-AIH and IM-DILI, as well as DILI cases with positive autoantibodies. Histologically distinguishing DILI from AIH remains a challenge. Even more challenging is the differentiation of AIH from DI-AIH mainly relying in histological features; however, a detailed standardised histologic evaluation of large cohorts of AIH and DI-AIH patients would probably render more subtle features that could be of help in the differential diagnosis between both entities. Growing information on the relationship of drugs and AIH is being available, being drugs like statins and biologic agents more frequently involved in cases of DIAILD. In addition, there is some evidence on the fact that patients diagnosed with DIAILD may have had a previous episode of hepatotoxicity. Further collaborative studies in DIAILD will strengthen the knowledge and understanding of this intriguing and complex disorder which might represent different phenotypes across the spectrum of disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The debate on the merits of observational studies as compared with randomized trials is ongoing. We will briefly touch on this subject, and demonstrate the role of cohort studies for the description of infectious disease patterns after transplantation. The potential benefits of cohort studies for the clinical management of patients outside of the expected gain in epidemiological knowledge are reviewed. The newly established Swiss Transplantation Cohort Study and in particular the part focusing on infectious diseases will serve as an illustration. A neglected area of research is the indirect value of large, multicenter cohort studies. These benefits can range from a deepened collaboration to the development of common definitions and guidelines. Unfortunately, very few data exist on the role of such indirect effects on improving quality of patient management. This review postulates an important role for cohort studies, which should not be viewed as inferior but complementary to established research tools, in particular randomized trials. Randomized trials remain the least bias-prone method to establish knowledge regarding the significance of diagnostic or therapeutic measures. Cohort studies have the power to reflect a real-world situation and to pinpoint areas of knowledge as well as of uncertainty. Prerequisite is a prospective design requiring a set of inclusive data coupled with the meticulous insistence on data retrieval and quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The diagnosis of single-lesion paucibacillary leprosy remains a challenge. Reviews by expert dermatopathologists and quantitative polymerase chain reaction (qPCR) results obtained from 66 single-plaque biopsy samples were compared. Histological findings were graded as high (HP), medium (MP) or low (LP) probability of leprosy or other dermatopathy (OD). Mycobacterium leprae-specific genes were detected using qPCR. The biopsies of 47 out of 57 clinically diagnosed patients who received multidrug therapy were classified as HP/MP, eight of which were qPCR negative. In the LP/OD (n = 19), two out of eight untreated patients showed positive qPCR results. In the absence of typical histopathological features, qPCR may be utilised to aid in final patient diagnosis, thus reducing overtreatment and delay in diagnosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many patients with Chagas disease live in remote communities that lack both equipment and trained personnel to perform a diagnosis by conventional serology (CS). Thus, reliable tests suitable for use under difficult conditions are required. In this study, we evaluated the ability of personnel with and without laboratory skills to perform immunochromatographic (IC) tests to detect Chagas disease at a primary health care centre (PHCC). We examined whole blood samples from 241 patients and serum samples from 238 patients. Then, we calculated the percentage of overall agreement (POA) between the two groups of operators for the sensitivity (S), specificity (Sp) and positive (PPV) and negative (NPV) predictive values of IC tests compared to CS tests. We also evaluated the level of agreement between ELISAs and indirect haemagglutination (IHA) tests. The readings of the IC test results showed 100% agreement (POA = 1). The IC test on whole blood showed the following values: S = 87.3%; Sp = 98.8%; PPV = 96.9% and NPV = 95.9%. Additionally, the IC test on serum displayed the following results: S = 95.7%; Sp = 100%; PPV = 100% and NPV = 98.2%. Using whole blood, the agreement with ELISA was 96.3% and the agreement with IHA was 94.1%. Using serum, the agreement with ELISA was 97.8% and the agreement with IHA was 96.6%. The IC test performance with serum samples was excellent and demonstrated its usefulness in a PHCC with minimal equipment. If the IC test S value and NPV with whole blood are improved, then this test could also be used in areas lacking laboratories or specialised personnel.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: The high prevalence of disease-related hospital malnutrition justifies the need for screening tools and early detection in patients at risk for malnutrition, followed by an assessment targeted towards diagnosis and treatment. At the same time there is clear undercoding of malnutrition diagnoses and the procedures to correct it Objectives: To describe the INFORNUT program/ process and its development as an information system. To quantify performance in its different phases. To cite other tools used as a coding source. To calculate the coding rates for malnutrition diagnoses and related procedures. To show the relationship to Mean Stay, Mortality Rate and Urgent Readmission; as well as to quantify its impact on the hospital Complexity Index and its effect on the justification of Hospitalization Costs. Material and methods: The INFORNUT® process is based on an automated screening program of systematic detection and early identification of malnourished patients on hospital admission, as well as their assessment, diagnoses, documentation and reporting. Of total readmissions with stays longer than three days incurred in 2008 and 2010, we recorded patients who underwent analytical screening with an alert for a medium or high risk of malnutrition, as well as the subgroup of patients in whom we were able to administer the complete INFORNUT® process, generating a report for each.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tissue microarray technology was used to establish immunohistochemistry protocols and to determine the specificity of new antisera against various Chlamydia-like bacteria for future use on formalin-fixed and paraffin-embedded tissues. The antisera exhibited strong reactivity against autologous antigen and closely related heterologous antigen, but no cross-reactivity with distantly related species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Missed, delayed or incorrect diagnoses are considered to be diagnostic errors. The aim of this paper is to describe the methodology of a study to analyse cognitive aspects of the process by which primary care (PC) physicians diagnose dyspnoea. It examines the possible links between the use of heuristics, suboptimal cognitive acts and diagnostic errors, using Reason's taxonomy of human error (slips, lapses, mistakes and violations). The influence of situational factors (professional experience, perceived overwork and fatigue) is also analysed. METHODS Cohort study of new episodes of dyspnoea in patients receiving care from family physicians and residents at PC centres in Granada (Spain). With an initial expected diagnostic error rate of 20%, and a sampling error of 3%, 384 episodes of dyspnoea are calculated to be required. In addition to filling out the electronic medical record of the patients attended, each physician fills out 2 specially designed questionnaires about the diagnostic process performed in each case of dyspnoea. The first questionnaire includes questions on the physician's initial diagnostic impression, the 3 most likely diagnoses (in order of likelihood), and the diagnosis reached after the initial medical history and physical examination. It also includes items on the physicians' perceived overwork and fatigue during patient care. The second questionnaire records the confirmed diagnosis once it is reached. The complete diagnostic process is peer-reviewed to identify and classify the diagnostic errors. The possible use of heuristics of representativeness, availability, and anchoring and adjustment in each diagnostic process is also analysed. Each audit is reviewed with the physician responsible for the diagnostic process. Finally, logistic regression models are used to determine if there are differences in the diagnostic error variables based on the heuristics identified. DISCUSSION This work sets out a new approach to studying the diagnostic decision-making process in PC, taking advantage of new technologies which allow immediate recording of the decision-making process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is insufficient evidence of the usefulness of dengue diagnostic tests under routine conditions. We sought to analyse how physicians are using dengue diagnostics to inform research and development. Subjects attending 14 health institutions in an endemic area of Colombia with either a clinical diagnosis of dengue or for whom a dengue test was ordered were included in the study. Patterns of test-use are described herein. Factors associated with the ordering of dengue diagnostic tests were identified using contingency tables, nonparametric tests and logistic regression. A total of 778 subjects were diagnosed with dengue by the treating physician, of whom 386 (49.5%) were tested for dengue. Another 491 dengue tests were ordered in subjects whose primary diagnosis was not dengue. Severe dengue classification [odds ratio (OR) 2.2; 95% confidence interval (CI) 1.1-4.5], emergency consultation (OR 1.9; 95% CI 1.4-2.5) and month of the year (OR 3.1; 95% CI 1.7-5.5) were independently associated with ordering of dengue tests. Dengue tests were used both to rule in and rule out diagnosis. The latter use is not justified by the sensitivity of current rapid dengue diagnostic tests. Ordering of dengue tests appear to depend on a combination of factors, including physician and institutional preferences, as well as other patient and epidemiological factors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The log-ratio methodology makes available powerful tools for analyzing compositionaldata. Nevertheless, the use of this methodology is only possible for those data setswithout null values. Consequently, in those data sets where the zeros are present, aprevious treatment becomes necessary. Last advances in the treatment of compositionalzeros have been centered especially in the zeros of structural nature and in the roundedzeros. These tools do not contemplate the particular case of count compositional datasets with null values. In this work we deal with \count zeros" and we introduce atreatment based on a mixed Bayesian-multiplicative estimation. We use the Dirichletprobability distribution as a prior and we estimate the posterior probabilities. Then weapply a multiplicative modi¯cation for the non-zero values. We present a case studywhere this new methodology is applied.Key words: count data, multiplicative replacement, composition, log-ratio analysis

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION Tolerability and convenience are crucial aspects for the long-term success of combined antiretroviral therapy (cART). The aim of this study was to investigate the impact in routine clinical practice of switching to the single tablet regimen (STR) RPV/FTC/TDF in patients with intolerance to previous cART, in terms of patients' well-being, assessed by several validated measures. METHODS Prospective, multicenter study. Adult HIV-infected patients with viral load under 1.000 copies/mL while receiving a stable ART for at least the last three months and switched to RPV/FTC/TDF due to intolerance of previous regimen, were included. Analyses were performed by ITT. Presence/magnitude of symptoms (ACTG-HIV Symptom Index), quality of life (EQ-5D, EUROQoL & MOS-HIV), adherence (SMAQ), preference of treatment and perceived ease of medication (ESTAR) through 48 weeks were performed. RESULTS Interim analysis of 125 patients with 16 weeks of follow up was performed. 100 (80%) were male, mean age 46 years. Mean CD4 at baseline was 629.5±307.29 and 123 (98.4%) had viral load <50 copies/mL; 15% were HCV co-infected. Ninety two (73.6%) patients switched from a NNRTI (84.8% from EFV/FTC/TDF) and 33 (26.4%) from a PI/r. The most frequent reasons for switching were psychiatric disorders (51.2%), CNS adverse events (40.8%), gastrointestinal (19.2%) and metabolic disorders (19.2%). At the time of this analysis (week 16), four patients (3.2%) discontinued treatment: one due to adverse events, two virologic failures and one with no data. A total of 104 patients (83.2%) were virologically suppressed (<50 copies/mL). The average degree of discomfort in the ACTG-HIV Symptom Index significantly decreased from baseline (21±15.55) to week 4 (10.89±12.36) & week 16 (10.81±12.62), p<0.001. In all the patients, quality of life tools showed a significant benefit in well-being of the patients (Table 1). Adherence to therapy significantly and progressively increased (SMAQ) from baseline (54.4%) to week 4 (68%), p<0.001 and to week 16 (72.0%), p<0.001. CONCLUSIONS Switching to RPV/FTC/TDF from another ARV regimen due to toxicity, significantly improved the quality of life of HIV-infected patients, both in mental and physical components, and improved adherence to therapy while maintaining a good immune and virological response.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A workshop was convened to discuss best practices for the assessment of drug-induced liver injury (DILI) in clinical trials. In a breakout session, workshop attendees discussed necessary data elements and standards for the accurate measurement of DILI risk associated with new therapeutic agents in clinical trials. There was agreement that in order to achieve this goal the systematic acquisition of protocol-specified clinical measures and lab specimens from all study subjects is crucial. In addition, standard DILI terms that address the diverse clinical and pathologic signatures of DILI were considered essential. There was a strong consensus that clinical and lab analyses necessary for the evaluation of cases of acute liver injury should be consistent with the US Food and Drug Administration (FDA) guidance on pre-marketing risk assessment of DILI in clinical trials issued in 2009. A recommendation that liver injury case review and management be guided by clinicians with hepatologic expertise was made. Of note, there was agreement that emerging DILI signals should prompt the systematic collection of candidate pharmacogenomic, proteomic and/or metabonomic biomarkers from all study subjects. The use of emerging standardized clinical terminology, CRFs and graphic tools for data review to enable harmonization across clinical trials was strongly encouraged. Many of the recommendations made in the breakout session are in alignment with those made in the other parallel sessions on methodology to assess clinical liver safety data, causality assessment for suspected DILI, and liver safety assessment in special populations (hepatitis B, C, and oncology trials). Nonetheless, a few outstanding issues remain for future consideration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Leprosy inflammatory episodes [type 1 (T1R) and type 2 (T2R) reactions] represent the major cause of irreversible nerve damage. Leprosy serology is known to be influenced by the patient’s bacterial index (BI) with higher positivity in multibacillary patients (MB) and specific multidrug therapy (MDT) reduces antibody production. This study evaluated by ELISA antibody responses to leprosy Infectious Disease Research Institute diagnostic-1 (LID-1) fusion protein and phenolic glycolipid I (PGL-I) in 100 paired serum samples of 50 MB patients collected in the presence/absence of reactions and in nonreactional patients before/after MDT. Patients who presented T2R had a median BI of 3+, while MB patients with T1R and nonreactional patients had median BI of 2.5+ (p > 0.05). Anti-LID-1 and anti-PGL-I antibodies declined in patients diagnosed during T1R (p < 0.05). Anti-LID-1 levels waned in MB with T2R at diagnosis and nonreactional MB patients (p < 0.05). Higher anti-LID-1 levels were seen in patients with T2R at diagnosis (vs. patients with T1R at diagnosis, p = 0.008; vs. nonreactional patients, p = 0.020) and in patients with T2R during MDT (vs. nonreactional MB, p = 0.020). In MB patients, high and persistent anti-LID-1 antibody levels might be a useful tool for clinicians to predict which patients are more susceptible to develop leprosy T2R.