858 resultados para Early Diagnosis


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Many factors have been studied as potential correlates in delayed HIV diagnosis and delayed linkage to HIV healthcare. Few studies have analyzed the association of trust as a correlate in HIV diagnosis and HIV medical treatment delays. This study sought to assess the effect of patient trust in physicians and trust in the healthcare system, and whether diminished levels of trust affect delays in HIV diagnosis and/or linking to HIV healthcare, among a cohort of newly diagnosed HIV-infected persons, in Harris County, Texas.^ This study is a secondary data analysis from the Attitude and Beliefs and the Steps of HIV Care Study, also known as the Steps Study, a prospective observational cohort study. From January 2006 to October 2007 patients newly diagnosed with HIV infection and not yet in HIV primary care were recruited from publically funded HIV testing sites in Houston, Texas.^ Two outcomes were assessed in this study. The first outcome sought to determine the influence of trust and whether decreased levels of trust predicted delays in HIV diagnosis. Trust in physicians and trust in the healthcare system were measured via 2 validated trust scales. Trust scores of those with late diagnosis (CD4 counts <200 cells/mm3) were compared statistically with those with early diagnosis (CD4 counts ≥ 200 cells/mm3) in a cross sectional study design. Trust was not found to be predictive of delays in HIV diagnosis. ^ The second outcome utilized the same trust scales and a prospective cohort study design to assess whether there were differences in trust scores between those who successfully linked to HIV healthcare, compared to those who failed to link to HIV healthcare, within 6 months of diagnosis. Patients with higher trust in physicians and trust in the healthcare system were significantly more likely to be linked to HIV healthcare than those with lower trust.^ Overall, this study showed that among low-income persons with undiagnosed HIV infection, low trust is not a barrier to timely diagnosis of HIV infection. Trust may be a factor in promoting a prompt linkage to HIV healthcare among those who are newly diagnosed.^

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Most people presenting with rheumatoid arthritis today can expect to achieve disease suppression, can avoid or substantially delay joint damage and deformities, and can maintain a good quality of life. Optimal management requires early diagnosis and treatment, usually with combinations of conventional disease modifying antirheumatic drugs (DMARDs). If these do not effect remission, biological DMARDs may be beneficial. Lack of recognition of the early signs of rheumatoid arthritis, ignorance of the benefits of early application of modern treatment regimens, and avoidable delays in securing specialist appointments may hinder achievement of best outcomes for many patients. Triage for recognising possible early rheumatoid arthritis must begin in primary care settings with the following pattern of presentation as a guide: involvement of three or more joints; early-morning joint stiffness of greater than 30 minutes; or bilateral squeeze tenderness at metacarpophalangeal or metatarsophalangeal joints.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Objectives: Establishing the diagnosis of infective endocarditis (IE) can be difficult when blood cultures remain sterile or echocardiography is inconclusive. Staphylococcus aureus is a common aetiological microorganism in IE and is associated with severe valvular destruction and increased mortality. Early diagnosis using culture and antibiotic independent tests would be preferable to allow prompt antibiotic administration. We have developed and evaluated 2 serological assays for the rapid identification of a staphylococcal aetiology in infective endocarditis. The assays measure IgG against whole cells of S. aureus and IgG against lipid S, a novel extracellular antigen released by Gram-positive microorganisms. Methods: Serum was collected from 130 patients with IE and 94 control patients. IgG against whole cells of S. aureus and against lipid S was measured by enzyme linked immunosorbent assay (ELISA). Results: Anti-lipid S IgG titres were higher in IE caused by Gram-positive microorganisms than in controls (p < 0.0001) and higher in staphylococcal IE than in both controls and IE caused by other microorganisms (p = 0.0003). Anti-whole cell staphylococcal IgG was significantly higher in serum from patients with staphylococcal IE than in IE caused by other microorganisms and control samples (p < 0.0001). Conclusion: High anti-whole cell IgG titres are predictive of a staphylococcal aetiology in IE. Elevated serum anti-lipid S IgG titres are predictive of Gram-positive infection compared to controls, very high titres being associated with staphylococcal IE. © 2005 The British Infection Society.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

RATIONALE: Limitations in methods for the rapid diagnosis of hospital-acquired infections often delay initiation of effective antimicrobial therapy. New diagnostic approaches offer potential clinical and cost-related improvements in the management of these infections. OBJECTIVES: We developed a decision modeling framework to assess the potential cost-effectiveness of a rapid biomarker assay to identify hospital-acquired infection in high-risk patients earlier than standard diagnostic testing. METHODS: The framework includes parameters representing rates of infection, rates of delayed appropriate therapy, and impact of delayed therapy on mortality, along with assumptions about diagnostic test characteristics and their impact on delayed therapy and length of stay. Parameter estimates were based on contemporary, published studies and supplemented with data from a four-site, observational, clinical study. Extensive sensitivity analyses were performed. The base-case analysis assumed 17.6% of ventilated patients and 11.2% of nonventilated patients develop hospital-acquired infection and that 28.7% of patients with hospital-acquired infection experience delays in appropriate antibiotic therapy with standard care. We assumed this percentage decreased by 50% (to 14.4%) among patients with true-positive results and increased by 50% (to 43.1%) among patients with false-negative results using a hypothetical biomarker assay. Cost of testing was set at $110/d. MEASUREMENTS AND MAIN RESULTS: In the base-case analysis, among ventilated patients, daily diagnostic testing starting on admission reduced inpatient mortality from 12.3 to 11.9% and increased mean costs by $1,640 per patient, resulting in an incremental cost-effectiveness ratio of $21,389 per life-year saved. Among nonventilated patients, inpatient mortality decreased from 7.3 to 7.1% and costs increased by $1,381 with diagnostic testing. The resulting incremental cost-effectiveness ratio was $42,325 per life-year saved. Threshold analyses revealed the probabilities of developing hospital-acquired infection in ventilated and nonventilated patients could be as low as 8.4 and 9.8%, respectively, to maintain incremental cost-effectiveness ratios less than $50,000 per life-year saved. CONCLUSIONS: Development and use of serial diagnostic testing that reduces the proportion of patients with delays in appropriate antibiotic therapy for hospital-acquired infections could reduce inpatient mortality. The model presented here offers a cost-effectiveness framework for future test development.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Background Delirium is highly prevalent, especially in older patients. It independently leads to adverse outcomes, but remains under-detected, particularly hypoactive forms. Although early identification and intervention is important, delirium prevention is key to improving outcomes. The delirium prodrome concept has been mooted for decades, but remains poorly characterised. Greater understanding of this prodrome would promote prompt identification of delirium-prone patients, and facilitate improved strategies for delirium prevention and management. Methods Medical inpatients of ≥70 years were screened for prevalent delirium using the Revised Delirium Rating Scale (DRS--‐R98). Those without prevalent delirium were assessed daily for delirium development, prodromal features and motor subtype. Survival analysis models identified which prodromal features predicted the emergence of incident delirium in the cohort in the first week of admission. The Delirium Motor Subtype Scale-4 was used to ascertain motor subtype. Results Of 555 patients approached, 191 patients were included in the prospective study. The median age was 80 (IQR 10) and 101 (52.9%) were male. Sixty-one patients developed incident delirium within a week of admission. Several prodromal features predicted delirium emergence in the cohort. Firstly, using a novel Prodromal Checklist based on the existing literature, and controlling for confounders, seven predictive behavioural features were identified in the prodromal period (for example, increasing confusion; and being easily distractible). Additionally, using serial cognitive tests and the DRS-R98 daily, multiple cognitive and other core delirium features were detected in the prodrome (for example inattention; and sleep-wake cycle disturbance). Examining longitudinal motor subtypes in delirium cases, subtypes were found to be predominantly stable over time, the most prevalent being hypoactive subtype (62.3%). Discussion This thesis explored multiple aspects of delirium in older medical inpatients, with particular focus on the characterisation of the delirium prodrome. These findings should help to inform future delirium educational programmes, and detection and prevention strategies.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

BACKGROUND: Schistosomiasis remains a major public health issue, with an estimated 230 million people infected worldwide. Novel tools for early diagnosis and surveillance of schistosomiasis are currently needed. Elevated levels of circulating microRNAs (miRNAs) are commonly associated with the initiation and progression of human disease pathology. Hence, serum miRNAs are emerging as promising biomarkers for the diagnosis of a variety of human diseases. This study investigated circulating host miRNAs commonly associated with liver diseases and schistosome parasite-derived miRNAs during the progression of hepatic schistosomiasis japonica in two murine models.

METHODOLOGY/PRINCIPAL FINDINGS: Two mouse strains (C57BL/6 and BALB/c) were infected with a low dosage of Schistosoma japonicum cercariae. The dynamic patterns of hepatopathology, the serum levels of liver injury-related enzymes and the serum circulating miRNAs (both host and parasite-derived) levels were then assessed in the progression of schistosomiasis japonica. For the first time, an inverse correlation between the severity of hepatocyte necrosis and the level of liver fibrosis was revealed during S. japonicum infection in BALB/c, but not in C57BL/6 mice. The inconsistent levels of the host circulating miRNAs, miR-122, miR-21 and miR-34a in serum were confirmed in the two murine models during infection, which limits their potential value as individual diagnostic biomarkers for schistosomiasis. However, their serum levels in combination may serve as a novel biomarker to mirror the hepatic immune responses induced in the mammalian host during schistosome infection and the degree of hepatopathology. Further, two circulating parasite-specific miRNAs, sja-miR-277 and sja-miR-3479-3p, were shown to have potential as diagnostic markers for schistosomiasis japonica.

CONCLUSIONS/SIGNIFICANCE: We provide the first evidence for the potential of utilizing circulating host miRNAs to indicate different immune responses and the severity of hepatopathology outcomes induced in two murine strains infected with S. japonicum. This study also establishes a basis for the early and cell-free diagnosis of schistosomiasis by targeting circulating schistosome parasite-derived miRNAs.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Nowadays it is still difficult to perform an early and accurate diagnosis of dementia, therefore many research focus on the finding of new dementia biomarkers that can aid in that purpose. So scientists try to find a noninvasive, rapid, and relatively inexpensive procedures for early diagnosis purpose. Several studies demonstrated that the utilization of spectroscopic techniques, such as Fourier Transform Infrared Spectroscopy (FTIR) and Raman spectroscopy could be an useful and accurate procedure to diagnose dementia. As several biochemical mechanisms related to neurodegeneration and dementia can lead to changes in plasma components and others peripheral body fluids, blood-based samples and spectroscopic analyses can be used as a more simple and less invasive technique. This work is intended to confirm some of the hypotheses of previous studies in which FTIR was used in the study of plasma samples of possible patient with AD and respective controls and verify the reproducibility of this spectroscopic technique in the analysis of such samples. Through the spectroscopic analysis combined with multivariate analysis it is possible to discriminate controls and demented samples and identify key spectroscopic differences between these two groups of samples which allows the identification of metabolites altered in this disease. It can be concluded that there are three spectral regions, 3500-2700 cm -1, 1800-1400 cm-1 and 1200-900 cm-1 where it can be extracted relevant spectroscopic information. In the first region, the main conclusion that is possible to take is that there is an unbalance between the content of saturated and unsaturated lipids. In the 1800-1400 cm-1 region it is possible to see the presence of protein aggregates and the change in protein conformation for highly stable parallel β-sheet. The last region showed the presence of products of lipid peroxidation related to impairment of membranes, and nucleic acids oxidative damage. FTIR technique and the information gathered in this work can be used in the construction of classification models that may be used for the diagnosis of cognitive dysfunction.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Background: Evaluation of myocardial function by speckle-tracking echocardiography is a new method for the early diagnosis of systolic dysfunction. Objectives: We aimed to determine myocardial speckle-tracking echocardiography indices in Kawasaki Disease (KD) patients and compare them with the same indices in control subjects. Patients and Methods: Thirty-two patients (65.5% males) with KD and 19 control subjects with normal echocardiography participated in this study. After their demographic characteristics and clinical findings were recorded, all the participants underwent transthoracic echocardiography. Strain (S), Strain Rate (SR), Time to Peak Strain (TPS), and Strain Rate (TPSR), longitudinal velocity and view point velocity images in the two, three, and four-chamber views were semi-automatically obtained via speckle-tracking echocardiography. Results: Among the patients, Twenty-four cases (75%) were younger than 4 years. Mean global S and SR was significantly reduced in the KD patients compared to controls (17.03 ± 1.28 vs. 20.22 ± 2.14% and 1.66 ± 0.16 vs. 1.97 ± 0.25 1/second, respectively), while there were no significant differences regarding mean TPS, TPSR, longitudinal velocity and view point velocity. Using repeated measure of analysis of variances, we observed that S and SR decreased from base to apical level in both groups. The change in the pattern of age adjusted mean S and SR across levels was significantly different between the groups (P < 0.001 for both parameters). Conclusions: We showed changes in S and SR assessed in KD patients versus control subjects in the acute phase of KD. However, we suggest that further studies be undertaken to compare S and SR in the acute phase and thereafter in KD patients.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Survival from melanoma is strongly related to tumour thickness, thus earlier diagnosis has the potential to reduce mortality from this disease. However, in the absence of conclusive evidence that clinical skin examination reduces mortality, evidence-based assessments do not recommend population screening. We aimed to assess whether clinical whole-body skin examination is associated with a reduced incidence of thick melanoma and also whether screening is associated with an increased incidence of thin lesions (possible overdiagnosis). A population-based case-control study of all Queensland residents aged 20-75 years with a histologically confirmed first primary invasive cutaneous melanoma diagnosed between January 2000 and December 2003. Telephone interviews were completed by 3,762 eligible cases (78.0%) and 3,824 eligible controls (50.4%) Whole-body clinical skin examination in the three years before diagnosis was associated with a 14% lower risk of being diagnosed with a thick melanoma (>0.75mm) (OR= 0.86, 95% CI=0.75, 0.98). Risk decreased for melanomas of increasing thickness: the risk of being diagnosed with a melanoma 0.76-1.49mm was reduced by 7% (OR=0.93, 95% CI 0.79, 1.10), by 17% for melanomas 1.50-2.99mm (OR=0.83, 95% CI=0.65, 1.05) and by 40% for melanomas ≥3mm (OR=0.60, 95% CI=0.43, 0.83). Screening was associated with a 38% higher risk of being diagnosed with a thin invasive melanoma (≤0.75mm) (OR=1.38, 95% CI=1.22, 1.56). This is the strongest evidence to date that whole-body clinical skin examination reduces the incidence of thick melanoma. Because survival from melanoma is strongly related to tumour thickness, these results suggest that screening would reduce melanoma mortality.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Diabetic peripheral neuropathy (DPN) is one of the most debilitating complications of diabetes. DPN is a major cause of foot ulceration and lower limb amputation. Early diagnosis and management is a key factor in reducing morbidity and mortality. Current techniques for clinical assessment of DPN are relatively insensitive for detecting early disease or involve invasive procedures such as skin biopsies. There is a need for less painful, non-invasive and safe evaluation methods. Eye care professionals already play an important role in the management of diabetic retinopathy; however recent studies have indicated that the eye may also be an important site for the diagnosis and monitoring of neuropathy. Corneal nerve morphology has been shown to be a promising marker of diabetic neuropathy occurring elsewhere in the body, and emerging evidence tentatively suggests that retinal anatomical markers and a range of functional visual indicators could similarly provide useful information regarding neural damage in diabetes – although this line of research is, as yet, less well established. This review outlines the growing body of evidence supporting a potential diagnostic role for retinal structure and visual functional markers in the diagnosis and monitoring of peripheral neuropathy in diabetes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Diabetes is an increasingly prevalent disease worldwide. Providing early management of the complications can prevent morbidity and mortality in this population. Peripheral neuropathy, a significant complication of diabetes, is the major cause of foot ulceration and amputation in diabetes. Delay in attending to complication of the disease contributes to significant medical expenses for diabetic patients and the community. Early structural changes to the neural components of the retina have been demonstrated to occur prior to the clinically visible retinal vasculature complication of diabetic retinopathy. Additionally visual functionloss has been shown to exist before the ophthalmoscopic manifestations of vasculature damage. The purpose of this thesis was to evaluate the relationship between diabetic peripheral neuropathy and both retinal structure and visual function. The key question was whether diabetic peripheral neuropathy is the potential underlying factor responsible for retinal anatomical change and visual functional loss in people with diabetes. This study was conducted on a cohort with type 2 diabetes. Retinal nerve fibre layer thickness was assessed by means of Optical Coherence Tomography (OCT). Visual function was assessed using two different methods; Standard Automated Perimetry (SAP) and flicker perimetry were performed within the central 30 degrees of fixation. The level of diabetic peripheral neuropathy (DPN) was assessed using two techniques - Quantitative Sensory Testing and Neuropathy Disability Score (NDS). These techniques are known to be capable of detecting DPN at very early stages. NDS has also been shown as a gold standard for detecting 'risk of foot ulceration'. Findings reported in this thesis showed that RNFL thickness, particularly in the inferior quadrant, has a significant association with severity of DPN when the condition has been assessed using NDS. More specifically it was observed that inferior RNFL thickness has the ability to differentiate individuals who are at higher risk of foot ulceration from those who are at lower risk, indicating that RNFL thickness can predict late-staged DPN. Investigating the association between RNFL and QST did not show any meaningful interaction, which indicates that RNFL thickness for this cohort was not as predictive of neuropathy status as NDS. In both of these studies, control participants did not have different results from the type 2 cohort who did not DPN suggesting that RNFL thickness is not a marker for diagnosing DPN at early stages. The latter finding also indicated that diabetes per se, is unlikely to affect the RNFL thickness. Visual function as measured by SAP and flicker perimetry was found to be associated with severity of peripheral neuropathy as measured by NDS. These findings were also capable of differentiating individuals at higher risk of foot ulceration; however, visual function also proved not to be a maker for early diagnosis of DPN. It was found that neither SAP, nor flicker sensitivity have meaningful associations with DPN when neuropathy status was measured using QST. Importantly diabetic retinopathy did not explain any of the findings in these experiments. The work described here is valuable as no other research to date has investigated the association between diabetic peripheral neuropathy and either retinal structure or visual function.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Queensland men aged 50 years and older are at high risk for melanoma. Early detection via skin self examination (SSE) (particularly whole-body SSE) followed by presentation to a doctor with suspicious lesions, may decrease morbidity and mortality from melanoma. Prevalence of whole-body SSE (wbSSE) is lower in Queensland older men compared to other population subgroups. With the exception of the present study no previous research has investigated the determinants of wbSSE in older men, or interventions to increase the behaviour in this population. Furthermore, although past SSE intervention studies for other populations have cited health behaviour models in the development of interventions, no study has tested these models in full. The Skin Awareness Study: A recent randomised trial, called the Skin Awareness Study, tested the impact of a video-delivered intervention compared to written materials alone on wbSSE in men aged 50 years or older (n=930). Men were recruited from the general population and interviewed over the telephone at baseline and 13 months. The proportion of men who reported wbSSE rose from 10% to 31% in the control group, and from 11% to 36% in the intervention group. Current research: The current research was a secondary analysis of data collected for the Skin Awareness Study. The objectives were as follows: • To describe how men who did not take up any SSE during the study period differed from those who did take up examining their skin. • To determine whether the intervention program was successful in affecting the constructs of the Health Belief Model it was aimed at (self-efficacy, perceived threat, and outcome expectations); and whether this in turn influenced wbSSE. • To determine whether the Health Action Process Approach (HAPA) was a better predictor of wbSSE behaviour compared to the Health Belief Model (HBM). Methods: For objective 1, men who did not report any past SSE at baseline (n=308) were categorised as having ‘taken up SSE’ (reported SSE at study end) or ‘resisted SSE’ (reported no SSE at study end). Bivariate logistic regression, followed by multivariable regression, investigated the association between participant characteristics measured at baseline and resisting SSE. For objective 2 proxy measures of self-efficacy, perceived threat, and outcome expectations were selected. To determine whether these mediated the effect of the intervention on the outcome, a mediator analysis was performed with all participants who completed interviews at both time points (n=830) following the Baron and Kenny approach, modified for use with structural equation modelling (SEM). For objective 3, control group participants only were included (n=410). Proxy measures of all HBM and HAPA constructs were selected and SEM was used to build up models and test the significance of each hypothesised pathway. A likelihood ratio test compared the HAPA to the HBM. Results: Amongst men who did not report any SSE at baseline, 27% did not take up any SSE by the end of the study. In multivariable analyses, resisting SSE was associated with having more freckly skin (p=0.027); being unsure about the statement ‘if I saw something suspicious on my skin, I’d go to the doctor straight away’ (p=0.028); not intending to perform SSE (p=0.015), having lower SSE self-efficacy (p<0.001), and having no recommendation for SSE from a doctor (p=0.002). In the mediator analysis none of the tested variables mediated the relationship between the intervention and wbSSE. In regards to health behaviour models, the HBM did not predict wbSSE well overall. Only the construct of self-efficacy was a significant predictor of future wbSSE (p=0.001), while neither perceived threat (p=0.584) nor outcome expectations (p=0.220) were. By contrast, when the HAPA constructs were added, all three HBM variables predicted intention to perform SSE, which in turn predicted future behaviour (p=0.015). The HAPA construct of volitional self-efficacy was also associated with wbSSE (p=0.046). The HAPA was a significantly better model compared to the HBM (p<0.001). Limitations: Items selected to measure HBM and HAPA model constructs for objectives 2 and 3 may not have accurately reflected each construct. Conclusions: This research added to the evidence base on how best to target interventions to older men; and on the appropriateness of particular health behaviour models to guide interventions. Findings indicate that to overcome resistance those men with more negative pre-existing attitudes to SSE (not intending to do it, lower initial self-efficacy) may need to be targeted with more intensive interventions in the future. Involving general practitioners in recommending SSE to their patients in this population, alongside disseminating an intervention, may increase its success. Comparison of the HBM and HAPA showed that while two of the three HBM variables examined did not directly predict future wbSSE, all three were associated with intention to self-examine skin. This suggests that in this population, intervening on these variables may increase intention to examine skin, but not necessarily the behaviour itself. Future interventions could potentially focus on increasing both the motivational variables of perceived threat and outcome expectations as well as a combination of both action and volitional self-efficacy; with the aim of increasing intention as well as its translation to taking up and maintaining regular wbSSE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this paper is to review the incidence of upper-body morbidity (arm and breast symptoms, impairments, and lymphedema), methods for diagnosis, and prevention and treatment strategies. It was also the purpose to highlight the evidence base for integration of prospective surveillance for upper-body morbidity within standard clinical care of women with breast cancer. Between 10% and 64% of women report upper-body symptoms between 6 months and 3 years after breast cancer, and approximately 20% develop lymphedema. Symptoms remain common into longer-term survivorship, and although lymphedema may be transient for some, those who present with mild lymphedema are at increased risk of developing moderate to severe lymphedema. The etiology of morbidity seems to be multifactorial, with the most consistent risk factors being those associated with extent of treatment. However, known risk factors cannot reliably distinguish between those who will and will not develop upper-body morbidity. Upper-body morbidity may be treatable with physical therapy. There is also evidence in support of integrating regular surveillance for upper-body morbidity into the routine care provided to women with breast cancer, with early diagnosis potentially contributing to more effective management and prevention of progression of these conditions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Early detection through whole-body Skin Self-Examination (wbSSE) may decrease mortality from melanoma. Using the Health Action Process Approach (HAPA) or Health Belief Model (HBM) we aimed to assess determinants of uptake of wbSSE in 410 men 50 years of older who participated in the control group of a randomized trial. Overall, the HAPA was a significantly better predictor of wbSSE compared to the HBM (p < .001). The construct of self-efficacy in the HBM was a significant predictor of future wbSSE (p = .001), while neither perceived threat (p = .584) nor outcome expectations (p = .220) were. In contrast, self-efficacy, perceived threat, and outcome expectations predicted intention to perform SSE, which predicted behavior (p = .015). The HAPA construct volitional self-efficacy was also associated with wbSSE (p = .046). The use of the HAPA model for future SSE interventions for this population is warranted.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Patients with Crohn’s disease (CD) often require surgery at some stage of disease course. Prediction of CD outcome is influenced by clinical, environmental, serological, and genetic factors (eg, NOD2). Being able to identify CD patients at high risk of surgical intervention should assist clinicians to decide whether or not to prescribe early aggressive treatment with immunomodulators. Methods: We performed a retrospective analysis of selected clinical (age at diagnosis, perianal disease, active smoking) and genetic (NOD2 genotype) data obtained for a population-based CD cohort from the Canterbury Inflammatory Bowel Disease study. Logistic regression was used to identify predictors of complicated outcome in these CD patients (ie, need for inflammatory bowel disease-related surgery). Results: Perianal disease and the NOD2 genotype were the only independent factors associated with the need for surgery in this patient group (odds ratio=2.84 and 1.60, respectively). By combining the associated NOD2 genotype with perianal disease we generated a single “clinicogenetic” variable. This was strongly associated with increased risk of surgery (odds ratio=3.84, P=0.00, confidence interval, 2.28-6.46) and offered moderate predictive accuracy (positive predictive value=0.62). Approximately 1/3 of surgical outcomes in this population are attributable to the NOD2+PA variable (attributable risk=0.32). Conclusions: Knowledge of perianal disease and NOD2 genotype in patients presenting with CD may offer clinicians some decision-making utility for early diagnosis of complicated CD progression and initiating intensive treatment to avoid surgical intervention. Future studies should investigate combination effects of other genetic, clinical, and environmental factors when attempting to identify predictors of complicated CD outcomes.