881 resultados para Disease severity
Resumo:
Definition of disease phenotype is a necessary preliminary to research into genetic causes of a complex disease. Clinical diagnosis of migraine is currently based on diagnostic criteria developed by the International Headache Society. Previously, we examined the natural clustering of these diagnostic symptoms using latent class analysis (LCA) and found that a four-class model was preferred. However, the classes can be ordered such that all symptoms progressively intensify, suggesting that a single continuous variable representing disease severity may provide a better model. Here, we compare two models: item response theory and LCA, each constructed within a Bayesian context. A deviance information criterion is used to assess model fit. We phenotyped our population sample using these models, estimated heritability and conducted genome-wide linkage analysis using Merlin-qtl. LCA with four classes was again preferred. After transformation, phenotypic trait values derived from both models are highly correlated (correlation = 0.99) and consequently results from subsequent genetic analyses were similar. Heritability was estimated at 0.37, while multipoint linkage analysis produced genome-wide significant linkage to chromosome 7q31-q33 and suggestive linkage to chromosomes 1 and 2. We argue that such continuous measures are a powerful tool for identifying genes contributing to migraine susceptibility.
Resumo:
Purpose: The aim of this study was to characterize the clinical signs, symptoms, and ocular and systemic comorbidities in a large case series of contact lens-related microbial keratitis. Methods: Two hundred ninety-seven cases of contact lens-related microbial keratitis, aged between 15 and 64 years were detected through surveillance of hospital and community based ophthalmic practitioners in Australia and New Zealand. Full clinical data were available for 190 cases and 90 were interviewed by telephone. Clinical data included the size, location, and degree of anterior chamber response. Symptom data were available from the practitioner and from participant self-report. Associations between symptoms and disease severity were evaluated. Data on ocular and systemic disease were collected from participants and practitioners. The frequency of comorbidities was compared between the different severities of disease and to population norms. Results: More severe disease was associated with greater symptom severity and pain was the most prevalent symptom reported. Ninety-one percent of cases showed progression of ocular symptoms after lens removal, and symptom progression was associated with all severities of disease. Twenty-five percent of cases reported prior episodes requiring emergency attention. Thyroid disease (p 0.05) and self-reported poor health (p 0.001) were more common in cases compared with age-matched population norms. Discussion: Information on the signs, symptoms, and comorbidities associated with contact lens-related microbial keratitis may be useful in patient education and for practitioners involved in the fitting of lenses and management of complications. Although pain was the most common symptom experienced, progression of symptoms despite lens removal was close to universal. Poor general health, particularly respiratory disease and thyroid disease was more common in cases than in the general population, which may prompt practitioners to recommend flexibility in wear schedules when in poor health or the selection of a lower risk wear schedule in at risk patients
Resumo:
Acute lower respiratory tract infections (ALRTIs) are a common cause of morbidity and mortality among children under 5 years of age and are found worldwide, with pneumonia as the most severe manifestation. Although the incidence of severe disease varies both between individuals and countries, there is still no clear understanding of what causes this variation. Studies of community-acquired pneumonia (CAP) have traditionally not focused on viral causes of disease due to a paucity of diagnostic tools. However, with the emergence of molecular techniques, it is now known that viruses outnumber bacteria as the etiological agents of childhood CAP, especially in children under 2 years of age. The main objective of this study was to investigate viruses contributing to disease severity in cases of childhood ALRTI, using a two year cohort study following 2014 infants and children enrolled in Bandung, Indonesia. A total of 352 nasopharyngeal washes collected from 256 paediatric ALRTI patients were used for analysis. A subset of samples was screened using a novel microarray pathogen detection method that identified respiratory syncytial virus (RSV), human metapneumovirus (hMPV) and human rhinovirus (HRV) in the samples. Real-time RT-PCR was used both for confirming and quantifying viruses found in the nasopharyngeal samples. Viral copy numbers were determined and normalised to the numbers of human cells collected with the use of 18S rRNA. Molecular epidemiology was performed for RSV A and hMPV using sequences to the glycoprotein gene and nucleoprotein gene respectively, to determine genotypes circulating in this Indonesian paediatric cohort. This study found that HRV (119/352; 33.8%) was the most common virus detected as the cause of respiratory tract infections in this cohort, followed by the viral pathogens RSV A (73/352; 20.7%), hMPV (30/352; 8.5%) and RSV B (12/352; 3.4%). Co-infections of more than two viruses were detected in 31 episodes (defined as an infection which occurred more than two weeks apart), accounting for 8.8% of the 352 samples tested or 15.4% of the 201 episodes with at least one virus detected. RSV A genotypes circulating in this population were predominantly GA2, GA5 and GA7, while hMPV genotypes circulating were mainly A2a (27/30; 90.0%), B2 (2/30; 6.7%) and A1 (1/30; 3.3%). This study found no evidence of disease severity associated either with a specific virus or viral strain, or with viral load. However, this study did find a significant association with co-infection of RSV A and HRV with severe disease (P = 0.006), suggesting that this may be a novel cause of severe disease.
Resumo:
Older adults, especially those acutely ill, are vulnerable to developing malnutrition due to a range of risk factors. The high prevalence and extensive consequences of malnutrition in hospitalised older adults have been reported extensively. However, there are few well-designed longitudinal studies that report the independent relationship between malnutrition and clinical outcomes after adjustment for a wide range of covariates. Acutely ill older adults are exceptionally prone to nutritional decline during hospitalisation, but few reports have studied this change and impact on clinical outcomes. In the rapidly ageing Singapore population, all this evidence is lacking, and the characteristics associated with the risk of malnutrition are also not well-documented. Despite the evidence on malnutrition prevalence, it is often under-recognised and under-treated. It is therefore crucial that validated nutrition screening and assessment tools are used for early identification of malnutrition. Although many nutrition screening and assessment tools are available, there is no universally accepted method for defining malnutrition risk and nutritional status. Most existing tools have been validated amongst Caucasians using various approaches, but they are rarely reported in the Asian elderly and none has been validated in Singapore. Due to the multiethnicity, cultural, and language differences in Singapore older adults, the results from non-Asian validation studies may not be applicable. Therefore it is important to identify validated population and setting specific nutrition screening and assessment methods to accurately detect and diagnose malnutrition in Singapore. The aims of this study are therefore to: i) characterise hospitalised elderly in a Singapore acute hospital; ii) describe the extent and impact of admission malnutrition; iii) identify and evaluate suitable methods for nutritional screening and assessment; and iv) examine changes in nutritional status during admission and their impact on clinical outcomes. A total of 281 participants, with a mean (+SD) age of 81.3 (+7.6) years, were recruited from three geriatric wards in Tan Tock Seng Hospital over a period of eight months. They were predominantly Chinese (83%) and community-dwellers (97%). They were screened within 72 hours of admission by a single dietetic technician using four nutrition screening tools [Tan Tock Seng Hospital Nutrition Screening Tool (TTSH NST), Nutritional Risk Screening 2002 (NRS 2002), Mini Nutritional Assessment-Short Form (MNA-SF), and Short Nutritional Assessment Questionnaire (SNAQ©)] that were administered in no particular order. The total scores were not computed during the screening process so that the dietetic technician was blinded to the results of all the tools. Nutritional status was assessed by a single dietitian, who was blinded to the screening results, using four malnutrition assessment methods [Subjective Global Assessment (SGA), Mini Nutritional Assessment (MNA), body mass index (BMI), and corrected arm muscle area (CAMA)]. The SGA rating was completed prior to computation of the total MNA score to minimise bias. Participants were reassessed for weight, arm anthropometry (mid-arm circumference, triceps skinfold thickness), and SGA rating at discharge from the ward. The nutritional assessment tools and indices were validated against clinical outcomes (length of stay (LOS) >11days, discharge to higher level care, 3-month readmission, 6-month mortality, and 6-month Modified Barthel Index) using multivariate logistic regression. The covariates included age, gender, race, dementia (defined using DSM IV criteria), depression (defined using a single question “Do you often feel sad or depressed?”), severity of illness (defined using a modified version of the Severity of Illness Index), comorbidities (defined using Charlson Comorbidity Index, number of prescribed drugs and admission functional status (measured using Modified Barthel Index; MBI). The nutrition screening tools were validated against the SGA, which was found to be the most appropriate nutritional assessment tool from this study (refer section 5.6) Prevalence of malnutrition on admission was 35% (defined by SGA), and it was significantly associated with characteristics such as swallowing impairment (malnourished vs well-nourished: 20% vs 5%), poor appetite (77% vs 24%), dementia (44% vs 28%), depression (34% vs 22%), and poor functional status (MBI 48.3+29.8 vs 65.1+25.4). The SGA had the highest completion rate (100%) and was predictive of the highest number of clinical outcomes: LOS >11days (OR 2.11, 95% CI [1.17- 3.83]), 3-month readmission (OR 1.90, 95% CI [1.05-3.42]) and 6-month mortality (OR 3.04, 95% CI [1.28-7.18]), independent of a comprehensive range of covariates including functional status, disease severity and cognitive function. SGA is therefore the most appropriate nutritional assessment tool for defining malnutrition. The TTSH NST was identified as the most suitable nutritional screening tool with the best diagnostic performance against the SGA (AUC 0.865, sensitivity 84%, specificity 79%). Overall, 44% of participants experienced weight loss during hospitalisation, and 27% had weight loss >1% per week over median LOS 9 days (range 2-50). Wellnourished (45%) and malnourished (43%) participants were equally prone to experiencing decline in nutritional status (defined by weight loss >1% per week). Those with reduced nutritional status were more likely to be discharged to higher level care (adjusted OR 2.46, 95% CI [1.27-4.70]). This study is the first to characterise malnourished hospitalised older adults in Singapore. It is also one of the very few studies to (a) evaluate the association of admission malnutrition with clinical outcomes in a multivariate model; (b) determine the change in their nutritional status during admission; and (c) evaluate the validity of nutritional screening and assessment tools amongst hospitalised older adults in an Asian population. Results clearly highlight that admission malnutrition and deterioration in nutritional status are prevalent and are associated with adverse clinical outcomes in hospitalised older adults. With older adults being vulnerable to risks and consequences of malnutrition, it is important that they are systematically screened so timely and appropriate intervention can be provided. The findings highlighted in this thesis provide an evidence base for, and confirm the validity of the current nutrition screening and assessment tools used among hospitalised older adults in Singapore. As the older adults may have developed malnutrition prior to hospital admission, or experienced clinically significant weight loss of >1% per week of hospitalisation, screening of the elderly should be initiated in the community and continuous nutritional monitoring should extend beyond hospitalisation.
Resumo:
Objective The aim of this study was to demonstrate the potential of near-infrared (NIR) spectroscopy for categorizing cartilage degeneration induced in animal models. Method Three models of osteoarthritic degeneration were induced in laboratory rats via one of the following methods: (i) menisectomy (MSX); (ii) anterior cruciate ligament transaction (ACLT); and (iii) intra-articular injection of mono-ido-acetete (1 mg) (MIA), in the right knee joint, with 12 rats per model group. After 8 weeks, the animals were sacrificed and tibial knee joints were collected. A custom-made nearinfrared (NIR) probe of diameter 5 mm was placed on the cartilage surface and spectral data were acquired from each specimen in the wavenumber range 4 000 – 12 500 cm−1. Following spectral data acquisition, the specimens were fixed and Safranin–O staining was performed to assess disease severity based on the Mankin scoring system. Using multivariate statistical analysis based on principal component analysis and partial least squares regression, the spectral data were then related to the Mankinscores of the samples tested. Results Mild to severe degenerative cartilage changes were observed in the subject animals. The ACLT models showed mild cartilage degeneration, MSX models moderate, and MIA severe cartilage degenerative changes both morphologically and histologically. Our result demonstrate that NIR spectroscopic information is capable of separating the cartilage samples into different groups relative to the severity of degeneration, with NIR correlating significantly with their Mankinscore (R2 = 88.85%). Conclusion We conclude that NIR is a viable tool for evaluating articularcartilage health and physical properties such as change in thickness with degeneration.
Interleukin-13 promotes susceptibility to chlamydial infection of the respiratory and genital tracts
Resumo:
Chlamydiae are intracellular bacteria that commonly cause infections of the respiratory and genital tracts, which are major clinical problems. Infections are also linked to the aetiology of diseases such as asthma, emphysema and heart disease. The clinical management of infection is problematic and antibiotic resistance is emerging. Increased understanding of immune processes that are involved in both clearance and immunopathology of chlamydial infection is critical for the development of improved treatment strategies. Here, we show that IL-13 was produced in the lungs of mice rapidly after Chlamydia muridarum (Cmu) infection and promoted susceptibility to infection. Wild-type (WT) mice had increased disease severity, bacterial load and associated inflammation compared to IL-13 deficient (−/−) mice as early as 3 days post infection (p.i.). Intratracheal instillation of IL-13 enhanced bacterial load in IL-13−/− mice. There were no differences in early IFN-g and IL-10 expression between WT and IL-13−/− mice and depletion of CD4+ T cells did not affect infection in IL-13−/− mice. Collectively, these data demonstrate a lack of CD4+ T cell involvement and a novel role for IL-13 in innate responses to infection. We also showed that IL-13 deficiency increased macrophage uptake of Cmu in vitro and in vivo. Moreover, the depletion of IL-13 during infection of lung epithelial cells in vitro decreased the percentage of infected cells and reduced bacterial growth. Our results suggest that enhanced IL-13 responses in the airways, such as that found in asthmatics, may promote susceptibility to chlamydial lung infection. Importantly the role of IL-13 in regulating infection was not limited to the lung as we showed that IL-13 also promoted susceptibility to Cmu genital tract infection. Collectively our findings demonstrate that innate IL-13 release promotes infection that results in enhanced inflammation and have broad implications for the treatment of chlamydial infections and IL-13-associated diseases.
Resumo:
Gait freezing is an episodic arrest of locomotion due to an inability to take normal steps. Pedunculopontine nucleus stimulation is an emerging therapy proposed to improve gait freezing, even where refractory to medication. However, the efficacy and precise effects of pedunculopontine nucleus stimulation on Parkinsonian gait disturbance are not established. The clinical application of this new therapy is controversial and it is unknown if bilateral stimulation is more effective than unilateral. Here, in a double-blinded study using objective spatiotemporal gait analysis, we assessed the impact of unilateral and bilateral pedunculopontine nucleus stimulation on triggered episodes of gait freezing and on background deficits of unconstrained gait in Parkinson’s disease. Under experimental conditions, while OFF medication, Parkinsonian patients with severe gait freezing implanted with pedunculopontine nucleus stimulators below the pontomesencephalic junction were assessed during three conditions; off stimulation, unilateral stimulation and bilateral stimulation. Results were compared to Parkinsonian patients without gait freezing matched for disease severity and healthy controls. Pedunculopontine nucleus stimulation improved objective measures of gait freezing, with bilateral stimulation more effective than unilateral. During unconstrained walking, Parkinsonian patients who experience gait freezing had reduced step length and increased step length variability compared to patients without gait freezing; however, these deficits were unchanged by pedunculopontine nucleus stimulation. Chronic pedunculopontine nucleus stimulation improved Freezing of Gait Questionnaire scores, reflecting a reduction of the freezing encountered in patients’ usual environments and medication states. This study provides objective, double-blinded evidence that in a specific subgroup of Parkinsonian patients, stimulation of a caudal pedunculopontine nucleus region selectively improves gait freezing but not background deficits in step length. Bilateral stimulation was more effective than unilateral.
Resumo:
Oral squamous cell carcinomas (OSCC) often arise from dysplastic lesions. The role of cancer stem cells in tumour initiation is widely accepted, yet the potential existence of pre-cancerous stem cells in dysplastic tissue has received little attention. Cell lines from oral diseases ranging in severity from dysplasia to malignancy provide opportunity to investigate the involvement of stem cells in malignant progression from dysplasia. Stem cells are functionally defined by their ability to generate hierarchical tissue structures in consortium with spatial regulation. Organotypic cultures readily display tissue hierarchy in vitro; hence, in this study, we compared hierarchical expression of stem cell-associated markers in dermis-based organotypic cultures of oral epithelial cells from normal tissue (OKF6-TERT2), mild dysplasia (DOK), severe dysplasia (POE-9n) and OSCC (PE/CA P J15). Expression of CD44, p75NTR, CD24 and ALDH was studied in monolayers by flow cytometry and in organotypic cultures by immunohistochemistry. Spatial regulation of CD44 and p75NTR was evident for organotypic cultures of normal (OKF6-TERT2) and dysplasia (DOK and POE-9n) but was lacking for OSCC (PE/CA PJ15)-derived cells. Spatial regulation of CD24 was not evident. All monolayer cultures exhibited CD44, p75NTR, CD24 antigens and ALDH activity (ALDEFLUOR® assay), with a trend towards loss of population heterogeneity that mirrored disease severity. In monolayer, increased FOXA1 and decreased FOXA2 expression correlated with disease severity, but OCT3/4, Sox2 and NANOG did not. We conclude that dermis-based organotypic cultures give opportunity to investigate the mechanisms that underlie loss of spatial regulation of stem cell markers seen with OSCC-derived cells.
Resumo:
Background & aims The Australasian Nutrition Care Day Survey (ANCDS) ascertained if malnutrition and poor food intake are independent risk factors for health-related outcomes in Australian and New Zealand hospital patients. Methods Phase 1 recorded nutritional status (Subjective Global Assessment) and 24-h food intake (0, 25, 50, 75, 100% intake). Outcomes data (Phase 2) were collected 90-days post-Phase 1 and included length of hospital stay (LOS), readmissions and in-hospital mortality. Results Of 3122 participants (47% females, 65 ± 18 years) from 56 hospitals, 32% were malnourished and 23% consumed ≤ 25% of the offered food. Malnourished patients had greater median LOS (15 days vs. 10 days, p < 0.0001) and readmissions rates (36% vs. 30%, p = 0.001). Median LOS for patients consuming ≤ 25% of the food was higher than those consuming ≤ 50% (13 vs. 11 days, p < 0.0001). The odds of 90-day in-hospital mortality were twice greater for malnourished patients (CI: 1.09–3.34, p = 0.023) and those consuming ≤ 25% of the offered food (CI: 1.13–3.51, p = 0.017), respectively. Conclusion The ANCDS establishes that malnutrition and poor food intake are independently associated with in-hospital mortality in the Australian and New Zealand acute care setting.
Resumo:
1. Essential hypertension occurs in people with an underlying genetic predisposition who subject themselves to adverse environmental influences. The number of genes involved is unknown, as is the extent to which each contributes to final blood pressure and the severity of the disease. 2. In the past, studies of potential candidate genes have been performed by association (case-control) analysis of unrelated individuals or linkage (pedigree or sibpair) analysis of families. These studies have resulted in several positive findings but, as one may expect, also an enormous number of negative results. 3. In order to uncover the major genetic loci for essential hypertension, it is proposed that scanning the genome systematically in 100- 200 affected sibships should prove successful. 4. This involves genotyping sets of hypertensive sibships to determine their complement of several hundred microsatellite polymorphisms. Those that are highly informative, by having a high heterozygosity, are most suitable. Also, the markers need to be spaced sufficiently evenly across the genome so as to ensure adequate coverage. 5. Tests are performed to determine increased segregation of alleles of each marker with hypertension. The analytical tools involve specialized statistical programs that can detect such differences. Non- parametric multipoint analysis is an appropriate approach. 6. In this way, loci for essential hypertension are beginning to emerge.
Resumo:
Background: Few patients diagnosed with lung cancer are still alive 5 years after diagnosis. The aim of the current study was to conduct a 10-year review of a consecutive series of patients undergoing curative-intent surgical resection at the largest tertiary referral centre to identify prognostic factors. Methods: Case records of all patients operated on for lung cancer between 1998 and 2008 were reviewed. The clinical features and outcomes of all patients with non-small cell lung cancer (NSCLC) stage I-IV were recorded. Results: A total of 654 patients underwent surgical resection with curative intent during the study period. Median overall survival for the entire cohort was 37 months. The median age at operation was 66 years, with males accounting for 62.7 %. Squamous cell type was the most common histological subtype, and lobectomies were performed in 76.5 % of surgical resections. Pneumonectomy rates decreased significantly in the latter half of the study (25 vs. 16.3 %), while sub-anatomical resection more than doubled (2 vs. 5 %) (p < 0.005). Clinico-pathological characteristics associated with improved survival by univariate analysis include younger age, female sex, smaller tumour size, smoking status, lobectomy, lower T and N status and less advanced pathological stage. Age, gender, smoking status and tumour size, as well as T and N descriptors have emerged as independent prognostic factors by multivariate analysis. Conclusion: We identified several factors that predicted outcome for NSCLC patients undergoing curative-intent surgical resection. Survival rates in our series are comparable to those reported from other thoracic surgery centres. © 2012 Royal Academy of Medicine in Ireland.
Resumo:
Early diagnosis and the ability to predict the most relevant treatment option for individuals is essential to improve clinical outcomes for non-small cell lung cancer (NSCLC) patients. Adenocarcinoma (ADC), a subtype of NSCLC, is the single biggest cancer killer and therefore an urgent need to identify minimally invasive biomarkers to enable early diagnosis. Recent studies, by ourselves and others, indicate that circulating miRNA s have potential as biomarkers. Here we applied global profiling approaches in serum from patients with ADC of the lung to explore if miRNA s have potential as diagnostic biomarkers. This study involved RNA isolation from 80 sera specimens including those from ADC patients (equal numbers of stages 1, 2, 3, and 4) and age- and gender-matched controls (n = 40 each). Six hundred and sixty-seven miRNA s were co-analyzed in these specimens using TaqMan low density arrays and qPCR validation using individual miRNA s. Overall, approximately 390 and 370 miRNA s were detected in ADC and control sera, respectively. A group of 6 miRNA s, miR-30c-1* (AU C = 0.74; P < 0.002), miR-616(AU C = 0.71; P = 0.001), miR-146b-3p (AU C = 0.82; P < 0.0001), miR-566 (AU C = 0.80; P < 0.0001), miR-550 (AU C = 0.72; P = 0.0006), and miR-939 (AU C = 0.82; P < 0.0001) was found to be present at substantially higher levels in ADC compared with control sera. Conversely, miR-339-5p and miR-656 were detected at substantially lower levels in ADC sera (co-analysis resulting in AU C = 0.6; P = 0.02). Differences in miRNA profile identified support circulating miRNA s having potential as diagnostic biomarkers for ADC. More extensive studies of ADC and control serum specimens are warranted to independently validate the potential clinical relevance of these miRNA s as minimally invasive biomarkers for ADC.
Resumo:
PURPOSE The purpose of this study was to demonstrate the potential of near infrared (NIR) spectroscopy for characterizing the health and degenerative state of articular cartilage based on the components of the Mankin score. METHODS Three models of osteoarthritic degeneration induced in laboratory rats by anterior cruciate ligament (ACL) transection, meniscectomy (MSX), and intra-articular injection of monoiodoacetate (1 mg) (MIA) were used in this study. Degeneration was induced in the right knee joint; each model group consisted of 12 rats (N = 36). After 8 weeks, the animals were euthanized and knee joints were collected. A custom-made diffuse reflectance NIR probe of 5-mm diameter was placed on the tibial and femoral surfaces, and spectral data were acquired from each specimen in the wave number range of 4,000 to 12,500 cm(-1). After spectral data acquisition, the specimens were fixed and safranin O staining (SOS) was performed to assess disease severity based on the Mankin scoring system. Using multivariate statistical analysis, with spectral preprocessing and wavelength selection technique, the spectral data were then correlated to the structural integrity (SI), cellularity (CEL), and matrix staining (SOS) components of the Mankin score for all the samples tested. RESULTS ACL models showed mild cartilage degeneration, MSX models had moderate degeneration, and MIA models showed severe cartilage degenerative changes both morphologically and histologically. Our results reveal significant linear correlations between the NIR absorption spectra and SI (R(2) = 94.78%), CEL (R(2) = 88.03%), and SOS (R(2) = 96.39%) parameters of all samples in the models. In addition, clustering of the samples according to their level of degeneration, with respect to the Mankin components, was also observed. CONCLUSIONS NIR spectroscopic probing of articular cartilage can potentially provide critical information about the health of articular cartilage matrix in early and advanced stages of osteoarthritis (OA). CLINICAL RELEVANCE This rapid nondestructive method can facilitate clinical appraisal of articular cartilage integrity during arthroscopic surgery.
Resumo:
Initial estimates of the burden of disease in South Africa in 20001 have been revised on the basis of additional data to estimate the disability-adjusted life-years (DALYs) for single causes for the first time in South Africa. The findings highlight the fact that despite uncertainty in the estimates, they provide important information to guide public health responses to improve the health of the nation...