881 resultados para Disease severity
Resumo:
Parkinson’s disease is a common neurodegenerative disorder with a higher risk of hospitalization than the general population. Therefore, there is a high likelihood of encountering a person with Parkinson’s disease in acute or critical care. Most people with Parkinson’s disease are over the age of 60 years and are likely to have other concurrent medical conditions. Parkinson’s disease is more likely to be the secondary diagnosis during hospital admission. The primary diagnosis may be due to other medical conditions or as a result of complications from Parkinson’s disease symptoms. Symptoms include motor symptoms, such as slowness of movement and tremor, and non-motor symptoms, such as depression, dysphagia, and constipation. There is a large degree of variation in the presence and degree of symptoms as well as in the rate of progression. There is a range of medications that can be used to manage the motor or non-motor symptoms, and side effects can occur. Improper administration of medications can result in deterioration of the patient’s condition and potentially a life-threatening condition called neuroleptic malignant-like syndrome. Nutrients and delayed gastric emptying may also interfere with intestinal absorption of levodopa, the primary medication used for motor symptom management. Rates of protein-energy malnutrition can be up to 15 % in people with Parkinson’s disease in the community, and this is likely to be higher in the acute or critical care setting. Nutrition-related care in this setting should utilize the Nutrition Care Process and take into account each individual’s Parkinson’s disease motor and non-motor symptoms, the severity of disease, limitations due to the disease, medical management regimen, and nutritional status when planning nutrition interventions. Special considerations may need to be taken into account in relation to meal and medication times and the administration of enteral feeding. Nutrition screening, assessment, and monitoring should occur during admission to minimize the effects of Parkinson's disease symptoms and to optimise nutrition-related outcomes.
Resumo:
Aberrant connectivity is implicated in many neurological and psychiatric disorders, including Alzheimer's disease and schizophrenia. However, other than a few disease-associated candidate genes, we know little about the degree to which genetics play a role in the brain networks; we know even less about specific genes that influence brain connections. Twin and family-based studies can generate estimates of overall genetic influences on a trait, but genome-wide association scans (GWASs) can screen the genome for specific variants influencing the brain or risk for disease. To identify the heritability of various brain connections, we scanned healthy young adult twins with high-field, highangular resolution diffusion MRI. We adapted GWASs to screen the brain's connectivity pattern, allowing us to discover genetic variants that affect the human brain's wiring. The association of connectivity with the SPON1 variant at rs2618516 on chromosome 11 (11p15.2) reached connectome-wide, genome-wide significance after stringent statistical corrections were enforced, and it was replicated in an independent subsample. rs2618516 was shown to affect brain structure in an elderly population with varying degrees of dementia. Older people who carried the connectivity variant had significantly milder clinical dementia scores and lower risk of Alzheimer's disease. As a posthoc analysis, we conducted GWASs on several organizational and topological network measures derived from the matrices to discover variants in and around genes associated with autism (MACROD2), development (NEDD4), and mental retardation (UBE2A) significantly associated with connectivity. Connectome-wide, genome-wide screening offers substantial promise to discover genes affecting brain connectivity and risk for brain diseases.
Resumo:
Introduction Presently, the severity of obstructive sleep apnea (OSA) is estimated based on the apnea-hypopnea index (AHI). Unfortunately, AHI does not provide information on the severity of individual obstruction events. Previously, the severity of individual obstruction events has been suggested to be related to the outcome of the disease. In this study, we incorporate this information into AHI and test whether this novel approach would aid in discriminating patients with the highest risk. We hypothesize that the introduced adjusted AHI parameter provides a valuable supplement to AHI in the diagnosis of the severity of OSA. Methods This hypothesis was tested by means of retrospective follow-up (mean ± sd follow-up time 198.2 ± 24.7 months) of 1,068 men originally referred to night polygraphy due to suspected OSA. After exclusion of the 264 patients using CPAP, the remaining 804 patients were divided into normal (AHI < 5) and OSA (AHI ≥ 5) categories based on conventional AHI and adjusted AHI. For a more detailed analysis, the patients were divided into normal, mild, moderate, and severe OSA categories based on conventional AHI and adjusted AHI. Subsequently, the mortality and cardiovascular morbidity in these groups were determined. Results Use of the severity of individual obstruction events for adjustment of AHI led to a significant rearrangement of patients between severity categories. Due to this rearrangement, the number of deceased patients diagnosed to have OSA was increased when adjusted AHI was used as the diagnostic index. Importantly, risk ratios of all-cause mortality and cardiovascular morbidity were higher in moderate and severe OSA groups formed based on the adjusted AHI parameter than in those formed based on conventional AHI. Conclusions The adjusted AHI parameter was found to give valuable supplementary information to AHI and to potentially improve the recognition of OSA patients with the highest risk of mortality or cardiovascular morbidity.
Resumo:
Objectives: Quality of life (QOL) is reportedly poor in children with Crohn disease (CD) but improves with increasing disease duration. This article aims to detail QOL in a cohort of Australian children with CD in relation to disease duration, disease activity, and treatment. MATERIALS AND METHODS: QOL, assessed using the IMPACT-III questionnaire, and disease activity measures, assessed using the Pediatric Crohn's Disease Activity Index (PCDAI), were available in 41 children with CD. For this cohort, a total of 186 measurements of both parameters were available. Results: QOL was found to be significantly lower, and disease activity significantly higher (F = 31.1, P = 0.00), in patients within 6 months of their diagnosis compared with those up to 2.5 years, up to 5 years, and beyond 5 years since diagnosis. Higher disease activity was associated with poorer QOL (r =-0.51, P = 0.00). Total QOL was highest in children on nil medications and lowest in children on enteral nutrition. The PCDAI (t =-6.0, P = 0.00) was a significant predictor of QOL, with the clinical history (t =-6.9, P = 0.00) and examination (t =-2.9, P = 0.01) sections of the PCDAI significantly predicting QOL. Disease duration, age, or sex was neither related to nor significant predictors of QOL, but height z score and type of treatment approached significance. Conclusions: Children with CD within 6 months of their diagnosis have impaired QOL compared with those diagnosed beyond 6 months. These patients, along with those with growth impairment, ongoing elevated disease activity with abdominal pain, diarrhoea and/or perirectal and extraintestinal complications, may benefit from regular assessments of QOL as part of their clinical treatment. © 2010 by European Society for Pediatric Gastroenterology, Hepatology, and Nutrition and North American Society for Pediatric Gastroenterology, Hepatology, and Nutrition.
Resumo:
Background Up-to-date evidence about levels and trends in disease and injury incidence, prevalence, and years lived with disability (YLDs) is an essential input into global, regional, and national health policies. In the Global Burden of Disease Study 2013 (GBD 2013), we estimated these quantities for acute and chronic diseases and injuries for 188 countries between 1990 and 2013. Methods Estimates were calculated for disease and injury incidence, prevalence, and YLDs using GBD 2010 methods with some important refi nements. Results for incidence of acute disorders and prevalence of chronic disorders are new additions to the analysis. Key improvements include expansion to the cause and sequelae list, updated systematic reviews, use of detailed injury codes, improvements to the Bayesian meta-regression method (DisMod-MR), and use of severity splits for various causes. An index of data representativeness, showing data availability, was calculated for each cause and impairment during three periods globally and at the country level for 2013. In total, 35 620 distinct sources of data were used and documented to calculated estimates for 301 diseases and injuries and 2337 sequelae. The comorbidity simulation provides estimates for the number of sequelae, concurrently, by individuals by country, year, age, and sex. Disability weights were updated with the addition of new population-based survey data from four countries. Findings Disease and injury were highly prevalent; only a small fraction of individuals had no sequelae. Comorbidity rose substantially with age and in absolute terms from 1990 to 2013. Incidence of acute sequelae were predominantly infectious diseases and short-term injuries, with over 2 billion cases of upper respiratory infections and diarrhoeal disease episodes in 2013, with the notable exception of tooth pain due to permanent caries with more than 200 million incident cases in 2013. Conversely, leading chronic sequelae were largely attributable to non-communicable diseases, with prevalence estimates for asymptomatic permanent caries and tension-type headache of 2∙4 billion and 1∙6 billion, respectively. The distribution of the number of sequelae in populations varied widely across regions, with an expected relation between age and disease prevalence. YLDs for both sexes increased from 537∙6 million in 1990 to 764∙8 million in 2013 due to population growth and ageing, whereas the age-standardised rate decreased little from 114∙87 per 1000 people to 110∙31 per 1000 people between 1990 and 2013. Leading causes of YLDs included low back pain and major depressive disorder among the top ten causes of YLDs in every country. YLD rates per person, by major cause groups, indicated the main drivers of increases were due to musculoskeletal, mental, and substance use disorders, neurological disorders, and chronic respiratory diseases; however HIV/AIDS was a notable driver of increasing YLDs in sub-Saharan Africa. Also, the proportion of disability-adjusted life years due to YLDs increased globally from 21·1% in 1990 to 31·2% in 2013. Interpretation Ageing of the world’s population is leading to a substantial increase in the numbers of individuals with sequelae of diseases and injuries. Rates of YLDs are declining much more slowly than mortality rates. The non-fatal dimensions of disease and injury will require more and more attention from health systems. The transition to nonfatal outcomes as the dominant source of burden of disease is occurring rapidly outside of sub-Saharan Africa. Our results can guide future health initiatives through examination of epidemiological trends and a better understanding of variation across countries.
Resumo:
Continuous cultivation and cereal cropping of southern Queensland soils previously supporting native vegetation have resulted in reduced soil nitrogen supply, and consequently decreased cereal grain yields and low grain protein. To enhance yields and protein concentrations of wheat, management practices involving N fertiliser application, with no-tillage and stubble retention, grain legumes, and legume leys were evaluated from 1987 to 1998 on a fertility-depleted Vertosol at Warra, southern Queensland. The objective of this study was to examine the effect of lucerne in a 2-year lucerne–wheat rotation for its nitrogen and disease-break benefits to subsequent grain yield and protein content of wheat as compared with continuous wheat cropping. Dry matter production and nitrogen yields of lucerne were closely correlated with the total rainfall for October–September as well as March–September rainfall. Each 100 mm of total rainfall resulted in 0.97 t/ha of dry matter and 26 kg/ha of nitrogen yield. For the March–September rainfall, the corresponding values were 1.26 t/ha of dry matter and 36 kg/ha of nitrogen yield. The latter values were 10% lower than those produced by annual medics during a similar period. Compared with wheat–wheat cropping, significant increases in total soil nitrogen were observed only in 1990, 1992 and 1994 but increases in soil mineralisable nitrogen were observed in most years following lucerne. Similarly, pre-plant nitrate nitrogen in the soil profile following lucerne was higher by 74 kg/ha (9–167 kg N/ha) than that of wheat–wheat without N fertiliser in all years except 1996. Consequently, higher wheat grain protein (7 out of 9 seasons) and grain yield (4 out of 9 seasons) were produced compared with continuous wheat. There was significant depression in grain yield in 2 (1993 and 1995) out of 9 seasons attributed to soil moisture depletion and/or low growing season rainfall. Consequently, the overall responses in yield were lower than those of 50 kg/ha of fertiliser nitrogen applied to wheat–wheat crops, 2-year medic–wheat or chickpea–wheat rotation, although grain protein concentrations were higher following lucerne. The incidence and severity of the soilborne disease, common root rot of wheat caused by Bipolaris sorokiniana, was generally higher in lucerne–wheat than in continuous wheat with no nitrogen fertiliser applications, since its severity was significantly correlated with plant available water at sowing. No significant incidence of crown rot or root lesion nematode was observed. Thus, productivity, which was mainly due to nitrogen accretion in this experiment, can be maintained where short duration lucerne leys are grown in rotations with wheat.
Resumo:
Alzheimer's disease (AD) is characterized by an impairment of the semantic memory responsible for processing meaning-related knowledge. This study was aimed at examining how Finnish-speaking healthy elderly subjects (n = 30) and mildly (n=20) and moderately (n = 20) demented AD patients utilize semantic knowledge to performa semantic fluency task, a method of studying semantic memory. In this task subjects are typically given 60 seconds to generate words belonging to the semantic category of animals. Successful task performance requires fast retrieval of subcategory exemplars in clusters (e.g., farm animals: 'cow', 'horse', 'sheep') and switching between subcategories (e.g., pets, water animals, birds, rodents). In this study, thescope of the task was extended to cover various noun and verb categories. The results indicated that, compared with normal controls, both mildly and moderately demented AD patients showed reduced word production, limited clustering and switching, narrowed semantic space, and an increase in errors, particularly perseverations. However, the size of the clusters, the proportion of clustered words, and the frequency and prototypicality of words remained relatively similar across the subject groups. Although the moderately demented patients showed a poor eroverall performance than the mildly demented patients in the individual categories, the error analysis appeared unaffected by the severity of AD. The results indicate a semantically rather coherent performance but less specific, effective, and flexible functioning of the semantic memory in mild and moderate AD patients. The findings are discussed in relation to recent theories of word production and semantic representation. Keywords: semantic fluency, clustering, switching, semantic category, nouns, verbs, Alzheimer's disease
Resumo:
Streptococcus pyogenes (group A streptococcus) is an important human pathogen, causing a wide array of infections ranging in severity. The majority of S. pyogenes infections are mild upper respiratory tract or skin infections. Severe, invasive infections, such as bacteraemia, are relatively rare, but constitute a major global burden with a high mortality. Certain streptococcal types are associated with a more severe disease and higher mortality. Bacterial, non-necrotizing cellulitis and erysipelas are localised infections of the skin, and although they are usually not life-threatening, they have a tendency to recur and therefore cause substantial morbidity. Despite several efforts aimed at developing an effective and safe vaccine against S. pyogenes infections, no vaccine is yet available. In this study, the epidemiology of invasive S. pyogenes infections in Finland was described over a decade of national, population-based surveillance. Recent trends in incidence, outcome and bacterial types were investigated. The beta-haemolytic streptococci causing cellulitis and erysipelas infections in Finland were studied in a case-control study. Bacterial isolates were characterised using both conventional and molecular typing methods, such as the emm typing, which is the most widely used typing method for beta-haemolytic streptococci. The incidence of invasive S. pyogenes disease has had an increasing trend during the past ten years in Finland, especially from 2006 onwards. Age- and sex-specific differences in the incidence rate were identified, with men having a higher incidence than women, especially among persons aged 45-64 years. In contrast, more infections occurred in women aged 25-34 years than men. Seasonal patterns with occasional peaks during the midsummer and midwinter were observed. Differences in the predisposing factors and underlying conditions of patients may contribute to these distinctions. Case fatality associated with invasive S. pyogenes infections peaked in 2005 (12%) but remained at a reasonably low level (8% overall during 2004-2007) compared to that of other developed countries (mostly exceeding 10%). Changes in the prevalent emm types were associated with the observed increases in incidence and case fatality. In the case-control study, acute bacterial non-necrotizing cellulitis was caused predominantly by Streptococcus dysgalactiae subsp. equisimilis, instead of S. pyogenes. The recurrent nature of cellulitis became evident. This study adds to our understanding of S. pyogenes infections in Finland and provides a basis for comparison to other countries and future trends. emm type surveillance and outcome analyses remain important for detecting such changes in type distribution that might lead to increases in incidence and case fatality. Bacterial characterisation serves as a basis for disease pathogenesis studies and vaccine development.
Resumo:
This research aimed to develop and evaluate pre- and postharvest management strategies to reduce stem end rot (SER) incidence and extend saleable life of 'Carabao' mango fruits in Southern Philippines. Preharvest management focused on the development and improvement of fungicide spray program, while postharvest management aimed to develop alternative interventions aside from hot water treatment (HWT). Field evaluation of systemic fungicides, namely azoxystrobin ( Amistar 25SC), tebuconazole ( Folicur 25WP), carbendazim ( Goldazim 500SC), difenoconazole ( Score 250SC) and azoxystrobin+difenoconazole ( Amistar Top), reduced blossom blight severity and improved fruit setting and retention, resulting in higher fruit yield but failed to sufficiently suppress SER incidence. Based on these findings, an improved fungicide spray program was developed taking into account the infection process of SER pathogens and fungicide resistance. Timely application of protectant (mancozeb) and systemic fungicides (azoxystrobin, carbendazim and difenoconazole) during the most critical stages of mango flower and fruit development ensured higher harvestable fruit yield and minimally lowered SER incidence. Control of SER was also achieved by employing postharvest treatment such as HWT (52-55°C for 10 min), which significantly prolonged the saleable life of mango fruits. However, extended hot water treatment (EHWT; 46°C pulp temperature for 15 min), rapid heat treatment (RHT; 59°C for 30-60 sec), fungicide dip and promising biological control agents failed to satisfactorily reduce SER and prolong saleable life. In contrast, the integration of the improved spray program as preharvest management practice, and postharvest treatments such as HWT and fungicide dips (azoxystrobin, 150-175 ppm; carbendazim, 312.5 ppm; and tebuconazole, 125-156 ppm), significantly reduced disease and extended marketable life for utmost 8 days.
Resumo:
Accepted Article Abstract Background: Liver diseases in Australia are estimated to affect 6 million people with a societal cost of $51 billion annually. Information about utilization of specialist hepatology care is critical in informing policy makers about the requirements for delivery of hepatology-related health care. Aims: This study examined etiology and severity of liver disease seen in a tertiary hospital hepatology clinic, as well as resource utilisation patterns. Methods: A longitudinal cohort study included consecutive patients booked in hepatology outpatient clinics during a 3 month period. Subsequent outpatient appointments for these patients over the following 12 months were then recorded. Results: During the initial 3 month period 1471 appointments were scheduled with a hepatologist, 1136 of which were attended. 21% of patients were “new cases”. Hepatitis B (HBV) was the most common disease etiology for new cases (37%). Advanced disease at presentation varied between etiology, with HBV (5%), Hepatitis C (HCV) (31%), non-alcoholic fatty liver disease (NAFLD) (46%) and alcoholic liver disease (ALD) (72%). Most patients (83%) attended multiple hepatology appointments, and a range of referrals patterns for procedures, investigations and other specialty assessments were observed. Conclusions: There is a high prevalence of HBV in new case referrals. Patients with HCV, NAFLD and ALD have a high prevalence of advanced liver disease at referral, requiring ongoing surveillance for development of decompensated liver disease and liver cancer. These findings that describe patterns of health service utilisation among patients with liver disease provide useful information for planning sustainable health service provision for this clinical population
Resumo:
Background Chronic kidney disease (CKD) leads to a range of symptoms, which are often under-recognised and little is known about the multidimensional symptom experience in advanced CKD. Objectives To examine (1) symptom burden at CKD stages 4 and 5, and dialysis modalities, and (2) demographic and renal history correlates of symptom burden. Methods Using a cross-sectional design, a convenience sample of 436 people with CKD was recruited from three hospitals. The CKD Symptom Burden Index (CKD-SBI) was used to measure the prevalence, severity, distress and frequency of 32 symptoms. Demographic and renal history data were also collected. Results Of the sample, 75.5 % were receiving dialysis (haemodialysis, n = 287; peritoneal dialysis, n = 42) and 24.5 % were not undergoing dialysis (stage 4, n = 69; stage 5, n = 38). Participants reported an average of 13.01 ± 7.67 symptoms. Fatigue and pain were common and burdensome across all symptom dimensions. While approximately one-third experienced sexual symptoms, when reported these symptoms were frequent, severe and distressing. Haemodialysis, older age and being female were independently associated with greater symptom burden. Conclusions In CKD, symptom burden is better understood when capturing the multidimensional aspects of a range of physical and psychological symptoms. Fatigue, pain and sexual dysfunction are key contributors to symptom burden, and these symptoms are often under-recognised and warrant routine assessment. The CKD-SBI offers a valuable tool for renal clinicians to assess symptom burden, leading to the commencement of timely and appropriate interventions.
Resumo:
Inherited retinal diseases are the most common cause of vision loss among the working population in Western countries. It is estimated that ~1 of the people worldwide suffer from vision loss due to inherited retinal diseases. The severity of these diseases varies from partial vision loss to total blindness, and at the moment no effective cure exists. To date, nearly 200 mapped loci, including 140 cloned genes for inherited retinal diseases have been identified. By a rough estimation 50% of the retinal dystrophy genes still await discovery. In this thesis we aimed to study the genetic background of two inherited retinal diseases, X-linked cone-rod dystrophy and Åland Island eye disease. X-linked cone-rod dystrophy (CORDX) is characterized by progressive loss of visual function in school age or early adulthood. Affected males show reduced visual acuity, photophobia, myopia, color vision defects, central scotomas, and variable changes in fundus. The disease is genetically heterogeneous and two disease loci, CORDX1 and CORDX2, were known prior to the present thesis work. CORDX1, located on Xp21.1-11.4, is caused by mutations in the RPGR gene. CORDX2 is located on Xq27-28 but the causative gene is still unknown. Åland Island eye disease (AIED), originally described in a family living in Åland Islands, is a congenital retinal disease characterized by decreased visual acuity, fundus hypopigmentation, nystagmus, astigmatism, red color vision defect, myopia, and defective night vision. AIED shares similarities with another retinal disease, congenital stationary night blindness (CSNB2). Mutations in the L-type calcium channel α1F-subunit gene, CACNA1F, are known to cause CSNB2, as well as AIED-like disease. The disease locus of the original AIED family maps to the same genetic interval as the CACNA1F gene, but efforts to reveal CACNA1F mutations in patients of the original AIED family have been unsuccessful. The specific aims of this study were to map the disease gene in a large Finnish family with X-linked cone-rod dystrophy and to identify the disease-causing genes in the patients of the Finnish cone-rod dystrophy family and the original AIED family. With the help of linkage and haplotype analyses, we could localize the disease gene of the Finnish cone-rod dystrophy family to the Xp11.4-Xq13.1 region, and thus establish a new genetic X-linked cone-rod dystrophy locus, CORDX3. Mutation analyses of candidate genes revealed three novel CACNA1F gene mutations: IVS28-1 GCGTC>TGG in CORDX3 patients, a 425 bp deletion, comprising exon 30 and flanking intronic regions in AIED patients, and IVS16+2T>C in an additional Finnish patient with a CSNB2-like phenotype. All three novel mutations altered splice sites of the CACNA1F gene, and resulted in defective pre-mRNA splicing suggesting altered or absent channel function as a disease mechanism. The analyses of CACNA1F mRNA also revealed novel alternative wt splice variants, which may enhance channel diversity or regulate the overall expression level of the channel. The results of our studies may be utilized in genetic counseling of the families, and they provide a basis for studies on the pathogenesis of these diseases. In the future, the knowledge of the genetic defects may be used in the identification of specific therapies for the patients.
Resumo:
Background: The fecal neutrophil-derived proteins calprotectin and lactoferrin have proven useful surrogate markers of intestinal inflammation. The aim of this study was to compare fecal calprotectin and lactoferrin concentrations to clinically, endoscopically, and histologically assessed Crohn’s disease (CD) activity, and to explore the suitability of these proteins as surrogate markers of mucosal healing during anti-TNFα therapy. Furthermore, we studied changes in the number and expression of effector and regulatory T cells in bowel biopsy specimens during anti-TNFα therapy. Patients and methods: Adult CD patients referred for ileocolonoscopy (n=106 for 77 patients) for various reasons were recruited (Study I). Clinical disease activity was assessed with the Crohn’s disease activity index (CDAI) and endoscopic activity with both the Crohn’s disease index of severity (CDEIS) and the simple endoscopic score for Crohn’s disease (SES-CD). Stool samples for measurements of calprotectin and lactoferrin, and blood samples for CRP were collected. For Study II, biopsy specimens were obtained from the ileum and the colon for histologic activity scoring. In prospective Study III, after baseline ileocolonoscopy, 15 patients received induction with anti-TNFα blocking agents and endoscopic, histologic, and fecal-marker responses to therapy were evaluated at 12 weeks. For detecting changes in the number and expression of effector and regulatory T cells, biopsy specimens were taken from the most severely diseased lesions in the ileum and the colon (Study IV). Results: Endoscopic scores correlated significantly with fecal calprotectin and lactoferrin (p<0.001). Both fecal markers were significantly lower in patients with endoscopically inactive than with active disease (p<0.001). In detecting endoscopically active disease, the sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) for calprotectin ≥200 μg/g were 70%, 92%, 94%, and 61%; for lactoferrin ≥10 μg/g they were 66%, 92%, 94%, and 59%. Accordingly, the sensitivity, specificity, PPV, and NPV for CRP >5 mg/l were 48%, 91%, 91%, and 48%. Fecal markers were significantly higher in active colonic (both p<0.001) or ileocolonic (calprotectin p=0.028, lactoferrin p=0.004) than in ileal disease. In ileocolonic or colonic disease, colon histology score correlated significantly with fecal calprotectin (r=0.563) and lactoferrin (r=0.543). In patients receiving anti-TNFα therapy, median fecal calprotectin decreased from 1173 μg/g (range 88-15326) to 130 μg/g (13-1419) and lactoferrin from 105.0 μg/g (4.2-1258.9) to 2.7 μg/g (0.0-228.5), both p=0.001. The relation of ileal IL-17+ cells to CD4+ cells decreased significantly during anti-TNF treatment (p=0.047). The relation of IL-17+ cells to Foxp3+ cells was higher in the patients’ baseline specimens than in their post-treatment specimens (p=0.038). Conclusions: For evaluation of CD activity, based on endoscopic findings, more sensitive surrogate markers than CDAI and CRP were fecal calprotectin and lactoferrin. Fecal calprotectin and lactoferrin were significantly higher in endoscopically active disease than in endoscopic remission. In both ileocolonic and colonic disease, fecal markers correlated closely with histologic disease activity. In CD, these neutrophil-derived proteins thus seem to be useful surrogate markers of endoscopic activity. During anti-TNFα therapy, fecal calprotectin and lactoferrin decreased significantly. The anti-TNFα treatment was also reflected in a decreased IL-17/Foxp3 cell ratio, which may indicate improved balance between effector and regulatory T cells with treatment.
Resumo:
Background. Cardiovascular disease (CVD) remains the most serious threat to life and health in industrialized countries. Atherosclerosis is the main underlying pathology associated with CVD, in particular coronary artery disease (CAD), ischaemic stroke, and peripheral arterial disease. Risk factors play an important role in initiating and accelerating the complex process of atherosclerosis. Most studies of risk factors have focused on the presence or absence of clinically defined CVD. Less is known about the determinants of the severity and extent of atherosclerosis in symptomatic patients. Aims. To clarify the association between coronary and carotid artery atherosclerosis, and to study the determinants associated with these abnormalities with special regard to novel cardiovascular risk factors. Subjects and methods. Quantitative coronary angiography (QCA) and B-mode ultrasound were used to assess coronary and carotid artery atherosclerosis in 108 patients with clinically suspected CAD referred for elective coronary angiography. To evaluate anatomic severity and extent of CAD, several QCA parameters were incorporated into indexes. These measurements reflected CAD severity, extent, and overall atheroma burden and were calculated for the entire coronary tree and separately for different coronary segments (i.e., left main, proximal, mid, and distal segments). Maximum and mean intima-media thickness (IMT) values of carotid arteries were measured and expressed as mean aggregate values. Furthermore, the study design included extensive fasting blood samples, oral glucose tolerance test, and an oral fat-load test to be performed in each participant. Results. Maximum and mean IMT values were significantly correlated with CAD severity, extent, and atheroma burden. There was heterogeneity in associations between IMT and CAD indexes according to anatomical location of CAD. Maximum and mean IMT values, respectively, were correlated with QCA indexes for mid and distal segments but not with the proximal segments of coronary vessels. The values of paraoxonase-1 (PON1) activity and concentration, respectively, were lower in subjects with significant CAD and there was a significant relationship between PON1 activity and concentration and coronary atherosclerosis assessed by QCA. PON1 activity was a significant determinant of severity of CAD independently of HDL cholesterol. Neither PON1 activity nor concentration was associated with carotid IMT. The concentration of triglycerides (TGs), triglyceride-rich lipoproteins (TRLs), oxidized LDL (oxLDL), and the cholesterol content of remnant lipoprotein particle (RLP-C) were significantly increased at 6 hours after intake of an oral fatty meal as compared with fasting values. The mean peak size of LDL remained unchanged 6 hours after the test meal. The correlations between total TGs, TRLs, and RLP-C in fasting and postprandial state were highly significant. RLP-C correlated with oxLDL both in fasting and in fed state and inversely with LDL size. In multivariate analysis oxLDL was a determinant of severity and extent of CAD. Neither total TGs, TRLs, oxLDL, nor LDL size were linked to carotid atherosclerosis. Insulin resistance (IR) was associated with an increased severity and extent of coronary atherosclerosis and seemed to be a stronger predictor of coronary atherosclerosis in the distal parts of the coronary tree than in the proximal and mid parts. In the multivariate analysis IR was a significant predictor of the severity of CAD. IR did not correlate with carotid IMT. Maximum and mean carotid IMT were higher in patients with the apoE4 phenotype compared with subjects with the apoE3 phenotype. Likewise, patients with the apoE4 phenotype had a more severe and extensive CAD than individuals with the apoE3 phenotype. Conclusions. 1) There is an association between carotid IMT and the severity and extent of CAD. Carotid IMT seems to be a weaker predictor of coronary atherosclerosis in the proximal parts of the coronary tree than in the mid and distal parts. 2) PON1 activity has an important role in the pathogenesis of coronary atherosclerosis. More importantly, the study illustrates how the protective role of HDL could be modulated by its components such that equivalent serum concentrations of HDL cholesterol may not equate with an equivalent, potential protective capacity. 3) RLP-C in the fasting state is a good marker of postprandial TRLs. Circulating oxLDL increases in CAD patients postprandially. The highly significant positive correlation between postprandial TRLs and postprandial oxLDL suggests that the postprandial state creates oxidative stress. Our findings emphasize the fundamental role of LDL oxidation in the development of atherosclerosis even after inclusion of conventional CAD risk factors. 4) Disturbances in glucose metabolism are crucial in the pathogenesis of coronary atherosclerosis. In fact, subjects with IR are comparable with diabetic subjects in terms of severity and extent of CAD. 5) ApoE polymorphism is involved in the susceptibility to both carotid and coronary atherosclerosis.
Resumo:
Chronic kidney disease (CKD) is a worldwide health problem, with adverse outcomes of cardiovascular disease and premature death. The ageing of populations along with the growing prevalence of chronic diseases such as diabetes and hypertension is leading to worldwide increase in the number of CKD patients. It has become evident that inflammation plays an important role in the pathogenesis of atherosclerosis complications. CKD patients also have an increased risk of atherosclerosis complications (including myocardial infarction, sudden death to cardiac arrhythmia, cerebrovascular accidents, and peripheral vascular disease). In line with this, oral and dental problems can be an important source of systemic inflammation. A decline in oral health may potentially act as an early marker of systemic disease progression. This series of studies examined oral health of CKD patients from predialysis, to dialysis and kidney transplantation in a 10-year follow-up study and in a cross-sectional study of predialysis CKD patients. Patients had clinical and radiographic oral and dental examination, resting and stimulated saliva flow rates were measured, whilst the biochemical and microbiological composition of saliva was analyzed. Lifestyle and oral symptoms were recorded using a questionnaire, and blood parameters were collected from the hospital records. The hypothesis was that the oral health status, symptoms, sensations, salivary flow rates and salivary composition vary in different renal failure stages and depend on the etiology of the kidney disease. No statistically significant difference were seen in the longitudinal study in the clinical parameters. However, some saliva parameters after renal transplantation were significantly improved compared to levels at the predialysis stage. The urea concentration of saliva was high in all stages. The salivary and plasma urea concentrations followed a similar trend, showing the lowest values in kidney transplant patients. Levels of immunoglobulin (Ig) A, G and M all decreased significantly after kidney transplantation. Increased concentrations of IgA, IgG and IgM may reflect disintegration of the oral epithelium and are usually markers of poor general oral condition. In the cross-sectional investigation of predialysis CKD patients we compared oral health findings of diabetic nephropathy patients to those with other kidney disease than diabetes. The results showed eg. more dental caries and lower stimulated salivary flow rates in the diabetic patients. HbA1C values of the diabetic patients were significantly higher than those in the other kidney disease group. A statistically significant difference was observed in the number of drugs used daily in the diabetic nephropathy group than in the other kidney disease group. In the logistic regression analyses, age was the principal explanatory factor for high salivary total protein concentration, and for low unstimulated salivary flow. Poor dental health, severity of periodontal disease seemed to be an explanatory factor for high salivary albumin concentrations. Salivary urea levels were significantly linked with diabetic nephropathy and with serum urea concentrations. Contrary to our expectation, however, diabetic nephropathy did not seem to affect periodontal health more severely than the other kidney diseases. Although diabetes is known to associate with xerostomia and other oral symptoms, it did not seem to increase the prevalence of oral discomfort. In summary, this series of studies has provided new information regarding the oral health of CKD patients. As expected, the commencement of renal disease reflects in oral symptoms and signs. Diabetic nephropathy, in particular, appears to impart a requirement for special attention in the oral health care of patients suffering from this disease.