35 resultados para 860[729.1].07[Sarduy]


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The phosphate mineral series eosphorite–childrenite–(Mn,Fe)Al(PO4)(OH)2·(H2O) has been studied using a combination of electron probe analysis and vibrational spectroscopy. Eosphorite is the manganese rich mineral with lower iron content in comparison with the childrenite which has higher iron and lower manganese content. The determined formulae of the two studied minerals are: (Mn0.72,Fe0.13,Ca0.01)(Al)1.04(PO4, OHPO3)1.07(OH1.89,F0.02)·0.94(H2O) for SAA-090 and (Fe0.49,Mn0.35,Mg0.06,Ca0.04)(Al)1.03(PO4, OHPO3)1.05(OH)1.90·0.95(H2O) for SAA-072. Raman spectroscopy enabled the observation of bands at 970 cm−1 and 1011 cm−1 assigned to monohydrogen phosphate, phosphate and dihydrogen phosphate units. Differences are observed in the area of the peaks between the two eosphorite minerals. Raman bands at 562 cm−1, 595 cm−1, and 608 cm−1 are assigned to the �4 bending modes of the PO4, HPO4 and H2PO4 units; Raman bands at 405 cm−1, 427 cm−1 and 466 cm−1 are attributed to the �2 modes of these units. Raman bands of the hydroxyl and water stretching modes are observed. Vibrational spectroscopy enabled details of the molecular structure of the eosphorite mineral series to be determined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It has been reported that poor nutritional status, in the form of weight loss and resulting body mass index (BMI) changes, is an issue in people with Parkinson's disease (PWP). The symptoms resulting from Parkinson's disease (PD) and the side effects of PD medication have been implicated in the aetiology of nutritional decline. However, the evidence on which these claims are based is, on one hand, contradictory, and on the other, restricted primarily to otherwise healthy PWP. Despite the claims that PWP suffer from poor nutritional status, evidence is lacking to inform nutrition-related care for the management of malnutrition in PWP. The aims of this thesis were to better quantify the extent of poor nutritional status in PWP, determine the important factors differentiating the well-nourished from the malnourished and evaluate the effectiveness of an individualised nutrition intervention on nutritional status. Phase DBS: Nutritional status in people with Parkinson's disease scheduled for deep-brain stimulation surgery The pre-operative rate of malnutrition in a convenience sample of people with Parkinson's disease (PWP) scheduled for deep-brain stimulation (DBS) surgery was determined. Poorly controlled PD symptoms may result in a higher risk of malnutrition in this sub-group of PWP. Fifteen patients (11 male, median age 68.0 (42.0 – 78.0) years, median PD duration 6.75 (0.5 – 24.0) years) participated and data were collected during hospital admission for the DBS surgery. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference, waist circumference, body mass index (BMI)) were taken, and body composition was measured using bioelectrical impedance spectroscopy (BIS). Six (40%) of the participants were malnourished (SGA-B) while 53% reported significant weight loss following diagnosis. BMI was significantly different between SGA-A and SGA-B (25.6 vs 23.0kg/m 2, p<.05). There were no differences in any other variables, including PG-SGA score and the presence of non-motor symptoms. The conclusion was that malnutrition in this group is higher than that in other studies reporting malnutrition in PWP, and it is under-recognised. As poorer surgical outcomes are associated with poorer pre-operative nutritional status in other surgeries, it might be beneficial to identify patients at nutritional risk prior to surgery so that appropriate nutrition interventions can be implemented. Phase I: Nutritional status in community-dwelling adults with Parkinson's disease The rate of malnutrition in community-dwelling adults (>18 years) with Parkinson's disease was determined. One hundred twenty-five PWP (74 male, median age 70.0 (35.0 – 92.0) years, median PD duration 6.0 (0.0 – 31.0) years) participated. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference (MAC), calf circumference, waist circumference, body mass index (BMI)) were taken. Nineteen (15%) of the participants were malnourished (SGA-B). All anthropometric indices were significantly different between SGA-A and SGA-B (BMI 25.9 vs 20.0kg/m2; MAC 29.1 – 25.5cm; waist circumference 95.5 vs 82.5cm; calf circumference 36.5 vs 32.5cm; all p<.05). The PG-SGA score was also significantly lower in the malnourished (2 vs 8, p<.05). The nutrition impact symptoms which differentiated between well-nourished and malnourished were no appetite, constipation, diarrhoea, problems swallowing and feel full quickly. This study concluded that malnutrition in community-dwelling PWP is higher than that documented in community-dwelling elderly (2 – 11%), yet is likely to be under-recognised. Nutrition impact symptoms play a role in reduced intake. Appropriate screening and referral processes should be established for early detection of those at risk. Phase I: Nutrition assessment tools in people with Parkinson's disease There are a number of validated and reliable nutrition screening and assessment tools available for use. None of these tools have been evaluated in PWP. In the sample described above, the use of the World Health Organisation (WHO) cut-off (≤18.5kg/m2), age-specific BMI cut-offs (≤18.5kg/m2 for under 65 years, ≤23.5kg/m2 for 65 years and older) and the revised Mini-Nutritional Assessment short form (MNA-SF) were evaluated as nutrition screening tools. The PG-SGA (including the SGA classification) and the MNA full form were evaluated as nutrition assessment tools using the SGA classification as the gold standard. For screening, the MNA-SF performed the best with sensitivity (Sn) of 94.7% and specificity (Sp) of 78.3%. For assessment, the PG-SGA with a cut-off score of 4 (Sn 100%, Sp 69.8%) performed better than the MNA (Sn 84.2%, Sp 87.7%). As the MNA has been recommended more for use as a nutrition screening tool, the MNA-SF might be more appropriate and take less time to complete. The PG-SGA might be useful to inform and monitor nutrition interventions. Phase I: Predictors of poor nutritional status in people with Parkinson's disease A number of assessments were conducted as part of the Phase I research, including those for the severity of PD motor symptoms, cognitive function, depression, anxiety, non-motor symptoms, constipation, freezing of gait and the ability to carry out activities of daily living. A higher score in all of these assessments indicates greater impairment. In addition, information about medical conditions, medications, age, age at PD diagnosis and living situation was collected. These were compared between those classified as SGA-A and as SGA-B. Regression analysis was used to identify which factors were predictive of malnutrition (SGA-B). Differences between the groups included disease severity (4% more severe SGA-A vs 21% SGA-B, p<.05), activities of daily living score (13 SGA-A vs 18 SGA-B, p<.05), depressive symptom score (8 SGA-A vs 14 SGA-B, p<.05) and gastrointestinal symptoms (4 SGA-A vs 6 SGA-B, p<.05). Significant predictors of malnutrition according to SGA were age at diagnosis (OR 1.09, 95% CI 1.01 – 1.18), amount of dopaminergic medication per kg body weight (mg/kg) (OR 1.17, 95% CI 1.04 – 1.31), more severe motor symptoms (OR 1.10, 95% CI 1.02 – 1.19), less anxiety (OR 0.90, 95% CI 0.82 – 0.98) and more depressive symptoms (OR 1.23, 95% CI 1.07 – 1.41). Significant predictors of a higher PG-SGA score included living alone (β=0.14, 95% CI 0.01 – 0.26), more depressive symptoms (β=0.02, 95% CI 0.01 – 0.02) and more severe motor symptoms (OR 0.01, 95% CI 0.01 – 0.02). More severe disease is associated with malnutrition, and this may be compounded by lack of social support. Phase II: Nutrition intervention Nineteen of the people identified in Phase I as requiring nutrition support were included in Phase II, in which a nutrition intervention was conducted. Nine participants were in the standard care group (SC), which received an information sheet only, and the other 10 participants were in the intervention group (INT), which received individualised nutrition information and weekly follow-up. INT gained 2.2% of starting body weight over the 12 week intervention period resulting in significant increases in weight, BMI, mid-arm circumference and waist circumference. The SC group gained 1% of starting weight over the 12 weeks which did not result in any significant changes in anthropometric indices. Energy and protein intake (18.3kJ/kg vs 3.8kJ/kg and 0.3g/kg vs 0.15g/kg) increased in both groups. The increase in protein intake was only significant in the SC group. The changes in intake, when compared between the groups, were no different. There were no significant changes in any motor or non-motor symptoms or in "off" times or dyskinesias in either group. Aspects of quality of life improved over the 12 weeks as well, especially emotional well-being. This thesis makes a significant contribution to the evidence base for the presence of malnutrition in Parkinson's disease as well as for the identification of those who would potentially benefit from nutrition screening and assessment. The nutrition intervention demonstrated that a traditional high protein, high energy approach to the management of malnutrition resulted in improved nutritional status and anthropometric indices with no effect on the presence of Parkinson's disease symptoms and a positive effect on quality of life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Observational data suggested that supplementation with vitamin D could reduce risk of infection, but trial data are inconsistent. OBJECTIVE: We aimed to examine the effect of oral vitamin D supplementation on antibiotic use. DESIGN: We conducted a post hoc analysis of data from pilot D-Health, which is a randomized trial carried out in a general community setting between October 2010 and February 2012. A total of 644 Australian residents aged 60-84 y were randomly assigned to receive monthly doses of a placebo (n = 214) or 30,000 (n = 215) or 60,000 (n = 215) IU oral cholecalciferol for ≤12 mo. Antibiotics prescribed during the intervention period were ascertained by linkage with pharmacy records through the national health insurance scheme (Medicare Australia). RESULTS: People who were randomly assigned 60,000 IU cholecalciferol had nonsignificant 28% lower risk of having antibiotics prescribed at least once than did people in the placebo group (RR: 0.72; 95% CI: 0.48, 1.07). In analyses stratified by age, in subjects aged ≥70 y, there was a significant reduction in antibiotic use in the high-dose vitamin D compared with placebo groups (RR: 0.53; 95% CI: 0.32, 0.90), whereas there was no effect in participants <70 y old (RR: 1.07; 95% CI: 0.58, 1.97) (P-interaction = 0.1). CONCLUSION: Although this study was a post hoc analysis and statistically nonsignificant, this trial lends some support to the hypothesis that supplementation with 60,000 IU vitamin D/mo is associated with lower risk of infection, particularly in older adults. The trial was registered at the Australian New Zealand Clinical Trials Registry (anzctr.org.au) as ACTRN12609001063202.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Single nucleotide polymorphisms (SNPs) rs429358 (ε4) and rs7412 (ε2), both invoking changes in the amino-acid sequence of the apolipoprotein E (APOE) gene, have previously been tested for association with multiple sclerosis (MS) risk. However, none of these studies was sufficiently powered to detect modest effect sizes at acceptable type-I error rates. As both SNPs are only imperfectly captured on commonly used microarray genotyping platforms, their evaluation in the context of genome-wide association studies has been hindered until recently. Methods We genotyped 12 740 subjects hitherto not studied for their APOE status, imputed raw genotype data from 8739 subjects from five independent genome wide association studies datasets using the most recent high-resolution reference panels, and extracted genotype data for 8265 subjects from previous candidate gene assessments. Results Despite sufficient power to detect associations at genome-wide significance thresholds across a range of ORs, our analyses did not support a role of rs429358 or rs7412 on MS susceptibility. This included meta-analyses of the combined data across 13 913 MS cases and 15 831 controls (OR=0.95, p=0.259, and OR 1.07, p=0.0569, for rs429358 and rs7412, respectively). Conclusion Given the large sample size of our analyses, it is unlikely that the two APOE missense SNPs studied here exert any relevant effects on MS susceptibility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background To explore the impact of geographical remoteness and area-level socioeconomic disadvantage on colorectal cancer (CRC) survival. Methods Multilevel logistic regression and Markov chain Monte Carlo simulations were used to analyze geographical variations in five-year all-cause and CRC-specific survival across 478 regions in Queensland Australia for 22,727 CRC cases aged 20–84 years diagnosed from 1997–2007. Results Area-level disadvantage and geographic remoteness were independently associated with CRC survival. After full multivariate adjustment (both levels), patients from remote (odds Ratio [OR]: 1.24, 95%CrI: 1.07-1.42) and more disadvantaged quintiles (OR = 1.12, 1.15, 1.20, 1.23 for Quintiles 4, 3, 2 and 1 respectively) had lower CRC-specific survival than major cities and least disadvantaged areas. Similar associations were found for all-cause survival. Area disadvantage accounted for a substantial amount of the all-cause variation between areas. Conclusions We have demonstrated that the area-level inequalities in survival of colorectal cancer patients cannot be explained by the measured individual-level characteristics of the patients or their cancer and remain after adjusting for cancer stage. Further research is urgently needed to clarify the factors that underlie the survival differences, including the importance of geographical differences in clinical management of CRC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives Heatwaves can have significant health consequences resulting in increased mortality and morbidity. However, their impact on people living in tropical/subtropical regions remains largely unknown. This study assessed the impact of heatwaves on mortality and emergency hospital admissions (EHAs) from non-external causes (NEC) in Brisbane, a subtropical city in Australia. Methods We acquired daily data on weather, air pollution and EHAs for patients aged 15 years and over in Brisbane between January 1996 and December 2005, and on mortality between January 1996 and November 2004. A locally derived definition of heatwave (daily maximum ≥37°C for 2 or more consecutive days) was adopted. Case–crossover analyses were used to assess the impact of heatwaves on cause-specific mortality and EHAs. Results During heatwaves, there was a statistically significant increase in NEC mortality (OR 1.46; 95% CI 1.21 to 1.77), cardiovascular mortality (OR 1.89; 95% CI 1.44 to 2.48), diabetes mortality in those aged 75+ (OR 9.96; 95% CI 1.02 to 96.85), NEC EHAs (OR 1.15; 95% CI 1.07 to 1.23) and EHAs from renal diseases (OR 1.41; 95% CI 1.09 to 1.83). The elderly were found to be particularly vulnerable to heatwaves (eg, for NEC EHAs, OR 1.24 for 65–74-year-olds and 1.39 for those aged 75+). Conclusions Significant increases in NEC mortality and EHAs were observed during heatwaves in Brisbane where people are well accustomed to hot summer weather. The most vulnerable were the elderly and people with cardiovascular, renal or diabetic disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Malaria remains a public health problem in the remote and poor area of Yunnan Province, China. Yunnan faces an increasing risk of imported malaria infections from Mekong river neighboring countries. This study aimed to identify the high risk area of malaria transmission in Yunnan Province, and to estimate the effects of climatic variability on the transmission of Plasmodium vivax and Plasmodium falciparum in the identified area. METHODS We identified spatial clusters of malaria cases using spatial cluster analysis at a county level in Yunnan Province, 2005-2010, and estimated the weekly effects of climatic factors on P. vivax and P. falciparum based on a dataset of daily malaria cases and climatic variables. A distributed lag nonlinear model was used to estimate the impact of temperature, relative humidity and rainfall up to 10-week lags on both types of malaria parasite after adjusting for seasonal and long-term effects. RESULTS The primary cluster area was identified along the China-Myanmar border in western Yunnan. A 1°C increase in minimum temperature was associated with a lag 4 to 9 weeks relative risk (RR), with the highest effect at lag 7 weeks for P. vivax (RR = 1.03; 95% CI, 1.01, 1.05) and 6 weeks for P. falciparum (RR = 1.07; 95% CI, 1.04, 1.11); a 10-mm increment in rainfall was associated with RRs of lags 2-4 weeks and 9-10 weeks, with the highest effect at 3 weeks for both P. vivax (RR = 1.03; 95% CI, 1.01, 1.04) and P. falciparum (RR = 1.04; 95% CI, 1.01, 1.06); and the RRs with a 10% rise in relative humidity were significant from lag 3 to 8 weeks with the highest RR of 1.24 (95% CI, 1.10, 1.41) for P. vivax at 5-week lag. CONCLUSIONS Our findings suggest that the China-Myanmar border is a high risk area for malaria transmission. Climatic factors appeared to be among major determinants of malaria transmission in this area. The estimated lag effects for the association between temperature and malaria are consistent with the life cycles of both mosquito vector and malaria parasite. These findings will be useful for malaria surveillance-response systems in the Mekong river region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Physical activity, particularly walking, is greatly beneficial to health; yet a sizeable proportion of older adults are insufficiently active. The importance of built environment attributes for walking is known, but few studies of older adults have examined neighbourhood destinations and none have investigated access to specific, objectively-measured commercial destinations and walking. METHODS: We undertook a secondary analysis of data from the Western Australian state government's health surveillance survey for those aged 65--84 years and living in the Perth metropolitan region from 2003--2009 (n = 2,918). Individual-level road network service areas were generated at 400 m and 800 m distances, and the presence or absence of six commercial destination types within the neighbourhood service areas identified (food retail, general retail, medical care services, financial services, general services, and social infrastructure). Adjusted logistic regression models examined access to and mix of commercial destination types within neighbourhoods for associations with self-reported walking behaviour. RESULTS: On average, the sample was aged 72.9 years (SD = 5.4), and was predominantly female (55.9%) and married (62.0%). Overall, 66.2% reported some weekly walking and 30.8% reported sufficient walking (>=150 min/week). Older adults with access to general services within 400 m (OR = 1.33, 95% CI = 1.07-1.66) and 800 m (OR = 1.20, 95% CI = 1.02-1.42), and social infrastructure within 800 m (OR = 1.19, 95% CI = 1.01-1.40) were more likely to engage in some weekly walking. Access to medical care services within 400 m (OR = 0.77, 95% CI = 0.63-0.93) and 800 m (OR = 0.83, 95% CI = 0.70-0.99) reduced the odds of sufficient walking. Access to food retail, general retail, financial services, and the mix of commercial destination types within the neighbourhood were all unrelated to walking. CONCLUSIONS: The types of neighbourhood commercial destinations that encourage older adults to walk appear to differ slightly from those reported for adult samples. Destinations that facilitate more social interaction, for example eating at a restaurant or church involvement, or provide opportunities for some incidental social contact, for example visiting the pharmacy or hairdresser, were the strongest predictors for walking among seniors in this study. This underscores the importance of planning neighbourhoods with proximate access to social infrastructure, and highlights the need to create residential environments that support activity across the life course.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction Clinical guidelines for the treatment of chronic low back pain suggest the use of supervised exercise. Motor control (MC) based exercise is widely used within clinical practice but its efficacy is equivalent to general exercise therapy. MC exercise targets the trunk musculature. Considering the mechanical links between the hip, pelvis, and lumbar spine, surprisingly little focus has been on investigating the contribution of the hip musculature to lumbopelvic support. The purpose of this study is to compare the efficacy of two exercise programs for the treatment of non-specific low back pain (NSLBP). Methods Eighty individuals aged 18-65 years of age were randomized into two groups to participate in this trial. The primary outcome measures included self-reported pain intensity (0-100mm VAS) and percent disability (Oswestry Disability Index V2). Bilateral measures of hip strength (N/kg) and two dimensional frontal plane mechanics (º) were the secondary outcomes. Outcomes were measured at baseline and following a six-week home based exercise program including weekly sessions of real-time ultrasound imaging. Results Within group comparisons revealed clinically meaningful reductions in pain for both groups. The MC exercise only (N= 40, xˉ =-20.9mm, 95%CI -25.7, -16.1) and the combined MC and hip exercise (N= 40, xˉ = -24.9mm, 95%CI -30.8, -19.0). There was no statistical difference in the change of pain (xˉ =-4.0mm, t= -1.07, p=0.29, 95%CI -11.5, 3.5) or disability (xˉ =-0.3%, t=-0.19, p=0.85, 95%CI -11.5, 3.5) between groups. Conclusion Both exercise programs had similar and positive effects on NSLBP which support the use of the home based exercise programs with weekly supervised visits. However, the addition of specific hip strengthening exercises to a MC based exercise program did not result in significantly greater reductions in pain or disability. Trial Registration NCTO1567566 Funding: Worker’s Compensation Board Alberta Research Grant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to elucidate the thermophysiological effects of wearing lightweight non-military overt and covert personal body armour (PBA) in a hot and humid environment. Eight healthy males walked on a treadmill for 120 min at 22% of their heart rate reserve in a climate chamber simulating 31 °C (60%RH) wearing either no armour (control), overt or covert PBA in addition to a security guard uniform, in a randomised controlled crossover design. No significant difference between conditions at the end of each trial was observed in core temperature, heart rate or skin temperature (P > 0.05). Covert PBA produced a significantly greater amount of body mass change (−1.81 ± 0.44%) compared to control (−1.07 ± 0.38%, P = 0.009) and overt conditions (−1.27 ± 0.44%, P = 0.025). Although a greater change in body mass was observed after the covert PBA trial; based on the physiological outcome measures recorded, the heat strain encountered while wearing lightweight, non-military overt or covert PBA was negligible compared to no PBA. Practitioner summary The wearing of bullet proof vests or body armour is a requirement of personnel engaged in a wide range of occupations including police, security, customs and even journalists in theatres of war. This randomised controlled crossover study is the first to examine the thermophysiological effects of wearing lightweight non-military overt and covert personal body armour (PBA) in a hot and humid environment. We conclude that the heat strain encountered while wearing both overt and covert lightweight, non-military PBA was negligible compared to no PBA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives To compare the efficacy of two exercise programs in reducing pain and disability for individuals with non-specific low back pain and to examine the underlying mechanical factors related to pain and disability for individuals with NSLBP. Design A single-blind, randomized controlled trial. Methods: Eighty participants were recruited from eleven community-based general medical practices and randomized into two groups completing either a lumbopelvic motor control or a combined lumbopelvic motor control and progressive hip strengthening exercise therapy program. All participants received an education session, 6 rehabilitation sessions including real time ultrasound training, and a home based exercise program manual and log book. The primary outcomes were pain (0-100mm visual analogue scale), and disability (Oswestry Disability Index V2). The secondary outcomes were hip strength (N/kg) and two-dimensional frontal plane biomechanics (°) measure during the static Trendelenburg test and while walking. All outcomes were measured at baseline and at 6-week follow up. Results There was no statistical difference in the change in pain (xˉ = -4.0mm, t= -1.07, p =0.29, 95%CI -11.5, 3.5) or disability (xˉ = -0.3%, t= -0.19, p =0.85, 95%CI -3.5, 2.8) between groups. Within group comparisons revealed clinically meaningful reductions in pain for both Group One (xˉ =-20.9mm, 95%CI -25.7, -16.1) and Group Two (xˉ =-24.9, 95%CI -30.8, -19.0). Conclusion Both exercise programs had similar efficacy in reducing pain. The addition of hip strengthening exercises to a motor control exercise program does not appear to result in improved clinical outcome for pain for individuals with non-specific low back pain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose We designed a visual field test focused on the field utilized while driving to examine associations between field impairment and motor vehicle collision involvement in 2,000 drivers ≥70 years old. Methods The "driving visual field test" involved measuring light sensitivity for 20 targets in each eye, extending 15° superiorly, 30° inferiorly, 60° temporally and 30° nasally. The target locations were selected on the basis that they fell within the field region utilized when viewing through the windshield of a vehicle or viewing the dashboard while driving. Monocular fields were combined into a binocular field based on the more sensitive point from each eye. Severe impairment in the overall field or a region was defined as average sensitivity in the lowest quartile of sensitivity. At-fault collision involvement for five years prior to enrollment was obtained from state records. Poisson regression was used to calculate crude and adjusted rate ratios examining the association between field impairment and at-fault collision involvement. Results Drivers with severe binocular field impairment in the overall driving visual field had a 40% increased rate of at-fault collision involvement (RR 1.40, 95%CI 1.07-1.83). Impairment in the lower and left fields was associated with elevated collision rates (RR 1.40 95%CI 1.07-1.82 and RR 1.49, 95%CI 1.15-1.92, respectively), whereas impairment in the upper and right field regions was not. Conclusions Results suggest that older drivers with severe impairment in the lower or left region of the driving visual field are more likely to have a history of at-fault collision involvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: At present there are no large scale nationally-representative studies from Sri Lanka on the prevalence and associations of Diabetic Retinopathy (DR). The present study aims to evaluate the prevalence and risk factors for DR in a community-based nationally-representative sample of adults with self-reported diabetes mellitus from Sri Lanka. Methods: A cross-sectional community-based national study among 5,000 adults (≥18 years) was conducted in Sri Lanka, using a multi-stage stratified cluster sampling technique. An interviewer-administered questionnaire was used to collect data. Ophthalmological evaluation of patients with ‘known’ diabetes (previously diagnosed at a government hospital or by a registered medical practitioner) was done using indirect ophthalmoscopy. A binary-logistic regression analysis was performed with ‘presence of DR’ as the dichotomous dependent variable and other independent covariates. Results: Crude prevalence of diabetes was 12.0%(n=536),of which 344 were patients with ‘known’ diabetes.Mean age was 56.4 ± 10.9 years and 37.3% were males. Prevalence of any degree of DR was 27.4% (Males-30.5%, Females-25.6%; p = 0.41). In patients with DR, majority had NPDR (93.4%), while 5.3% had maculopathy. Patients with DR had a significantly longer duration of diabetes than those without. In the binary-logistic regression analysis in all adults duration of diabetes (OR:1.07), current smoking (OR:1.67) and peripheral neuropathy (OR:1.72)all were significantly associated with DR. Conclusions: Nearly 1/3rd of Sri Lankan adults with self-reported diabetes are having retinopathy. DR was associated with diabetes duration, cigarette smoking and peripheral neuropathy. However, further prospective follow up studies are required to establish causality for identified risk factors

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and purpose: The purpose of this study is to examine the feasibility of developing plasma predictive value biomarkers of cerebral ischemic stroke before imaging evidence is acquired. Methods: Blood samples were obtained from 198 patients who attended our neurology department as emergencies - with symptoms of vertigo, numbness, limb weakness, etc. - within 4.5 h of symptom onset, and before imaging evidence was obtained and medical treatment. After the final diagnosis was made by MRI/DWI/MRA or CTA in the following 24-72 h, the above cases were divided into two groups: stroke group and non-stroke group according to the imaging results. The levels of baseline plasma antithrombin III (AT-III), thrombin-antithrombin III (TAT), fibrinogen, D-dimer and high-sensitivity C-reactive protein (hsCRP) in the two groups were assayed. Results: The level of the baseline AT-III in the stroke group was 118.07 ± 26.22%, which was lower than that of the non-stroke group (283.83 ± 38.39%). The levels of TAT, fibrinogen, hsCRP were 7.24 ± 2.28 μg/L, 5.49 ± 0.98 g/L, and 2.17 ± 1.07 mg/L, respectively, which were higher than those of the non-stroke group (2.53 ± 1.23 μg/L, 3.35 ± 0.50 g/L, 1.82 ± 0.67 mg/L). All the P-values were less than 0.001. The D-dimer level was 322.57 ± 60.34 μg/L, which was slightly higher than that of the non-stroke group (305.76 ± 49.52 μg/L), but the P-value was 0.667. The sensitivities of AT-III, TAT, fibrinogen, D-dimer and hsCRP for predicting ischemic stroke tendency were 97.37%, 96.05%, 3.29%, 7.89%, but the specificity was 93.62%, 82.61%, 100% and 100%, respectively, and all the P-values were less than 0.001. High levels of D-dimer and hsCRP were mainly seen in the few cases with severe large-vessel infarction. Conclusions: Clinical manifestations of acute focal neurological deficits were associated with plasma AT-III and fibrinogen. These tests might help the risk assessment of acute cerebral ischemic stroke and/or TIA with infarction tendency in the superacute stage before positive imaging evidence is obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

- Introduction There is limited understanding of how young adults’ driving behaviour varies according to long-term substance involvement. It is possible that regular users of amphetamine-type stimulants (i.e. ecstasy (MDMA) and methamphetamine) may have a greater predisposition to engage in drink/drug driving compared to non-users. We compare offence rates, and self-reported drink/drug driving rates, for stimulant users and non-users in Queensland, and examine contributing factors. - Methods The Natural History Study of Drug Use is a prospective longitudinal study using population screening to recruit a probabilistic sample of amphetamine-type stimulant users and non-users aged 19-23 years. At the 4 ½ year follow-up, consent was obtained to extract data from participants’ Queensland driver records (ATS users: n=217, non-users: n=135). Prediction models were developed of offence rates in stimulant users controlling for factors such as aggression and delinquency. - Results Stimulant users were more likely than non-users to have had a drink-driving offence (8.7% vs. 0.8%, p < 0.001). Further, about 26% of ATS users and 14% of non-users self-reported driving under the influence of alcohol during the last 12 months. Among stimulant users, drink-driving was independently associated with last month high-volume alcohol consumption (Incident Rate Ratio (IRR): 5.70, 95% CI: 2.24-14.52), depression (IRR: 1.28, 95% CI: 1.07-1.52), low income (IRR: 3.57, 95% CI: 1.12-11.38), and male gender (IRR: 5.40, 95% CI: 2.05-14.21). - Conclusions Amphetamine-type stimulant use is associated with increased long-term risk of drink-driving, due to a number of behavioural and social factors. Inter-sectoral approaches which target long-term behaviours may reduce offending rates.