32 resultados para 1,01
em Queensland University of Technology - ePrints Archive
Resumo:
Objective: To replicate and refine the reported association of ankylosing spondylitis (AS) with two nonsynonymous single nucleotide polymorphisms (nsSNPs) on chromosome 16q22.1. Methods: Firstly, 730 independent UK patients with AS were genotyped for rs9939768 and rs6979 and allele frequencies were compared with 2879 previously typed historic disease controls. Secondly, the two data sets were combined in meta-analyses. Finally, 5 tagging SNPs, located between rs9939768 and rs6979, were analysed in 1604 cases and 1020 controls. Results: The association of rs6979 with AS was replicated, p=0.03, OR=1.14 (95% CI 1.01 to 1.28), and a trend for association with rs9939768 detected, p=0.06, OR=1.25 (95% CI 0.99 to 1.57). Meta-analyses revealed association of both SNPs with AS, p=0.0008, OR=1.31 (95% CI 1.12 to 1.54) and p=0.0009, OR=1.15 (95% CI 1.06 to 1.23) for rs9939768 and rs6979, respectively. New associations with rs9033 and rs868213 (p=0.00002, OR=1.23 (95% CI 1.12 to 1.36) and p=0.00002 OR=1.45 (95% CI 1.22 to 1.72), respectively, were identified. Conclusions: The region on chromosome 16 that has been replicated in the present work is interesting as the highly plausible candidate gene, tumour necrosis factor receptor type 1 (TNFR1)-associated death domain (TRADD), is located between rs9033 and rs868213. It will require additional work to identify the primary genetic association(s) with AS.
Resumo:
(-)-CGP12177 is a non-conventional partial agonist that causes modest and transient increases of contractile force in human atrial trabeculae (Kaumann and Molenaar, 2008). These effects are markedly increased and maintained by inhibition of phosphodiesterase PDE3. As verified with recombinant receptors, the cardiostimulant effect of (-)-CGP12177 is mediated through a site at the beta1-adrenoceptor with lower affinity (beta1LAR) compared to the site through which (-)-CGP12177 antagonizes the effects of catecholamines (beta1HAR). However, in a recent report it was proposed that the positive inotropic effects of CGP12177 are mediated through beta3-adrenoceptors (Skeberdis et al 2008). We therefore investigated whether the effects of (-)-CGP12177 on human atrial trabeculae are antagonized by the beta3-adrenoceptor-selective antagonist L-748,337 (1 microM). (-)-CGP12177 (200 nM) caused a stable increase in force which was significantly reduced by the addition of (-)-bupranolol (1 microM), P = 0.002, (basal 4.45 ± 0.78 mN, IBMX (PDE inhibitor) 5.47 ± 1.01 mN, (-)-CGP12177 9.34 ± 1.33 mN, (-)-bupranolol 5.79 ± 1.08 mN, n = 6) but not affected by the addition of L-748,337 (1 microM), P = 0.12, (basal 4.48 ± 1.32 mN, IBMX 7.15 ± 2.28 mN, (-)-CGP12177 12.51 ± 3.71 mN, L-748,337 10.90 ± 3.49 mN, n = 6). Cumulative concentration-effect curves for (-)-CGP12177 were not shifted to the right by L-748,337 (1 microM). The –logEC50M values of (-)-CGP12177 in the absence and presence of L-748,337 were 7.21±0.09 and 7.41±0.13, respectively (data from 25 trabeculae from 8 patients, P=0.2) The positive inotropic effects of (-)-CGP12177 (IBMX present) were not antagonized by L-748,337 but were blunted by (-)-bupranolol (1 microM). The results rule out an involvement of beta3-adrenoceptors in the positive inotropic effects (-)-CGP12177 in human right atrial myocardium and are consistent with mediation through beta1LAR. Kaumann A and Molenaar P (2008) Pharmacol Ther 118, 303-336 Skeberdis VA et al (2008) J Clin Invest, 118, 3219-3227
Resumo:
We investigated the limits at which blur due to defocus, crossed-cylinder astigmatism, and trefoil became noticeable, troublesome or objectionable. Black letter targets (0.1, 0.35 and 0.6 logMAR) were presented on white backgrounds. Subjects were cyclopleged and had effectively 5 mm pupils. Blur was induced with a deformable, adaptive-optics mirror operating under open-loop conditions. Mean defocus blur limits of six subjects with uncorrected intrinsic higher-order ocular aberrations ranged from 0.18 ± 0.08 D (noticeable blur criterion, 0.1 logMAR) to 1.01 ± 0.27 D (objectionable blur criterion, 0.6 logMAR. Crossed-cylinder astigmatic blur limits were approximately 90% of those for defocus, but with considerable meridional influences. In two of the subjects, the intrinsic aberrations of the eye were subsequently corrected before the defocus and astigmatic blur were added. This resulted in only minor reductions in their blur limits. When assessed with trefoil blur and corrected intrinsic ocular aberrations, the ratio of objectionable to noticeable blur limits in these two subjects was much higher for trefoil (3.5) than for defocus (2.5) and astigmatism (2.2).
Resumo:
Cooking skills are emphasized in nutrition promotion but their distribution among population subgroups and relationship to dietary behavior is researched by few population-based studies. This study examined the relationships between confidence to cook, sociodemographic characteristics, and household vegetable purchasing. This cross-sectional study of 426 randomly selected households in Brisbane, Australia, used a validated questionnaire to assess household vegetable purchasing habits and the confidence to cook of the person who most often prepares food for these households. The mutually adjusted odds ratios (ORs) of lacking confidence to cook were assessed across a range of demographic subgroups using multiple logistic regression models. Similarly, mutually adjusted mean vegetable purchasing scores were calculated using multiple linear regression for different population groups and for respondents with varying confidence levels. Lacking confidence to cook using a variety of techniques was more common among respondents with less education (OR 3.30; 95% confidence interval [CI] 1.01 to 10.75) and was less common among respondents who lived with minors (OR 0.22; 95% CI 0.09 to 0.53) and other adults (OR 0.43; 95% CI 0.24 to 0.78). Lack of confidence to prepare vegetables was associated with being male (OR 2.25; 95% CI 1.24 to 4.08), low education (OR 6.60; 95% CI 2.08 to 20.91), lower household income (OR 2.98; 95% CI 1.02 to 8.72) and living with other adults (OR 0.53; 95% CI 0.29 to 0.98). Households bought a greater variety of vegetables on a regular basis when the main chef was confident to prepare them (difference: 18.60; 95% CI 14.66 to 22.54), older (difference: 8.69; 95% CI 4.92 to 12.47), lived with at least one other adult (difference: 5.47; 95% CI 2.82 to 8.12) or at least one minor (difference: 2.86; 95% CI 0.17 to 5.55). Cooking skills may contribute to socioeconomic dietary differences, and may be a useful strategy for promoting fruit and vegetable consumption, particularly among socioeconomically disadvantaged groups.
Resumo:
Suicide has drawn much attention from both the scientific community and the public. Examining the impact of socio-environmental factors on suicide is essential in developing suicide prevention strategies and interventions, because it will provide health authorities with important information for their decision-making. However, previous studies did not examine the impact of socio-environmental factors on suicide using a spatial analysis approach. The purpose of this study was to identify the patterns of suicide and to examine how socio-environmental factors impact on suicide over time and space at the Local Governmental Area (LGA) level in Queensland. The suicide data between 1999 and 2003 were collected from the Australian Bureau of Statistics (ABS). Socio-environmental variables at the LGA level included climate (rainfall, maximum and minimum temperature), Socioeconomic Indexes for Areas (SEIFA) and demographic variables (proportion of Indigenous population, unemployment rate, proportion of population with low income and low education level). Climate data were obtained from Australian Bureau of Meteorology. SEIFA and demographic variables were acquired from ABS. A series of statistical and geographical information system (GIS) approaches were applied in the analysis. This study included two stages. The first stage used average annual data to view the spatial pattern of suicide and to examine the association between socio-environmental factors and suicide over space. The second stage examined the spatiotemporal pattern of suicide and assessed the socio-environmental determinants of suicide, using more detailed seasonal data. In this research, 2,445 suicide cases were included, with 1,957 males (80.0%) and 488 females (20.0%). In the first stage, we examined the spatial pattern and the determinants of suicide using 5-year aggregated data. Spearman correlations were used to assess associations between variables. Then a Poisson regression model was applied in the multivariable analysis, as the occurrence of suicide is a small probability event and this model fitted the data quite well. Suicide mortality varied across LGAs and was associated with a range of socio-environmental factors. The multivariable analysis showed that maximum temperature was significantly and positively associated with male suicide (relative risk [RR] = 1.03, 95% CI: 1.00 to 1.07). Higher proportion of Indigenous population was accompanied with more suicide in male population (male: RR = 1.02, 95% CI: 1.01 to 1.03). There was a positive association between unemployment rate and suicide in both genders (male: RR = 1.04, 95% CI: 1.02 to 1.06; female: RR = 1.07, 95% CI: 1.00 to 1.16). No significant association was observed for rainfall, minimum temperature, SEIFA, proportion of population with low individual income and low educational attainment. In the second stage of this study, we undertook a preliminary spatiotemporal analysis of suicide using seasonal data. Firstly, we assessed the interrelations between variables. Secondly, a generalised estimating equations (GEE) model was used to examine the socio-environmental impact on suicide over time and space, as this model is well suited to analyze repeated longitudinal data (e.g., seasonal suicide mortality in a certain LGA) and it fitted the data better than other models (e.g., Poisson model). The suicide pattern varied with season and LGA. The north of Queensland had the highest suicide mortality rate in all the seasons, while there was no suicide case occurred in the southwest. Northwest had consistently higher suicide mortality in spring, autumn and winter. In other areas, suicide mortality varied between seasons. This analysis showed that maximum temperature was positively associated with suicide among male population (RR = 1.24, 95% CI: 1.04 to 1.47) and total population (RR = 1.15, 95% CI: 1.00 to 1.32). Higher proportion of Indigenous population was accompanied with more suicide among total population (RR = 1.16, 95% CI: 1.13 to 1.19) and by gender (male: RR = 1.07, 95% CI: 1.01 to 1.13; female: RR = 1.23, 95% CI: 1.03 to 1.48). Unemployment rate was positively associated with total (RR = 1.40, 95% CI: 1.24 to 1.59) and female (RR=1.09, 95% CI: 1.01 to 1.18) suicide. There was also a positive association between proportion of population with low individual income and suicide in total (RR = 1.28, 95% CI: 1.10 to 1.48) and male (RR = 1.45, 95% CI: 1.23 to 1.72) population. Rainfall was only positively associated with suicide in total population (RR = 1.11, 95% CI: 1.04 to 1.19). There was no significant association for rainfall, minimum temperature, SEIFA, proportion of population with low educational attainment. The second stage is the extension of the first stage. Different spatial scales of dataset were used between the two stages (i.e., mean yearly data in the first stage, and seasonal data in the second stage), but the results are generally consistent with each other. Compared with other studies, this research explored the variety of the impact of a wide range of socio-environmental factors on suicide in different geographical units. Maximum temperature, proportion of Indigenous population, unemployment rate and proportion of population with low individual income were among the major determinants of suicide in Queensland. However, the influence from other factors (e.g. socio-culture background, alcohol and drug use) influencing suicide cannot be ignored. An in-depth understanding of these factors is vital in planning and implementing suicide prevention strategies. Five recommendations for future research are derived from this study: (1) It is vital to acquire detailed personal information on each suicide case and relevant information among the population in assessing the key socio-environmental determinants of suicide; (2) Bayesian model could be applied to compare mortality rates and their socio-environmental determinants across LGAs in future research; (3) In the LGAs with warm weather, high proportion of Indigenous population and/or unemployment rate, concerted efforts need to be made to control and prevent suicide and other mental health problems; (4) The current surveillance, forecasting and early warning system needs to be strengthened, to trace the climate and socioeconomic change over time and space and its impact on population health; (5) It is necessary to evaluate and improve the facilities of mental health care, psychological consultation, suicide prevention and control programs; especially in the areas with low socio-economic status, high unemployment rate, extreme weather events and natural disasters.
Resumo:
High-density living in inner-urban areas has been promoted to encourage the use of more sustainable modes of travel to reduce greenhouse gas emissions. However, previous research presents mixed results on the relationship between living in proximity to transport systems and reduced car-dependency. This research examines inner-city residents’ transportation practices and perceptions, via 24 qualitative interviews with residents from high-density dwellings in inner-city Brisbane, Australia. Whilst participants consider public transport accessible and convenient, car use continues to be relied on for many journeys. Transportation choices are justified through complex definitions of convenience containing both utilitarian and psycho-social elements,with three key themes identified: time-efficiency, single versus multi-modal trips, and distance to and purpose of journey, as well as attitudinal, affective and symbolic elements related to transport mode use. Understanding conceptions of transport convenience held by different segments of the transport users market,alongside other factors strongly implicated in travel mode choice, can ensure targeted improvements in sustainable transport service levels and infrastructure as well as information service provision and behavioural change campaigns.
Resumo:
Introduction: Emergency prehospital medical care providers are frontline health workers during emergencies. However, little is known about their attitudes, perceptions, and likely behaviors during emergency conditions. Understanding these attitudes and behaviors is crucial to mitigating the psychological and operational effects of biohazard events such as pandemic influenza, and will support the business continuity of essential prehospital services. ----- ----- Problem: This study was designed to investigate the association between knowledge and attitudes regarding avian influenza on likely behavioral responses of Australian emergency prehospital medical care providers in pandemic conditions. ----- ----- Methods: Using a reply-paid postal questionnaire, the knowledge and attitudes of a national, stratified, random sample of the Australian emergency prehospital medical care workforce in relation to pandemic influenza were investigated. In addition to knowledge and attitudes, there were five measures of anticipated behavior during pandemic conditions: (1) preparedness to wear personal protective equipment (PPE); (2) preparedness to change role; (3) willingness to work; and likely refusal to work with colleagues who were exposed to (4) known and (5) suspected influenza. Multiple logistic regression models were constructed to determine the independent predictors of each of the anticipated behaviors, while controlling for other relevant variables. ----- ----- Results: Almost half (43%) of the 725 emergency prehospital medical care personnel who responded to the survey indicated that they would be unwilling to work during pandemic conditions; one-quarter indicated that they would not be prepared to work in PPE; and one-third would refuse to work with a colleague exposed to a known case of pandemic human influenza. Willingness to work during a pandemic (OR = 1.41; 95% CI = 1.0–1.9), and willingness to change roles (OR = 1.44; 95% CI = 1.04–2.0) significantly increased with adequate knowledge about infectious agents generally. Generally, refusal to work with exposed (OR = 0.48; 95% CI = 0.3–0.7) or potentially exposed (OR = 0.43; 95% CI = 0.3–0.6) colleagues significantly decreased with adequate knowledge about infectious agents. Confidence in the employer’s capacity to respond appropriately to a pandemic significantly increased employee willingness to work (OR = 2.83; 95% CI = 1.9–4.1); willingness to change roles during a pandemic (OR = 1.52; 95% CI = 1.1–2.1); preparedness to wear PPE (OR = 1.68; 95% CI = 1.1–2.5); and significantly decreased the likelihood of refusing to work with colleagues exposed to (suspected) influenza (OR = 0.59; 95% CI = 0.4–0.9). ----- ----- Conclusions:These findings indicate that education and training alone will not adequately prepare the emergency prehospital medical workforce for a pandemic. It is crucial to address the concerns of ambulance personnel and the perceived concerns of their relationship with partners in order to maintain an effective prehospital emergency medical care service during pandemic conditions.
Resumo:
Background: Specialised disease management programmes for chronic heart failure (CHF) improve survival, quality of life and reduce healthcare utilisation. The overall efficacy of structured telephone support or telemonitoring as an individual component of a CHF disease management strategy remains inconclusive. Objectives: To review randomised controlled trials (RCTs) of structured telephone support or telemonitoring compared to standard practice for patients with CHF in order to quantify the effects of these interventions over and above usual care for these patients. Search strategy: Databases (the Cochrane Central Register of Controlled Trials (CENTRAL), Database of Abstracts of Reviews of Effects (DARE) and Health Technology Assessment Database (HTA) on The Cochrane Library, MEDLINE, EMBASE, CINAHL, AMED and Science Citation Index Expanded and Conference Citation Index on ISI Web of Knowledge) and various search engines were searched from 2006 to November 2008 to update a previously published non-Cochrane review. Bibliographies of relevant studies and systematic reviews and abstract conference proceedings were handsearched. No language limits were applied. Selection criteria: Only peer reviewed, published RCTs comparing structured telephone support or telemonitoring to usual care of CHF patients were included. Unpublished abstract data was included in sensitivity analyses. The intervention or usual care could not include a home visit or more than the usual (four to six weeks) clinic follow-up. Data collection and analysis: Data were presented as risk ratio (RR) with 95% confidence intervals (CI). Primary outcomes included all-cause mortality, all-cause and CHF-related hospitalisations which were meta-analysed using fixed effects models. Other outcomes included length of stay, quality of life, acceptability and cost and these were described and tabulated. Main results: Twenty-five studies and five published abstracts were included. Of the 25 full peer-reviewed studies meta-analysed, 16 evaluated structured telephone support (5613 participants), 11 evaluated telemonitoring (2710 participants), and two tested both interventions (included in counts). Telemonitoring reduced all-cause mortality (RR 0.66, 95% CI 0.54 to 0.81, P < 0.0001) with structured telephone support demonstrating a non-significant positive effect (RR 0.88, 95% CI 0.76 to 1.01, P = 0.08). Both structured telephone support (RR 0.77, 95% CI 0.68 to 0.87, P < 0.0001) and telemonitoring (RR 0.79, 95% CI 0.67 to 0.94, P = 0.008) reduced CHF-related hospitalisations. For both interventions, several studies improved quality of life, reduced healthcare costs and were acceptable to patients. Improvements in prescribing, patient knowledge and self-care, and New York Heart Association (NYHA) functional class were observed. Authors' conclusions: Structured telephone support and telemonitoring are effective in reducing the risk of all-cause mortality and CHF-related hospitalisations in patients with CHF; they improve quality of life, reduce costs, and evidence-based prescribing.
Resumo:
Background: Rapid weight gain in infancy is an important predictor of obesity in later childhood. Our aim was to determine which modifiable variables are associated with rapid weight gain in early life. Methods: Subjects were healthy infants enrolled in NOURISH, a randomised, controlled trial evaluating an intervention to promote positive early feeding practices. This analysis used the birth and baseline data for NOURISH. Birthweight was collected from hospital records and infants were also weighed at baseline assessment when they were aged 4-7 months and before randomisation. Infant feeding practices and demographic variables were collected from the mother using a self administered questionnaire. Rapid weight gain was defined as an increase in weight-for-age Z-score (using WHO standards) above 0.67 SD from birth to baseline assessment, which is interpreted clinically as crossing centile lines on a growth chart. Variables associated with rapid weight gain were evaluated using a multivariable logistic regression model. Results: Complete data were available for 612 infants (88% of the total sample recruited) with a mean (SD) age of 4.3 (1.0) months at baseline assessment. After adjusting for mother's age, smoking in pregnancy, BMI, and education and infant birthweight, age, gender and introduction of solid foods, the only two modifiable factors associated with rapid weight gain to attain statistical significance were formula feeding [OR=1.72 (95%CI 1.01-2.94), P= 0.047] and feeding on schedule [OR=2.29 (95%CI 1.14-4.61), P=0.020]. Male gender and lower birthweight were non-modifiable factors associated with rapid weight gain. Conclusions: This analysis supports the contention that there is an association between formula feeding, feeding to schedule and weight gain in the first months of life. Mechanisms may include the actual content of formula milk (e.g. higher protein intake) or differences in feeding styles, such as feeding to schedule, which increase the risk of overfeeding. Trial Registration: Australian Clinical Trials Registry ACTRN12608000056392
Resumo:
Objective: To investigate the mental and general health of infertile women who had not sought medical advice for their recognized infertility and were therefore not represented in clinical populations. Design: Longitudinal cohort study.Setting Population based.Patient(s) Participants in the Australian Longitudinal Study on Women's Health aged 28-33 years in 2006 who had ever tried to conceive or had been pregnant (n = 5,936).Intervention(s) None.Main Outcome Measure(s) Infertility, not seeking medical advice. Result(s): Compared with fertile women (n = 4,905), infertile women (n = 1,031) had higher odds of self-reported depression (odds ratio [OR] 1.20, 95% confidence interval [CI] 1.01-1.43), endometriosis (5.43, 4.01-7.36), polycystic ovary syndrome (9.52, 7.30-12.41), irregular periods (1.99, 1.68-2.36), type II diabetes (4.70, 1.79-12.37), or gestational diabetes (1.66, 1.12-2.46). Compared with infertile women who sought medical advice (n = 728), those who had not sought medical advice (n = 303) had higher odds of self-reported depression (1.67, 1.18-2.37), other mental health problems (3.14, 1.14-8.64), urinary tract infections (1.67, 1.12-2.49), heavy periods (1.63, 1.16-2.29), or a cancer diagnosis (11.33, 2.57-49.89). Infertile women who had or had not sought medical advice had similar odds of reporting an anxiety disorder or anxiety-related symptoms. Conclusion(s): Women with self-reported depression were unlikely to have sought medical advice for infertility. Depression and depressive symptoms may be barriers to seeking medical advice for infertility.
Duration-dependant response of mixed-method pre-cooling for intermittent-sprint exercise in the heat
Resumo:
This study examined the effects of pre-cooling duration on performance and neuromuscular function for self-paced intermittent-sprint shuttle running in the heat. Eight male, team-sport athletes completed two 35-min bouts of intermittent-sprint shuttle running separated by a 15-min recovery on three separate occasions (33°C, 34% relative humidity). Mixed-method pre-cooling was completed for 20 min (COOL20), 10-min (COOL10) or no cooling (CONT) and reapplied for 5-min mid-exercise. Performance was assessed via sprint times, percentage decline and shuttle-running distance covered. Maximal voluntary contractions (MVC), voluntary activation (VA) and evoked twitch properties were recorded pre- and post-intervention and mid- and post-exercise. Core temperature (T c), skin temperature, heart rate, capillary blood metabolites, sweat losses, perceptual exertion and thermal stress were monitored throughout. Venous blood draws pre- and post-exercise were analyzed for muscle damage and inflammation markers. Shuttle-running distances covered were increased 5.2 ± 3.3% following COOL20 (P < 0.05), with no differences observed between COOL10 and CONT (P > 0.05). COOL20 aided in the maintenance of mid- and post-exercise MVC (P < 0.05; d > 0.80), despite no conditional differences in VA (P > 0.05). Pre-exercise T c was reduced by 0.15 ± 0.13°C with COOL20 (P < 0.05; d > 1.10), and remained lower throughout both COOL20 and COOL10 compared to CONT (P < 0.05; d > 0.80). Pre-cooling reduced sweat losses by 0.4 ± 0.3 kg (P < 0.02; d > 1.15), with COOL20 0.2 ± 0.4 kg less than COOL10 (P = 0.19; d = 1.01). Increased pre-cooling duration lowered physiological demands during exercise heat stress and facilitated the maintenance of self-paced intermittent-sprint performance in the heat. Importantly, the dose-response interaction of pre-cooling and sustained neuromuscular responses may explain the improved exercise performance in hot conditions.
Resumo:
Remote monitoring for heart failure has been evaluated in numerous systematic reviews. The aim of this meta-review was to appraise their quality and synthesise results. We electronically searched online databases, performed a forward citation search and hand-searched bibliographies. Systematic reviews of remote monitoring interventions that were used for surveillance of heart failure patients were included. Seven (41%) systematic reviews pooled results for meta-analysis. Eight (47%) considered all non-invasive remote monitoring strategies. Five (29%) focused on telemonitoring. Four (24%) included both non-invasive and invasive technologies. According to AMSTAR criteria, ten (58%) systematic reviews were of poor methodological quality. In high quality reviews, the relative risk of mortality in patients who received remote monitoring ranged from 0.53 (95% CI=0.29-0.96) to 0.88 (95% CI=0.76-1.01). High quality reviews also reported that remote monitoring reduced the relative risk of all-cause (0.52; 95% CI=0.28-0.96 to 0.96; 95% CI=0.90–1.03) and heart failure-related hospitalizations (0.72; 95% CI=0.64–0.81 to RR 0.79; 95% CI=0.67-0.94) and, as a consequence, healthcare costs. As the high quality reviews reported that remote monitoring reduced hospitalizations, mortality and healthcare costs, research efforts should now be directed towards optimising these interventions in preparation for more widespread implementation.
Resumo:
It has been reported that poor nutritional status, in the form of weight loss and resulting body mass index (BMI) changes, is an issue in people with Parkinson's disease (PWP). The symptoms resulting from Parkinson's disease (PD) and the side effects of PD medication have been implicated in the aetiology of nutritional decline. However, the evidence on which these claims are based is, on one hand, contradictory, and on the other, restricted primarily to otherwise healthy PWP. Despite the claims that PWP suffer from poor nutritional status, evidence is lacking to inform nutrition-related care for the management of malnutrition in PWP. The aims of this thesis were to better quantify the extent of poor nutritional status in PWP, determine the important factors differentiating the well-nourished from the malnourished and evaluate the effectiveness of an individualised nutrition intervention on nutritional status. Phase DBS: Nutritional status in people with Parkinson's disease scheduled for deep-brain stimulation surgery The pre-operative rate of malnutrition in a convenience sample of people with Parkinson's disease (PWP) scheduled for deep-brain stimulation (DBS) surgery was determined. Poorly controlled PD symptoms may result in a higher risk of malnutrition in this sub-group of PWP. Fifteen patients (11 male, median age 68.0 (42.0 – 78.0) years, median PD duration 6.75 (0.5 – 24.0) years) participated and data were collected during hospital admission for the DBS surgery. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference, waist circumference, body mass index (BMI)) were taken, and body composition was measured using bioelectrical impedance spectroscopy (BIS). Six (40%) of the participants were malnourished (SGA-B) while 53% reported significant weight loss following diagnosis. BMI was significantly different between SGA-A and SGA-B (25.6 vs 23.0kg/m 2, p<.05). There were no differences in any other variables, including PG-SGA score and the presence of non-motor symptoms. The conclusion was that malnutrition in this group is higher than that in other studies reporting malnutrition in PWP, and it is under-recognised. As poorer surgical outcomes are associated with poorer pre-operative nutritional status in other surgeries, it might be beneficial to identify patients at nutritional risk prior to surgery so that appropriate nutrition interventions can be implemented. Phase I: Nutritional status in community-dwelling adults with Parkinson's disease The rate of malnutrition in community-dwelling adults (>18 years) with Parkinson's disease was determined. One hundred twenty-five PWP (74 male, median age 70.0 (35.0 – 92.0) years, median PD duration 6.0 (0.0 – 31.0) years) participated. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference (MAC), calf circumference, waist circumference, body mass index (BMI)) were taken. Nineteen (15%) of the participants were malnourished (SGA-B). All anthropometric indices were significantly different between SGA-A and SGA-B (BMI 25.9 vs 20.0kg/m2; MAC 29.1 – 25.5cm; waist circumference 95.5 vs 82.5cm; calf circumference 36.5 vs 32.5cm; all p<.05). The PG-SGA score was also significantly lower in the malnourished (2 vs 8, p<.05). The nutrition impact symptoms which differentiated between well-nourished and malnourished were no appetite, constipation, diarrhoea, problems swallowing and feel full quickly. This study concluded that malnutrition in community-dwelling PWP is higher than that documented in community-dwelling elderly (2 – 11%), yet is likely to be under-recognised. Nutrition impact symptoms play a role in reduced intake. Appropriate screening and referral processes should be established for early detection of those at risk. Phase I: Nutrition assessment tools in people with Parkinson's disease There are a number of validated and reliable nutrition screening and assessment tools available for use. None of these tools have been evaluated in PWP. In the sample described above, the use of the World Health Organisation (WHO) cut-off (≤18.5kg/m2), age-specific BMI cut-offs (≤18.5kg/m2 for under 65 years, ≤23.5kg/m2 for 65 years and older) and the revised Mini-Nutritional Assessment short form (MNA-SF) were evaluated as nutrition screening tools. The PG-SGA (including the SGA classification) and the MNA full form were evaluated as nutrition assessment tools using the SGA classification as the gold standard. For screening, the MNA-SF performed the best with sensitivity (Sn) of 94.7% and specificity (Sp) of 78.3%. For assessment, the PG-SGA with a cut-off score of 4 (Sn 100%, Sp 69.8%) performed better than the MNA (Sn 84.2%, Sp 87.7%). As the MNA has been recommended more for use as a nutrition screening tool, the MNA-SF might be more appropriate and take less time to complete. The PG-SGA might be useful to inform and monitor nutrition interventions. Phase I: Predictors of poor nutritional status in people with Parkinson's disease A number of assessments were conducted as part of the Phase I research, including those for the severity of PD motor symptoms, cognitive function, depression, anxiety, non-motor symptoms, constipation, freezing of gait and the ability to carry out activities of daily living. A higher score in all of these assessments indicates greater impairment. In addition, information about medical conditions, medications, age, age at PD diagnosis and living situation was collected. These were compared between those classified as SGA-A and as SGA-B. Regression analysis was used to identify which factors were predictive of malnutrition (SGA-B). Differences between the groups included disease severity (4% more severe SGA-A vs 21% SGA-B, p<.05), activities of daily living score (13 SGA-A vs 18 SGA-B, p<.05), depressive symptom score (8 SGA-A vs 14 SGA-B, p<.05) and gastrointestinal symptoms (4 SGA-A vs 6 SGA-B, p<.05). Significant predictors of malnutrition according to SGA were age at diagnosis (OR 1.09, 95% CI 1.01 – 1.18), amount of dopaminergic medication per kg body weight (mg/kg) (OR 1.17, 95% CI 1.04 – 1.31), more severe motor symptoms (OR 1.10, 95% CI 1.02 – 1.19), less anxiety (OR 0.90, 95% CI 0.82 – 0.98) and more depressive symptoms (OR 1.23, 95% CI 1.07 – 1.41). Significant predictors of a higher PG-SGA score included living alone (β=0.14, 95% CI 0.01 – 0.26), more depressive symptoms (β=0.02, 95% CI 0.01 – 0.02) and more severe motor symptoms (OR 0.01, 95% CI 0.01 – 0.02). More severe disease is associated with malnutrition, and this may be compounded by lack of social support. Phase II: Nutrition intervention Nineteen of the people identified in Phase I as requiring nutrition support were included in Phase II, in which a nutrition intervention was conducted. Nine participants were in the standard care group (SC), which received an information sheet only, and the other 10 participants were in the intervention group (INT), which received individualised nutrition information and weekly follow-up. INT gained 2.2% of starting body weight over the 12 week intervention period resulting in significant increases in weight, BMI, mid-arm circumference and waist circumference. The SC group gained 1% of starting weight over the 12 weeks which did not result in any significant changes in anthropometric indices. Energy and protein intake (18.3kJ/kg vs 3.8kJ/kg and 0.3g/kg vs 0.15g/kg) increased in both groups. The increase in protein intake was only significant in the SC group. The changes in intake, when compared between the groups, were no different. There were no significant changes in any motor or non-motor symptoms or in "off" times or dyskinesias in either group. Aspects of quality of life improved over the 12 weeks as well, especially emotional well-being. This thesis makes a significant contribution to the evidence base for the presence of malnutrition in Parkinson's disease as well as for the identification of those who would potentially benefit from nutrition screening and assessment. The nutrition intervention demonstrated that a traditional high protein, high energy approach to the management of malnutrition resulted in improved nutritional status and anthropometric indices with no effect on the presence of Parkinson's disease symptoms and a positive effect on quality of life.
Resumo:
Objective To evaluate the effectiveness of the 7-valent pneumococcal conjugate vaccine (PCV7) in preventing pneumonia, diagnosed radiologically according to World Health Organization (WHO) criteria, among indigenous infants in the Northern Territory of Australia. Methods We conducted a historical cohort study of consecutive indigenous birth cohorts between 1 April 1998 and 28 February 2005. Children were followed up to 18 months of age. The PCV7 programme commenced on 1 June 2001. All chest X-rays taken within 3 days of any hospitalization were assessed. The primary endpoint was a first episode of WHO-defined pneumonia requiring hospitalization. Cox proportional hazards models were used to compare disease incidence. Findings There were 526 pneumonia events among 10 600 children - an incidence of 3.3 per 1000 child-months; 183 episodes (34.8%) occurred before 5 months of age and 247 (47.0%) by 7 months. Of the children studied, 27% had received 3 doses of vaccine by 7 months of age. Hazard ratios for endpoint pneumonia were 1.01 for 1 versus 0 doses; 1.03 for 2 versus 0 doses; and 0.84 for 3 versus 0 doses. Conclusion There was limited evidence that PCV7 reduced the incidence of radiologically confirmed pneumonia among Northern Territory indigenous infants, although there was a non-significant trend towards an effect after receipt of the third dose. These findings might be explained by lack of timely vaccination and/or occurrence of disease at an early age. Additionally, the relative contribution of vaccine-type pneumococcus to severe pneumonia in a setting where multiple other pathogens are prevalent may differ with respect to other settings where vaccine efficacy has been clearly established.
Resumo:
Purpose: Inaccurate accommodation during nearwork and subsequent accommodative hysteresis may influence myopia development. Myopia is highly prevalent in Singapore; an untested theory is that Chinese children are prone to these accommodation characteristics. We measured the accuracy of accommodation responses during and nearwork-induced transient myopia (NITM) after periods spent reading Chinese and English texts. Methods: Refractions of 40 emmetropic and 43 myopic children were measured with a free-space autorefractor for four reading tasks of 10-minute durations: Chinese (SimSun, 10.5 points) and English (Times New Roman, 12 points) texts at 25 cm and 33 cm. Accuracy was obtained by subtracting accommodation response from accommodation demand. Nearwork-induced transient myopia was obtained by subtracting pretask distance refraction from posttask refraction, and regression was determined as the time for the posttask refraction to return to pretask levels. Results: There were significant, but small, effects of text type (Chinese, 0.97 ± 0.32 diopters [D] vs. English, 1.00 ± 0.37 D; F1,1230 = 7.24, p = 0.007) and reading distance (33 cm, 1.01 ± 0.30 D vs. 25 cm, 0.97 ± 0.39 D; F1,1230 = 7.74, p = 0.005) on accommodation accuracy across all participants. Accuracy was similar for emmetropic and myopic children across all reading tasks. Neither text type nor reading distance had significant effects on NITM or its regression. Myopes had greater NITM (by 0.07 D) (F1,81 = 5.05, p = 0.03) that took longer (by 50s) (F1,81 = 31.08, p < 0.01) to dissipate. Conclusions: Reading Chinese text caused smaller accommodative lags than reading English text, but the small differences were not clinically significant. Myopic children had significantly greater NITM and longer regression than emmetropic children for both texts. Whether differences in NITM are a cause or consequence of myopia cannot be answered from this study.