757 resultados para Diet and nutrition
Resumo:
Background Delivering effective multiple health behavior interventions to large numbers of adults with chronic conditions via primary care settings is a public health priority. Purpose Within a 12-month, telephone-delivered diet and physical activity intervention with multiple behavioral outcomes, we examined the extent and co-variation of multiple health behavior change. Methods A cluster-randomized trial with 434 patients with type 2 diabetes or hypertension were recruited from 10 general practices, which were randomized to receive telephone counseling or usual care. Results Those receiving telephone counseling were significantly more likely than those in usual care to make greater reductions in multiple behaviors after adjusting for baseline risk behaviors (OR 2.42; 95%CI 1.43, 4.11). Controlling for baseline risk and group allocation, making changes to either physical activity, fat, vegetable, or fiber intake was associated with making significantly more improvements in other behaviors. Conclusions For patients with chronic conditions, telephone counseling can significantly improve multiple health behaviors, with behavioral changes tending to co-vary.
Resumo:
Background By 2025, it is estimated that approximately 1.8 million Australian adults (approximately 8.4% of the adult population) will have diabetes, with the majority having type 2 diabetes. Weight management via improved physical activity and diet is the cornerstone of type 2 diabetes management. However, the majority of weight loss trials in diabetes have evaluated short-term, intensive clinic-based interventions that, while producing short-term outcomes, have failed to address issues of maintenance and broad population reach. Telephone-delivered interventions have the potential to address these gaps. Methods/Design Using a two-arm randomised controlled design, this study will evaluate an 18-month, telephone-delivered, behavioural weight loss intervention focussing on physical activity, diet and behavioural therapy, versus usual care, with follow-up at 24 months. Three-hundred adult participants, aged 20-75 years, with type 2 diabetes, will be recruited from 10 general practices via electronic medical records search. The Social-Cognitive Theory driven intervention involves a six-month intensive phase (4 weekly calls and 11 fortnightly calls) and a 12-month maintenance phase (one call per month). Primary outcomes, assessed at 6, 18 and 24 months, are: weight loss, physical activity, and glycaemic control (HbA1c), with weight loss and physical activity also measured at 12 months. Incremental cost-effectiveness will also be examined. Study recruitment began in February 2009, with final data collection expected by February 2013. Discussion This is the first study to evaluate the telephone as the primary method of delivering a behavioural weight loss intervention in type 2 diabetes. The evaluation of maintenance outcomes (6 months following the end of intervention), the use of accelerometers to objectively measure physical activity, and the inclusion of a cost-effectiveness analysis will advance the science of broad reach approaches to weight control and health behaviour change, and will build the evidence base needed to advocate for the translation of this work into population health practice.
Resumo:
This study aimed to gauge the presence of markers of chronic disease, as a basis for food and nutrition policy in correctional facilities. One hundred and twenty offenders, recruited from a Queensland Correctional Centre, provided informed consent and completed both dietary interviews and physical measurements. Mean age of the sample was 35.5 ± 12 years (range = 19–77 yrs); mean age of the total population (n = 945) was 32.8 ± 10 years (range = 19–80 yrs). Seventy-nine participants also provided fasting blood samples. The mean body mass index (BMI) was 27 ± 3.5 kg/m2; 72% having a BMI > 25 kg/m2. Thirty-three percent were classified overweight or obese using waist circumference (mean = 92 ± 10 cm). Mean blood pressure measurement was systolic = 130 ± 14 mmHg and diastolic = 73 ± 10 mmHg. Twenty-four percent were classified as hypertensive of whom three were on antihypertensive medication. Eighteen percent had elevated triglycerides, and 40% unfavourable total cholesterol to HDL ratios. Homeostatic Model Assessment (HOMA scores) were calculated from glucose and insulin. Four participants were insulin resistant, two of whom had known diabetes. Metabolic syndrome, based on waist circumference (adjusted for ethnicity), blood lipids, blood pressure and plasma glucose indicated that 25% (n = 20) were classified with metabolic syndrome. Eighty-four percent (n = 120) reported some physical activity each day, with 51 percent participating ≥two times daily. Fifty-four percent reported smoking with an additional 20% having smoked in the past. Findings suggest that waist circumference rather than weight and BMI only should be used in this group to determine weight status. The data suggest that markers of chronic disease are present and that food and nutrition policy must reflect this. Further analysis is being completed to determine relevant policy initiatives.
Resumo:
Lower energy and protein intakes are well documented in patients on texture modified diets. In acute hospital settings, the provision of appropriate texture modified foods to meet industry standards is essential for patient safety and nutrition outcomes. The texture modified menu at an acute private hospital was evaluated in accordance with their own nutritional standards (NS) and Australian National Standards (Dietitians Association of Australia and Speech Pathology Australia, 2007). The NS documents portion sizes and nutritional requirements for each menu. Texture B and C menus were analysed qualitatively and quantitatively over 9 days of a 6 day cyclic menu for breakfast (n=4), lunch (n=34) and dinner (n=34). Results indicated a lack of portion control, as specified by the NS, across all meals including breakfast (65–140%), soup (55–115%), meat (45–165%), vegetables (55–185%) and desserts (30–300%). Dilution factors and portion sizes influenced the protein and energy availability of Texture B & C menus. While the Texture B menu provided more energy, neither menu met the NS. Limited dessert options on the Texture C menu restricted the ability of this menu to meet protein NS. A lack of portion control and menu items incorrectly modified can compromise protein and energy intakes. Strategies to correct serving sizes and provision of alternate protein sources were recommended. Suggestions included cost-effectively increasing the variety of foods to assist protein and energy intake and the procurement of standardised equipment and visual aids to assist food preparation and presentation in accordance with texture modified guidelines and the NS.
Resumo:
1. Both dietary magnesium depletion and potassium depletion (confirmed by tissue analysis) were induced in rats which were then compared with rats treated with chlorothiazide (250 mg/kg diet) and rats on a control synthetic diet. 2. Brain and muscle intracellular pH was measured by using a surface coil and [31P]-NMR to measure the chemical shift of inorganic phosphate. pH was also measured in isolated perfused hearts from control and magnesium-deficient rats. Intracellular magnesium status was assessed by measuring the chemical shift of β-ATP in brain. 3. There was no evidence for magnesium deficiency in the chlorothiazide-treated rats on tissue analysis or on chemical shift of β-ATP in brain. Both magnesium and potassium deficiency, but not chlorothiazide treatment, were associated with an extracellular alkalosis. 4. Magnesium deficiency led to an intracellular alkalosis in brain, muscle and heart. Chlorothiazide treatment led to an alkalosis in brain. Potassium deficiency was associated with a normal intracellular pH in brain and muscle. 5. Magnesium depletion and chlorothiazide treatment produce intracellular alkalosis by unknown mechanism(s).
Resumo:
Background In Australia and other developed countries, there are consistent and marked socioeconomic inequalities in health. Diet is a major contributing factor to the poorer health of lower socioeconomic groups: the dietary patterns of disadvantaged groups are least consistent with dietary recommendations for the prevention of diet-related chronic diseases compared with their more advantaged counterparts. Part of the reason that lower socioeconomic groups have poorer diets may be their consumption of takeaway foods. These foods typically have nutrient contents that fail to comply with the dietary recommendations for the prevention of chronic disease and associated risk factors. A high level of takeaway food consumption, therefore, may negatively influence overall dietary intakes and, consequently, lead to adverse health outcomes. Despite this, little attention has focused on the association between socioeconomic position (SEP) and takeaway food consumption, with the limited number of studies showing mixed results. Additionally, studies have been limited by only considering a narrow range of takeaway foods and not examining how different socioeconomic groups make choices that are more (or less) consistent with dietary recommendations. While a large number of earlier studies have consistently reported socioeconomically disadvantaged groups consume a lesser amount of fruit and vegetables, there is limited knowledge about the role of takeaway food in socioeconomic variations in fruit and vegetable intake. Furthermore, no known studies have investigated why there are socioeconomic differences in takeaway food consumption. The aims of this study are to: examine takeaway food consumption and the types of takeaway food consumed (healthy and less healthy) by different socioeconomic groups, to determine whether takeaway food consumption patterns explain socioeconomic variations in fruit and vegetable intake, and investigate the role of a range of psychosocial factors in explaining the association between SEP and takeaway food consumption and the choice of takeaway food. Methods This study used two cross-sectional population-based datasets: 1) the 1995 Australian National Nutrition Survey (NNS) which was conducted among a nationally representative sample of adults aged between 25.64 years (N = 7319, 61% response rate); and 2) the Food and Lifestyle Survey (FLS) which was conducted by the candidate and was undertaken among randomly selected adults aged between 25.64 years residing in Brisbane, Australia in 2009 (N = 903, 64% response rate). The FLS extended the NNS in several ways by describing current socioeconomic differences in takeaway food consumption patterns, formally assessing the mediated effect of takeaway food consumption to socioeconomic inequalities in fruit and vegetable intake, and also investigating whether (and which) psychosocial factors contributed to the observed socioeconomic variations in takeaway food consumption patterns. Results Approximately 32% of the NNS participants consumed takeaway food in the previous 24 hours and 38% of the FLS participants reported consuming takeaway food once a week or more. The results from analyses of the NNS and the FLS were somewhat mixed; however, disadvantaged groups were likely to consume a high level of �\less healthy. takeaway food compared with their more advantaged counterparts. The lower fruit and vegetable intake among lower socioeconomic groups was partly mediated by their high consumption of �\less healthy. takeaway food. Lower socioeconomic groups were more likely to have negative meal preparation behaviours and attitudes, and weaker health and nutrition-related beliefs and knowledge. Socioeconomic differences in takeaway food consumption were partly explained by meal preparation behaviours and attitudes, and these factors along with health and nutrition-related beliefs and knowledge appeared to contribute to the socioeconomic variations in choice of takeaway foods. Conclusion This thesis enhances our understanding of socioeconomic differences in dietary behaviours and the potential pathways by describing takeaway food consumption patterns by SEP, explaining the role of takeaway food consumption in socioeconomic inequalities in fruit and vegetable intake, and identifying the potential impact of psychosocial factors on socioeconomic differences in takeaway food consumption and the choice of takeaway food. Some important evidence is also provided for developing policies and effective intervention programs to improve the diet quality of the population, especially among lower socioeconomic groups. This thesis concludes with a discussion of a number of recommendations about future research and strategies to improve the dietary intake of the whole population, and especially among disadvantaged groups.
Mixed methods research approach to the development and review of competency standards for dietitians
Resumo:
Aim: Competency standards support a range of professional activities including the accreditation of university courses. Reviewing these standards is essential to ensure universities continue to produce well equipped graduates, who can meet the challenge of changing workforce requirements. This paper has two aims: a) to provide an overview of the methodological approaches utilised for compilation and review of the Competency Standards for Dietetics and b) to evaluate the Dietitians Association of Australia’s Competency Standards and capture emerging and contemporary dietetic practice. Methods: A literature review of the methods used to develop Competency Standards for dietitians in Australia, including entry level, advanced level and DAA Fellow competencies and other specific areas of competency, such as public health nutrition and nutrition education is outlined and compared to other allied health professions. The mixed methods methodology used in the most recent review is described in more detail. Results: The history of Dietetic Competency Standards development and review in Australia is compared to dietetic Competency Standards internationally and within other health professions in Australia. The political context in which these standards have been developed in Australia and which has determined their format is also discussed. The results of the most recent Competency Standards review are reported to highlight emerging practice in Australia. Conclusion: The mixed methods approach used in this review provides rich data about contemporary dietetic practice. Our view supports a planned review of all Competency Standards to ensure practice informs education and credentialling and we recommend the Dietitians Association of Australia consider this in future
Resumo:
Context: Anti-Müllerian hormone (AMH) concentration reflects ovarian aging and is argued to be a useful predictor of age at menopause (AMP). It is hypothesized that AMH falling below a critical threshold corresponds to follicle depletion, which results in menopause. With this threshold, theoretical predictions of AMP can be made. Comparisons of such predictions with observed AMP from population studies support the role for AMH as a forecaster of menopause. Objective: The objective of the study was to investigate whether previous relationships between AMH and AMP are valid using a much larger data set. Setting: AMH was measured in 27 563 women attending fertility clinics. Study Design: From these data a model of age-related AMH change was constructed using a robust regression analysis. Data on AMP from subfertile women were obtained from the population-based Prospect-European Prospective Investigation into Cancer and Nutrition (Prospect- EPIC) cohort (n � 2249). By constructing a probability distribution of age at which AMH falls below a critical threshold and fitting this to Prospect-EPIC menopausal age data using maximum likelihood, such a threshold was estimated. Main Outcome: The main outcome was conformity between observed and predicted AMP. Results: To get a distribution of AMH-predicted AMP that fit the Prospect-EPIC data, we found the critical AMH threshold should vary among women in such a way that women with low age-specific AMH would have lower thresholds, whereas women with high age-specific AMH would have higher thresholds (mean 0.075 ng/mL; interquartile range 0.038–0.15 ng/mL). Such a varying AMH threshold for menopause is a novel and biologically plausible finding. AMH became undetectable (�0.2 ng/mL) approximately 5 years before the occurrence of menopause, in line with a previous report. Conclusions: The conformity of the observed and predicted distributions of AMP supports the hypothesis that declining population averages of AMH are associated with menopause, making AMH an excellent candidate biomarker for AMP prediction. Further research will help establish the accuracy of AMH levels to predict AMP within individuals.
Resumo:
Dear Editor We thank Dr Klek for his interest in our article and giving us the opportunity to clarify our study and share our thoughts. Our study looks at the prevalence of malnutrition in an acute tertiary hospital and tracked the outcomes prospectively.1 There are a number of reasons why we chose Subjective Global Assessment (SGA) to determine the nutritional status of patients. Firstly, we took the view that nutrition assessment tools should be used to determine nutrition status and diagnose presence and severity of malnutrition; whereas the purpose of nutrition screening tools are to identify individuals who are at risk of malnutrition. Nutritional assessment rather than screening should be used as the basis for planning and evaluating nutrition interventions for those diagnosed with malnutrition. Secondly, Subjective Global Assessment (SGA) has been well accepted and validated as an assessment tool to diagnose the presence and severity of malnutrition in clinical practice.2, 3 It has been used in many studies as a valid prognostic indicator of a range of nutritional and clinical outcomes.4, 5, 6 On the other hand, Malnutrition Universal Screening Tool (MUST)7 and Nutrition Risk Screening 2002 (NRS 2002)8 have been established as screening rather than assessment tools.
Resumo:
Alterations in cognitive function are characteristic of the aging process in humans and other animals. However, the nature of these age related changes in cognition is complex and is likely to be influenced by interactions between genetic predispositions and environmental factors resulting in dynamic fluctuations within and between individuals. These inter and intra-individual fluctuations are evident in both so-called normal cognitive aging and at the onset of cognitive pathology. Mild Cognitive Impairment (MCI), thought to be a prodromal phase of dementia, represents perhaps the final opportunity to mitigate cognitive declines that may lead to terminal conditions such as dementia. The prognosis for people with MCI is mixed with the evidence suggesting that many will remain stable within 10-years of diagnosis, many will improve, and many will transition to dementia. If the characteristics of people who do not progress to dementia from MCI can be identified and replicated in others it may be possible to reduce or delay dementia onset, thus reducing a growing personal and public health burden. Furthermore, if MCI onset can be prevented or delayed, the burden of cognitive decline in aging populations worldwide may be reduced. A cognitive domain that is sensitive to the effects of advancing age, and declines in which have been shown to presage the onset of dementia in MCI patients, is executive function. Moreover, environmental factors such as diet and physical activity have been shown to affect performance on tests of executive function. For example, improvements in executive function have been demonstrated as a result of increased aerobic and anaerobic physical activity and, although the evidence is not as strong, findings from dietary interventions suggest certain nutrients may preserve or improve executive functions in old age. These encouraging findings have been demonstrated in older adults with MCI and their non-impaired peers. However, there are some gaps in the literature that need to be addressed. For example, little is known about the effect on cognition of an interaction between diet and physical activity. Both are important contributors to health and wellbeing, and a growing body of evidence attests to their importance in mental and cognitive health in aging individuals. Yet physical activity and diet are rarely considered together in the context of cognitive function. There is also little known about potential underlying biological mechanisms that might explain the physical activity/diet/cognition relationship. The first aim of this program of research was to examine the individual and interactive role of physical activity and diet, specifically long chain polyunsaturated fatty acid consumption(LCn3) as predictors of MCI status. The second aim is to examine executive function in MCI in the context of the individual and interactive effects of physical activity and LCn3.. A third aim was to explore the role of immune and endocrine system biomarkers as possible mediators in the relationship between LCn3, physical activity and cognition. Study 1a was a cross-sectional analysis of MCI status as a function of erythrocyte proportions of an interaction between physical activity and LCn3. The marine based LCn3s eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) have both received support in the literature as having cognitive benefits, although comparisons of the relative benefits of EPA or DHA, particularly in relation to the aetiology of MCI, are rare. Furthermore, a limited amount of research has examined the cognitive benefits of physical activity in terms of MCI onset. No studies have examined the potential interactive benefits of physical activity and either EPA or DHA. Eighty-four male and female adults aged 65 to 87 years, 50 with MCI and 34 without, participated in Study 1a. A logistic binary regression was conducted with MCI status as a dependent variable, and the individual and interactive relationships between physical activity and either EPA or DHA as predictors. Physical activity was measured using a questionnaire and specific physical activity categories were weighted according to the metabolic equivalents (METs) of each activity to create a physical activity intensity index (PAI). A significant relationship was identified between MCI outcome and the interaction between the PAI and EPA; participants with a higher PAI and higher erythrocyte proportions of EPA were more likely to be classified as non-MCI than their less active peers with less EPA. Study 1b was a randomised control trial using the participants from Study 1a who were identified with MCI. Given the importance of executive function as a determinant of progression to more severe forms of cognitive impairment and dementia, Study 1b aimed to examine the individual and interactive effect of physical activity and supplementation with either EPA or DHA on executive function in a sample of older adults with MCI. Fifty male and female participants were randomly allocated to supplementation groups to receive 6-months of supplementation with EPA, or DHA, or linoleic acid (LA), a long chain polyunsaturated omega-6 fatty acid not known for its cognitive enhancing properties. Physical activity was measured using the PAI from Study 1a at baseline and follow-up. Executive function was measured using five tests thought to measure different executive function domains. Erythrocyte proportions of EPA and DHA were higher at follow-up; however, PAI was not significantly different. There was also a significant improvement in three of the five executive function tests at follow-up. However, regression analyses revealed that none of the variance in executive function at follow-up was predicted by EPA, DHA, PAI, the EPA by PAI interaction, or the DHA by PAI interaction. The absence of an effect may be due to a small sample resulting in limited power to find an effect, the lack of change in physical activity over time in terms of volume and/or intensity, or a combination of both reduced power and no change in physical activity. Study 2a was a cross-sectional study using cognitively unimpaired older adults to examine the individual and interactive effects of LCn3 and PAI on executive function. Several possible explanations for the absence of an effect were identified. From this consideration of alternative explanations it was hypothesised that post-onset interventions with LCn3 either alone or in interation with self-reported physical activity may not be beneficial in MCI. Thus executive function responses to the individual and interactive effects of physical activity and LCn3 were examined in a sample of older male and female adults without cognitive impairment (n = 50). A further aim of study 2a was to operationalise executive function using principal components analysis (PCA) of several executive function tests. This approach was used firstly as a data reduction technique to overcome the task impurity problem, and secondly to examine the executive function structure of the sample for evidence of de-differentiation. Two executive function components were identified as a result of the PCA (EF 1 and EF 2). However, EPA, DHA, the PAI, or the EPA by PAI or DHA by PAI interactions did not account for any variance in the executive function components in subsequent hierarchical multiple regressions. Study 2b was an exploratory correlational study designed to explore the possibility that immune and endocrine system biomarkers may act as mediators of the relationship between LCn3, PAI, the interaction between LCn3 and PAI, and executive functions. Insulin-like growth factor-1 (IGF-1), an endocrine system growth hormone, and interleukin-6 (IL-6) an immune system cytokine involved in the acute inflammatory response, have both been shown to affect cognition including executive functions. Moreover, IGF-1 and IL-6 have been shown to be antithetical in so far as chronically increased IL-6 has been associated with reduced IGF-1 levels, a relationship that has been linked to age related morbidity. Further, physical activity and LCn3 have been shown to modulate levels of both IGF-1 and IL-6. Thus, it is possible that the cognitive enhancing effects of LCn3, physical activity or their interaction are mediated by changes in the balance between IL-6 and IGF-1. Partial and non-parametric correlations were conducted in a subsample of participants from Study 2a (n = 13) to explore these relationships. Correlations of interest did not reach significance; however, the coefficients were quite large for several relationships suggesting studies with larger samples may be warranted. In summary, the current program of research found some evidence supporting an interaction between EPA, not DHA, and higher energy expenditure via physical activity in differentiating between older adults with and without MCI. However, a RCT examining executive function in older adults with MCI found no support for increasing EPA or DHA while maintaining current levels of energy expenditure. Furthermore, a cross-sectional study examining executive function in older adults without MCI found no support for better executive function performance as a function of increased EPA or DHA consumption, greater energy expenditure via physical activity or an interaction between physical activity and either EPA or DHA. Finally, an examination of endocrine and immune system biomarkers revealed promising relationships in terms of executive function in non-MCI older adults particularly with respect to LCn3 and physical activity. Taken together, these findings demonstrate a potential benefit of increasing physical activity and LCn3 consumption, particularly EPA, in mitigating the risk of developing MCI. In contrast, no support was found for a benefit to executive function as a result of increased physical activity, LCn3 consumption or an interaction between physical activity and LCn3, in participants with and without MCI. These results are discussed with reference to previous findings in the literature including possible limitations and opportunities for future research.
Resumo:
Three native freshwater crayfish Cherax species are farmed in Australia namely; Redclaw (Cherax quadricarinatus), Marron (C. tenuimanus), and Yabby (C. destructor). Lack of appropriate data on specific nutrient requirements for each of these species, however, has constrained development of specific formulated diets and hence current use of over-formulated feeds or expensive marine shrimp feeds, limit their profitability. A number of studies have investigated nutritional requirements in redclaw that have focused on replacing expensive fish meal in formulated feeds with non-protein, less expensive substitutes including plant based ingredients. Confirmation that freshwater crayfish possess endogenous cellulase genes, suggests their potential ability to utilize complex carbohydrates like cellulose as nutrient sources in their diet. To date, studies have been limited to only C. quadricarinatus and C. destructor and no studies have compared the relative ability of each species to utilize soluble cellulose in their diets. Individual feeding trials of late-juveniles of each species were conducted separately in an automated recirculating culture system over 12 week cycles. Animals were fed either a test diet (TD) that contained 20% soluble cellulose or a reference diet (RD) substituted with the same amount of corn starch. Water temperature, conductivity and pH were maintained at constant and optimum levels for each species. Animals were fed at 3% of their body weight twice daily and wet body weight was recorded bi-weekly. At the end of experiment, all animals were harvested, measured and midgut gland extracts assayed for alpha-amylase, total protease and cellulase activity levels. After the trial period, redclaw fed with RD showed significantly higher (p<0.05) specific growth rate (SGR) compare with animals fed the TD while SGR of marron and yabby fed the two diets were not significantly different (p<0.05). Cellulase expression levels in redclaw were not significantly different between diets. Marron and yabby showed significantly higher cellulase activity when fed the RD. Amylase and protease activity in all three species were significantly higher in the animals fed with RD (Table 1). These results indicate that test animals of all species can utilize starch better than dietary soluble cellulose in their diet and inclusion of 20% soluble cellulose in diets does not appear to have any significant negative effect on their growth rate but survival was impacted in C. quadricarinatus while not in C. tenuimanus or C. destructor.
Resumo:
Background: Gestational diabetes mellitus (GDM) is increasing, along with obesity and type 2 diabetes (T2DM), with Aboriginal and Torres Strait Islander people* in Australia particularly affected. GDM causes serious complications in pregnancy, birth, and the longer term, for both women and their infants. Women diagnosed with GDM have an eightfold risk of developing T2DM after pregnancy, compared with women who have not had GDM. Indigenous women have an even higher risk, at a younger age, and progress more quickly from GDM to T2DM, compared to non-Indigenous women. If left undetected and untreated, T2DM can lead to heart disease, stroke, renal disease, kidney failure, amputations and blindness. A GDM diagnosis offers a ‘window of opportunity’ for diabetes health interventions and it is vital that acceptable and effective prevention, treatment, and post-pregnancy care are provided. Low rates of post-pregnancy screening for T2DM are reported among non-Aboriginal women in Australia and among Indigenous women in other countries, however data for Aboriginal women are scarce. Breastfeeding, a healthy diet, and exercise can also help to prevent T2DM, and together with T2DM screening are recommended elements of ‘post-pregnancy care’ for women with GDM, This paper describes methods for a data linkage study to investigate rates of post-pregnancy care among women with GDM. Methods/Design: This retrospective cohort includes all women who gave birth at Cairns Base Hospital in Far North Queensland, Australia, from 2004 to 2010, coded as having GDM in the Cairns Base Hospital Clinical Coding system. Data linkage is being conducted with the Queensland Perinatal Data Collection, and three laboratories. Hospital medical records are being reviewed to validate the accuracy of GDM case ascertainment, and gather information on breastfeeding and provision of dietary advice. Multiple logistic regression is being used to compare post-pregnancy care between Aboriginal and non-Aboriginal women, while adjusting for other factors may impact on post-pregnancy care. Survival analysis is being used to estimate the rates of progression from GDM to T2DM. Discussion: There are challenges to collecting post-pregnancy data for women with GDM. However, research is urgently needed to ensure adequate post-pregnancy care is provided for women with GDM in Australia.
Resumo:
This article examines the recent emergence of cookbooks written for Aboriginal and Torres Strait Islander people in Australia. The cookbooks are health promotion initiatives, developed through a desire to improve the health status of Indigenous Australians. They focus on nutritious, family meals that can be cooked on a low budget. In this article, the authors argue that the cookbooks designed for Aboriginal and Torres Strait Islander people are developed within a Western paradigm of health and nutrition that subtly reinforces Western approaches to food and disregards traditional diets. While the authors recognize the value of the cookbooks as health promotion tools, they suggest that cookbooks centred around Indigenous foodways – with a focus on traditional ingredients and traditional cooking methods – may be more appropriate for improving the health of Indigenous people and helping Indigenous cultures to thrive. They advocate for a decolonizing approach to food and nutrition, that specifically promotes Indigenous traditions and culture, and incorporates traditional foodways into modern recipes.