675 resultados para Healthy Eating Index
Provincial mortality in South Africa, 2000 - priority-setting for now and a benchmark for the future
Resumo:
Background. Cause-of-death statistics are an essential component of health information. Despite improvements, underregistration and misclassification of causes make it difficult to interpret the official death statistics. Objective. To estimate consistent cause-specific death rates for the year 2000 and to identify the leading causes of death and premature mortality in the provinces. Methods. Total number of deaths and population size were estimated using the Actuarial Society of South Africa ASSA2000 AIDS and demographic model. Cause-of-death profiles based on Statistics South Africa's 15% sample, adjusted for misclassification of deaths due to ill-defined causes and AIDS deaths due to indicator conditions, were applied to the total deaths by age and sex. Age-standardised rates and years of life lost were calculated using age weighting and discounting. Results. Life expectancy in KwaZulu-Natal and Mpumalanga is about 10 years lower than that in the Western Cape, the province with the lowest mortality rate. HIV/AIDS is the leading cause of premature mortality for all provinces. Mortality due to pre-transitional causes, such as diarrhoea, is more pronounced in the poorer and more rural provinces. In contrast, non-communicable disease mortality is similar across all provinces, although the cause profiles differ. Injury mortality rates are particularly high in provinces with large metropolitan areas and in Mpumalanga. Conclusion. The quadruple burden experienced in all provinces requires a broad range of interventions, including improved access to health care; ensuring that basic needs such as those related to water and sanitation are met; disease and injury prevention; and promotion of a healthy lifestyle. High death rates as a result of HIV/AIDS highlight the urgent need to accelerate the implementation of the treatment and prevention plan. In addition, there is an urgent need to improve the cause-of-death data system to provide reliable cause-of-death statistics at health district level.
Resumo:
Background Unlike leisure time physical activity, knowledge of the socioeconomic determinants of active transport is limited, research on this topic has produced mixed and inconsistent findings, and it remains unknown if peoples’ engagement in active transport declines as they age. This longitudinal study examined relationships between neighbourhood disadvantage, individual-level socioeconomic position and walking for transport (WfT) during mid- and early old-age (40 – 70 years). Three questions were addressed: (i) which socioeconomic groups walk for transport, (ii) does the amount of walking change over time as people age, and (iii) is the change socioeconomically patterned? Methods The data come from the HABITAT study of physical activity, a bi-annual multilevel longitudinal survey of 11,036 residents of 200 neighbourhoods in Brisbane, Australia. At each wave (2007, 2009 and 2011) respondents estimated the duration (minutes) of WfT in the previous 7 days. Neighbourhood disadvantage was measured using a census-derived index comprising 17 different socioeconomic components, and individual-level socioeconomic position was measured using education, occupation, and household income. The data were analysed using multilevel mixed-effects logistic and linear regression. Results The odds of being defined as a ‘never walker’ were significantly lower for residents of disadvantaged neighbourhoods, but significantly higher for the less educated, blue collar employees, and members of lower income households. WfT declined significantly over time as people aged and the declines were more precipitous for older persons. Average minutes of WfT declined for all neighbourhoods and most socioeconomic groups; however, the declines were steeper for the retired and members of low income households. Conclusions Designing age-friendly neighbourhoods might slow or delay age-related declines in WfT and should be a priority. Steeper declines in WfT among residents of low income households may reflect their poorer health status and the impact of adverse socioeconomic exposures over the life course. Each of these declines represents a significant challenge to public health advocates, urban designers, and planners in their attempts to keep people active and healthy in their later years of life.
Resumo:
Objectives To estimate the burden of disease attributed to low fruit and vegetable intake by sex and age group in South Africa for the year 2000. Design The analysis follows the World Health Organization comparative risk assessment (CRA) methodology. Populationattributable fractions were calculated from South African prevalence data from dietary surveys and applied to the revised South African burden of disease estimates for 2000. A theoretical maximum distribution of 600 g per day for fruit and vegetable intake was chosen. Monte Carlo simulationmodelling techniques were used for uncertainty analysis. Setting South Africa. Subjects Adults ≥ 15 years. Outcome measures Mortality and disability-adjusted life years (DALYs), from ischaemic heart disease, ischaemic stroke, lung cancer, gastric cancer, colorectal cancer and oesophageal cancer. Results Low fruit and vegetable intake accounted for 3.2% of total deaths and 1.1% of the 16.2 million attributable DALYs. For both males and females the largest proportion of total years of healthy life lost attributed to low fruit and vegetable intake was for ischaemic heart disease (60.6% and 52.2%, respectively). Ischaemic stroke accounted for 17.8% of attributable DALYs for males and 32.7% for females. For the related cancers, the leading attributable DALYs for men and women were oesophageal cancer (9.8% and 7.0%, respectively) and lung cancer (7.8% and 4.7%, respectively). Conclusions A high intake of fruit and vegetables can make a significant contribution to decreasing mortality from certain diseases. The challenge lies in creating the environment that facilitates changes in dietary habits such as the increased intake of fruit and vegetables.
Resumo:
There is considerable evidence for the efficacy of physical activity, diet and weight loss interventions in improving health outcomes for cancer survivors, but limited uptake into practice. Healthy Living after Cancer (HLaC) is an evidence-based, telephone-delivered lifestyle intervention targeting cancer survivors. This paper describes the translation of HLaC into practice in partnership with Australian state-based Cancer Councils.
Resumo:
Parental controlling feeding practices have been directly associated with maladaptive child eating behaviours, such as Eating in the Absence of Hunger (EAH). The aim of this study was to examine EAH in very young children (3-4 years old) and to investigate the association between maternal controlling feeding practices and energy intake from a standardised selection of snacks consumed ‘in the absence of hunger’. Thirty-seven mother-child dyads enrolled in the NOURISH RCT participated in a modified EAH protocol conducted in the child’s home. All children displayed EAH, despite 80% reporting to be full or very full following completion of lunch 15 minutes earlier. The relationship between maternal and child covariates and controlling feeding practices and EAH were examined using non-parametric tests, and were stratified by child gender. For boys only, pressure to eat was positively associated with EAH. Neither restriction nor monitoring practices were associated with EAH in either boys or girls. Overall, the present findings suggest gender differences in the relationship between maternal feeding practices and children’s eating behaviours emerge early and should be considered in future research and intervention design.
Resumo:
Abstract Background The purpose of this study was the development of a valid and reliable “Mechanical and Inflammatory Low Back Pain Index” (MIL) for assessment of non-specific low back pain (NSLBP). This 7-item tool assists practitioners in determining whether symptoms are predominantly mechanical or inflammatory. Methods Participants (n = 170, 96 females, age = 38 ± 14 years-old) with NSLP were referred to two Spanish physiotherapy clinics and completed the MIL and the following measures: the Roland Morris Questionnaire (RMQ), SF-12 and “Backache Index” (BAI) physical assessment test. For test-retest reliability, 37 consecutive patients were assessed at baseline and three days later during a non-treatment period. Face and content validity, practical characteristics, factor analysis, internal consistency, discriminant validity and convergent validity were assessed from the full sample. Results A total of 27 potential items that had been identified for inclusion were subsequently reduced to 11 by an expert panel. Four items were then removed due to cross-loading under confirmatory factor analysis where a two-factor model yielded a good fit to the data (χ2 = 14.80, df = 13, p = 0.37, CFI = 0.98, and RMSEA = 0.029). The internal consistency was moderate (α = 0.68 for MLBP; 0.72 for ILBP), test-retest reliability high (ICC = 0.91; 95%CI = 0.88-0.93) and discriminant validity good for either MLBP (AUC = 0.74) and ILBP (AUC = 0.92). Convergent validity was demonstrated through similar but weak correlations between the ILBP and both the RMQ and BAI (r = 0.34, p < 0.001) and the MLBP and BAI (r = 0.38, p < 0.001). Conclusions The MIL is a valid and reliable clinical tool for patients with NSLBP that discriminates between mechanical and inflammatory LBP. Keywords: Low back pain; Psychometrics properties; Pain measurement; Screening tool; Inflammatory; Mechanical
Resumo:
Background Symptom burden in chronic kidney disease (CKD) is poorly understood. To date, the majority of research focuses on single symptoms and there is a lack of suitable multidimensional symptom measures. The purpose of this study was to modify, translate, cross-culturally adapt and psychometrically analyse the Dialysis Symptom Index (DSI). Methods The study methods involved four phases: modification, translation, pilot-testing with a bilingual non-CKD sample and then psychometric testing with the target population. Content validity was assessed using an expert panel. Inter-rater agreement, test-retest reliability and Cronbach’s alpha coefficient were calculated to demonstrate reliability of the modified DSI. Discriminative and convergent validity were assessed to demonstrate construct validity. Results Content validity index during translation was 0.98. In the pilot study with 25 bilingual students a moderate to perfect agreement (Kappa statistic = 0.60-1.00) was found between English and Arabic versions of the modified DSI. The main study recruited 433 patients CKD with stages 4 and 5. The modified DSI was able to discriminate between non-dialysis and dialysis groups (p < 0.001) and demonstrated convergent validity with domains of the Kidney Disease Quality of Life short form. Excellent test-retest and internal consistency (Cronbach’s α = 0.91) reliability were also demonstrated. Conclusion The Arabic version of the modified DSI demonstrated good psychometric properties, measures the multidimensional nature of symptoms and can be used to assess symptom burden at different stages of CKD. The modified instrument, renamed the CKD Symptom Burden Index (CKD-SBI), should encourage greater clinical and research attention to symptom burden in CKD.
Resumo:
Objective Explosive ordnance disposal (EOD) often requires technicians to wear multiple protective garments in challenging environmental conditions. The accumulative effect of increased metabolic cost coupled with decreased heat dissipation associated with these garments predisposes technicians to high levels of physiological strain. It has been proposed that a perceptual strain index (PeSI) using subjective ratings of thermal sensation and perceived exertion as surrogate measures of core body temperature and heart rate, may provide an accurate estimation of physiological strain. Therefore, this study aimed to determine if the PeSI could estimate the physiological strain index (PSI) across a range of metabolic workloads and environments while wearing heavy EOD and chemical protective clothing. Methods Eleven healthy males wore an EOD and chemical protective ensemble while walking on a treadmill at 2.5, 4 and 5.5 km·h− 1 at 1% grade in environmental conditions equivalent to wet bulb globe temperature (WBGT) 21, 30 and 37 °C. WBGT conditions were randomly presented and a maximum of three randomised treadmill walking trials were completed in a single testing day. Trials were ceased at a maximum of 60-min or until the attainment of termination criteria. A Pearson's correlation coefficient, mixed linear model, absolute agreement and receiver operating characteristic (ROC) curves were used to determine the relationship between the PeSI and PSI. Results A significant moderate relationship between the PeSI and the PSI was observed [r = 0.77; p < 0.001; mean difference = 0.8 ± 1.1 a.u. (modified 95% limits of agreement − 1.3 to 3.0)]. The ROC curves indicated that the PeSI had a good predictive power when used with two, single-threshold cut-offs to differentiate between low and high levels of physiological strain (area under curve: PSI three cut-off = 0.936 and seven cut-off = 0.841). Conclusions These findings support the use of the PeSI for monitoring physiological strain while wearing EOD and chemical protective clothing. However, future research is needed to confirm the validity of the PeSI for active EOD technicians operating in the field.
Numerical investigation of motion and deformation of a single red blood cell in a stenosed capillary
Resumo:
It is generally assumed that influence of the red blood cells (RBCs) is predominant in blood rheology. The healthy RBCs are highly deformable and can thus easily squeeze through the smallest capillaries having internal diameter less than their characteristic size. On the other hand, RBCs infected by malaria or other diseases are stiffer and so less deformable. Thus it is harder for them to flow through the smallest capillaries. Therefore, it is very important to critically and realistically investigate the mechanical behavior of both healthy and infected RBCs which is a current gap in knowledge. The motion and the steady state deformed shape of the RBCs depend on many factors, such as the geometrical parameters of the capillary through which blood flows, the membrane bending stiffness and the mean velocity of the blood flow. In this study, motion and deformation of a single two-dimensional RBC in a stenosed capillary is explored by using smoothed particle hydrodynamics (SPH) method. An elastic spring network is used to model the RBC membrane, while the RBC's inside fluid and outside fluid are treated as SPH particles. The effect of RBC's membrane stiffness (kb), inlet pressure (P) and geometrical parameters of the capillary on the motion and deformation of the RBC is studied. The deformation index, RBC's mean velocity and the cell membrane energy are analyzed when the cell passes through the stenosed capillary. The simulation results demonstrate that the kb, P and the geometrical parameters of the capillary have a significant impact on the RBCs' motion and deformation in the stenosed section.
Resumo:
Heavy metal pollution of sediments is a growing concern in most parts of the world, and numerous studies focussed on identifying contaminated sediments by using a range of digestion methods and pollution indices to estimate sediment contamination have been described in the literature. The current work provides a critical review of the more commonly used sediment digestion methods and identifies that weak acid digestion is more likely to provide guidance on elements that are likely to be bioavailable than other traditional methods of digestion. This work also reviews common pollution indices and identifies the Nemerow Pollution Index as the most appropriate method for establishing overall sediment quality. Consequently, a modified Pollution Index that can lead to a more reliable understanding of whole sediment quality is proposed. This modified pollution index is then tested against a number of existing studies and demonstrated to give a reliable and rapid estimate of sediment contamination and quality.
Resumo:
Background The Spine Functional Index (SFI) is a patient reported outcome measure with sound clinimetric properties and clinical viability for the determination of whole-spine impairment. To date, no validated Turkish version is available. The purpose of this study is to cross-culturally adapted the SFI for Turkish-speaking patients (SFI-Tk) and determine the psychometric properties of reliability, validity and factor structure in a Turkish population with spine musculoskeletal disorders. Methods The SFI English version was culturally adapted and translated into Turkish using a double forward and backward method according to established guidelines. Patients (n = 285, cervical = l29, lumbar = 151, cervical and lumbar region = 5, 73% female, age 45 ± 1) with spine musculoskeletal disorders completed the SFI-Tk at baseline and after a seven day period for test-retest reliability. For criterion validity the Turkish version of the Functional Rating Index (FRI) was used plus the Neck Disability Index (NDI) for cervical patients and the Oswestry Disability Index (ODI) for back patients. Additional psychometric properties were determined for internal consistency (Chronbach’s α), criterion validity and factor structure. Results There was a high degree of internal consistency (α = 0.85, item range 0.80-0.88) and test-retest reliability (r = 0.93, item range = 0.75-0.95). The factor analysis demonstrated a one-factor solution explaining 24.2% of total variance. Criterion validity with the ODI was high (r = 0.71, p < 0.001) while the FRI and NDI were fair (r = 0.52 and r = 0.58, respectively). The SFI-Tk showed no missing responses with the ‘half-mark’ option used in 11.75% of total responses by 77.9% of participants. Measurement error from SEM and MDC90 were respectively 2.96% and 7.12%. Conclusions The SFI-Tk demonstrated a one-factor solution and is a reliable and valid instrument. The SFI-Tk consists of simple and easily understood wording and may be used to assess spine region musculoskeletal disorders in Turkish speaking patients.
Resumo:
Stress is implicated in the development and course of psychotic illness, but the factors that influence stress levels are not well understood. The aim of this study was to examine the impact of neuropsychological functioning and coping styles on perceived stress in people with first-episode psychosis (FEP) and healthy controls (HC). Thirty-four minimally treated FEP patients from the Early Psychosis Prevention and Intervention Centre, Melbourne, Australia, and 26 HC participants from a similar demographic area participated in the study. Participants completed a comprehensive neuropsychological test battery as well as the Coping Inventory for Stressful Situations (task-, emotion- and avoidance-focussed coping styles) and Perceived Stress Scale (PSS). Linear regressions were used to determine the contribution of neuropsychological functioning and coping style to perceived stress in the two groups. In the FEP group, higher levels of emotion-focussed and lower levels of task-focussed coping were associated with elevated stress. Higher premorbid IQ and working memory were also associated with higher subjective stress. In the HC group, higher levels of emotion-focussed coping, and contrary to the FEP group, lower premorbid IQ, working memory and executive functioning, were associated with increased stress. Lower intellectual functioning may provide some protection against perceived stress in FEP.