963 resultados para Nutritional Physiological Phenomena
Resumo:
Does exercise promote weight loss? One of the key problems with studies assessing the efficacy of exercise as a method of weight management and obesityis that mean data are presented and the individual variability in response is overlooked. Recent data have highlighted the need to demonstrate and characterise the individual variability in response to exercise. Do people who exercise compensate for the increase in energy expenditure via compensatory increases in hunger and food intake? The authors address the physiological, psychological and behavioural factors potentially involved in the relationship between exercise and appetite, and identify the research questions that remain unanswered. A negative consequence of the phenomena of individual variability and compensatory responses has been the focus on those who lose little weight in response to exercise; this has been used unreasonably as evidence to suggest that exercise is a futile method of controlling weight and managing obesity. Most of the evidence suggests that exercise is useful for improving body composition and health. For example, when exercise-induced mean weight loss is <1.0 kg, significant improvements in aerobic capacity (+6.3 ml/kg/min), systolic (−6.00 mm Hg) and diastolic (−3.9 mm Hg) blood pressure, waist circumference (−3.7 cm) and positive mood still occur. However, people will vary in their responses to exercise; understanding and characterising this variability will help tailor weight loss strategies to suit individuals.
Resumo:
Background & Aims: Inadequate feeding assistance and mealtime interruptions during hospitalisation may contribute to malnutrition and poor nutritional intake in older people. This study aimed to implement and compare three interventions designed to specifically address mealtime barriers and improve energy intakes of medical inpatients aged ≥65 years. Methods: Pre-post study compared three mealtime assistance interventions: PM: Protected Mealtimes with multidisciplinary education; AIN: additional assistant-in-nursing (AIN) with dedicated meal role; PM+AIN: combined intervention. Dietary intake of 254 patients (pre: n=115, post: n=141; mean age 80±8) was visually estimated on a single day in the first week of hospitalisation and compared with estimated energy requirements. Assistance activities were observed and recorded. Results: Mealtime assistance levels significantly increased in all interventions (p<0.01). Post-intervention participants were more likely to achieve adequate energy intake (OR=3.4, p=0.01), with no difference noted between interventions (p=0.29). Patients with cognitive impairment or feeding dependency appeared to gain substantial benefit from mealtime assistance interventions. Conclusions: Protected Mealtimes and additional AIN assistance (implemented alone or in combination) may produce modest improvements in nutritional intake. Targeted feeding assistance for certain patient groups holds promise; however, alternative strategies are required to address the complex problem of malnutrition in this population.
Resumo:
Aim: Maternal obesity is associated with increased risk of adverse outcomes for mothers and offspring. Strategies to better manage maternal obesity are urgently needed; however, there is little evidence to assist the development of nutrition interventions during antenatal care. The present study aimed to assess maternal weight gain and dietary intakes of overweight and obese women participating in an exercise trial. Results will assist the development of interventions for the management of maternal overweight and obesity. Methods: Fifty overweight and obese pregnant women receiving antenatal care were recruited and provided dietary and weight data at baseline (12 weeks), 28 weeks, 36 weeks gestation and 6 weeks post-partum. Data collected were compared with current nutritional and weight gain recommendations. Associations used Pearson's correlation coefficient, and ANOVA assessed dietary changes over time, P < 0.05. Results: Mean prepregnancy body mass index was 34.4 ± 6.6 kg/m2. Gestational weight gain was 10.6 ± 6 kg with a wide range (−4.1 to 23.0 kg). 52% of women gained excessive weight (>11.5 kg for overweight and >9 kg for obese women). Gestational weight gain correlated with post-partum weight retention (P < 0.001). Dietary intakes did not change significantly during pregnancy. No women achieved dietary fat or dietary iron recommendations, only 11% achieved adequate dietary folate, and 38% achieved adequate dietary calcium. Very few women achieved recommended food group servings for pregnancy, with 83% consuming excess servings of non-core foods. Conclusion: Results provide evidence that early intervention and personalised support for obese pregnant women may help achieve individualised goals for maternal weight gain and dietary adequacy, but this needs to be tested in a clinical setting.
Resumo:
This study examined physiological and performance effects of pre-cooling on medium-fast bowling in the heat. Ten, medium-fast bowlers completed two randomised trials involving either cooling (mixed-methods) or control (no cooling) interventions before a 6-over bowling spell in 31.9±2.1°C and 63.5±9.3% relative humidity. Measures included bowling performance (ball speed, accuracy and run-up speeds), physical characteristics (global positioning system monitoring and counter-movement jump height), physiological (heart rate, core temperature, skin temperature and sweat loss), biochemical (serum concentrations of damage, stress and inflammation) and perceptual variables (perceived exertion and thermal sensation). Mean ball speed (114.5±7.1 vs. 114.1±7.2 km · h−1; P = 0.63; d = 0.09), accuracy (43.1±10.6 vs. 44.2±12.5 AU; P = 0.76; d = 0.14) and total run-up speed (19.1±4.1 vs. 19.3±3.8 km · h−1; P = 0.66; d = 0.06) did not differ between pre-cooling and control respectively; however 20-m sprint speed between overs was 5.9±7.3% greater at Over 4 after pre-cooling (P = 0.03; d = 0.75). Pre-cooling reduced skin temperature after the intervention period (P = 0.006; d = 2.28), core temperature and pre-over heart rates throughout (P = 0.01−0.04; d = 0.96−1.74) and sweat loss by 0.4±0.3 kg (P = 0.01; d = 0.34). Mean rating of perceived exertion and thermal sensation were lower during pre-cooling trials (P = 0.004−0.03; d = 0.77−3.13). Despite no observed improvement in bowling performance, pre-cooling maintained between-over sprint speeds and blunted physiological and perceptual demands to ease the thermoregulatory demands of medium-fast bowling in hot conditions.
Resumo:
The aim of this study was to investigate the effect of court surface (clay v hard-court) on technical, physiological and perceptual responses to on-court training. Four high-performance junior male players performed two identical training sessions on hard and clay courts, respectively. Sessions included both physical conditioning and technical elements as led by the coach. Each session was filmed for later notational analysis of stroke count and error rates. Further, players wore a global positioning satellite device to measure distance covered during each session; whilst heart rate, countermovement jump distance and capillary blood measures of metabolites were measured before, during and following each session. Additionally a respective coach and athlete rating of perceived exertion (RPE) were measured following each session. Total duration and distance covered during of each session were comparable (P>0.05; d<0.20). While forehand and backhands stroke volume did not differ between sessions (P>0.05; d<0.30); large effects for increased unforced and forced errors were present on the hard court (P>0.05; d>0.90). Furthermore, large effects for increased heart rate, blood lactate and RPE values were evident on clay compared to hard courts (P>0.05; d>0.90). Additionally, while player and coach RPE on hard courts were similar, there were large effects for coaches to underrate the RPE of players on clay courts (P>0.05; d>0.90). In conclusion, training on clay courts results in trends for increased heart rate, lactate and RPE values, suggesting sessions on clay tend towards higher physiological and perceptual loads than hard courts. Further, coaches appear effective at rating player RPE on hard courts, but may underrate the perceived exertion of sessions on clay courts.
Resumo:
People with Parkinson’s disease (PD) are at higher risk of malnutrition due to PD symptoms and pharmacotherapy side effects. Poorer outcomes are associated with higher amounts of weight loss (>5%) and lower levels of fat free mass. When pharmacotherapy is no longer effective for symptom control, deep-brain stimulation (DBS) surgery may be considered. People with PD scheduled for DBS surgery were recruited from a Brisbane neurological clinic (n=11 out of 16). The Scale for Outcomes of Parkinson’s disease –Autonomic (SCOPA-AUT), Modified Constipation Assessment Scale (MCAS), and a 3-day food diary were mailed to participants’ homes for completion prior to hospital admission. During admission, the Patient-Generated Subjective Global Assessment (PG-SGA), weight, height and body composition were assessed. Mean(±s.d.) PD duration from diagnosis and time since occurrence of PD symptoms was 9.0(±8.0) and 12(±8.8) years, respectively. Five participants reported unintentional weight loss (average loss of 15.6%). PD duration but not years since symptom onset significantly predicted PG-SGA scores (β=4.2, t(8)=2.7, p<.05). Both were positively correlated with PG-SGA score (r = .667, r=.587). On average, participants classified as well-nourished (SGA-A) (n=4) were younger, had shorter disease durations, lower PG-SGA scores, higher body mass (BMI) and fat free mass (FFMI) indices when compared to malnourished participants (SGA-B) (n=7). They also reported fewer non-motor symptoms on the SCOPA-AUT and MCAS. Three participants had previously received dietetic advice but not in relation to PD. These findings demonstrate that malnutrition remains unrecognised and untreated in this group despite unintentional weight loss and a high prevalence of malnutrition.
Resumo:
Objectives: People with Parkinson’s disease (PD) are at higher risk of malnutrition due to PD symptoms and pharmacotherapy side effects. When pharmacotherapy is no longer effective for symptom control, deep-brain stimulation (DBS) surgery may be considered. The aim of this study was to assess the nutritional status of people with PD who may be at higher risk of malnutrition related to unsatisfactory symptom management with optimised medical therapy. Design: This was an observational study using a convenience sample. Setting: Participants were seen during their hospital admission for their deep brain stimulation surgery. Participants: People with PD scheduled for DBS surgery were recruited from a Brisbane neurological clinic (n=15). Measurements: The Patient-Generated Subjective Global Assessment (PG-SGA), weight, height and body composition were assessed to determine nutritional status. Results: Six participants (40%) were classified as moderately malnourished (SGA-B). Eight participants (53%) reported previous unintentional weight loss (average loss of 13.3%). On average, participants classified as well-nourished (SGA-A) were younger, had shorter disease durations, lower PG-SGA scores, higher body mass (BMI) and fat free mass indices (FFMI) when compared to malnourished participants (SGA-B). Five participants had previously received dietetic advice but only one in relation to unintentional weight loss. Conclusion: Malnutrition remains unrecognised and untreated in this group despite unintentional weight loss and presence of nutrition impact symptoms. Improving nutritional status prior to surgery may improve surgical outcomes.
Resumo:
Background Undernutrition, weight loss and dehydration are major clinical issues for people with dementia in residential care, with excessive weight loss contributing to increased risk of frailty, immobility, illness and premature morbidity. This paper discusses a nutritional knowledge and attitudes survey conducted as part of a larger project focused on improving nutritional intake of people with dementia within a residential care facility in Brisbane, Australia. Aims The specific aims of the survey were to identify (i) knowledge of the nutritional needs of aged care facility residents; (ii) mealtime practices; and (iii) attitudes towards mealtime practices and organisation. Methods A survey based on those used in other healthcare settings was completed by 76 staff members. The survey included questions about nutritional knowledge, opinions of the food service, frequency of feeding assistance provided and feeding assessment practices. Results Nutritional knowledge scores ranged from 1 to 9 of a possible 10, with a mean score of 4.67. While 76% of respondents correctly identified risk factors associated with malnutrition in nursing home residents, only 38% of participants correctly identified the need for increased protein and energy in residents with pressure ulcers, and just 15% exhibited correct knowledge of fluid requirements. Further, while nutritional assessment was considered an important part of practice by 83% of respondents, just 53% indicated that they actually carried out such assessments. Identified barriers to promoting optimal nutrition included insufficient time to observe residents (56%); being unaware of residents' feeding issues (46%); poor knowledge of nutritional assessments (44%); and unappetising appearance of food served (57%). Conclusion An important step towards improving health and quality of life for residents of aged care facilities would be to enhance staff nutritional awareness and assessment skills. This should be carried out through increased attention to both preservice curricula and on-the-job training. Implications for practice The residential facility staff surveyed demonstrated low levels of nutrition knowledge, which reflects findings from the international literature. This has implications for the provision of responsive care to residents of these facilities and should be explored further.
Resumo:
Background & aims: One aim of the Australasian Nutrition Care Day Survey was to determine the nutritional status and dietary intake of acute care hospital patients. Methods: Dietitians from 56 hospitals in Australia and New Zealand completed a 24-h survey of nutritional status and dietary intake of adult hospitalised patients. Nutritional risk was evaluated using the Malnutrition Screening Tool. Participants ‘at risk’ underwent nutritional assessment using Subjective Global Assessment. Based on the International Classification of Diseases (Australian modification), participants were also deemed malnourished if their body mass index was <18.5 kg/m2. Dietitians recorded participants’ dietary intake at each main meal and snacks as 0%, 25%, 50%, 75%, or 100% of that offered. Results: 3122 patients (mean age: 64.6 ± 18 years) participated in the study. Forty-one percent of the participants were “at risk” of malnutrition. Overall malnutrition prevalence was 32%. Fifty-five percent of malnourished participants and 35% of well-nourished participants consumed ≤50% of the food during the 24-h audit. “Not hungry” was the most common reason for not consuming everything offered during the audit. Conclusion: Malnutrition and sub-optimal food intake is prevalent in acute care patients across hospitals in Australia and New Zealand and warrants appropriate interventions.
Resumo:
PURPOSE: This study examined the effects of overnight sleep deprivation on recovery following competitive rugby league matches. METHODS: Eleven male, amateur rugby league players performed two competitive matches, followed by either a normal night's sleep (~8h; CONT) or a sleep deprived night (~0h; SDEP) in a randomised fashion. Testing was conducted the morning of the match, and immediately post-match, 2h post and the next morning (16h post-match). Measures included counter-movement jump (CMJ) distance, knee extensor maximal voluntary contraction (MVC), voluntary activation (VA), venous blood creatine kinase (CK) and C-reactive protein (CRP), perceived muscle soreness and a word-colour recognition cognitive function test. Percent change between post- and 16h post-match was reported to determine the effect of the intervention the next morning. RESULTS: Large effects indicated a greater post- to 16h post-match percentage decline in CMJ distance following SDEP compared to CONT (P=0.10-0.16; d=0.95-1.05). Similarly, the percentage decline in incongruent word-colour reaction times were increased in SDEP trials (P=0.007; d=1.75). Measures of MVC did not differ between conditions (P=0.40-0.75; d=0.13-0.33), though trends for larger percentage decline in VA were detected in SDEP (P=0.19; d=0.84). Further, large effects indicated higher CK and CRP responses 16h post-match during SDEP compared to CONT (P=0.11-0.87; d=0.80-0.88). CONCLUSIONS: Sleep deprivation negatively affected recovery following a rugby league match, specifically impairing CMJ distance and cognitive function. Practitioners should promote adequate post-match sleep patterns or adjust training demands the next day to accommodate the altered physical and cognitive state following sleep deprivation.
Resumo:
Currently there is confusion about the value of using nutritional support to treat malnutrition and improve functional outcomes in chronic obstructive pulmonary disease (COPD). This systematic review and meta-analysis of randomised controlled trials (RCTs) aimed to clarify the effectiveness of nutritional support in improving functional outcomes in COPD. A systematic review identified 12 RCTs (n = 448) in stable COPD patients investigating the effects of nutritional support [dietary advice (1 RCT), oral nutritional supplements (ONS; 10 RCTs), enteral tube feeding (1 RCT)] versus control on functional outcomes. Meta-analysis of the changes induced by intervention found that whilst respiratory function (FEV(1,) lung capacity, blood gases) was unresponsive to nutritional support, both inspiratory and expiratory muscle strength (PI max +3.86 SE 1.89 cm H(2) O, P = 0.041; PE max +11.85 SE 5.54 cm H(2) O, P = 0.032) and handgrip strength (+1.35 SE 0.69 kg, P = 0.05) were significantly improved, and associated with weight gains of ≥ 2 kg. Nutritional support produced significant improvements in quality of life in some trials, although meta-analysis was not possible. It also led to improved exercise performance and enhancement of exercise rehabilitation programmes. This systematic review and meta-analysis demonstrates that nutritional support in COPD results in significant improvements in a number of clinically relevant functional outcomes, complementing a previous review showing improvements in nutritional intake and weight.
Resumo:
In this thesis, I contribute to the study of how arrangements are made in social interaction. Using conversation analysis, I examine a corpus of 375 telephone calls between employees and clients of three Community Home Care (CHC) service agencies in metropolitan Adelaide, South Australia. My analysis of the CHC data corpus draws upon existing empirical findings within conversation analysis in order to generate novel findings about how people make arrangements with one another, and some of the attendant considerations that parties to such an activity can engage in: Prospective informings as remote proposals for a future arrangement – Focusing on how employees make arrangements with clients, I show how the employees in the CHC data corpus use ‘prospective informings’ to detail a future course of action that will involve the recipient of that informing. These informings routinely occasion a double-paired sequence, where informers pursue a response to their informing. This pursuit often occurs even after recipients have provided an initial response. This practice for making arrangements has been previously described by Houtkoop (1987) as ‘remote proposing.’ I develop Houtkoop’s analysis to show how an informing of a future arrangement can be recompleted, with response solicitation, as a proposal that is contingent upon a recipient’s acceptance. Participants’ understanding of references to non-present third parties – In the process of making arrangements, references are routinely made to non-present third parties. In the CHC data corpus, these third parties are usually care workers. Prior research (e.g., Sacks & Schegloff, 1979; Schegloff, 1996b) explains how the use of ‘recognitional references’ (such as the bare name ‘Kerry’), conveys to recipients that they should be able to locate the referent from amongst their acquaintances. Conversely, the use of ‘non-recognitional references’ (such as the description ‘a lady called Kerry’), conveys that recipients are unacquainted with the referent. I examine instances where the selection of a recognitional or non-recognitional reference form is followed by a recipient initiating repair on that reference. My analysis provides further evidence thatthe existing analytic account of these references corresponds to the way in which participants themselves make sense of them. My analysis also advances an understanding of how repair can be used, by recipients, to indicate the inappositeness of a prior turn. Post-possible-completion accounts – In a case study of a problematic interaction, I examine a misunderstanding that is not resolved within the repair space, the usual defence of intersubjectivity in interaction (cf. Schegloff, 1992b). Rather, I explore how the source of trouble is addressed, outside of the sequence of its production, with a ‘post-possible-completion account.’ This account specifies the basis of a misunderstanding and yet, unlike repair, does so without occasioning a revised response to a trouble-source turn. By considering various aspects of making arrangements in social interaction, I highlight some of the rich order that underpins the maintenance of human relationships across time. In the concluding section of this thesis I review this order, while also discussing practical implications of this analysis for CHC practice.
Resumo:
Objective: To determine the impact of a free-choice diet on nutritional intake and body condition of feral horses. Animals: Cadavers of 41 feral horses from 5 Australian locations. Procedures: Body condition score (BCS) was determined (scale of 1 to 9), and the stomach was removed from horses during postmortem examination. Stomach contents were analyzed for nutritional variables and macroelement and microelement concentrations. Data were compared among the locations and also compared with recommended daily intakes for horses. Results: Mean BCS varied by location; all horses were judged to be moderately thin. The BCS for males was 1 to 3 points higher than that of females. Amount of protein in the stomach contents varied from 4.3% to 14.9% and was significantly associated with BCS. Amounts of water-soluble carbohydrate and ethanol-soluble carbohydrate in stomach contents of feral horses from all 5 locations were higher than those expected for horses eating high-quality forage. Some macroelement and microelement concentrations were grossly excessive, whereas others were grossly deficient. There was no evidence of ill health among the horses. Conclusions and Clinical Relevance: Results suggested that the diet for several populations of feral horses in Australia appeared less than optimal. However, neither low BCS nor trace mineral deficiency appeared to affect survival of the horses. Additional studies on food sources in these regions, including analysis of water-soluble carbohydrate, ethanol-soluble carbohydrate, and mineral concentrations, are warranted to determine the provenance of such rich sources of nutrients. Determination of the optimal diet for horses may need revision.
Resumo:
Neutrophils serve as an intriguing model for the study of innate immune cellular activity induced by physiological stress. We measured changes in the transcriptome of circulating neutrophils following an experimental exercise trial (EXTRI) consisting of 1 h of intense cycling immediately followed by 1 h of intense running. Blood samples were taken at baseline, 3 h, 48 h, and 96 h post-EXTRI from eight healthy, endurance-trained, male subjects. RNA was extracted from isolated neutrophils. Differential gene expression was evaluated using Illumina microarrays and validated with quantitative PCR. Gene set enrichment analysis identified enriched molecular signatures chosen from the Molecular Signatures Database. Blood concentrations of muscle damage indexes, neutrophils, interleukin (IL)-6 and IL-10 were increased (P < 0.05) 3 h post-EXTRI. Upregulated groups of functionally related genes 3 h post-EXTRI included gene sets associated with the recognition of tissue damage, the IL-1 receptor, and Toll-like receptor (TLR) pathways (familywise error rate, P value < 0.05). The core enrichment for these pathways included TLRs, low-affinity immunoglobulin receptors, S100 calcium binding protein A12, and negative regulators of innate immunity, e.g., IL-1 receptor antagonist, and IL-1 receptor associated kinase-3. Plasma myoglobin changes correlated with neutrophil TLR4 gene expression (r = 0.74; P < 0.05). Neutrophils had returned to their nonactivated state 48 h post-EXTRI, indicating that their initial proinflammatory response was transient and rapidly counterregulated. This study provides novel insight into the signaling mechanisms underlying the neutrophil responses to endurance exercise, suggesting that their transcriptional activity was particularly induced by damage-associated molecule patterns, hypothetically originating from the leakage of muscle components into the circulation.
Resumo:
Human immunodeficiency virus (HIV) that leads to acquired immune deficiency syndrome (AIDs) reduces immune function, resulting in opportunistic infections and later death. Use of antiretroviral therapy (ART) increases chances of survival, however, with some concerns regarding fat re-distribution (lipodystrophy) which may encompass subcutaneous fat loss (lipoatrophy) and/or fat accumulation (lipohypertrophy), in the same individual. This problem has been linked to Antiretroviral drugs (ARVs), majorly, in the class of protease inhibitors (PIs), in addition to older age and being female. An additional concern is that the problem exists together with the metabolic syndrome, even when nutritional status/ body composition, and lipodystrophy/metabolic syndrome are unclear in Uganda where the use of ARVs is on the increase. In line with the literature, the overall aim of the study was to assess physical characteristics of HIV-infected patients using a comprehensive anthropometric protocol and to predict body composition based on these measurements and other standardised techniques. The other aim was to establish the existence of lipodystrophy, the metabolic syndrome, andassociated risk factors. Thus, three studies were conducted on 211 (88 ART-naïve) HIV-infected, 15-49 year-old women, using a cross-sectional approach, together with a qualitative study of secondary information on patient HIV and medication status. In addition, face-to-face interviews were used to extract information concerning morphological experiences and life style. The study revealed that participants were on average 34.1±7.65 years old, had lived 4.63±4.78 years with HIV infection and had spent 2.8±1.9 years receiving ARVs. Only 8.1% of participants were receiving PIs and 26% of those receiving ART had ever changed drug regimen, 15.5% of whom changed drugs due to lipodystrophy. Study 1 hypothesised that the mean nutritional status and predicted percent body fat values of study participants was within acceptable ranges; different for participants receiving ARVs and the HIV-infected ART-naïve participants and that percent body fat estimated by anthropometric measures (BMI and skinfold thickness) and the BIA technique was not different from that predicted by the deuterium oxide dilution technique. Using the Body Mass Index (BMI), 7.1% of patients were underweight (<18.5 kg/m2) and 46.4% were overweight/obese (≥25.0 kg/m2). Based on waist circumference (WC), approximately 40% of the cohort was characterized as centrally obese. Moreover, the deuterium dilution technique showed that there was no between-group difference in the total body water (TBW), fat mass (FM) and fat-free mass (FFM). However, the technique was the only approach to predict a between-group difference in percent body fat (p = .045), but, with a very small effect (0.021). Older age (β = 0.430, se = 0.089, p = .000), time spent receiving ARVs (β = 0.972, se = 0.089, p = .006), time with the infection (β = 0.551, se = 0.089, p = .000) and receiving ARVs (β = 2.940, se = 1.441, p = .043) were independently associated with percent body fat. Older age was the greatest single predictor of body fat. Furthermore, BMI gave better information than weight alone could; in that, mean percentage body fat per unit BMI (N = 192) was significantly higher in patients receiving treatment (1.11±0.31) vs. the exposed group (0.99±0.38, p = .025). For the assessment of obesity, percent fat measures did not greatly alter the accuracy of BMI as a measure for classifying individuals into the broad categories of underweight, normal and overweight. Briefly, Study 1 revealed that there were more overweight/obese participants than in the general Ugandan population, the problem was associated with ART status and that BMI broader classification categories were maintained when compared with the gold standard technique. Study 2 hypothesized that the presence of lipodystrophy in participants receiving ARVs was not different from that of HIV-infected ART-naïve participants. Results showed that 112 (53.1%) patients had experienced at least one morphological alteration including lipohypertrophy (7.6%), lipoatrophy (10.9%), and mixed alterations (34.6%). The majority of these subjects (90%) were receiving ARVs; in fact, all patients receiving PIs reported lipodystrophy. Period spent receiving ARVs (t209 = 6.739, p = .000), being on ART (χ2 = 94.482, p = .000), receiving PIs (Fisher’s exact χ2 = 113.591, p = .000), recent T4 count (CD4 counts) (t207 = 3.694, p = .000), time with HIV (t125 = 1.915, p = .045), as well as older age (t209 = 2.013, p = .045) were independently associated with lipodystrophy. Receiving ARVs was the greatest predictor of lipodystrophy (p = .000). In other analysis, aside from skinfolds at the subscapular (p = .004), there were no differences with the rest of the skinfold sites and the circumferences between participants with lipodystrophy and those without the problem. Similarly, there was no difference in Waist: Hip ratio (WHR) (p = .186) and Waist: Height ratio (WHtR) (p = .257) among participants with lipodystrophy and those without the problem. Further examination showed that none of the 4.1% patients receiving stavudine (d4T) did experience lipoatrophy. However, 17.9% of patients receiving EFV, a non-nucleoside reverse transcriptase inhibitor (NNRTI) had lipoatrophy. Study 2 findings showed that presence of lipodystrophy in participants receiving ARVs was in fact far higher than that of HIV-infected ART-naïve participants. A final hypothesis was that the prevalence of the metabolic syndrome in participants receiving ARVs was not different from that of HIV-infected ART-naïve participants. Moreover, data showed that many patients (69.2%) lived with at least one feature of the metabolic syndrome based on International Diabetic Federation (IDF, 2006) definition. However, there was no single anthropometric predictor of components of the syndrome, thus, the best anthropometric predictor varied as the component varied. The metabolic syndrome was diagnosed in 15.2% of the subjects, lower than commonly reported in this population, and was similar between the medicated and the exposed groups (χ 21 = 0.018, p = .893). Moreover, the syndrome was associated with older age (p = .031) and percent body fat (p = .012). In addition, participants with the syndrome were heavier according to BMI (p = .000), larger at the waist (p = .000) and abdomen (p = .000), and were at central obesity risk even when hip circumference (p = .000) and height (p = .000) were accounted for. In spite of those associations, results showed that the period with disease (p = .13), CD4 counts (p = .836), receiving ART (p = .442) or PIs (p = .678) were not associated with the metabolic syndrome. While the prevalence of the syndrome was highest amongst the older, larger and fatter participants, WC was the best predictor of the metabolic syndrome (p = .001). Another novel finding was that participants with the metabolic syndrome had greater arm muscle circumference (AMC) (p = .000) and arm muscle area (AMA) (p = .000), but the former was most influential. Accordingly, the easiest and cheapest indicator to assess risk in this study sample was WC should routine laboratory services not be feasible. In addition, the final study illustrated that the prevalence of the metabolic syndrome in participants receiving ARVs was not different from that of HIV-infected ART-naïve participants.