870 resultados para mean body length
Resumo:
Total hip arthroplasty (THA) has a proven clinical record for providing pain relief and return of function to patients with disabling arthritis. There are many successful options for femoral implant design and fixation. Cemented, polished, tapered femoral implants have been shown to have excellent results in national joint registries and long-term clinical series. These implants are usually 150mm long at their lateral aspect. Due to their length, these implants cannot always be offered to patients due to variations in femoral anatomy. Polished, tapered implants as short as 95mm exist, however their small proximal geometry (neck offset and body size) limit their use to smaller stature patients. There is a group of patients in which a shorter implant with a maintained proximal body size would be advantageous. There are also potential benefits to a shorter implant in standard patient populations such as reduced bone removal due to reduced reaming, favourable loading of the proximal femur, and the ability to revise into good proximal bone stock if required. These factors potentially make a shorter implant an option for all patient populations. The role of implant length in determining the stability of a cemented, polished, tapered femoral implant is not well defined by the literature. Before changes in implant design can be made, a better understanding of the role of each region in determining performance is required. The aim of the thesis was to describe how implant length affects the stability of a cemented, polished, tapered femoral implant. This has been determined through an extensive body of laboratory testing. The major findings are that for a given proximal body size, a reduction in implant length has no effect on the torsional stability of a polished, tapered design, while a small reduction in axial stability should be expected. These findings are important because the literature suggests that torsional stability is the major determinant of long-term clinical performance of a THA system. Furthermore, a polished, tapered design is known to be forgiving of cement-implant interface micromotion due to the favourable wear characteristics. Together these findings suggest that a shorter polished, tapered implant may be well tolerated. The effect of a change in implant length on the geometric characteristics of polished, tapered design were also determined and applied to the mechanical testing. Importantly, interface area does play a role in stability of the system; however it is the distribution of the interface and not the magnitude of the area that defines stability. Taper angle (at least in the range of angles seen in this work) was shown not to be a determinant of axial or torsional stability. A range of implants were tested, comparing variations in length, neck offset and indication (primary versus cement-in-cement revision). At their manufactured length, the 125mm implants were similar to their longer 150mm counterparts suggesting that they may be similarly well tolerated in the clinical environment. However, the slimmer cement-in-cement revision implant was shown to have a poorer mechanical performance, suggesting their use in higher demand patients may be hazardous. An implant length of 125mm has been shown to be quite stable and the results suggest that a further reduction to 100mm may be tolerated. However, further work is required. A shorter implant with maintained proximal body size would be useful for the group of patients who are unable to access the current standard length implants due to variations in femoral anatomy. Extending the findings further, the similar function with potential benefits of a shorter implant make their application to all patients appealing.
Resumo:
Bactrocera dorsalis (Hendel) and B. papayae Drew & Hancock represent a closely related sibling species pair for which the biological species limits are unclear; i.e., it is uncertain if they are truely two biological species, or one biological species which has been incorrectly taxonomically split. The geographic ranges of the two taxa are thought to abut or overlap on or around the Isthmus of Kra, a recognised biogeographic barrier located on the narrowest portion of the Thai Peninsula. We collected fresh material of B. dorsalis sensu lato (i.e., B. dorsalis sensu stricto + B. papayae) in a north-south transect down the Thai Peninsula, from areas regarded as being exclusively B. dorsalis s.s., across the Kra Isthmus, and into regions regarded as exclusively B. papayae. We carried out microsatellite analyses and took measurements of male genitalia and wing shape. Both the latter morphological tests have been used previously to separate these two taxa. No significant population structuring was found in the microsatellite analysis and results were consistent with an interpretation of one, predominantly panmictic population. Both morphological datasets showed consistent, clinal variation along the transect, with no evidence for disjunction. No evidence in any tests supported historical vicariance driven by the Isthmus of Kra, and none of the three datasets supported the current taxonomy of two species. Rather, within and across the area of range overlap or abutment between the two species, only continuous morphological and genetic variation was recorded. Recognition that morphological traits previously used to separate these taxa are continuous, and that there is no genetic evidence for population segregation in the region of suspected species overlap, is consistent with a growing body of literature that reports no evidence of biological differentiation between these taxa.
Resumo:
The time course of elongation and recovery of axial length associated with a 30 minute accommodative task was studied using optical low coherence reflectometry in a population of young adult myopic (n = 37) and emmetropic (n = 22) subjects. Ten of the 59 subjects were excluded from analysis either due to inconsistent accommodative response, or incomplete anterior biometry data. Those subjects with valid data (n = 49) were found to exhibit a significant axial elongation immediately following the commencement of a 30 minute, 4 D accommodation task, which was sustained for the duration of the task, and ¬was evident to a lesser extent immediately following task cessation. During the accommodation task, on average, the myopic subjects exhibited 22 ± 34 µm, and the emmetropic subjects 6 ± 22 µm of axial elongation, however the differences in axial elongation between the myopic and emmetropic subjects were not statistically significant (p = 0.136). Immediately following the completion of the task, the myopic subjects still exhibited an axial elongation (mean magnitude 12 ± 28 µm), that was significantly greater (p < 0.05) than the changes in axial length observed in the emmetropic subjects (mean change -3 ± 16 µm). Axial length had returned to baseline levels 10 minutes after completion of the accommodation task. The time for recovery from accommodation-induced axial elongation was greater in myopes, which may reflect differences in the biomechanical properties of the globe associated with refractive error. Changes in subfoveal choroidal thickness were able to be measured in 37 of the 59 subjects, and a small amount of choroidal thinning was observed during the accommodation task that was statistically significant in the myopic subjects (p < 0.05). These subfoveal choroidal changes could account for some but not all of the increased axial length during accommodation.
Resumo:
Recent research indicates that brief periods (60 minutes) of monocular defocus lead to small but significant changes in human axial length. However, the effects of longer periods of defocus on the axial length of human eyes are unknown. We examined the influence of a 12 hour period of monocular myopic defocus on the natural daily variations occurring in axial length and choroidal thickness of young adult emmetropes. A series of axial length and choroidal thickness measurements (collected at ~3 hourly intervals, with the first measurement at ~9 am and the final measurement at ~9 pm) were obtained for 13 emmetropic young adults over three consecutive days. The natural daily rhythms (Day 1, baseline day, no defocus), the daily rhythms with monocular myopic defocus (Day 2, defocus day, +1.50 DS spectacle lens over the right eye), and the recovery from any defocus induced changes (Day 3, recovery day, no defocus) were all examined. Significant variations over the course of the day were observed in both axial length and choroidal thickness on each of the three measurement days (p<0.0001). The magnitude and timing of the daily variations in axial length and choroidal thickness were significantly altered with the monocular myopic defocus on day 2 (p<0.0001). Following the introduction of monocular myopic defocus, the daily peak in axial length occurred approximately 6 hours later, and the peak in choroidal thickness approximately 8.5 hours earlier in the day compared to days 1 and 3 (with no defocus). The mean amplitude (peak to trough) of change in axial length (0.030 ± 0.012 on day 1, 0.020 ± 0.010 on day 2 and 0.033 ± 0.012 mm on day 3) and choroidal thickness (0.030 ± 0.007 on day 1, 0.022 ± 0.006 on day 2 and 0.027 ± 0.009 mm on day 3) were also significantly different between the three days (both p<0.05). The introduction of monocular myopic defocus disrupts the daily variations in axial length and choroidal thickness of human eyes (in terms of both amplitude and timing) that return to normal the following day after removal of the defocus.
Resumo:
Rationale: The Australasian Nutrition Care Day Survey (ANCDS) evaluated if malnutrition and decreased food intake are independent risk factors for negative outcomes in hospitalised patients. Methods: A multicentre (56 hospitals) cross-sectional survey was conducted in two phases. Phase 1 evaluated nutritional status (defined by Subjective Global Assessment) and 24-hour food intake recorded as 0, 25, 50, 75, and 100% intake. Phase 2 data, which included length of stay (LOS), readmissions and mortality, were collected 90 days post-Phase 1. Logistic regression was used to control for confounders: age, gender, disease type and severity (using Patient Clinical Complexity Level scores). Results: Of 3122 participants (53% males, mean age: 65±18 years) 32% were malnourished and 23% consumed�25% of the offered food. Median LOS for malnourished (MN) patients was higher than well-nourished (WN) patients (15 vs. 10 days, p<0.0001). Median LOS for patients consuming �25% of the food was higher than those consuming �50% (13 vs. 11 days, p<0.0001). MN patients had higher readmission rates (36% vs. 30%, p = 0.001). The odds ratios of 90-day in-hospital mortality were 1.8 times greater for MN patients (CI: 1.03 3.22, p = 0.04) and 2.7 times greater for those consuming �25% of the offered food (CI: 1.54 4.68, p = 0.001). Conclusion: The ANCDS demonstrates that malnutrition and/or decreased food intake are associated with longer LOS and readmissions. The survey also establishes that malnutrition and decreased food intake are independent risk factors for in-hospital mortality in acute care patients; and highlights the need for appropriate nutritional screening and support during hospitalisation. Disclosure of Interest: None Declared.
Resumo:
Wing length is a key character for essential behaviours related to bird flight such as migration and foraging. In the present study, we initiate the search for the genes underlying wing length in birds by studying a long-distance migrant, the great reed warbler (Acrocephalus arundinaceus). In this species wing length is an evolutionary interesting trait with pronounced latitudinal gradient and sex-specific selection regimes in local populations. We performed a quantitative trait locus (QTL) scan for wing length in great reed warblers using phenotypic, genotypic, pedigree and linkage map data from our long-term study population in Sweden. We applied the linkage analysis mapping method implemented in GRIDQTL (a new web-based software) and detected a genome-wide significant QTL for wing length on chromosome 2, to our knowledge, the first detected QTL in wild birds. The QTL extended over 25 cM and accounted for a substantial part (37%) of the phenotypic variance of the trait. A genome scan for tarsus length (a bodysize-related trait) did not show any signal, implying that the wing-length QTL on chromosome 2 was not associated with body size. Our results provide a first important step into understanding the genetic architecture of avian wing length, and give opportunities to study the evolutionary dynamics of wing length at the locus level. This journal is© 2010 The Royal Society.
Resumo:
Objective: To determine the impact of a free-choice diet on nutritional intake and body condition of feral horses. Animals: Cadavers of 41 feral horses from 5 Australian locations. Procedures: Body condition score (BCS) was determined (scale of 1 to 9), and the stomach was removed from horses during postmortem examination. Stomach contents were analyzed for nutritional variables and macroelement and microelement concentrations. Data were compared among the locations and also compared with recommended daily intakes for horses. Results: Mean BCS varied by location; all horses were judged to be moderately thin. The BCS for males was 1 to 3 points higher than that of females. Amount of protein in the stomach contents varied from 4.3% to 14.9% and was significantly associated with BCS. Amounts of water-soluble carbohydrate and ethanol-soluble carbohydrate in stomach contents of feral horses from all 5 locations were higher than those expected for horses eating high-quality forage. Some macroelement and microelement concentrations were grossly excessive, whereas others were grossly deficient. There was no evidence of ill health among the horses. Conclusions and Clinical Relevance: Results suggested that the diet for several populations of feral horses in Australia appeared less than optimal. However, neither low BCS nor trace mineral deficiency appeared to affect survival of the horses. Additional studies on food sources in these regions, including analysis of water-soluble carbohydrate, ethanol-soluble carbohydrate, and mineral concentrations, are warranted to determine the provenance of such rich sources of nutrients. Determination of the optimal diet for horses may need revision.
Resumo:
This study sought to a) compare and contrast the effect of 2 commonly used cryotherapy treatments, 4 min of − 110 °C whole body cryotherapy and 8 °C cold water immersion, on knee skin temperature and b) establish whether either protocol was capable of achieving a skin temperature ( < 13 °C) believed to be required for analgesic purposes. After ethics committee approval and written informed consent was obtained, 10 healthy males (26.5 ± 4.9 yr, 183.5 ± 6.0 cm, 90.7 ± 19.9 kg, 26.8 ± 5.0 kg/m 2 , 23.0 ± 9.3 % body fat; mean ± SD) participated in this randomised controlled crossover study. Skin temperature around the patellar region was assessed in both knees via non-contact, infrared thermal imaging and recorded pre-, immediately post-treatment and every 10 min thereafter for 60 min. Compared to baseline, average, minimum and maximum skin temperatures were significantly reduced (p < 0.001) immediately post-treatment and at 10, 20, 30, 40, 50 and 60 min after both cooling modalities. Average and minimum skin temperatures were lower (p < 0.05) immediately after whole body cryotherapy (19.0 ± 0.9 ° C) compared to cold water immersion (20.5 ± 0.6 ° C). However, from 10 to 60 min post, the average, minimum and maximum skin temperatures were lower (p < 0.05) following the cold water treatment. Finally, neither protocol achieved a skin temperature believed to be required to elicit an analgesic effect.
Resumo:
Despite the increasing number of immigrants, there is a limited body of literature describing the use of hospital emergency department (ED) care by immigrants in Australia. This study aims to describe how immigrants from refugee source countries (IRSC) utilise ED care, compared to immigrants from the main English speaking countries (MESC), immigrants from other countries (IOC) and the local population in Queensland. A retrospective analysis of a Queensland state-wide hospital ED dataset (ED Information System) from 1-1-2008 to 31-12-2010 was conducted. Our study showed that immigrants are not a homogenous group. We found that immigrants from IRSC are more likely to use interpreters (8.9%) in the ED compared to IOC. Furthermore, IRSC have a higher rate of ambulance use (odds ratio 1.2, 95% confidence interval (CI) 1.2–1.3), are less likely to be admitted to the hospital from the ED (odds ratio 0.7 (95% CI 0.7–0.8), and have a longer length of stay (LOS; mean differences 33.0, 95% CI 28.8–37.2), in minutes, in the ED compared to the Australian born population. Our findings highlight the need to develop policies and educational interventions to ensure the equitable use of health services among vulnerable immigrant populations.
Resumo:
Human immunodeficiency virus (HIV) that leads to acquired immune deficiency syndrome (AIDs) reduces immune function, resulting in opportunistic infections and later death. Use of antiretroviral therapy (ART) increases chances of survival, however, with some concerns regarding fat re-distribution (lipodystrophy) which may encompass subcutaneous fat loss (lipoatrophy) and/or fat accumulation (lipohypertrophy), in the same individual. This problem has been linked to Antiretroviral drugs (ARVs), majorly, in the class of protease inhibitors (PIs), in addition to older age and being female. An additional concern is that the problem exists together with the metabolic syndrome, even when nutritional status/ body composition, and lipodystrophy/metabolic syndrome are unclear in Uganda where the use of ARVs is on the increase. In line with the literature, the overall aim of the study was to assess physical characteristics of HIV-infected patients using a comprehensive anthropometric protocol and to predict body composition based on these measurements and other standardised techniques. The other aim was to establish the existence of lipodystrophy, the metabolic syndrome, andassociated risk factors. Thus, three studies were conducted on 211 (88 ART-naïve) HIV-infected, 15-49 year-old women, using a cross-sectional approach, together with a qualitative study of secondary information on patient HIV and medication status. In addition, face-to-face interviews were used to extract information concerning morphological experiences and life style. The study revealed that participants were on average 34.1±7.65 years old, had lived 4.63±4.78 years with HIV infection and had spent 2.8±1.9 years receiving ARVs. Only 8.1% of participants were receiving PIs and 26% of those receiving ART had ever changed drug regimen, 15.5% of whom changed drugs due to lipodystrophy. Study 1 hypothesised that the mean nutritional status and predicted percent body fat values of study participants was within acceptable ranges; different for participants receiving ARVs and the HIV-infected ART-naïve participants and that percent body fat estimated by anthropometric measures (BMI and skinfold thickness) and the BIA technique was not different from that predicted by the deuterium oxide dilution technique. Using the Body Mass Index (BMI), 7.1% of patients were underweight (<18.5 kg/m2) and 46.4% were overweight/obese (≥25.0 kg/m2). Based on waist circumference (WC), approximately 40% of the cohort was characterized as centrally obese. Moreover, the deuterium dilution technique showed that there was no between-group difference in the total body water (TBW), fat mass (FM) and fat-free mass (FFM). However, the technique was the only approach to predict a between-group difference in percent body fat (p = .045), but, with a very small effect (0.021). Older age (β = 0.430, se = 0.089, p = .000), time spent receiving ARVs (β = 0.972, se = 0.089, p = .006), time with the infection (β = 0.551, se = 0.089, p = .000) and receiving ARVs (β = 2.940, se = 1.441, p = .043) were independently associated with percent body fat. Older age was the greatest single predictor of body fat. Furthermore, BMI gave better information than weight alone could; in that, mean percentage body fat per unit BMI (N = 192) was significantly higher in patients receiving treatment (1.11±0.31) vs. the exposed group (0.99±0.38, p = .025). For the assessment of obesity, percent fat measures did not greatly alter the accuracy of BMI as a measure for classifying individuals into the broad categories of underweight, normal and overweight. Briefly, Study 1 revealed that there were more overweight/obese participants than in the general Ugandan population, the problem was associated with ART status and that BMI broader classification categories were maintained when compared with the gold standard technique. Study 2 hypothesized that the presence of lipodystrophy in participants receiving ARVs was not different from that of HIV-infected ART-naïve participants. Results showed that 112 (53.1%) patients had experienced at least one morphological alteration including lipohypertrophy (7.6%), lipoatrophy (10.9%), and mixed alterations (34.6%). The majority of these subjects (90%) were receiving ARVs; in fact, all patients receiving PIs reported lipodystrophy. Period spent receiving ARVs (t209 = 6.739, p = .000), being on ART (χ2 = 94.482, p = .000), receiving PIs (Fisher’s exact χ2 = 113.591, p = .000), recent T4 count (CD4 counts) (t207 = 3.694, p = .000), time with HIV (t125 = 1.915, p = .045), as well as older age (t209 = 2.013, p = .045) were independently associated with lipodystrophy. Receiving ARVs was the greatest predictor of lipodystrophy (p = .000). In other analysis, aside from skinfolds at the subscapular (p = .004), there were no differences with the rest of the skinfold sites and the circumferences between participants with lipodystrophy and those without the problem. Similarly, there was no difference in Waist: Hip ratio (WHR) (p = .186) and Waist: Height ratio (WHtR) (p = .257) among participants with lipodystrophy and those without the problem. Further examination showed that none of the 4.1% patients receiving stavudine (d4T) did experience lipoatrophy. However, 17.9% of patients receiving EFV, a non-nucleoside reverse transcriptase inhibitor (NNRTI) had lipoatrophy. Study 2 findings showed that presence of lipodystrophy in participants receiving ARVs was in fact far higher than that of HIV-infected ART-naïve participants. A final hypothesis was that the prevalence of the metabolic syndrome in participants receiving ARVs was not different from that of HIV-infected ART-naïve participants. Moreover, data showed that many patients (69.2%) lived with at least one feature of the metabolic syndrome based on International Diabetic Federation (IDF, 2006) definition. However, there was no single anthropometric predictor of components of the syndrome, thus, the best anthropometric predictor varied as the component varied. The metabolic syndrome was diagnosed in 15.2% of the subjects, lower than commonly reported in this population, and was similar between the medicated and the exposed groups (χ 21 = 0.018, p = .893). Moreover, the syndrome was associated with older age (p = .031) and percent body fat (p = .012). In addition, participants with the syndrome were heavier according to BMI (p = .000), larger at the waist (p = .000) and abdomen (p = .000), and were at central obesity risk even when hip circumference (p = .000) and height (p = .000) were accounted for. In spite of those associations, results showed that the period with disease (p = .13), CD4 counts (p = .836), receiving ART (p = .442) or PIs (p = .678) were not associated with the metabolic syndrome. While the prevalence of the syndrome was highest amongst the older, larger and fatter participants, WC was the best predictor of the metabolic syndrome (p = .001). Another novel finding was that participants with the metabolic syndrome had greater arm muscle circumference (AMC) (p = .000) and arm muscle area (AMA) (p = .000), but the former was most influential. Accordingly, the easiest and cheapest indicator to assess risk in this study sample was WC should routine laboratory services not be feasible. In addition, the final study illustrated that the prevalence of the metabolic syndrome in participants receiving ARVs was not different from that of HIV-infected ART-naïve participants.
Resumo:
Purpose To investigate the influence of monocular hyperopic defocus on the normal diurnal rhythms in axial length and choroidal thickness of young adults. Methods A series of axial length and choroidal thickness measurements (collected at ~3 hourly intervals, with the first measurement at ~9 am and the final measurement at ~9 pm) were obtained for 15 emmetropic young adults over three consecutive days. The natural diurnal rhythms (Day 1, no defocus), diurnal rhythms with monocular hyperopic defocus (Day 2, – 2.00 DS spectacle lens over the right eye), and the recovery from any defocus induced changes (Day 3, no defocus) in diurnal rhythms were examined. Results Both axial length and choroidal thickness underwent significant diurnal changes on each of the three measurement days (p<0.0001). The introduction of monocular hyperopic defocus resulted in significant changes in the diurnal variations observed in both parameters (p<0.05). A significant (p<0.001) increase in the mean amplitude (peak to trough) of change in axial length (mean increase, 0.016 ± 0.005 mm) and choroidal thickness (mean increase, 0.011 ± 0.003 mm) was observed on day 2 with hyperopic defocus compared to the two ‘no defocus’ days (days 1 and 3). At the second measurement (mean time 12:10 pm) on the day with hyperopic defocus, the eye was significantly longer by 0.012 ± 0.002 mm compared to the other two days (p<0.05). No significant difference was observed in the average timing of the daily peaks in axial length (mean peak time 12:12 pm) and choroidal thickness (21:02 pm) over the three days. Conclusions The introduction of monocular hyperopic defocus resulted in a significant increase in the amplitude of the diurnal change in axial length and choroidal thickness that returned to normal the following day after removal of the blur stimulus.
Resumo:
Background The pattern of protein intake following exercise may impact whole-body protein turnover and net protein retention. We determined the effects of different protein feeding strategies on protein metabolism in resistance-trained young men. Methods: Participants were randomly assigned to ingest either 80g of whey protein as 8x10g every 1.5h (PULSE; n=8), 4x20g every 3h (intermediate, INT; n=7), or 2x40g every 6h (BOLUS; n=8) after an acute bout of bilateral knee extension exercise (4x10 repetitions at 80% maximal strength). Whole-body protein turnover (Q), synthesis (S), breakdown (B), and net balance (NB) were measured throughout 12h of recovery by a bolus ingestion of [ 15N]glycine with urinary [15N]ammonia enrichment as the collected end-product. Results PULSE Q rates were greater than BOLUS (?19%, P<0.05) with a trend towards being greater than INT (?9%, P=0.08). Rates of S were 32% and 19% greater and rates of B were 51% and 57% greater for PULSE as compared to INT and BOLUS, respectively (P<0.05), with no difference between INT and BOLUS. There were no statistical differences in NB between groups (P=0.23); however, magnitude-based inferential statistics revealed likely small (mean effect90%CI; 0.590.87) and moderate (0.800.91) increases in NB for PULSE and INT compared to BOLUS and possible small increase (0.421.00) for INT vs. PULSE. Conclusion We conclude that the pattern of ingested protein, and not only the total daily amount, can impact whole-body protein metabolism. Individuals aiming to maximize NB would likely benefit from repeated ingestion of moderate amounts of protein (?20g) at regular intervals (?3h) throughout the day.
Resumo:
Background & aims The confounding effect of disease on the outcomes of malnutrition using diagnosis-related groups (DRG) has never been studied in a multidisciplinary setting. This study aims to determine the impact of malnutrition on hospitalisation outcomes, controlling for DRG. Methods Subjective Global Assessment was used to assess the nutritional status of 818 patients within 48 hours of admission. Prospective data were collected on cost of hospitalisation, length of stay (LOS), readmission and mortality up to 3 years post-discharged using National Death Register data. Mixed model analysis and conditional logistic regression matching by DRG were carried out to evaluate the association between nutritional status and outcomes, with the results adjusted for gender, age and race. Results Malnourished patients (29%) had longer hospital stays (6.9±7.3 days vs. 4.6±5.6 days, p<0.001) and were more likely to be readmitted within 15 days (adjusted relative risk = 1.9, 95%CI 1.1–3.2, p=0.025). Within a DRG, the mean difference between actual cost of hospitalisation and the average cost for malnourished patients was greater than well-nourished patients (p=0.014). Mortality was higher in malnourished patients at 1 year (34% vs. 4.1 %), 2 years (42.6% vs. 6.7%) and 3 years (48.5% vs. 9.9%); p<0.001 for all. Overall, malnutrition was a significant predictor of mortality (adjusted hazard ratio = 4.4, 95%CI 3.3-6.0, p<0.001). Conclusions Malnutrition was evident in up to one third of inpatients and led to poor hospitalisation outcomes, even after matching for DRG. Strategies to prevent and treat malnutrition in the hospital and post-discharge are needed.
Resumo:
Background Foot ulcers are a leading cause of avoidable hospital admissions and lower extremity amputations. However, large clinical studies describing foot ulcer presentations in the ambulatory setting are limited. The aim of this descriptive observational paper is to report the characteristics of ambulatory foot ulcer patients managed across 13 of 17 Queensland Health & Hospital Services. Methods Data on all foot ulcer patients registered with a Queensland High Risk Foot Form (QHRFF) was collected at their first consult in 2012. Data is automatically extracted from each QHRFF into a Queensland high risk foot database. Descriptive statistics display age, sex, ulcer types and co-morbidities. Statewide clinical indicators of foot ulcer management are also reported. Results Overall, 2,034 people presented with a foot ulcer in 2012. Mean age was 63(±14) years and 67.8% were male. Co-morbidities included 85% had diabetes, 49.7% hypertension, 39.2% dyslipidaemia, 25.6% cardiovascular disease, 13.7% kidney disease and 12.2% smoking. Foot ulcer types included 51.6% neuropathic, 17.8% neuro-ischaemic, 7.2% ischaemic, 6.6% post-surgical and 16.8% other; whilst 31% were infected. Clinical indicator results revealed 98% had their wound categorised, 51% received non-removable offloading, median ulcer healing time was 6-weeks and 37% had ulcer recurrence. Conclusion This paper details the largest foot ulcer database reported in Australia. People presenting with foot ulcers appear predominantly older, male with several co-morbidities. Encouragingly it appears most patients are receiving best practice care. These results may be a factor in the significant reduction of Queensland diabetes foot-related hospitalisations and amputations recently reported.
Resumo:
Background: Whole body cryotherapy (WBC) is the therapeutic application of extreme cold air for a short duration. Minimal evidence is available for determining optimal exposure time. Purpose: To explore whether the length of WBC exposure induces differential changes in inflammatory markers, tissue oxygenation, skin and core temperature, thermal sensation and comfort. Method: This study was a randomised cross over design with participants acting as their own control. Fourteen male professional first team super league rugby players were exposed to 1, 2, and 3 minutes of WBC at -135°C. Testing took place the day after a competitive league fixture, each exposure separated by seven days. Results: No significant changes were found in the inflammatory cytokine interleukin six. Significant reductions (p<0.05) in deoxyhaemoglobin for gastrocnemius and vastus lateralis were found. In vastus lateralis significant reductions (p<0.05) in oxyhaemoglobin and tissue oxygenation index (p<0.05) were demonstrated. Significant reductions (p<0.05) in skin temperature were recorded. No significant changes were recorded in core temperature. Significant reductions (p<0.05) in thermal sensation and comfort were recorded. Conclusion: Three brief exposures to WBC separated by 1 week are not sufficient to induce physiological changes in IL-6 or core temperature. There are however significant changes in tissue oxyhaemoglobin, deoxyhaemoglobin, tissue oxygenation index, skin temperature and thermal sensation. We conclude that a 2 minute WBC exposure was the optimum exposure length at temperatures of -135°C and could be applied as the basis for future studies.