963 resultados para Total Parenteral Nutrition


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The technique of femoral cement-in-cement revision is well established, but there are no previous series reporting its use on the acetabular side at the time of revision total hip arthroplasty. We describe the surgical technique and report the outcome of 60 consecutive cement-in-cement revisions of the acetabular component at a mean follow-up of 8.5 years (range 5-12 years). All had a radiologically and clinically well fixed acetabular cement mantle at the time of revision. 29 patients died. No case was lost to follow-up. The 2 most common indications for acetabular revision were recurrent dislocation (77%) and to compliment a femoral revision (20%). There were 2 cases of aseptic cup loosening (3.3%) requiring re-revision. No other hip was clinically or radiologically loose (96.7%) at latest follow-up. One case was re-revised for infection, 4 for recurrent dislocation and 1 for disarticulation of a constrained component. At 5 years, the Kaplan-Meier survival rate was 100% for aseptic loosening and 92.2% (95% CI; 84.8-99.6%) with revision for all causes as the endpoint. These results support the use of the cement-in-cement revision technique in appropriate cases on the acetabular side. Theoretical advantages include preservation of bone stock, reduced operating time, reduced risk of complications and durable fixation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract (provisional) Background Food Frequency Questionnaires (FFQs) are commonly used in epidemiologic studies to assess long-term nutritional exposure. Because of wide variations in dietary habits in different countries, a FFQ must be developed to suit the specific population. Sri Lanka is undergoing nutritional transition and diet-related chronic diseases are emerging as an important health problem. Currently, no FFQ has been developed for Sri Lankan adults. In this study, we developed a FFQ to assess the regular dietary intake of Sri Lankan adults. Methods A nationally representative sample of 600 adults was selected by a multi-stage random cluster sampling technique and dietary intake was assessed by random 24-h dietary recall. Nutrient analysis of the FFQ required the selection of foods, development of recipes and application of these to cooked foods to develop a nutrient database. We constructed a comprehensive food list with the units of measurement. A stepwise regression method was used to identify foods contributing to a cumulative 90% of variance to total energy and macronutrients. In addition, a series of photographs were included. Results We obtained dietary data from 482 participants and 312 different food items were recorded. Nutritionists grouped similar food items which resulted in a total of 178 items. After performing step-wise multiple regression, 93 foods explained 90% of the variance for total energy intake, carbohydrates, protein, total fat and dietary fibre. Finally, 90 food items and 12 photographs were selected. Conclusion We developed a FFQ and the related nutrient composition database for Sri Lankan adults. Culturally specific dietary tools are central to capturing the role of diet in risk for chronic disease in Sri Lanka. The next step will involve the verification of FFQ reproducibility and validity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective The main aim of the present study was to identify food consumption in Sri Lankan adults based on serving characteristics. Design Cross-sectional study. Fruits, vegetables, starch, meat, pulses, dairy products and added sugars in the diet were assessed with portion sizes estimated using standard methods. Setting Twelve randomly selected clusters from the Sri Lanka Diabetes and Cardiovascular Study. Subjects Six hundred non-institutionalized adults. Results The daily intake of fruit (0·43), vegetable (1·73) and dairy (0·39) portions were well below national recommendations. Only 3·5 % of adults consumed the recommended 5 portions of fruits and vegetables/d; over a third of the population consumed no dairy products and fewer than 1 % of adults consumed 2 portions/d. In contrast, Sri Lankan adults consumed over 14 portions of starch and 3·5 portions of added sugars daily. Almost 70 % of those studied exceeded the upper limit of the recommendations for starch intake. The total daily number of meat and pulse portions was 2·78. Conclusions Dietary guidelines emphasize the importance of a balanced and varied diet; however, a substantial proportion of the Sri Lankan population studied failed to achieve such a recommendation. Nutrition-related diseases in the country may be closely correlated with unhealthy eating habits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Total hip arthroplasty (THA) is a commonly performed procedure and numbers are increasing with ageing populations. One of the most serious complications in THA are surgical site infections (SSIs), caused by pathogens entering the wound during the procedure. SSIs are associated with a substantial burden for health services, increased mortality and reduced functional outcomes in patients. Numerous approaches to preventing these infections exist but there is no gold standard in practice and the cost-effectiveness of alternate strategies is largely unknown. Objectives The aim of this project was to evaluate the cost-effectiveness of strategies claiming to reduce deep surgical site infections following total hip arthroplasty in Australia. The objectives were: 1. Identification of competing strategies or combinations of strategies that are clinically relevant to the control of SSI related to hip arthroplasty 2. Evidence synthesis and pooling of results to assess the volume and quality of evidence claiming to reduce the risk of SSI following total hip arthroplasty 3. Construction of an economic decision model incorporating cost and health outcomes for each of the identified strategies 4. Quantification of the effect of uncertainty in the model 5. Assessment of the value of perfect information among model parameters to inform future data collection Methods The literature relating to SSI in THA was reviewed, in particular to establish definitions of these concepts, understand mechanisms of aetiology and microbiology, risk factors, diagnosis and consequences as well as to give an overview of existing infection prevention measures. Published economic evaluations on this topic were also reviewed and limitations for Australian decision-makers identified. A Markov state-transition model was developed for the Australian context and subsequently validated by clinicians. The model was designed to capture key events related to deep SSI occurring within the first 12 months following primary THA. Relevant infection prevention measures were selected by reviewing clinical guideline recommendations combined with expert elicitation. Strategies selected for evaluation were the routine use of pre-operative antibiotic prophylaxis (AP) versus no use of antibiotic prophylaxis (No AP) or in combination with antibiotic-impregnated cement (AP & ABC) or laminar air operating rooms (AP & LOR). The best available evidence for clinical effect size and utility parameters was harvested from the medical literature using reproducible methods. Queensland hospital data were extracted to inform patients’ transitions between model health states and related costs captured in assigned treatment codes. Costs related to infection prevention were derived from reliable hospital records and expert opinion. Uncertainty of model input parameters was explored in probabilistic sensitivity analyses and scenario analyses and the value of perfect information was estimated. Results The cost-effectiveness analysis was performed from a health services perspective using a hypothetical cohort of 30,000 THA patients aged 65 years. The baseline rate of deep SSI was 0.96% within one year of a primary THA. The routine use of antibiotic prophylaxis (AP) was highly cost-effective and resulted in cost savings of over $1.6m whilst generating an extra 163 QALYs (without consideration of uncertainty). Deterministic and probabilistic analysis (considering uncertainty) identified antibiotic prophylaxis combined with antibiotic-impregnated cement (AP & ABC) to be the most cost-effective strategy. Using AP & ABC generated the highest net monetary benefit (NMB) and an incremental $3.1m NMB compared to only using antibiotic prophylaxis. There was a very low error probability that this strategy might not have the largest NMB (<5%). Not using antibiotic prophylaxis (No AP) or using both antibiotic prophylaxis combined with laminar air operating rooms (AP & LOR) resulted in worse health outcomes and higher costs. Sensitivity analyses showed that the model was sensitive to the initial cohort starting age and the additional costs of ABC but the best strategy did not change, even for extreme values. The cost-effectiveness improved for a higher proportion of cemented primary THAs and higher baseline rates of deep SSI. The value of perfect information indicated that no additional research is required to support the model conclusions. Conclusions Preventing deep SSI with antibiotic prophylaxis and antibiotic-impregnated cement has shown to improve health outcomes among hospitalised patients, save lives and enhance resource allocation. By implementing a more beneficial infection control strategy, scarce health care resources can be used more efficiently to the benefit of all members of society. The results of this project provide Australian policy makers with key information about how to efficiently manage risks of infection in THA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article is a brief introduction to the total solar eclipse Wed 14 November 2012 in north Queensland that will be seen in a narrow strip of land just 140 km wide in the vicinity of Cairns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Periurban agriculture refers to agricultural practice occurring in areas with mixed rural and urban features. It is responsible 25% of the total gross value of economic production in Australia, despite only comprising 3% of the land used for agriculture. As populations grows and cities expand, they are constantly absorbing surrounding fringe areas, thus creating a new fringe, further from the city causing the periurban region to constantly shift outwards. Periurban regions are fundamental in the provision of fresh food to city populations and residential (and industrial) expansion taking over agricultural land has been noted as a major worldwide concern. Another major concern around the increase in urbanisation and resultant decrease in periurban agriculture is its potential effect on food security. Food security is the availability or access to nutritionally-adequate, culturally-relevant and safe foods in culturally-appropriate ways. Thus food insecurity occurs when access to or availability of these foods is compromised. There is an important level of connectedness between food security and food production and a decrease in periurban agriculture may have adverse effects on food security. A decrease in local, seasonal produce may result in a decrease in the availability of products and an increase in cost, as food must travel greater distances, incurring extra costs present at the consumer level. Currently, few Australian studies exist examining the change in periurban agriculture over time. Such information may prove useful for future health policy and interventions as well as infrastructure planning. The aim of this study is to investigate changes in periurban agriculture among capital cities of Australia. Methods: We compared data pertaining to selected commodities from the Australian Bureau of Statistics 2000-01 and 2005 -2006 Agricultural Census. This survey is distributed online or via mail on a five-yearly basis to approximately 175,000 Agricultural business to ascertain information on a range of factors, such as types of crops, livestock and land preparation practices. For the purpose of this study we compared the land being used for total crops, and cereal , oil seed, legume, fruit and vegetable crops separately. Data was analysed using repeated measures anova in spss. Results: Overall, total area available for crops in urbanised areas of Australia increased slightly by 1.8%. However, Sydney, Melbourne, Adelaide and Perth experienced decreases in the area available for fruit crops by 11%, 5%,and 4% respectively. Furthermore, Brisbane and Perth experienced decreases in land available for vegetable crops by 28% and 14% respectively. Finally, Sydney, Adelaide and Perth experienced decreases in land available for cereal crops by 10 – 79%. Conclusions: These findings suggest that population increases and consequent urban sprawl may be resulting in a decrease in peri-urban agriculture, specifically for several core food groups including fruit, breads and grain based foods. In doing so, access to or availability of these foods may be limited, and the cost of these foods is likely to increase, which may compromise food insecurity for certain sub-groups of the population.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Studies on the relationship between performance and design of the throwing frame have been limited. Part I provided only a description of the whole body positioning. Objectives: The specific objectives were (a) to benchmark feet positioning characteristics (i.e. position, spacing and orientation) and (b) to investigate the relationship between performance and these characteristics for male seated discus throwers in F30s classes. Study Design: Descriptive analysis. Methods: A total of 48 attempts performed by 12 stationary discus throwers in F33 and F34 classes during seated discus throwing event of 2002 International Paralympic Committee Athletics World Championships were analysed in this study. Feet positioning was characterised by tridimensional data of the front and back feet position as well as spacing and orientation corresponding to the distance between and the angle made by both feet, respectively. Results: Only 4 of 30 feet positioning characteristics presented a coefficient correlation superior to 0.5, including the feet spacing on mediolateral and anteroposterior axes in F34 class as well as the back foot position and feet spacing on mediolateral axis in F33 class. Conclusions: This study provided key information for a better understanding of the interaction between throwing technique of elite seated throwers and their throwing frame.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To assess the effects of pre-cooling volume on neuromuscular function and performance in free-paced intermittent-sprint exercise in the heat. Methods: Ten male, teamsport athletes completed four randomized trials involving an 85-min free-paced intermittentsprint exercise protocol in 33°C±33% relative humidity. Pre-cooling sessions included whole body (WB), head+hand (HH), head (H) and no cooling (CONT), applied for 20-min pre-exercise and 5-min mid exercise. Maximal voluntary contractions (MVC) were assessed pre- and postintervention and mid- and post-exercise. Exercise performance was assessed with sprint times, % decline and distances covered during free-paced bouts. Measures of core(Tc) and skin (Tsk) temperatures, heart rate, perceptual exertion and thermal stress were monitored throughout. Venous and capillary blood was analyzed for metabolite, muscle damage and inflammatory markers. Results: WB pre-cooling facilitated the maintenance of sprint times during the exercise protocol with reduced % decline (P=0.04). Mean and total hard running distances increased with pre cooling 12% compared to CONT (P<0.05), specifically, WB was 6-7% greater than HH (P=0.02) and H (P=0.001) respectively. No change was evident in mean voluntary or evoked force pre- to post-exercise with WB and HH cooling (P>0.05). WB and HH cooling reduced Tc by 0.1-0.3°C compared to other conditions (P<0.05). WB Tsk was suppressed for the entire session(P=0.001). HR responses following WB cooling were reduced(P=0.05; d=1.07) compared to CONT conditions during exercise. Conclusion: A relationship between pre-cooling volume and exercise performance seems apparent, as larger surface area coverage augmented subsequent free-paced exercise capacity, in conjunction with greater suppression of physiological load. Maintenance of MVC with pre-cooling, despite increased work output suggests the role of centrally-mediated mechanisms in exercise pacing regulation and subsequent performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: The current study investigated the change in neuromuscular contractile properties following competitive rugby league matches and the relationship with physical match demands. Design: Eleven trained, male rugby league players participated in 2–3 amateur, competitive matches (n = 30). Methods: Prior to, immediately (within 15-min) and 2 h post-match, players performed repeated counter-movement jumps (CMJ) followed by isometric tests on the right knee extensors for maximal voluntary contraction (MVC), voluntary activation (VA) and evoked twitch contractile properties of peak twitch force (Pt), rate of torque development (RTD), contraction duration (CD) and relaxation rate (RR). During each match, players wore 1 Hz Global Positioning Satellite devices to record distance and speeds of matches. Further, matches were filmed and underwent notational analysis for number of total body collisions. Results: Total, high-intensity, very-high intensity distances covered and mean speed were 5585 ± 1078 m, 661 ± 265, 216 ± 121 m and 75 ± 14 m min−1, respectively. MVC was significantly reduced immediately and 2 h post-match by 8 ± 11 and 12 ± 13% from pre-match (p < 0.05). Moreover, twitch contractile properties indicated a suppression of Pt, RTD and RR immediately post-match (p < 0.05). However, VA was not significantly altered from pre-match (90 ± 9%), immediately-post (89 ± 9%) or 2 h post (89 ± 8%), (p > 0.05). Correlation analyses indicated that total playing time (r = −0.50) and mean speed (r = −0.40) were moderately associated to the change in post-match MVC, while mean speed (r = 0.35) was moderately associated to VA. Conclusions: The present study highlights the physical demands of competitive amateur rugby league result in interruption of peripheral contractile function, and post-match voluntary torque suppression may be associated with match playing time and mean speeds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examined physiological and performance effects of pre-cooling on medium-fast bowling in the heat. Ten, medium-fast bowlers completed two randomised trials involving either cooling (mixed-methods) or control (no cooling) interventions before a 6-over bowling spell in 31.9±2.1°C and 63.5±9.3% relative humidity. Measures included bowling performance (ball speed, accuracy and run-up speeds), physical characteristics (global positioning system monitoring and counter-movement jump height), physiological (heart rate, core temperature, skin temperature and sweat loss), biochemical (serum concentrations of damage, stress and inflammation) and perceptual variables (perceived exertion and thermal sensation). Mean ball speed (114.5±7.1 vs. 114.1±7.2 km · h−1; P = 0.63; d = 0.09), accuracy (43.1±10.6 vs. 44.2±12.5 AU; P = 0.76; d = 0.14) and total run-up speed (19.1±4.1 vs. 19.3±3.8 km · h−1; P = 0.66; d = 0.06) did not differ between pre-cooling and control respectively; however 20-m sprint speed between overs was 5.9±7.3% greater at Over 4 after pre-cooling (P = 0.03; d = 0.75). Pre-cooling reduced skin temperature after the intervention period (P = 0.006; d = 2.28), core temperature and pre-over heart rates throughout (P = 0.01−0.04; d = 0.96−1.74) and sweat loss by 0.4±0.3 kg (P = 0.01; d = 0.34). Mean rating of perceived exertion and thermal sensation were lower during pre-cooling trials (P = 0.004−0.03; d = 0.77−3.13). Despite no observed improvement in bowling performance, pre-cooling maintained between-over sprint speeds and blunted physiological and perceptual demands to ease the thermoregulatory demands of medium-fast bowling in hot conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigated the effects of alcohol ingestion on lower body strength and power, and physiological and cognitive recovery following competitive Rugby League matches. Nine male Rugby players participated in two matches, followed by one of two randomized interventions; a control or alcohol ingestion session. Four hours post-match, participants consumed either beverages containing a total of 1g of ethanol per kg bodyweight (vodka and orange juice; ALC) or a caloric and taste matched non-alcoholic beverage (orange juice; CONT). Pre, post, 2 h post and 16 h post match measures of countermovement jump (CMJ), maximal voluntary contraction(MVC), voluntary activation (VA), damage and stress markers of creatine kinase (CK), C-reactive protein (CRP), cortisol, and testosterone analysed from venous blood collection, and cognitive function (modified Stroop test) were determined. Alcohol resulted in large effects for decreased CMJ height(-2.35 ± 8.14 and -10.53 ± 8.36 % decrement for CONT and ALC respectively; P=0.15, d=1.40), without changes in MVC (P=0.52, d=0.70) or VA (P=0.15, d=0.69). Furthermore, alcohol resulted in a significant slowing of total time in a cognitive test (P=0.04, d=1.59), whilst exhibiting large effects for detriments in congruent reaction time (P=0.19, d=1.73). Despite large effects for increased cortisol following alcohol ingestion during recovery (P=0.28, d=1.44), post-match alcohol consumption did not unduly affect testosterone (P-0.96, d=0.10), CK (P=0.66, d=0.70) or CRP(P=0.75, d=0.60). It appears alcohol consumption during the evening following competitive rugby matches may have some detrimental effects on peak power and cognitive recovery the morning following a Rugby League match. Accordingly, practitioners should be aware of the potential associated detrimental effects of alcohol consumption on recovery and provide alcohol awareness to athletes at post-match functions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This investigation examined physiological and performance effects of cooling on recovery of medium-fast bowlers in the heat. Eight, medium-fast bowlers completed two randomised trials, involving two sessions completed on consecutive days (Session 1: 10-overs and Session 2: 4-overs) in 31 ± 3°C and 55 ± 17% relative humidity. Recovery interventions were administered for 20 min (mixed-method cooling vs. control) after Session 1. Measures included bowling performance (ball speed, accuracy, run-up speeds), physical demands (global positioning system, counter-movement jump), physiological (heart rate, core temperature, skin temperature, sweat loss), biochemical (creatine kinase, C-reactive protein) and perceptual variables (perceived exertion, thermal sensation, muscle soreness). Mean ball speed was higher after cooling in Session 2 (118.9 ± 8.1 vs. 115.5 ± 8.6 km · h−1; P = 0.001; d = 0.67), reducing declines in ball speed between sessions (0.24 vs. −3.18 km · h−1; P = 0.03; d = 1.80). Large effects indicated higher accuracy in Session 2 after cooling (46.0 ± 11.2 vs. 39.4 ± 8.6 arbitrary units [AU]; P = 0.13; d = 0.93) without affecting total run-up speed (19.0 ± 3.1 vs. 19.0 ± 2.5 km · h−1; P = 0.97; d = 0.01). Cooling reduced core temperature, skin temperature and thermal sensation throughout the intervention (P = 0.001–0.05; d = 1.31–5.78) and attenuated creatine kinase (P = 0.04; d = 0.56) and muscle soreness at 24-h (P = 0.03; d = 2.05). Accordingly, mixed-method cooling can reduce thermal strain after a 10-over spell and improve markers of muscular damage and discomfort alongside maintained medium-fast bowling performance on consecutive days in hot conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to investigate the effect of court surface (clay v hard-court) on technical, physiological and perceptual responses to on-court training. Four high-performance junior male players performed two identical training sessions on hard and clay courts, respectively. Sessions included both physical conditioning and technical elements as led by the coach. Each session was filmed for later notational analysis of stroke count and error rates. Further, players wore a global positioning satellite device to measure distance covered during each session; whilst heart rate, countermovement jump distance and capillary blood measures of metabolites were measured before, during and following each session. Additionally a respective coach and athlete rating of perceived exertion (RPE) were measured following each session. Total duration and distance covered during of each session were comparable (P>0.05; d<0.20). While forehand and backhands stroke volume did not differ between sessions (P>0.05; d<0.30); large effects for increased unforced and forced errors were present on the hard court (P>0.05; d>0.90). Furthermore, large effects for increased heart rate, blood lactate and RPE values were evident on clay compared to hard courts (P>0.05; d>0.90). Additionally, while player and coach RPE on hard courts were similar, there were large effects for coaches to underrate the RPE of players on clay courts (P>0.05; d>0.90). In conclusion, training on clay courts results in trends for increased heart rate, lactate and RPE values, suggesting sessions on clay tend towards higher physiological and perceptual loads than hard courts. Further, coaches appear effective at rating player RPE on hard courts, but may underrate the perceived exertion of sessions on clay courts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the elderly, the risks for protein-energy malnutrition from older age, dementia, depression and living alone have been well-documented. Other risk factors including anorexia, gastrointestinal dysfunction, loss of olfactory and taste senses and early satiety have also been suggested to contribute to poor nutritional status. In Parkinson’s disease (PD), it has been suggested that the disease symptoms may predispose people with PD to malnutrition. However, the risks for malnutrition in this population are not well-understood. The current study’s aim was to determine malnutrition risk factors in community-dwelling adults with PD. Nutritional status was assessed using the Patient-Generated Subjective Global Assessment (PG-SGA). Data about age, time since diagnosis, medications and living situation were collected. Levodopa equivalent doses (LDED) and LDED per kg body weight (mg/kg) were calculated. Depression and anxiety were measured using the Beck’s Depression Inventory (BDI) and Spielberger Trait Anxiety questionnaire, respectively. Cognitive function was assessed using the Addenbrooke’s Cognitive Examination (ACE-R). Non-motor symptoms were assessed using the Scales for Outcomes in Parkinson's disease-Autonomic (SCOPA-AUT) and Modified Constipation Assessment Scale (MCAS). A total of 125 community-dwelling people with PD were included, average age of 70.2±9.3(35-92) years and average time since diagnosis of 7.3±5.9(0–31) years. Average body mass index (BMI) was 26.0±5.5kg/m2. Of these, 15% (n=19) were malnourished (SGA-B). Multivariate logistic regression analysis revealed that older age (OR=1.16, CI=1.02-1.31), more depressive symptoms (OR=1.26, CI=1.07-1.48), lower levels of anxiety (OR=.90, CI=.82-.99), and higher LDED per kg body weight (OR=1.57, CI=1.14-2.15) significantly increased malnutrition risk. Cognitive function, living situation, number of prescription medications, LDED, years since diagnosis and the severity of non-motor symptoms did not significantly influence malnutrition risk. Malnutrition results in poorer health outcomes. Proactively addressing the risk factors can help prevent declines in nutritional status. In the current study, older people with PD with depression and greater amounts of levodopa per body weight were at increased malnutrition risk.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to identify what outcome measures or quality indicators are being used to evaluate advanced and new roles in nine allied health professions and whether the measures are evaluating outcomes of interest to the patient, the clinician, or the healthcare provider. A systematic search strategy was used. Medical and allied health databases were searched and relevant articles extracted. Relevant studies with at least 1 outcome measure were evaluated. A total of 106 articles were identified that described advanced roles, however, only 23 of these described an outcome measure in sufficient detail to be included for review. The majority of the reported measures fit into the economic and process categories. The most reported outcome related to patients was satisfaction surveys. Measures of patient health outcomes were infrequently reported. It is unclear from the studies evaluated whether new models of allied healthcare can be shown to be as safe and effective as traditional care for a given procedure. Outcome measures chosen to evaluate these services often reflect organizational need and not patient outcomes. Organizations need to ensure that high-quality performance measures are chosen to evaluate the success of new health service innovations. There needs to be a move away from in-house type surveys that add little or no valid evidence as to the effect of a new innovation. More importance needs to be placed on patient outcomes as a measure of the quality of allied health interventions.