552 resultados para Patient-generated outccome measures


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: People with Parkinson’s disease (PD) are at higher risk of malnutrition due to PD symptoms and pharmacotherapy side effects. When pharmacotherapy is no longer effective for symptom control, deep-brain stimulation (DBS) surgery may be considered. The aim of this study was to assess the nutritional status of people with PD who may be at higher risk of malnutrition related to unsatisfactory symptom management with optimised medical therapy. Design: This was an observational study using a convenience sample. Setting: Participants were seen during their hospital admission for their deep brain stimulation surgery. Participants: People with PD scheduled for DBS surgery were recruited from a Brisbane neurological clinic (n=15). Measurements: The Patient-Generated Subjective Global Assessment (PG-SGA), weight, height and body composition were assessed to determine nutritional status. Results: Six participants (40%) were classified as moderately malnourished (SGA-B). Eight participants (53%) reported previous unintentional weight loss (average loss of 13.3%). On average, participants classified as well-nourished (SGA-A) were younger, had shorter disease durations, lower PG-SGA scores, higher body mass (BMI) and fat free mass indices (FFMI) when compared to malnourished participants (SGA-B). Five participants had previously received dietetic advice but only one in relation to unintentional weight loss. Conclusion: Malnutrition remains unrecognised and untreated in this group despite unintentional weight loss and presence of nutrition impact symptoms. Improving nutritional status prior to surgery may improve surgical outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective In Parkinson's disease (PD), commonly reported risk factors for malnutrition in other populations commonly occur. Few studies have explored which of these factors are of particular importance in malnutrition in PD. The aim was to identify the determinants of nutritional status in people with Parkinson's disease (PWP). Methods Community-dwelling PWP (>18 years) were recruited (n = 125; 73M/52F; Mdn 70 years). Self-report assessments included Beck's Depression Inventory (BDI), Spielberger Trait Anxiety Inventory (STAI), Scales for Outcomes in Parkinson's disease – Autonomic (SCOPA-AUT), Modified Constipation Assessment Scale (MCAS) and Freezing of Gait Questionnaire (FOG-Q). Information about age, PD duration, medications, co-morbid conditions and living situation was obtained. Addenbrooke's Cognitive Examination (ACE-R), Unified Parkinson's Disease Rating Scale (UPDRS) II and UPDRS III were performed. Nutritional status was assessed using the Subjective Global Assessment (SGA) as part of the scored Patient-Generated Subjective Global Assessment (PG-SGA). Results Nineteen (15%) were malnourished (SGA-B). Median PG-SGA score was 3. More of the malnourished were elderly (84% vs. 71%) and had more severe disease (H&Y: 21% vs. 5%). UPDRS II and UPDRS III scores and levodopa equivalent daily dose (LEDD)/body weight(mg/kg) were significantly higher in the malnourished (Mdn 18 vs. 15; 20 vs. 15; 10.1 vs. 7.6 respectively). Regression analyses revealed older age at diagnosis, higher LEDD/body weight (mg/kg), greater UPDRS III score, lower STAI score and higher BDI score as significant predictors of malnutrition (SGA-B). Living alone and higher BDI and UPDRS III scores were significant predictors of a higher log-adjusted PG-SGA score. Conclusions In this sample of PWP, the rate of malnutrition was higher than that previously reported in the general community. Nutrition screening should occur regularly in those with more severe disease and depression. Community support should be provided to PWP living alone. Dopaminergic medication should be reviewed with body weight changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Malnutrition before and during chemotherapy is associated with poor treatment outcomes. The risk of cancer-related malnutrition is exacerbated by common nutrition impact symptoms during chemotherapy, such as nausea, diarrhoea and mucositis. Aim of presentation: To describe the prevalence of malnutrition/ malnutrition risk in two samples of patients treated in a quaternary-level chemotherapy unit. Research design: Cross sectional survey. Sample 1: Patients ≥ 65 years prior to chemotherapy treatment (n=175). Instrument: Nurse-administered Malnutrition Screening Tool to screen for malnutrition risk and body mass index (BMI). Sample 2: Patients ≥ 18 years receiving chemotherapy (n=121). Instrument: Dietitian-administered Patient Generated Subjective Global Assessment to assess malnutrition, malnutrition risk and BMI. Findings Sample 1: 93/175 (53%) of older patients were at risk of malnutrition prior to chemotherapy. 27 (15%) were underweight (BMI <21.9); 84 (48%) were overweight (BMI >27). Findings Sample 2: 31/121 patients (26%) were malnourished; 12 (10%) had intake-limiting nausea or vomiting; 22 (20%) reported significant weight loss; and 20 (18%) required improved nutritional symptom management during treatment. 13 participants with malnutrition/nutrition impact symptoms (35%) had no dietitian contact; the majority of these participants were overweight. Implications for nursing: Patients with, or at risk of, malnutrition before and during chemotherapy can be overlooked, particularly if they are overweight. Older patients seem particularly at risk. Nurses can easily and quickly identify risk with the regular use of the Malnutrition Screening Tool, and refer patients to expert dietetic support, to ensure optimal treatment outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Cancer-related malnutrition is associated with increased morbidity, poorer tolerance of treatment, decreased quality of life, increased hospital admissions, and increased health care costs (Isenring et al., 2013). This study’s aim was to determine whether a novel, automated screening system was a useful tool for nutrition screening when compared against a full nutrition assessment using the Patient-Generated Subjective Global Assessment (PG-SGA) tool. Methods A single site, observational, cross-sectional study was conducted in an outpatient oncology day care unit within a Queensland tertiary facility, with three hundred outpatients (51.7% male, mean age 58.6 ± 13.3 years). Eligibility criteria: ≥18 years, receiving anticancer treatment, able to provide written consent. Patients completed the Malnutrition Screening Tool (MST). Nutritional status was assessed using the PG-SGA. Data for the automated screening system was extracted from the pharmacy software program Charm. This included body mass index (BMI) and weight records dating back up to six months. Results The prevalence of malnutrition was 17%. Any weight loss over three to six weeks prior to the most recent weight record as identified by the automated screening system relative to malnutrition resulted in 56.52% sensitivity, 35.43% specificity, 13.68% positive predictive value, 81.82% negative predictive value. MST score 2 or greater was a stronger predictor of nutritional risk relative to PG-SGA classified malnutrition (70.59% sensitivity, 69.48% specificity, 32.14% positive predictive value, 92.02% negative predictive value). Conclusions Both the automated screening system and the MST fell short of the accepted professional standard for sensitivity (80%) or specificity (60%) when compared to the PG-SGA. However, although the MST remains a better predictor of malnutrition in this setting, uptake of this tool in the Oncology Day Care Unit remains challenging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background & Aims Nutrition screening and assessment enable early identification of malnourished people and those at risk of malnutrition. Appropriate assessment tools assist with informing and monitoring nutrition interventions. Tool choice needs to be appropriate to the population and setting. Methods Community-dwelling people with Parkinson’s disease (>18 years) were recruited. Body mass index (BMI) was calculated from weight and height. Participants were classified as underweight according to World Health Organisation (WHO) (≤18.5kg/m2) and age specific (<65 years,≤18.5kg/m2; ≥65 years,≤23.5kg/m2) cut-offs. The Mini-Nutritional Assessment (MNA) screening (MNA-SF) and total assessment scores were calculated. The Patient-Generated Subjective Global Assessment (PG-SGA), including the Subjective Global Assessment (SGA), was performed. Sensitivity, specificity, positive predictive value, negative predictive value and weighted kappa statistic of each of the above compared to SGA were determined. Results Median age of the 125 participants was 70.0(35-92) years. Age-specific BMI (Sn 68.4%, Sp 84.0%) performed better than WHO (Sn 15.8%, Sp 99.1%) categories. MNA-SF performed better (Sn 94.7%, Sp 78.3%) than both BMI categorisations for screening purposes. MNA had higher specificity but lower sensitivity than PG-SGA (MNA Sn 84.2%, Sp 87.7%; PG-SGA Sn 100.0%, Sp 69.8%). Conclusions BMI lacks sensitivity to identify malnourished people with Parkinson’s disease and should be used with caution. The MNA-SF may be a better screening tool in people with Parkinson’s disease. The PG-SGA performed well and may assist with informing and monitoring nutrition interventions. Further research should be conducted to validate screening and assessment tools in Parkinson’s disease.  

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose Paper-based nutrition screening tools can be challenging to implement in the ambulatory oncology setting. The aim of this study was to determine the validity of the Malnutrition Screening Tool (MST) and a novel, automated nutrition screening system compared to a ‘gold standard’ full nutrition assessment using the Patient-Generated Subjective Global Assessment (PG-SGA). Methods An observational, cross-sectional study was conducted in an outpatient oncology day treatment unit (ODTU) within an Australian tertiary health service. Eligibility criteria were as follows: ≥18 years, receiving outpatient anticancer treatment and English literate. Patients self-administered the MST. A dietitian assessed nutritional status using the PGSGA, blinded to the MST score. Automated screening system data were extracted from an electronic oncology prescribing system. This system used weight loss over 3 to 6 weeks prior to the most recent weight record or age-categorised body mass index (BMI) to identify nutritional risk. Sensitivity and specificity against PG-SGA (malnutrition) were calculated using contingency tables and receiver operating curves. Results There were a total of 300 oncology outpatients (51.7 % male, 58.6±13.3 years). The area under the curve (AUC) for weight loss alone was 0.69 with a cut-off value of ≥1 % weight loss yielding 63 % sensitivity and 76.7 % specificity. MST (score ≥2) resulted in 70.6 % sensitivity and 69.5 % specificity, AUC 0.77. Conclusions Both the MST and the automated method fell short of the accepted professional standard for sensitivity (~≥80 %) derived from the PG-SGA. Further investigation into other automated nutrition screening options and the most appropriate parameters available electronically is warranted to support targeted service provision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Multi attribute utility instruments (MAUIs) are preference-based measures that comprise a health state classification system (HSCS) and a scoring algorithm that assigns a utility value to each health state in the HSCS. When developing a MAUI from a health-related quality of life (HRQOL) questionnaire, first a HSCS must be derived. This typically involves selecting a subset of domains and items because HRQOL questionnaires typically have too many items to be amendable to the valuation task required to develop the scoring algorithm for a MAUI. Currently, exploratory factor analysis (EFA) followed by Rasch analysis is recommended for deriving a MAUI from a HRQOL measure. Aim To determine whether confirmatory factor analysis (CFA) is more appropriate and efficient than EFA to derive a HSCS from the European Organisation for the Research and Treatment of Cancer’s core HRQOL questionnaire, Quality of Life Questionnaire (QLQ-C30), given its well-established domain structure. Methods QLQ-C30 (Version 3) data were collected from 356 patients receiving palliative radiotherapy for recurrent/metastatic cancer (various primary sites). The dimensional structure of the QLQ-C30 was tested with EFA and CFA, the latter informed by the established QLQ-C30 structure and views of both patients and clinicians on which are the most relevant items. Dimensions determined by EFA or CFA were then subjected to Rasch analysis. Results CFA results generally supported the proposed QLQ-C30 structure (comparative fit index =0.99, Tucker–Lewis index =0.99, root mean square error of approximation =0.04). EFA revealed fewer factors and some items cross-loaded on multiple factors. Further assessment of dimensionality with Rasch analysis allowed better alignment of the EFA dimensions with those detected by CFA. Conclusion CFA was more appropriate and efficient than EFA in producing clinically interpretable results for the HSCS for a proposed new cancer-specific MAUI. Our findings suggest that CFA should be recommended generally when deriving a preference-based measure from a HRQOL measure that has an established domain structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Quality of life is poorer in Parkinson’s disease than in other conditions and in the general population without Parkinson’s disease. Malnutrition also results in poorer quality of life. This study aimed at determining the relationship between quality of life and nutritional status. Methods: Community-dwelling people with Parkinson’s disease >18 years old were recruited. The Patient-Generated Subjective Global Assessment (PG-SGA) assessed nutritional status. The Parkinson’s Disease Questionnaire 39 (PDQ-39) measured quality of life. Phase I was cross-sectional. The malnourished in Phase I were eligible for a nutrition intervention phase, randomised into 2 groups: standard care (SC) with provision of nutrition education materials only and intervention (INT) with individualised dietetic advice and regular weekly follow-up. Data were collected at baseline, 6 weeks, and 12 weeks. Results: Phase I consisted of 120 people who completed the PDQ-39. Phase II consisted of 9 in the SC group and 10 in the INT group. In Phase I, quality of life was poorer in the malnourished, particularly for mobility and activities of daily living domains. There was a significant correlation between PG-SGA and PDQ-39 scores (Phase I, rs = 0.445, p = .000; Phase II, rs = .426, p = .002). In Phase II, no significant difference in the PDQ-39 total or sub-scores was observed between the INT and SC groups; however, there was significant improvement in the emotional well-being domain for the entire group, X2(2) = 8.84, p = .012. Conclusions: Malnourished people with Parkinson’s disease had poorer quality of life than the well-nourished, and improvements in nutritional status resulted in quality of life improvements. Attention to nutritional status is an important component of quality of life and therefore the total care of people with Parkinson’s disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Patient-relevant outcome measures are essential for high-quality clinical research, and quality-of-life (QoL) tools are the current standard. Currently, there is no validated children's acute cough-specific QoL questionnaire. Objective The objective of this study was to develop and validate the Parent-proxy Children's Acute Cough-specific QoL Questionnaire (PAC-QoL). Methods Using focus groups, a 48-item PAC-QoL questionnaire was developed and later reduced to 16 items by using the clinical impact method. Parents of children with a current acute cough (<2 weeks) at enrollment completed 2 validated cough score measures, the preliminary 48-item PAC-QoL, and 3 other questionnaires (the State Trait Anxiety Inventory [STAI], the Short-Form 8-item 24-hour recall Health Survey [SF-8], and the Depression, Anxiety, and Stress 21-item Scale [DASS21]). All measures were repeated on days 3 and 14. Results The median age of the 155 children enrolled was 2.3 years (interquartile range, 1.3-4.6). Median cough duration at enrollment was 3 days (interquartile range, 2-5). The reduced 16-item scale had high internal consistency (Cronbach α = 0.95). Evidence for repeatability and criterion validity was shown by significant correlations between the domains and total PAC-QoL scores and the SF-8 (r = −0.36 and −0.51), STAI (r = −0.27 and −0.39), and DASS21 (r = −0.32 and −0.41) scales on days 0 and 3, respectively. The final PAC-QoL questionnaire was sensitive to change over time, with changes significantly relating to changes in cough score measures (P < .001). Conclusion The 16-item PAC-QoL is a reliable and valid outcome measure that assesses QoL related to childhood acute cough at a given time point and reflects changes in acute cough-specific QoL over time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Various policies, plans, and initiatives have been implemented to provide safe, quality, and culturally competent care to patients within Queensland’s healthcare system. A series of models of maternity care are available in Queensland that range from standard public care to private midwifery care. The current study aimed to determine whether identifying as Culturally or Linguistically Diverse (CALD) was associated with the perceived safety, quality, and cultural competency of maternity care from a consumer perspective, and to identify specific needs and preferences of CALD maternity care consumers. Secondary analysis of data collected in the Having a Baby in Queensland Survey 2012 was used to compare the experiences of 655 CALD women to those of 4049 non-CALD women in Queensland, Australia, across three stages of maternity care: pregnancy, labour and birth, and after birth. After adjustment for model of maternity care received and socio-demographic characteristics, CALD women were significantly more likely than non-CALD women to experience suboptimal staff technical competence in pregnancy, overall perceived safety in pregnancy and labour/birth, and interpersonal sensitivity in pregnancy and labour/birth. Approximately 50% of CALD women did not have the choice to use a translator or interpreter, or the gender of their care provider, during labour and birth. Thirteen themes of preferences and needs of CALD maternity care consumers based on ethnicity, cultural beliefs, or traditions were identified, however, these were rarely met. Findings imply that CALD women in Queensland experience disadvantageous maternity care with regards to perceived staff technical competence, safety, and interpersonal sensitivity, and receive care that lacks cultural competence. Improved access to support persons, continuity and choice of carer, and staff availability and training is recommended.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The high morbidity and mortality associated with atherosclerotic coronary vascular disease (CVD) and its complications are being lessened by the increased knowledge of risk factors, effective preventative measures and proven therapeutic interventions. However, significant CVD morbidity remains and sudden cardiac death continues to be a presenting feature for some subsequently diagnosed with CVD. Coronary vascular disease is also the leading cause of anaesthesia related complications. Stress electrocardiography/exercise testing is predictive of 10 year risk of CVD events and the cardiovascular variables used to score this test are monitored peri-operatively. Similar physiological time-series datasets are being subjected to data mining methods for the prediction of medical diagnoses and outcomes. This study aims to find predictors of CVD using anaesthesia time-series data and patient risk factor data. Several pre-processing and predictive data mining methods are applied to this data. Physiological time-series data related to anaesthetic procedures are subjected to pre-processing methods for removal of outliers, calculation of moving averages as well as data summarisation and data abstraction methods. Feature selection methods of both wrapper and filter types are applied to derived physiological time-series variable sets alone and to the same variables combined with risk factor variables. The ability of these methods to identify subsets of highly correlated but non-redundant variables is assessed. The major dataset is derived from the entire anaesthesia population and subsets of this population are considered to be at increased anaesthesia risk based on their need for more intensive monitoring (invasive haemodynamic monitoring and additional ECG leads). Because of the unbalanced class distribution in the data, majority class under-sampling and Kappa statistic together with misclassification rate and area under the ROC curve (AUC) are used for evaluation of models generated using different prediction algorithms. The performance based on models derived from feature reduced datasets reveal the filter method, Cfs subset evaluation, to be most consistently effective although Consistency derived subsets tended to slightly increased accuracy but markedly increased complexity. The use of misclassification rate (MR) for model performance evaluation is influenced by class distribution. This could be eliminated by consideration of the AUC or Kappa statistic as well by evaluation of subsets with under-sampled majority class. The noise and outlier removal pre-processing methods produced models with MR ranging from 10.69 to 12.62 with the lowest value being for data from which both outliers and noise were removed (MR 10.69). For the raw time-series dataset, MR is 12.34. Feature selection results in reduction in MR to 9.8 to 10.16 with time segmented summary data (dataset F) MR being 9.8 and raw time-series summary data (dataset A) being 9.92. However, for all time-series only based datasets, the complexity is high. For most pre-processing methods, Cfs could identify a subset of correlated and non-redundant variables from the time-series alone datasets but models derived from these subsets are of one leaf only. MR values are consistent with class distribution in the subset folds evaluated in the n-cross validation method. For models based on Cfs selected time-series derived and risk factor (RF) variables, the MR ranges from 8.83 to 10.36 with dataset RF_A (raw time-series data and RF) being 8.85 and dataset RF_F (time segmented time-series variables and RF) being 9.09. The models based on counts of outliers and counts of data points outside normal range (Dataset RF_E) and derived variables based on time series transformed using Symbolic Aggregate Approximation (SAX) with associated time-series pattern cluster membership (Dataset RF_ G) perform the least well with MR of 10.25 and 10.36 respectively. For coronary vascular disease prediction, nearest neighbour (NNge) and the support vector machine based method, SMO, have the highest MR of 10.1 and 10.28 while logistic regression (LR) and the decision tree (DT) method, J48, have MR of 8.85 and 9.0 respectively. DT rules are most comprehensible and clinically relevant. The predictive accuracy increase achieved by addition of risk factor variables to time-series variable based models is significant. The addition of time-series derived variables to models based on risk factor variables alone is associated with a trend to improved performance. Data mining of feature reduced, anaesthesia time-series variables together with risk factor variables can produce compact and moderately accurate models able to predict coronary vascular disease. Decision tree analysis of time-series data combined with risk factor variables yields rules which are more accurate than models based on time-series data alone. The limited additional value provided by electrocardiographic variables when compared to use of risk factors alone is similar to recent suggestions that exercise electrocardiography (exECG) under standardised conditions has limited additional diagnostic value over risk factor analysis and symptom pattern. The effect of the pre-processing used in this study had limited effect when time-series variables and risk factor variables are used as model input. In the absence of risk factor input, the use of time-series variables after outlier removal and time series variables based on physiological variable values’ being outside the accepted normal range is associated with some improvement in model performance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

- Background Falls are the most frequent adverse events that are reported in hospitals. We examined the effectiveness of individualised falls-prevention education for patients, supported by training and feedback for staff, delivered as a ward-level programme. - Methods Eight rehabilitation units in general hospitals in Australia participated in this stepped-wedge, cluster-randomised study, undertaken during a 50 week period. Units were randomly assigned to intervention or control groups by use of computer-generated, random allocation sequences. We included patients admitted to the unit during the study with a Mini-Mental State Examination (MMSE) score of more than 23/30 to receive individualised education that was based on principles of changes in health behaviour from a trained health professional, in addition to usual care. We provided information about patients' goals, feedback about the ward environment, and perceived barriers to engagement in falls-prevention strategies to staff who were trained to support the uptake of strategies by patients. The coprimary outcome measures were patient rate of falls per 1000 patient-days and the proportion of patients who were fallers. All analyses were by intention to treat. This trial is registered with the Australian New Zealand Clinical Trials registry, number ACTRN12612000877886). - Findings Between Jan 13, and Dec 27, 2013, 3606 patients were admitted to the eight units (n=1983 control period; n=1623 intervention period). There were fewer falls (n=196, 7·80/1000 patient-days vs n=380, 13·78/1000 patient-days, adjusted rate ratio 0·60 [robust 95% CI 0·42–0·94], p=0·003), injurious falls (n=66, 2·63/1000 patient-days vs 131, 4·75/1000 patient-days, 0·65 [robust 95% CI 0·42–0·88], p=0·006), and fallers (n=136 [8·38%] vs n=248 [12·51%] adjusted odds ratio 0·55 [robust 95% CI 0·38 to 0·81], p=0·003) in the intervention compared with the control group. There was no significant difference in length of stay (intervention median 11 days [IQR 7–19], control 10 days [6–18]). - Interpretation Individualised patient education programmes combined with training and feedback to staff added to usual care reduces the rates of falls and injurious falls in older patients in rehabilitation hospital-units.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE Malnutrition is common among peritoneal dialysis (PD) patients. Reduced nutrient intake contributes to this. It has long been assumed that this reflects disturbed appetite. We set out to define the appetite profiles of a group of PD patients using a novel technique. DESIGN Prospective, cross-sectional comparison of PD patients versus controls. SETTING Teaching hospital dialysis unit. PATIENTS 39 PD patients and 42 healthy controls. INTERVENTION Visual analog ratings were recorded at hourly intervals to generate daily profiles for hunger and fullness. Summary statistics were generated to compare the groups. Food intake was measured using 3-day dietary records. MAIN OUTCOME MEASURES Hunger and fullness profiles. Derived hunger and fullness scores. RESULTS Controls demonstrated peaks of hunger before mealtimes, with fullness scores peaking after meals. The PD profiles had much reduced premeal hunger peaks. A postmeal reduction in hunger was evident, but the rest of the trace was flat. The PD fullness profile was also flatter than in the controls. Mean scores were similar despite the marked discrepancy in the profiles. The PD group had lower peak hunger and less diurnal variability in their hunger scores. They also demonstrated much less change in fullness rating around mealtimes, while the mean and peak fullness scores were little different. The reported nutrient intake was significantly lower for PD. CONCLUSION The data suggest that PD patients normalize their mean appetite perception at a lower level of nutrient intake than controls, suggesting that patient-reported appetite may be misleading in clinical practice. There is a loss of the usual daily variation for the PD group, which may contribute to their reduced food intake. The technique described here could be used to assess the impact of interventions upon the abnormal PD appetite profile.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Efforts to prevent the development of overweight and obesity have increasingly focused early in the life course as we recognise that both metabolic and behavioural patterns are often established within the first few years of life. Randomised controlled trials (RCTs) of interventions are even more powerful when, with forethought, they are synthesised into an individual patient data (IPD) prospective meta-analysis (PMA). An IPD PMA is a unique research design where several trials are identified for inclusion in an analysis before any of the individual trial results become known and the data are provided for each randomised patient. This methodology minimises the publication and selection bias often associated with a retrospective meta-analysis by allowing hypotheses, analysis methods and selection criteria to be specified a priori. Methods/Design: The Early Prevention of Obesity in CHildren (EPOCH) Collaboration was formed in 2009. The main objective of the EPOCH Collaboration is to determine if early intervention for childhood obesity impacts on body mass index (BMI) z scores at age 18-24 months. Additional research questions will focus on whether early intervention has an impact on children’s dietary quality, TV viewing time, duration of breastfeeding and parenting styles. This protocol includes the hypotheses, inclusion criteria and outcome measures to be used in the IPD PMA. The sample size of the combined dataset at final outcome assessment (approximately 1800 infants) will allow greater precision when exploring differences in the effect of early intervention with respect to pre-specified participant- and intervention-level characteristics. Discussion: Finalisation of the data collection procedures and analysis plans will be complete by the end of 2010. Data collection and analysis will occur during 2011-2012 and results should be available by 2013. Trial registration number: ACTRN12610000789066

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To identify agreement levels between conventional longitudinal evaluation of change (post–pre) and patient-perceived change (post–then test) in health-related quality of life. Design: A prospective cohort investigation with two assessment points (baseline and six-month follow-up) was implemented. Setting: Community rehabilitation setting. Subjects: Frail older adults accessing community-based rehabilitation services. Intervention: Nil as part of this investigation. Main measures: Conventional longitudinal change in health-related quality of life was considered the difference between standard EQ-5D assessments completed at baseline and follow-up. To evaluate patient-perceived change a ‘then test’ was also completed at the follow-up assessment. This required participants to report (from their current perspective) how they believe their health-related quality of life was at baseline (using the EQ-5D). Patient-perceived change was considered the difference between ‘then test’ and standard follow-up EQ-5D assessments. Results: The mean (SD) age of participants was 78.8 (7.3). Of the 70 participants 62 (89%) of data sets were complete and included in analysis. Agreement between conventional (post–pre) and patient-perceived (post–then test) change was low to moderate (EQ-5D utility intraclass correlation coefficient (ICC)¼0.41, EQ-5D visual analogue scale (VAS) ICC¼0.21). Neither approach inferred greater change than the other (utility P¼0.925, VAS P¼0.506). Mean (95% confidence interval (CI)) conventional change in EQ-5D utility and VAS were 0.140 (0.045,0.236) and 8.8 (3.3,14.3) respectively, while patient-perceived change was 0.147 (0.055,0.238) and 6.4 (1.7,11.1) respectively. Conclusions: Substantial disagreement exists between conventional longitudinal evaluation of change in health-related quality of life and patient-perceived change in health-related quality of life (as measured using a then test) within individuals.