971 resultados para p-Groups
Resumo:
PURPOSE: We used gene microarray analysis to compare the global expression profile of genes involved in adaptation to training in skeletal muscle from chronically strength-trained (ST), endurance-trained (ET), and untrained control subjects (Con). METHODS: Resting skeletal muscle samples were obtained from the vastus lateralis of 20 subjects (Con n = 7, ET n = 7, ST n = 6; trained [TR] groups >8 yr specific training). Total RNA was extracted from tissue for two color microarray analysis and quantative (Q)-PCR. Trained subjects were characterized by performance measures of peak oxygen uptake V?O 2peak) on a cycle ergometer and maximal concentric and eccentric leg strength on an isokinetic dynamometer. RESULTS: Two hundred and sixty-three genes were differentially expressed in trained subjects (ET + ST) compared with Con (P < 0.05), whereas 21 genes were different between ST and ET (P < 0.05). These results were validated by reverse transcriptase polymerase chain reaction for six differentially regulated genes (EIFSJ, LDHB, LMO4, MDH1, SLC16A7, and UTRN. Manual cluster analyses revealed significant regulation of genes involved in muscle structure and development in TR subjects compared with Con (P < 0.05) and expression correlated with measures of performance (P < 0.05). ET had increased whereas ST had decreased expression of gene clusters related to mitochondrial/oxidative capacity (P ?‰Currency sign 0.05). These mitochondrial gene clusters correlated with V?O2peak (P < 0.05). V?O2peak also correlated with expression of gene clusters that regulate fat and carbohydrate oxidation (P < 0.05). CONCLUSION: We demonstrate that chronic training subtly coregulates numerous genes from important functional groups that may be part of the long-term adaptive process to adapt to repeated training stimuli.
Resumo:
OBJECTIVE: To optimize the animal model of liver injury that can properly represent the pathological characteristics of dampness-heat jaundice syndrome of traditional Chinese medicine. METHODS: The liver injury in the model rat was induced by alpha-naphthylisothiocyanate (ANIT) and carbon tetrachloride (CCl(4) ) respectively, and the effects of Yinchenhao Decoction (, YCHD), a proved effective Chinese medical formula for treating the dampness-heat jaundice syndrome in clinic, on the two liver injury models were evaluated by analyzing the serum level of alanine aminotransferase (ALT), asparate aminotransferase (AST), alkaline phosphatase (ALP), malondialchehyche (MDA), total bilirubin (T-BIL), superoxide dismutase (SOD), glutathione peroxidase (GSH-PX) as well as the ratio of liver weight to body weight. The experimental data were analyzed by principal component analytical method of pattern recognition. RESULTS: The ratio of liver weight to body weight was significantly elevated in the ANIT and CCl(4) groups when compared with that in the normal control (P<0.01). The contents of ALT and T-BIL were significantly higher in the ANIT group than in the normal control (P<0.05,P<0.01), and the levels of AST, ALT and ALP were significantly elevated in CCl(4) group relative to those in the normal control P<0.01). In the YCHD group, the increase in AST, ALT and ALP levels was significantly reduced (P<0.05, P<0.01), but with no significant increase in serum T-BIL. In the CCl(4) intoxicated group, the MDA content was significantly increased and SOD, GSH-PX activities decreased significantly compared with those in the normal control group, respectively (P<0.01). The increase in MDA induced by CCl(4) was significantly reduced by YCHD P<0.05). CONCLUSION: YCHD showed significant effects on preventing liver injury progression induced by CCl(4), and the closest or most suitable animal model for damp-heat jaundice syndrome may be the one induced by CCl(4).
Resumo:
An analytical evaluation of the higher ac harmonic components derived from large amplitude Fourier transformed voltammetry is provided for the reversible oxidation of ferrocenemethanol (FcMeOH) and oxidation of uric acid by an EEC mechanism in a pH 7.4 phosphate buffer at a glassy carbon (GC) electrode. The small background current in the analytically optimal fifth harmonic is predominantly attributed to faradaic current associated with the presence of electroactive functional groups on the GC electrode surface, rather than to capacitive current which dominates the background in the dc, and the initial three ac harmonics. The detection limits for the dc and the first to fifth harmonic ac components are 1.9, 5.89, 2.1, 2.5, 0.8, and 0.5 µM for FcMeOH, respectively, using a sine wave modulation of 100 mV at 21.46 Hz and a dc sweep rate of 111.76 mV s−1. Analytical performance then progressively deteriorates in the sixth and higher harmonics. For the determination of uric acid, the capacitive background current was enhanced and the reproducibility lowered by the presence of surface active uric acid, but the rapid overall 2e− rather than 1e– electron transfer process gives rise to a significantly enhanced fifth harmonic faradaic current which enabled a detection limit of 0.3 µM to be achieved which is similar to that reported using chemically modified electrodes. Resolution of overlapping voltammetric signals for a mixture of uric acid and dopamine is also achieved using higher fourth or fifth harmonic components, under very low background current conditions. The use of higher fourth and fifth harmonics exhibiting highly favorable faradaic to background (noise) current ratios should therefore be considered in analytical applications under circumstances where the electron transfer rate is fast.
Resumo:
The charge transfer-mediated surface enhanced Raman scattering (SERS) of crystal violet (CV) molecules that were chemically conjugated between partially polarized silver nanoparticles and optically smooth gold and silver substrates has been studied under off-resonant conditions. Tyrosine molecules were used as a reducing agent to convert silver ions into silver nanoparticles where oxidised tyrosine caps the silver nanoparticle surface with its semiquinone group. This binding through the quinone group facilitates charge transfer and results in partially oxidised silver. This establishes a chemical link between the silver nanoparticles and the CV molecules, where the positively charged central carbon of CV molecules can bind to the terminal carboxylate anion of the oxidised tyrosine molecules. After drop casting Ag nanoparticles bound with CV molecules it was found that the free terminal amine groups tend to bind with the underlying substrates. Significantly, only those CV molecules that were chemically conjugated between the partially polarised silver nanoparticles and the underlying gold or silver substrates were found to show SERS under off-resonant conditions. The importance of partial charge transfer at the nanoparticle/capping agent interface and the resultant conjugation of CV molecules to off resonant SERS effects was confirmed by using gold nanoparticles prepared in a similar manner. In this case the capping agent binds to the nanoparticle through the amine group which does not facilitate charge transfer from the gold nanoparticle and under these conditions SERS enhancement in the sandwich configuration was not observed.
Resumo:
BACKGROUND: The prevalence of protein-energy malnutrition in older adults is reported to be as high as 60% and is associated with poor health outcomes. Inadequate feeding assistance and mealtime interruptions may contribute to malnutrition and poor nutritional intake during hospitalisation. Despite being widely implemented in practice in the United Kingdom and increasingly in Australia, there have been few studies examining the impact of strategies such as Protected Mealtimes and dedicated feeding assistant roles on nutritional outcomes of elderly inpatients. AIMS: The aim of this research was to implement and compare three system-level interventions designed to specifically address mealtime barriers and improve energy intakes of medical inpatients aged ≥65 years. This research also aimed to evaluate the sustainability of any changes to mealtime routines six months post-intervention and to gain an understanding of staff perceptions of the post-intervention mealtime experience. METHODS: Three mealtime assistance interventions were implemented in three medical wards at Royal Brisbane and Women's Hospital: AIN-only: Additional assistant-in-nursing (AIN) with dedicated nutrition role. PM-only: Multidisciplinary approach to meals, including Protected Mealtimes. PM+AIN: Combined intervention: AIN + multidisciplinary approach to meals. An action research approach was used to carefully design and implement the three interventions in partnership with ward staff and managers. Significant time was spent in consultation with staff throughout the implementation period to facilitate ownership of the interventions and increase likelihood of successful implementation. A pre-post design was used to compare the implementation and nutritional outcomes of each intervention to a pre-intervention group. Using the same wards, eligible participants (medical inpatients aged ≥65 years) were recruited to the preintervention group between November 2007 and March 2008 and to the intervention groups between January and June 2009. The primary nutritional outcome was daily energy and protein intake, which was determined by visually estimating plate waste at each meal and mid-meal on Day 4 of admission. Energy and protein intakes were compared between the pre and post intervention groups. Data were collected on a range of covariates (demographics, nutritional status and known risk factors for poor food intake), which allowed for multivariate analysis of the impact of the interventions on nutritional intake. The provision of mealtime assistance to participants and activities of ward staff (including mealtime interruptions) were observed in the pre-intervention and intervention groups, with staff observations repeated six months post-intervention. Focus groups were conducted with nursing and allied health staff in June 2009 to explore their attitudes and behaviours in response to the three mealtime interventions. These focus group discussions were analysed using thematic analysis. RESULTS: A total of 254 participants were recruited to the study (pre-intervention: n=115, AIN-only: n=58, PM-only: n=39, PM+AIN: n=42). Participants had a mean age of 80 years (SD 8), and 40% (n=101) were malnourished on hospital admission, 50% (n=108) had anorexia and 38% (n=97) required some assistance at mealtimes. Occasions of mealtime assistance significantly increased in all interventions (p<0.01). However, no change was seen in mealtime interruptions. No significant difference was seen in mean total energy and protein intake between the preintervention and intervention groups. However, when total kilojoule intake was compared with estimated requirements at the individual level, participants in the intervention groups were more likely to achieve adequate energy intake (OR=3.4, p=0.01), with no difference noted between interventions (p=0.29). Despite small improvements in nutritional adequacy, the majority of participants in the intervention groups (76%, n=103) had inadequate energy intakes to meet their estimated energy requirements. Patients with cognitive impairment or feeding dependency appeared to gain substantial benefit from mealtime assistance interventions. The increase in occasions of mealtime assistance by nursing staff during the intervention period was maintained six-months post-intervention. Staff focus groups highlighted the importance of clearly designating and defining mealtime responsibilities in order to provide adequate mealtime care. While the purpose of the dedicated feeding assistant was to increase levels of mealtime assistance, staff indicated that responsibility for mealtime duties may have merely shifted from nursing staff to the assistant. Implementing the multidisciplinary interventions empowered nursing staff to "protect" the mealtime from external interruptions, but further work is required to empower nurses to prioritise mealtime activities within their own work schedules. Staff reported an increase in the profile of nutritional care on all wards, with additional non-nutritional benefits noted including improved mobility and functional independence, and better identification of swallowing difficulties. IMPLICATIONS: The PhD research provides clinicians with practical strategies to immediately introduce change to deliver better mealtime care in the hospital setting, and, as such, has initiated local and state-wide roll-out of mealtime assistance programs. Improved nutritional intakes of elderly inpatients was observed; however given the modest effect size and reducing lengths of hospital stays, better nutritional outcomes may be achieved by targeting the hospital-to-home transition period. Findings from this study suggest that mealtime assistance interventions for elderly inpatients with cognitive impairment and/or functional dependency show promise.
Resumo:
Background & aims The confounding effect of disease on the outcomes of malnutrition using diagnosis-related groups (DRG) has never been studied in a multidisciplinary setting. This study aims to determine the impact of malnutrition on hospitalisation outcomes, controlling for DRG. Methods Subjective Global Assessment was used to assess the nutritional status of 818 patients within 48 hours of admission. Prospective data were collected on cost of hospitalisation, length of stay (LOS), readmission and mortality up to 3 years post-discharged using National Death Register data. Mixed model analysis and conditional logistic regression matching by DRG were carried out to evaluate the association between nutritional status and outcomes, with the results adjusted for gender, age and race. Results Malnourished patients (29%) had longer hospital stays (6.9±7.3 days vs. 4.6±5.6 days, p<0.001) and were more likely to be readmitted within 15 days (adjusted relative risk = 1.9, 95%CI 1.1–3.2, p=0.025). Within a DRG, the mean difference between actual cost of hospitalisation and the average cost for malnourished patients was greater than well-nourished patients (p=0.014). Mortality was higher in malnourished patients at 1 year (34% vs. 4.1 %), 2 years (42.6% vs. 6.7%) and 3 years (48.5% vs. 9.9%); p<0.001 for all. Overall, malnutrition was a significant predictor of mortality (adjusted hazard ratio = 4.4, 95%CI 3.3-6.0, p<0.001). Conclusions Malnutrition was evident in up to one third of inpatients and led to poor hospitalisation outcomes, even after matching for DRG. Strategies to prevent and treat malnutrition in the hospital and post-discharge are needed.
Resumo:
Over the past few decades a major paradigm shift has occurred in the conceptualisation of chronic pain as a complex multidimensional phenomenon. Yet, pain experienced by individuals with a primary disability continues to be understood largely from a traditional biomedical model, despite its inherent limitations. This is reflected in the body of literature on the topic that is primarily driven by positivist assumptions and the search for etiologic pain mechanisms. Conversely, little is known about the experiences of and meanings attributed to, disability-related pain. Thus the purpose of this paper is to discuss the use of focus group methodology in elucidating the meanings and experiences of this population. Here, a distinction is made between the method of the focus group and focus group research as methodology. Typically, the focus group is presented as a seemingly atheoretical method of research. Drawing on research undertaken on the impact of chronic pain in people with multiple sclerosis, this paper seeks to theorise the focus group in arguing the methodological congruence of focus group research and the study of pain experience. It is argued that the contributions of group interaction and shared experiences in focus group discussions produce data and insights less accessible through more structured research methods. It is concluded that a biopsychosocial perspective of chronic pain may only ever be appreciated when the person-in-context is the unit of investigation.
Resumo:
Traditionally, infectious diseases and under-nutrition have been considered major health problems in Sri Lanka with little attention paid to obesity and associated non-communicable diseases (NCDs). However, the recent Sri Lanka Diabetes and Cardiovascular Study (SLDCS) reported the epidemic level of obesity, diabetes and metabolic syndrome. Moreover, obesity-associated NCDs is the leading cause of death in Sri Lanka and there is an exponential increase in hospitalization due to NCDs adversely affecting the development of the country. Despite Sri Lanka having a very high prevalence of NCDs and associated mortality, little is known about the causative factors for this burden. It is widely believed that the global NCD epidemic is associated with recent lifestyle changes, especially dietary factors. In the absence of sufficient data on dietary habits in Sri Lanka, successful interventions to manage these serious health issues would not be possible. In view of the current situation the dietary survey was undertaken to assess the intakes of energy, macro-nutrients and selected other nutrients with respect to socio demographic characteristics and the nutritional status of Sri Lankan adults especially focusing on obesity. Another aim of this study was to develop and validate a culturally specific food frequency questionnaire (FFQ) to assess dietary risk factors of NCDs in Sri Lankan adults. Data were collected from a subset of the national SLDCS using a multi-stage, stratified, random sampling procedure (n=500). However, data collection in the SLDCS was affected by the prevailing civil war which resulted in no data being collected from Northern and Eastern provinces. To obtain a nationally representative sample, additional subjects (n=100) were later recruited from the two provinces using similar selection criteria. Ethical Approval for this study was obtained from the Ethical Review Committee, Faculty of Medicine, University of Colombo, Sri Lanka and informed consent was obtained from the subjects before data were collected. Dietary data were obtained using the 24-h Dietary Recall (24HDR) method. Subjects were asked to recall all foods and beverages, consumed over the previous 24-hour period. Respondents were probed for the types of foods and food preparation methods. For the FFQ validation study, a 7-day weight diet record (7-d WDR) was used as the reference method. All foods recorded in the 24 HDR were converted into grams and then intake of energy and nutrients were analysed using NutriSurvey 2007 (EBISpro, Germany) which was modified for Sri Lankan food recipes. Socio-demographic details and body weight perception were collected from interviewer-administrated questionnaire. BMI was calculated and overweight (BMI ≥23 kg.m-2), obesity (BMI ≥25 kg.m-2) and abdominal obesity (Men: WC ≥ 90 cm; Women: WC ≥ 80 cm) were categorized according to Asia-pacific anthropometric cut-offs. The SPSS v. 16 for Windows and Minitab v10 were used for statistical analysis purposes. From a total of 600 eligible subjects, 491 (81.8%) participated of whom 34.5% (n=169) were males. Subjects were well distributed among different socio-economic parameters. A total of 312 different food items were recorded and nutritionists grouped similar food items which resulted in a total of 178 items. After performing step-wise multiple regression, 93 foods explained 90% of the variance for total energy intake, carbohydrates, protein, total fat and dietary fibre. Finally, 90 food items and 12 photographs were selected. Seventy-seven subjects completed (response rate = 65%) the FFQ and 7-day WDR. Estimated mean energy intake (SD) from FFQ (1794±398 kcal) and 7DWR (1698±333 kcal, P<0.001) was significantly different due to a significant overestimation of carbohydrate (~10 g/d, P<0.001) and to some extent fat (~5 g/d, NS). Significant positive correlations were found between the FFQ and 7DWR for energy (r = 0.39), carbohydrate (r = 0.47), protein (r = 0.26), fat (r =0.17) and dietary fiber (r = 0.32). Bland-Altman graphs indicated fairly good agreement between methods with no relationship between bias and average intake of each nutrient examined. The findings from the nutrition survey showed on average, Sri Lankan adults consumed over 14 portions of starch/d; moreover, males consumed 5 more portions of cereal than females. Sri Lankan adults consumed on average 3.56 portions of added sugars/d. Moreover, mean daily intake of fruit (0.43) and vegetable (1.73) portions was well below minimum dietary recommendations (fruits 2 portions/d; vegetables 3 portions/d). The total fruit and vegetable intake was 2.16 portions/d. Daily consumption of meat or alternatives was 1.75 portions and the sum of meat and pulses was 2.78 portions/d. Starchy foods were consumed by all participants and over 88% met the minimum daily recommendations. Importantly, nearly 70% of adults exceeded the maximum daily recommendation for starch (11portions/d) and a considerable proportion consumed larger numbers of starch servings daily, particularly men. More than 12% of men consumed over 25 starch servings/d. In contrast to their starch consumption, participants reported very low intakes of other food groups. Only 11.6%, 2.1% and 3.5% of adults consumed the minimum daily recommended servings of vegetables, fruits, and fruits and vegetables combined, respectively. Six out of ten adult Sri Lankans sampled did not consume any fruits. Milk and dairy consumption was extremely low; over a third of the population did not consume any dairy products and less than 1% of adults consumed 2 portions of dairy/d. A quarter of Sri Lankans did not report consumption of meat and pulses. Regarding protein consumption, 36.2% attained the minimum Sri Lankan recommendation for protein; and significantly more men than women achieved the recommendation of ≥3 servings of meat or alternatives daily (men 42.6%, women 32.8%; P<0.05). Over 70% of energy was derived from carbohydrates (Male:72.8±6.4%, Female:73.9±6.7%), followed by fat (Male:19.9±6.1%, Female:18.5±5.7%) and proteins (Male:10.6±2.1%, Female:10.9±5.6%). The average intake of dietary fiber was 21.3 g/day and 16.3 g/day for males and females, respectively. There was a significant difference in nutritional intake related to ethnicities, areas of residence, education levels and BMI categories. Similarly, dietary diversity was significantly associated with several socio-economic parameters among Sri Lankan adults. Adults with BMI ≥25 kg.m-2 and abdominally obese Sri Lankan adults had the highest diet diversity values. Age-adjusted prevalence (95% confidence interval) of overweight, obesity, and abdominal obesity among Sri Lankan adults were 17.1% (13.8-20.7), 28.8% (24.8-33.1), and 30.8% (26.8-35.2), respectively. Men, compared with women, were less overweight, 14.2% (9.4-20.5) versus 18.5% (14.4-23.3), P = 0.03, less obese, 21.0% (14.9-27.7) versus 32.7% (27.6-38.2), P < .05; and less abdominally obese, 11.9% (7.4-17.8) versus 40.6% (35.1-46.2), P < .05. Although, prevalence of obesity has reached to epidemic level body weight misperception was common among Sri Lankan adults. Two-thirds of overweight males and 44.7% of females considered themselves as in "about right weight". Over one third of both male and female obese subjects perceived themselves as "about right weight" or "underweight". Nearly 32% of centrally obese men and women perceived that their waist circumference is about right. People who perceived overweight or very overweight (n = 154) only 63.6% tried to lose their body weight (n = 98), and quarter of adults seek advices from professionals (n = 39). A number of important conclusions can be drawn from this research project. Firstly, the newly developed FFQ is an acceptable tool for assessing the nutrient intake of Sri Lankans and will assist proper categorization of individuals by dietary exposure. Secondly, a substantial proportion of the Sri Lankan population does not consume a varied and balanced diet, which is suggestive of a close association between the nutrition-related NCDs in the country and unhealthy eating habits. Moreover, dietary diversity is positively associated with several socio-demographic characteristics and obesity among Sri Lankan adults. Lastly, although obesity is a major health issue among Sri Lankan adults, body weight misperception was common among underweight, healthy weight, overweight, and obese adults in Sri Lanka. Over 2/3 of overweight and 1/3 of obese Sri Lankan adults believe that they are in "right weight" or "under-weight" categories.
Resumo:
Aim There is a growing population of people with cancer who experience physiological and psychological effects that persist long after treatment is complete. Interventions that enhance survivors’ self-management abilities might help offset these effects. The aim of this pilot study was to develop, implement and evaluate interventions tailored to assist patients to manage post-treatment health issues effectively. Method In this pre-post intervention cohort study, participants were recruited on completion of cancer treatment. Participants recruited preimplementation, who received usual care, comprised the control group. Participants recruited later formed the intervention group. In the intervention group, the Cancer Care Coordinator developed an individualised, structured Cancer Survivor Self-management Care Plan. Participants were interviewed on completion of treatment (baseline) and at three months. Assessments concerned health needs (CaSUN), self-efficacy in adjusting and coping with cancer and health-related quality of life (FACIT-B or FACT-C). The impact of the intervention was determined by independent t-tests of change scores. Results The intervention (n = 32) and control groups (n = 35) were comparable on demographic and clinical characteristics. Sample mean age was 54 + 10 years. Cancer diagnoses were breast (82%) and colorectal (18%). Statistically significant differences (p < 0.05) indicated improvement in the intervention group for: (a) functional well-being, from the FACIT, (Control: M = −0.69, SE = 0.91; Intervention: M = 3.04, SE = 1.13); and (b) self-efficacy in maintaining social relationships, (Control: M = −0.333, SE = 0.33; Intervention: M = 0.621, SE = 0.27). No significant differences were found in health needs, other subscales of quality of life, the extent and number of strategies used in coping and adjusting to cancer and in other domains of self-efficacy. Conclusions While the results should be interpreted with caution, due to the non-randomised nature of the study and the small sample size, they indicate the potential benefits of tailored self-management interventions warrant further investigation in this context.
Resumo:
Diagnosis threat is a psychosocial factor that has been proposed to contribute to poor outcomes following mild traumatic brain injury (mTBI). This threat is thought to impair the cognitive test performance of individuals with mTBI because of negative injury stereotypes. University students (N= 45, 62.2% female) with a history of mTBI were randomly allocated to a diagnosis threat (DT, n=15), reduced threat (DT-reduced, n=15) or neutral (n=15) group. The reduced threat condition invoked a positive stereotype (i.e., that people with mTBI can perform well on cognitive tests). All participants were given neutral instructions before they completed baseline tests of: a) objective cognitive function across a number of domains; b) psychological symptoms; and, c) PCS symptoms, including self-reported cognitive and emotional difficulties. Participants then received either neutral, DT or DT-reduced instructions, before repeating the tests. Results were analyzed using separate mixed model ANOVAs; one for each dependent measure. The only significant result was for the 2 X 3 ANOVA on an objective test of attention/working memory, Digit Span, p<.05, such that the DT-reduced group performed better than the other groups, which were not different from each other. Although not consistent with predictions or earlier DT studies, the absence of group differences on most tests fits with several recent DT findings. The results of this study suggest that it is timely to reconsider the role of DT as a unique contributor to poor mTBI outcome.
Resumo:
Objective To analyze the epidemiological trend of hepatitis B from 1990 to 2007 in Shandong province, and to find the high risk population so as to explore the further control strategy. Methods Based on the routine reporting incidence data of hepatitis B and demographic data of Shandong province, the incidence rates and sex - specific, age - specific incidence rates of hepatitis B were calculated and statistically analyzed in the simple linear regression model. Results The total number of hepatitis B was 437 094, the annual average morbidity was 27132 per 100 000 population during 1990 to 2007. The incidence of men (38142 per 100 000) was higher than that for women (15183 per 100 000) 1The annual incidence rate of hepatitis B indicated an increasing trend for the whole population, while a decreased trend for the 0~9 year - old children p resented in the past 18 years. It showed that the average age of onset moved to the older. Conclusion Young adult men are the high-risk groups for the onset of hepatitis B. For the prevention of hepatitis B, the immunization of hepatitis B vaccine should be enhanced for other groups, especially for the high - risk population on the basis of imp roving the immunization coverage rate for newborns.
Resumo:
The high burden of parental concern in children with chronic cough has been well documented. Acute cough in children (lasting less than 2 weeks) also has a significant impact on families, reflected by the number of doctor visits for cough. Currently there is no validated acute cough specific quality of life (QOL) measure for children. The objective of this study is to develop and validate an acute cough specific QOL questionnaire (PAC-QOL) for paediatric use. Here we present our data on item selection. Methods Two independent focus groups were conducted to determine relevant items. Parents discussed the impact of their child’s current or previous episodes of acute cough on their child, themselves and their family functioning. Transcripts were analyzed to determine whether discussions had reached an item saturation point. Items were also compared against our previously validated parent-centred children’s chronic cough specific QOL questionnaire (PC-QOL), which was used as a model. The newly developed acute cough specific QOL questionnaire is designed to assess the level of frequency of parents’ feelings and worry related to their child’s acute cough, using a 24-h time-point reference. Results Newly identified acute cough specific items include parental worry around whether or not they should take their child to a doctor or emergency department, and frequency of seeking assistance from friends and family. Conclusions While there are similarities between items identified for both acute and chronic cough, there are distinct features. Further data will be collected for item reduction and validation of this children’s acute cough specific QOL questionnaire.
Resumo:
Objectives In Aboriginal and Torres Strait Islander peoples in Queensland, to (a) determine the disease burden of common chronic lung diseases and (b) identify areas of need with respect to lung health services. Methods Literature reviews and analyses of hospitalisation and mortality data were used to describe disease epidemiology and available programs and services. Key stakeholder interviews and an online survey of health professionals were used to evaluate lung health services across the state and to identify services, needs and gaps. Results Morbidity and mortality from respiratory diseases in the Indigenous population is substantially higher than the non-Indigenous population across all age groups and regions. There are inadequate clinical services and resources to address disease prevention, detection, intervention and management in an evidence-based and culturally acceptable fashion. There is a lack of culturally appropriate educational resources and management programs, insufficient access to appropriately engaged Indigenous health professionals, a lack of multi-disciplinary specialist outreach teams, fragmented information systems and inadequate coordination of care. Conclusions Major initiatives are required at all levels of the healthcare system to adequately address service provision for Indigenous Queenslanders with lung diseases, including high quality research to investigate the causes for poor lung health, which are likely to be multifactorial.
Resumo:
Objective To determine the burden of hospitalised, radiologically confirmed pneumonia (World Health Organization protocol) in Northern Territory Indigenous children. Design, setting and participants Historical, observational study of all hospital admissions for any diagnosis of NT resident Indigenous children, aged between >= 29 days and < 5 years, 1 April 1997 to 31 March 2005. Intervention All chest radiographs taken during these admissions, regardless of diagnosis, were assessed for pneumonia in accordance with the WHO protocol. Main outcome measure The primary outcome was endpoint consolidation (dense fluffy consolidation [alveolar infiltrate] of a portion of a lobe or the entire lung) present on a chest radiograph within 3 days of hospitalisation. Results We analysed data on 24 115 hospitalised episodes of care for 9492 children and 13 683 chest radiographs. The average annual cumulative incidence of endpoint consolidation was 26.6 per 1000 population per year (95% Cl, 25.3-27.9); 57.5 per 1000 per year in infants aged 1-11 months, 38.3 per 1000 per year in those aged 12-23 months, and 13.3 per 1000 per year in those aged 24-59 months. In all age groups, rates of endpoint consolidation in children in the arid southern region of NT were about twice that of children in the tropical northern region. Conclusion The rates of severe pneumonia in hospitalised NT Indigenous children are among the highest reported in the world. Reducing this unacceptable burden of disease should be a national health priority.
Resumo:
It is known that bioscience is perceived to be difficult and causes anxiety within undergraduate nursing students; yet, commencing students' perceptions of bioscience is not known. Therefore, the aim of this study was to ascertain incoming students' perceptions, knowledge and approaches to learning bioscience. Incoming students to the Bachelor of Nursing completed a questionnaire prior to undertaking bioscience. Two hundred and seventy three students completed the questionnaire that explored their expectations, preconceptions of bioscience content, approaches to learning bioscience, and relationship to clinical practice in the context of biosciences. Participant ages ranged from 17 to 53 (mean 23 years), and 78% of students had completed at least one secondary school science subject, of which 60% had studied biology. Overall, students' preconceptions included anxiety about studying bioscience, bioscience being difficult and harder than nursing subjects, and that more content will be required for bioscience than nursing subjects. Analysis using ANOVA revealed the relationships for secondary school science and age on student responses. A significant effect of secondary school science was found for science in school being advantageous for bioscience (p = 0.010), understanding what bioscience entails (p = 0.002), needing to study science prior to the start of the semester (p = 0.009), and that bioscience is considered difficult (p = 0.029). A significant effect of age was found for exams being more difficult than other assessments (p = 0.000) and for being able to see the relevance of nursing when reaching the workplace (p = 0.011). The findings also indicated that perceptions and associated anxieties related to bioscience were present in commencing students, similar to those which have been reported previously in established student groups. This strongly suggests that the faculty should attempt to dispel preconceptions about bioscience and target improved supports to facilitate the transition of students into the commencement of bioscience for nursing students.