961 resultados para ADMINISTERED MORPHINE
Resumo:
Background: Quality mental health care for adults with an intellectual disability (ID) depends upon the availability of appropriately trained and experienced psychiatrists. There have been few surveys of psychiatrists working with this population. Methods: This Australian study obtained psychiatrists' attitudes to and perceptions of the mental health needs of adults with an ID. Training needs were also sought. The survey instrument used was a purposely designed, 28-item self-administered questionnaire featuring multiple-choice and open-ended questions. Results: The majority of psychiatrists expressed concerns about treatment of this group, describing unmet needs. A total of 75% considered that antipsychotics were overused to control aggression, and 34% of psychiatrists were reluctant to treat adults with an ID. In total, 85% agreed that mental health in ID should be offered as a training option for psychiatric registrars, and that specialized mental health services would provide a high standard of care for this population. Conclusions: Broad concerns are raised regarding pathways to mental health care for adults with an ID in Australia. An Australia-wide training strategy needs to be developed. Partnerships between mental health, disability and community services that serve the mental health needs of this population, should actively seek to engage psychiatrists.
Resumo:
The conversion of one-way polyethylene terephthalate (PET) bottles into reusable bottles helps reduce environmental burden. Recently, the Ministry of the Environment in Japan began discussing the introduction of reusable bottles. One of the barriers to introducing the new type of bottle is consumer unwillingness to accept refilled reusable bottles. We administered the questionnaires to consumers in a pilot test on reusable PET bottles organized to analyze the demand for these products. To increase the demand for refilled reusable bottles, it is necessary to supply refilled reusable bottles that are acceptable to consumers who are concerned about container flaws and stains.
Resumo:
The introduction of online delivery platforms such as learning management systems (LMS) in tertiary education has changed the methods and modes of curriculum delivery and communication. While course evaluation methods have also changed from paper-based in-class-administered methods to largely online-administered methods, the data collection instruments have remained unchanged. This paper reports on a small exploratory study of two tertiary-level courses. The study investigated why design of the instruments and methods to administer surveys in the courses are ineffective measures against the intrinsic characteristics of online learning. It reviewed the students' response rates of the conventional evaluations for the courses over an eight-year period. It then compared a newly developed online evaluation and the conventional methods over a two-year period. The results showed the response rates with the new evaluation method increased by more than 80% from the average of the conventional evaluations (below 30%), and the students' written feedback was more detailed and comprehensive than in the conventional evaluations. The study demonstrated the possibility that the LMS-based learning evaluation can be effective and efficient in terms of the quality of students' participation and engagement in their learning, and for an integrated pedagogical approach in an online learning environment.
Resumo:
This study quantifies the motivators and barriers to bikeshare program usage in Australia. An online survey was administered to a sample of annual members of Australia’s two bikeshare programs based in Brisbane and Melbourne, to assess motivations for joining the schemes. Non-members of the programs were also sampled in order to identify current barriers to joining bikeshare. Spatial analysis from Brisbane revealed residential and work locations of non-members were more geographically dispersed than for bikeshare members. An analysis of bikeshare usage in Melbourne showed a strong relationship between docking stations in areas with relatively less accessible public transit opportunities. The most influential barriers to bikeshare use related to motorized travel being too convenient and docking stations not being sufficiently close to home, work and other frequented destinations. The findings suggest that bikeshare programs may attract increased membership by ensuring travel times are competitive with motorized travel, for example through efficient bicycle routing and priority progression and, by expanding docking station locations, and by increasing the level of convenience associated with scheme use. Convenience considerations may include strategic location of docking stations, ease of signing up and integration with public transport.
Resumo:
Background
With dwindling malaria cases in Bhutan in recent years, the government of Bhutan has made plans for malaria elimination by 2016. This study aimed to determine coverage, use and ownership of LLINs, as well as the prevalence of asymptomatic malaria at a single time-point, in four sub-districts of Bhutan.
Methods
A cross-sectional study was carried out in August 2013. Structured questionnaires were administered to a single respondent in each household (HH) in four sub-districts. Four members from 25 HH, randomly selected from each sub-district, were tested using rapid diagnostic tests (RDT) for asymptomatic Plasmodium falciparum and Plasmodium vivax infection. Multivariable logistic regression models were used to identify factors associated with LLIN use and maintenance.
Results
All blood samples from 380 participants tested negative for Plasmodium infections. A total of 1,223 HH (92.5% of total HH) were surveyed for LLIN coverage and use. Coverage of LLINs was 99.0% (1,203/1,223 HH). Factors associated with decreased odds of sleeping under a LLIN included: washing LLINs
Resumo:
Surgical site infections following caesarean section are a serious and costly adverse event for Australian hospitals. In the United Kingdom, 9% of women are diagnosed with a surgical site infection following caesarean section either in hospital or post-discharge (Wloch et al 2012, Ward et al 2008). Additional staff time, pharmaceuticals and health supplies, and increased length of stay or readmission to hospital are often required (Henman et al 2012). Part of my PhD investigated the economics of preventing post-caesarean infection. This paper summarises a review of relevant infection prevention strategies. Administering antibiotic prophylaxis 15 to 60 minutes pre-incision, rather than post cordclamping, is probably the most important infection prevention strategy for caesarean section (Smaill and Gyte2010, Liu et al 2013, Dahlke et al 2013). However the timing of antibiotic administration is reportedly inconsistent in Australian hospitals. Clinicians may be taking advice from the influential, but out-dated RANZCOG and United States Centers for Disease Control and Prevention guidelines (Royal Australian and New Zealand College of Obstetricians and Gynaecologists 2011, Mangram et al 1999). A number of other important international clinical guidelines, including Australia's NHMRC guidelines, recommend universal prophylactic antibiotics pre-incision for caesarean section (National Health and Medical Research Council 2010, National Collaborating Centre for Women's and Children's Health 2008, Anderson et al 2008, National Collaborating Centre for Women's and Children's Health 2011, Bratzler et al 2013, American College of Obstetricians and Gynecologists 2011a, Antibiotic Expert Group 2010). We need to ensure women receive preincision antibiotic prophylaxis, particularly as nurses and midwives play a significant role in managing an infection that may result from sub-optimal practice. It is acknowledged more explicitly now that nurses and midwives can influence prescribing and administration of antibiotics through informal approaches (Edwards et al 2011). Methods such as surgical safety checklists are a more formal way for nurses and midwives to ensure that antibiotics are administered pre-incision (American College of Obstetricians and Gynecologists 2011 b). Nurses and midwives can also be directly responsible for other infection prevention strategies such as instructing women to not remove pubic hair in the month before the expected date of delivery and wound management education (Ng et al 2013). Potentially more costly but effective strategies include using a Chlorhexidine-gluconate (CHG) sponge preoperatively (in addition to the usual operating room skin preparation) and vaginal cleansing with a povidone-iodine solution (Riley et al 2012, Rauk 2010, Haas, Morgan, and Contreras 2013).
Resumo:
Background Musculoskeletal conditions and insufficient physical activity have substantial personal and economic costs among contemporary aging societies. This study examined the age distribution, comorbid health conditions, body mass index (BMI), self-reported physical activity levels, and health-related quality of life of patients accessing ambulatory hospital clinics for musculoskeletal disorders. The study also investigated whether comorbidity, BMI, and self-reported physical activity were associated with patients’ health-related quality of life after adjusting for age as a potential confounder. Methods A cross-sectional survey was undertaken in three ambulatory hospital clinics for musculoskeletal disorders. Participants (n=224) reported their reason for referral, age, comorbid health conditions, BMI, physical activity levels (Active Australia Survey), and health-related quality of life (EQ-5D). Descriptive statistics and linear modeling were used to examine the associations between age, comorbidity, BMI, intensity and duration of physical activity, and health-related quality of life. Results The majority of patients (n=115, 51.3%) reported two or more comorbidities. In addition to other musculoskeletal conditions, common comorbidities included depression (n=41, 18.3%), hypertension (n=40, 17.9%), and diabetes (n=39, 17.4%). Approximately one-half of participants (n=110, 49.1%) self-reported insufficient physical activity to meet minimum recommended guidelines and 150 (67.0%) were overweight (n=56, 23.2%), obese (n=64, 28.6%), severely obese (n=16, 7.1%), or very severely obese (n=14, 6.3%), with a higher proportion of older patients affected. A generalized linear model indicated that, after adjusting for age, self-reported physical activity was positively associated (z=4.22, P<0.001), and comorbidities were negatively associated (z=-2.67, P<0.01) with patients’ health-related quality of life. Conclusion Older patients were more frequently affected by undesirable clinical attributes of comorbidity, obesity, and physical inactivity. However, findings from this investigation are compelling for the care of patients of all ages. Potential integration of physical activity behavior change or other effective lifestyle interventions into models of care for patients with musculoskeletal disorders is worthy of further investigation.
Resumo:
Coffee is one of the most widely consumed beverages in the world and has a number of potential health benefits. Coffee may influence energy expenditure and energy intake, which in turn may affect body weight. However, the influence of coffee and its constituents – particularly caffeine – on appetite remains largely unexplored. The objective of this study was to examine the impact of coffee consumption (with and without caffeine) on appetite sensations, energy intake, gastric emptying, and plasma glucose between breakfast and lunch meals. In a double-blind, randomised crossover design. Participants (n = 12, 9 women; Mean ± SD age and BMI: 26.3 ± 6.3 y and 22.7 ± 2.2 kg•m−2) completed 4 trials: placebo (PLA), decaffeinated coffee (DECAF), caffeine (CAF), and caffeine with decaffeinated coffee (COF). Participants were given a standardised breakfast labelled with 13C-octanoic acid and 225 mL of treatment beverage and a capsule containing either caffeine or placebo. Two hours later, another 225 mL of the treatment beverage and capsule was administered. Four and a half hours after breakfast, participants were given access to an ad libitum meal for determination of energy intake. Between meals, participants provided exhaled breath samples for determination of gastric emptying; venous blood and appetite sensations. Energy intake was not significantly different between the trials (Means ± SD, p > 0.05; Placebo: 2118 ± 663 kJ; Decaf: 2128 ± 739 kJ; Caffeine: 2287 ± 649 kJ; Coffee: 2016 ± 750 kJ); Other than main effects of time (p < 0.05), no significant differences were detected for appetite sensations or plasma glucose between treatments (p > 0.05). Gastric emptying was not significantly different across trials (p > 0.05). No significant effects of decaffeinated coffee, caffeine or their combination were detected. However, the consumption of caffeine and/or coffee for regulation of energy balance over longer periods of time warrant further investigation.
Resumo:
Dendrimers have potential for delivering chemotherapeutic drugs to solid tumours via the enhanced permeation and retention (EPR) effect. The impact of conjugation of hydrophobic anticancer drugs to hydrophilic PEGylated dendrimer surfaces, however, has not been fully investigated. The current study has therefore characterised the effect on dendrimer disposition of conjugating α-carboxyl protected methotrexate (MTX) to a series of PEGylated 3H-labelled poly-L-lysine dendrimers ranging in size from generation 3 (G3) to 5 (G5) in rats. Dendrimers contained 50% surface PEG and 50% surface MTX. Conjugation of MTX generally increased plasma clearance when compared to conjugation with PEG alone. Conversely, increasing generation reduced clearance, increased metabolic stability and reduced renal elimination of the administered radiolabel. For constructs with molecular weights >20 kDa increasing the molecular weight of conjugated PEG also reduced clearance and enhanced metabolic stability but had only a minimal effect on renal elimination. Tissue distribution studies revealed retention of MTX conjugated smaller (G3-G4) PEG570 dendrimers (or their metabolic products) in the kidneys. In contrast, the larger G5 dendrimer was concentrated more in the liver and spleen. The G5 PEG1100 dendrimer was also shown to accumulate in solid Walker 256 and HT1080 tumours and comparative disposition data in both rats (1 to 2% dose/g in tumour) and mice (11% dose/g in tumour) are presented. The results of this study further illustrate the potential utility of biodegradable PEGylated poly-L-lysine dendrimers as long circulating vectors for the delivery and tumour-targeting of hydrophobic drugs.
Resumo:
Child behaviour management is crucial to successful treatment of atopic dermatitis. This study tested relationships between parents’ self-efficacy, outcome expectations, and self-reported task performance when caring for a child with atopic dermatitis. Using a cross-sectional study design, a community-based convenience sample of 120 parents participated in pilot-testing of the Child Eczema Management Questionnaire - a self-administered questionnaire which appraises parents’ self-efficacy, outcome expectations, and self-reported task performance when managing atopic dermatitis. Overall, parents’ self-reported confidence and success with performing routine management tasks was greater than that for managing their child’s symptoms and behaviour. Therewas a positive relationship between time since diagnosis and self-reported performance of routine management tasks; however, success with managing the child’s symptoms and behaviour did not improve with illness duration. Longer time since diagnosis was also associated with more positive outcome expectations of performing tasks that involved others in the child’s care (i.e. healthcare professionals, or the child themselves). This study provides the foundation for further research examining relationships between child, parent, and family psychosocial variables, parent management of atopic dermatitis, and child health outcomes. Improved understanding of these relationships will assist healthcare providers to better support parents and families caring for children with atopic dermatitis. KEYWORDS
Resumo:
The chubby baby who eats well is desirable in our culture. Perceived low weight gains and feeding concerns are common reasons mothers seek advice in the early years. In contrast, childhood obesity is a global public health concern. Use of coercive feeding practices, prompted by maternal concern about weight, may disrupt a child’s innate self regulation of energy intake, promoting overeating and overweight. This study describes predictors of maternal concern about her child undereating/becoming underweight and feeding practices. Mothers in the control group of the NOURISH and South Australian Infants Dietary Intake studies (n = 332) completed a self-administered questionnaire when the child was aged 12–16 months. Weight-for-age z-score (WAZ)was derived from weight measured by study staff. Mean age (SD) was 13.8 (1.3) months, mean WAZ (SD), 0.58 (0.86) and 49% were male. WAZ and two questions describing food refusal were combined in a structural equation model with four items from the Infant feeding Questionnaire (IFQ) to form the factor ‘Concern about undereating/weight’. Structural relationships were drawn between concern and IFQ factors ‘awareness of infant’s hunger and satiety cues’, ‘use of food to calm infant’s fussiness’ and ‘feeding infant on a schedule’, resulting in a model of acceptable fit. Lower WAZ and higher frequency of food refusal predicted higher maternal concern. Higher maternal concern was associated with lower awareness of infant cues (r = −.17, p = .01) and greater use of food to calm (r = .13, p = .03). In a cohort of healthy children, maternal concern about undereating and underweight was associated with practices that have the potential to disrupt self-regulation.
Resumo:
Purpose Food refusal is part of normal toddler development due to an innate ability to self-regulate energy intake and the onset of neophobia. For parents, this ‘fussy’ stage causes great concern, prompting use of coercive feeding practices which ignore a child’s own hunger and satiety cues, promoting overeating and overweight. This analysis defines characteristics of the ‘good eater’ using latent variable structural equation modelling and the relationship with maternal perception of her child as a fussy eater. Methods Mothers in the control group of the NOURISH and South Australian Infants Dietary Intake studies (n=332) completed a self-administered questionnaire - when child was age 12-16 months - describing refusal of familiar and unfamiliar foods and maternal perception as fussy/not fussy. Weight-for-age z-score (WAZ) was derived from weight measured by study staff. Questionnaire items and WAZ were combined in AMOS to represent the latent variable the ‘good eater’. Results/findings Mean age(sd) of children was 13.8(1.3) months, mean WAZ(sd), .58(.86) and 49% were male. The ‘good eater’ was represented by higher WAZ, a child that hardly ever refuses food, hardly ever refuses familiar food, and willing to eat unfamiliar foods (x2/df=2.80, GFI=.98, RMSEA=.07(.03-.12), CFI=.96). The ‘good eater’ was inversely associated with maternal perception of her child as a fussy eater (β=-.64, p<.05). Conclusions Toddlers displaying characteristics of a ‘good eater’ are not perceived as fussy, but these characteristics, especially higher WAZ, may be undesirable in the context of obesity prevention. Clinicians can promote food refusal as normal and even desirable in healthy young children.
Resumo:
A mother’s perception of her child’s weight may be more important in determining how she feeds her child, than the child’s actual weight status. Use of controlling feeding practices, prompted by perceptions and concerns about weight, may disrupt the child’s innate self-regulation of energy intake. This can promote overeating and overweight (Costanzo & Woody, 1985). This study describes mother’s perception of her child’s weight relative to the child’s actual weight. Mothers in the control group of NOURISH (n=276) were asked to describe their child as underweight, normal weight, or somewhat/very overweight via self-administered questionnaire when children were aged 12-16 months (Daniels et al, 2009). Child’s weight and length were measured by study staff. At assessment, mean age (sd) was 13.7(1.3) months, mean weight-for-age z-score (sd) was 0.6(0.8) (WHO standards, 2008), and 51% were male. Twenty-seven children were perceived as underweight (10%) and twelve children were perceived as overweight (4%). ANOVA revealed significant differences in weight-for-age z-scores across each category of weight perception, mean (sd) -0.2(0.5), 0.6(0.8) and 1.8(0.7) for underweight, normal weight and overweight respectively F(4, 288)= 15.6, (p<0.00). Based on WHO criteria only one of the 27 children was correctly perceived as underweight (WHO 2008). Similarly while 12 children were perceived as overweight, 88 were actually overweight/at risk. At group level, children of mothers who perceived their child as underweight were indeed leaner. However at the individual level mothers could not accurately describe their child’s weight, tending to over-identify underweight and perceive overweight children as normal weight.
Resumo:
For Construction - Architectural drawing package. House designed to AS3959:2009 Bushfire Attack Level 40. QUT-Client Agreement 100% Research (HERDC definition of Research).
Resumo:
Deoxyribonucleic acid (DNA) extraction has considerably evolved since it was initially performed back in 1869. It is the first step required for many of the available downstream applications used in the field of molecular biology. Whole blood samples are one of the main sources used to obtain DNA, and there are many different protocols available to perform nucleic acid extraction on such samples. These methods vary from very basic manual protocols to more sophisticated methods included in automated DNA extraction protocols. Based on the wide range of available options, it would be ideal to determine the ones that perform best in terms of cost-effectiveness and time efficiency. We have reviewed DNA extraction history and the most commonly used methods for DNA extraction from whole blood samples, highlighting their individual advantages and disadvantages. We also searched current scientific literature to find studies comparing different nucleic acid extraction methods, to determine the best available choice. Based on our research, we have determined that there is not enough scientific evidence to support one particular DNA extraction method from whole blood samples. Choosing a suitable method is still a process that requires consideration of many different factors, and more research is needed to validate choices made at facilities around the world.