961 resultados para ADMINISTERED MORPHINE
Resumo:
OBJECTIVE: The present study evaluates the prehospital care of paediatric burn patients in Queensland (QLD). As first aid (FA) treatment has been shown to affect burn progression and outcome, the FA treatment and the risk of associated hypothermia in paediatric patients were specifically examined in the context of paramedic management of burn patients. METHODS: Data were retrospectively collected from electronic ambulance response forms (eARFs) for paediatric burn patients (0-5 years) who were attended by Queensland Ambulance Service (QAS) from 2008 to 2010. Data were collected from 117 eARFs of incidents occurring within the Brisbane, Townsville and Cairns regions. RESULTS: Initial FA measures were recorded in 77.8% of cases, with cool running water FA administered in 56.4% of cases. The duration of FA was recorded in 29.9% of reports. The duration of FA was significantly shorter for patients in Northern QLD (median = 10 min, n = 10) compared with Brisbane (median = 15 min, n = 18), P = 0.005. Patient temperatures were recorded significantly more often in Brisbane than in other regions (P = 0.041); however, in total, only 24.8% of all patients had documented temperature readings. Of these, six (5%) were recorded as having temperatures ≤ 36.0°C. Burnaid(TM) was the most commonly used dressing and was applied to 55.6% of all patients; however, it was applied with a variety of different outer dressings. Brisbane paramedics applied Burnaid significantly less often (44.3%) compared with paramedics from Northern QLD (72.7%) and Far Northern QLD (60.9%), P = 0.025. CONCLUSIONS: Despite FA and patient temperatures being important prognostic factors for burn patients, paramedic documentation of these was often incomplete, and there was no consistent use of burns dressings.
Resumo:
The fate of two popular antibiotics, oxytetracycline and oxolinic acid, in a fish pond were simulated using a computational model. The VDC model, which is designed based on a model for predicting pesticide fate and transport in paddy fields, was modified to take into account the differences between the pond and the paddies as well as those between the fish and the rice plant behaviors. The pond conditions were set following the typical practice in South East Asia aquaculture. The two antibiotics were administered to the animal in the pond through medicated feed during a period of 5 days as in actual practice. Concentrations of oxytetracycline in pond water were higher than those of oxolinic acid at the beginning of the simulation. Dissipation rate of oxytetracycline is also higher as it is more readily available for degradation in the water. For the long term, oxolinic acid was present at higher concentration than oxytetracycline in pond water as well as pond sediment. The simulated results were expected to be conservative and can be useful for the lower tier assessment of exposure risk of veterinary medicine in aquaculture industry but more data are needed for the complete validation of the model.
Resumo:
The stability of five illicit drug markers in wastewater was tested under different sewer conditions using laboratory-scale sewer reactors. Wastewater was spiked with deuterium labelled isotopes of cocaine, benzoyl ecgonine, methamphetamine, MDMA and 6-acetyl morphine to avoid interference from the native isotopes already present in the wastewater matrix. The sewer reactors were operated at 20 °C and pH 7.5, and wastewater was sampled at 0, 0.25, 0.5, 1, 2, 3, 6, 9 and 12 h to measure the transformation/degradation of these marker compounds. The results showed that while methamphetamine, MDMA and benzoyl ecgonine were stable in the sewer reactors, cocaine and 6-acetyl morphine degraded quickly. Their degradation rates are significantly higher than the values reportedly measured in wastewater alone (without biofilms). All the degradation processes followed first order kinetics. Benzoyl ecgonine and morphine were also formed from the degradation of cocaine and 6-acetyl morphine, respectively, with stable formation rates throughout the test. These findings suggest that, in sewage epidemiology, it is essential to have relevant information of the sewer system (i.e. type of sewer, hydraulic retention time) in order to accurately back-estimate the consumption of illicit drugs. More research is required to look into detailed sewer conditions (e.g. temperature, pH and ratio of biofilm area to wastewater volume among others) to identify their effects on the fate of illicit drug markers in sewer systems.
Resumo:
Objective Child maltreatment is a problem that has longer recognition in the northern hemisphere and in high-income countries. Recent work has highlighted the nearly universal nature of the problem in other countries but demonstrated the lack of comparability of studies because of the variations in definitions and measures used. The International Society for the Prevention of Child Abuse and Neglect has developed instrumentation that may be used with cross-cultural and cross-national benchmarking by local investigators. Design and sampling The instrument design began with a team of expert in Brisbane in 2004. A large bank of questions were subjected to two rounds of Delphi review to develop the fielded version of the instrument. Convenience samples included approximately 120 parent respondents with children under the age of 18 in each of six countries (697 total). Results This paper presents an instrument that measures parental behaviors directed at children and reports data from pilot work in 6 countries and 7 languages. Patterns of response revealed few missing values and distributions of responses that generally were similar in the six countries. Subscales performed well in terms of internal consistency with Cronbach's alpha in very good range (0.77–0.88) with the exception of the neglect and sex abuse subscales. Results varied by child age and gender in expected directions but with large variations among the samples. About 15% of children were shaken, 24% hit on the buttocks with an object, and 37% were spanked. Reports of choking and smothering were made by 2% of parents. Conclusion These pilot data demonstrate that the instrument is well tolerated and captures variations in, and potentially harmful forms of child discipline. Practice implications The ISPCAN Child Abuse Screening Tool – Parent Version (ICAST-P) has been developed as a survey instrument to be administered to parents for the assessment of child maltreatment in a multi-national and multi-cultural context. It was developed with broad input from international experts and subjected to Dephi review, translation, and pilot testing in six countries. The results of the Delphi study and pilot testing are presented. This study demonstrates that a single instrument can be used in a broad range of cultures and languages with low rates of missing data and moderate to high internal consistency.
Resumo:
Background Domestic violence against women is a major public health problem and violations of women’s human rights. Health professionals could play an important role in screening for the victims. From the evidence to date, it is unclear whether health professionals do play an active role in identification of the victims. Objectives To develop a reliable and valid instrument to measure health professionals’ attitude to identifying female victims of domestic violence. Methods A primary questionnaire was constructed in accordance with established guidelines using the Theory of Planned Behaviour Ajzen (1975) to develop an instrument to measure health professionals’ attitudes in identifying female victim of DV. An expert panel was used to establish content validity. Focus groups amongst a group of health professionals (N = 5) of the target population were performed to confirm face validity. A pilot study (N = 30 nurses and doctors) was undertaken to elicit the feasibility and reliability of the questionnaire. The questionnaire was also administered a second time after one week to check the stability of the tests. Results Feedbacks of the expert panel’s and group discussion confirmed that the questionnaire had the content and face validity. Cronbach’s alpha values for all the items were greater than 0.7. Strong correlations between the direct and indirect measures confirmed that the indirect measures were well constructed. High test-retest correlations confirmed that the measures were reliable in the sense of temporal stability. Significance This tool has the potential to be used by researchers in expanding the knowledge base in this important area.
Resumo:
Driving while sleepy is regarded as a substantial crash risk factor. Reducing the risk of sleep-related crashes predominately rests with the driver’s awareness of experiencing signs that are common when sleepy; such as yawning, frequent eye blinks, and difficulty keeping eyes open. However the relationship between the signs of sleepiness and risky sleepy driving behaviours is largely unknown. The current study sought to examine the relationships between drivers’ experiences of the signs of sleepiness, risky sleepy driving behaviours, and the associations with demographic, work and sleep-related factors. In total 1,608 participants completed a questionnaire administered via a telephone interview that assessed their experiences and behaviours of driving while sleepy. The results revealed a number of demographic, work and sleep-related factors were associated with experiencing signs of sleepiness when driving. Signs of sleepiness were also found to mediate the relationship between continuing to drive while sleepy and having a sleep-related close call event. A subgroup analysis based on age (under 30 and 30 years or older) found younger drivers were more likely to continue to drive when sleepy despite experiencing more signs of sleepiness. The results suggest participants had considerable experience with the signs of sleepiness and driving while sleepy. Actions to be taken from this research include informing the content of driver education campaigns regarding the importance of the signs of sleepiness. Working together to educate all drivers about the dangerousness of driving when experiencing signs of sleepiness is an important road safety outcome.
Resumo:
Objectives: The incidence and mortality of traumatic brain injury (TBI) has increased rapidly in the last decade in China. Appropriate ambulance service can reduce case-fatality rates of TBI significantly. This study aimed to explore the factors (age, gender, education level, clinical experience, professional title, organization, specialty before prehospital care, and training frequency) that could influence prehospital doctors’ knowledge level and practices in TBI management in China, Hubei Province. Methods: A cross-sectional questionnaire survey was conducted in two cities in Hubei Province. The self-administered questionnaire consisted of demographic information and questions about prehospital TBI management. Independent samples t-test and one-way ANOVA were used to analyze group differences in the average scores in terms of demographic character. General linear regression was used to explore associated factors in prehospital TBI management. Results: A total of 56 questionnaires were handed out and 52 (93%) were returned. Participants received the lowest scores in TBI treatment (0.64; SD=0.08) and the highest scores in TBI assessment (0.80; SD=0.14). According to the regression model, the education level was associated positively with the score of TBI identification (P=.019); participants who worked in the emergency department (ED; P=.011) or formerly practiced internal medicine (P=.009) tended to get lower scores in TBI assessment; participants’ scores in TBI treatment were associated positively with the training frequency (P=.011); and no statistically significant associated factor was found in the overall TBI management. Conclusion: This study described the current situation of prehospital TBI management. The prehospital doctors’ knowledge level and practices in TBI management were quantified and the influential factors hidden underneath were explored. The results indicated that an appropriate continuing medical education (CME) program enables improvement of the quality of ambulance service in China.
Resumo:
Objective. The heritability of disease activity and function in ankylosing spondylitis (AS) have been estimated at 0.51 and 0.63 (i.e., 51% and 63%), respectively. We examined the concordance of disease severity among family members in terms of disease activity, function, radiological change, prevalence of iritis, and juvenile onset. Methods. Disease activity and functional impairment due to AS were studied using the Bath AS Disease Activity Index (BASDAI) and Functional Index (BASFI) self-administered questionnaires; radiographic involvement was measured using the Bath AS Radiology Index (BASRI) scale. Familial correlation of BASDAI and BASFI was assessed in 406 families with 2 or more cases, using the program PAP. Parent-child and sibling-sibling concordance for iritis and juvenile AS were also studied in these families. Heritability of radiological disease severity based on the BASRI was assessed in 29 families containing 60 affected individuals using the program SOLAR. Results. Correlations between parent-child pairs for disease activity and function were 0.07 for both. Correlations between sibling pairs for disease activity and function were 0.27 and 0.36, respectively. The children of AS parents with iritis were more likely to develop iritis [27/71 (38%)] than children of non-iritis AS parents [13/70 (19%)] (p = 0.01). Parents with JAS were more likely to have children with JAS [17/30 (57%) compared to non-JAS parents 34/111 (30%)] (p = 0.002). The heritability of radiological disease severity based on the BASRI was 0.62. Conclusion. While correlation in severity between parent and child is poor, siblings do resemble each other in terms of severity, supporting the findings of segregation studies indicating significant genetic dominance in the heritable component of disease activity. Significant parent-child concordance for iritis and juvenile disease onset suggest that there are genetic risk factors for these traits independent of those determining the risk of AS itself. The finding of significant heritability of radiological change (BASRI) provides support using an objective measure for the observed heritability of the questionnaire-assessed disease severity scores, ASDAI and BASFI.
Resumo:
The objective of the current study was to investigate the mechanism by which the corpus luteum (CL) of the monkey undergoes desensitization to luteinizing hormone following exposure to increasing concentration of human chorionic gonadotrophin (hCG) as it occurs in pregnancy. Female bonnet monkeys were injected (im) increasing doses of hCG or dghCG beginning from day 6 or 12 of the luteal phase for either 10 or 4 or 2 days. The day of oestrogen surge was considered as day '0' of luteal phase. Luteal cells obtained from CL of these animals were incubated with hCG (2 and 200 pg/ml) or dbcAMP (2.5, 25 and 100 mu M) for 3 h at 37 degrees C and progesterone secreted was estimated. Corpora lutea of normal cycling monkeys on day 10/16/22 of the luteal phase were used as controls, In addition the in vivo response to CG and deglycosylated hCG (dghCG) was assessed by determining serum steroid profiles following their administration. hCG (from 15-90 IU) but not dghCG (15-90 IU) treatment in vivo significantly (P < 0.05) elevated serum progesterone and oestradiol levels. Serum progesterone, however, could not be maintained at a elevated level by continuous treatment with hCG (from day 6-15), the progesterone level declining beyond day 13 of luteal phase. Administering low doses of hCG (15-90 IU/day) from day 6-9 or high doses (600 IU/day) on days 8 and 9 of the luteal phase resulted in significant increase (about 10-fold over corresponding control P < 0.005) in the ability of luteal cells to synthesize progesterone (incubated controls) in vitro. The luteal cells of the treated animals responded to dbcAMP (P < 0.05) but not to hCG added in vitro, The in vitro response of luteal cells to added hCG was inhibited by 0, 50 and 100% if the animals were injected with low (15-90 IU) or medium (100 IU) between day 6-9 of luteal phase and high (600 IU on day 8 and 9 of luteal phase) doses of dghCG respectively; such treatment had no effect on responsivity of the cells to dbcAMP, The luteal cell responsiveness to dbcAMP in vitro was also blocked if hCG was administered for 10 days beginning day 6 of the luteal phase. Though short term hCG treatment during late luteal phase (from days 12-15) had no effect on luteal function, 10 day treatment beginning day 12 of luteal phase resulted in regain of in vitro responsiveness to both hCG (P < 0.05) and dbcAMP (P < 0.05) suggesting that luteal rescue can occur even at this late stage. In conclusion, desensitization of the CL to hCG appears to be governed by the dose/period for which it is exposed to hCG/dghCG. That desensitization is due to receptor occupancy is brought out by the fact that (i) this can be achieved by giving a larger dose of hCG over a 2 day period instead of a lower dose of the hormone for a longer (4 to 10 days) period and (ii) the effect can largely be reproduced by using dghCG instead of hCG to block the receptor sites. It appears that to achieve desensitization to dbcAMP also it is necessary to expose the luteal cell to relatively high dose of hCG for more than 4 days.
Resumo:
Activation of midbrain dopamine systems is thought to be critically involved in the addictive properties of abused substances. Drugs of abuse increase dopamine release in the nucleus accumbens and dorsal striatum, which are the target areas of mesolimbic and nigrostriatal dopamine pathways, respectively. Dopamine release in the nucleus accumbens is thought to mediate the attribution of incentive salience to rewards, and dorsal striatal dopamine release is involved in habit formation. In addition, changes in the function of prefrontal cortex (PFC), the target area of mesocortical dopamine pathway, may skew information processing and memory formation such that the addict pays an abnormal amount of attention to drug-related cues. In this study, we wanted to explore how long-term forced oral nicotine exposure or the lack of catechol-O-methyltransferase (COMT), one of the dopamine metabolizing enzymes, would affect the functioning of these pathways. We also wanted to find out how the forced nicotine exposure or the lack of COMT would affect the consumption of nicotine, alcohol, or cocaine. First, we studied the effect of forced chronic nicotine exposure on the sensitivity of dopamine D2-like autoreceptors in microdialysis and locomotor activity experiments. We found that the sensitivity of these receptors was unchanged after forced oral nicotine exposure, although an increase in the sensitivity was observed in mice treated with intermittent nicotine injections twice daily for 10 days. Thus, the effect of nicotine treatment on dopamine autoreceptor sensitivity depends on the route, frequency, and time course of drug administration. Second, we investigated whether the forced oral nicotine exposure would affect the reinforcing properties of nicotine injections. The chronic nicotine exposure did not significantly affect the development of conditioned place preference to nicotine. In the intravenous self-administration paradigm, however, the nicotine-exposed animals self-administered nicotine at a lower unit dose than the control animals, indicating that their sensitivity to the reinforcing effects of nicotine was enhanced. Next, we wanted to study whether the Comt gene knock-out animals would be a suitable model to study alcohol and cocaine consumption or addiction. Although previous work had shown male Comt knock-out mice to be less sensitive to the locomotor-activating effects of cocaine, the present study found that the lack of COMT did not affect the consumption of cocaine solutions or the development of cocaine-induced place preference. However, the present work did find that male Comt knock-out mice, but not female knock-out mice, consumed ethanol more avidly than their wild-type littermates. This finding suggests that COMT may be one of the factors, albeit not a primary one, contributing to the risk of alcoholism. Last, we explored the effect of COMT deficiency on dorsal striatal, accumbal, and prefrontal cortical dopamine metabolism under no-net-flux conditions and under levodopa load in freely-moving mice. The lack of COMT did not affect the extracellular dopamine concentrations under baseline conditions in any of the brain areas studied. In the prefrontal cortex, the dopamine levels remained high for a prolonged time after levodopa treatment in male, but not female, Comt knock-out mice. COMT deficiency induced accumulation of 3,4-dihydroxyphenylacetic acid, which increased further under levodopa load. Homovanillic acid was not detectable in Comt knock-out animals either under baseline conditions or after levodopa treatment. Taken together, the present results show that although forced chronic oral nicotine exposure affects the reinforcing properties of self-administered nicotine, it is not an addiction model itself. COMT seems to play a minor role in dopamine metabolism and in the development of addiction under baseline conditions, indicating that dopamine function in the brain is well-protected from perturbation. However, the role of COMT becomes more important when the dopaminergic system is challenged, such as by pharmacological manipulation.
Resumo:
Cigarette smoking is, in developed countries, the leading cause of premature death. In tobacco smoke, the main addictive compound is nicotine, which in the brain binds to neuronal nicotinic acetylcholine receptors (neuronal nAChRs). These have been implicated in addiction, but also in several neurological disorders including Alzheimer's and Parkinson's diseases, Tourette's syndrome, attention-deficit hyperactivity disorder (ADHD), schizophrenia, pain, depression, and autosomal-dominant noctural frontal lobe epilepsy; all of which makes nAChRs an intriguing target of study. Chronic treatment with nicotine leads to an increase in the number of nAChRs (upregulation) in the brain and changes their functionality. Changes in the properties of nAChRs are likely to occur in smokers as well, since they are exposed to nicotine for long periods of time. Several nAChR subtypes likely play a role in the formation of nicotine addiction by participating in the release of dopamine in the striatum. The aim of this study was to clarify at cellular level the changes in nAChR characteristics resulting from chronic nicotine treatment. SH-SY5Y cells, endogenously several nAChR-expressing, and SH-EP1-h-alfa7 cells, transfected with the alfa 7 nAChR subunit gene were treated chronically with nicotine. The localisation of alfa 7 and beta2 subunits was studied with confocal and electron microscopy. Functionality of nAChRs was studied with calcium fluorometry. Effects of long-term treatment with opioid compounds on nAChRs were studied by means of ligand binding. Confocal microscopy showed that in SH-SY5Y cells, alfa7 and beta2 subunits formed clusters, unlike the case in SH-EP1-h alfa7 cells, where alfa7 nAChRs were distributed more diffusely. The majority of nAChR subunits localised on endoplasmic reticulum (ER). The isomers of methadone acted as agonists at alfa7 nAChRs. Acute morphine challenge also stimulated nAChRs. Chronic treatment with methadone or morphine led to an increased number of nAChRs. In animal studies, mice received nicotine for 7 weeks. Electron microscopical analysis of the localisation of nAChRs showed in the striatum that alfa7 and beta2 nAChR subunits localised synaptically, extrasynaptically, and intracellularly, with the majority localising extrasynaptically. Chronic nicotine treatment caused an increase in the number of nAChR subunits at all studied locations. These results suggest that the alfa7 nAChR and beta2 subunit-containing nAChRs respond to chronic nicotine treatment differently. This may indicate that the functional balance of various nAChR subtypes in control of the release of dopamine is altered as a result of chronic nicotine treatment. Compounds binding both to opioid and nACh receptors may be of clinical importance.
Resumo:
The bentiromide test was evaluated using plasma p-aminobenzoic acid as an indirect test of pancreatic insufficiency in young children between 2 months and 4 years of age. To determine the optimal test method, the following were examined: (a) the best dose of bentiromide (15 mg/kg or 30 mg/kg); (b) the optimal sampling time for plasma p-aminobenzoic acid, and; (c) the effect of coadministration of a liquid meal. Sixty-nine children (1.6 ± 1.0 years) were studied, including 34 controls with normal fat absorption and 35 patients (34 with cystic fibrosis) with fat maldigestion due to pancreatic insufficiency. Control and pancreatic insufficient subjects were studied in three age-matched groups: (a) low-dose bentiromide (15 mg/kg) with clear fluids; (b) high-dose bentiromide (30 mg/kg) with clear fluids, and; (c) high-dose bentiromide with a liquid meal. Plasma p-aminobenzoic acid was determined at 0, 30, 60, and 90 minutes then hourly for 6 hours. The dose effect of bentiromide with clear liquids was evaluated. High-dose bentiromide best discriminated control and pancreatic insufficient subjects, due to a higher peak plasma p-aminobenzoic acid level in controls, but poor sensitivity and specificity remained. High-dose bentiromide with a liquid meal produced a delayed increase in plasma p-aminobenzoic acid in the control subjects probably caused by retarded gastric emptying. However, in the pancreatic insufficient subjects, use of a liquid meal resulted in significantly lower plasma p-aminobenzoic acid levels at all time points; plasma p-aminobenzoic acid at 2 and 3 hours completely discriminated between control and pancreatic insufficient patients. Evaluation of the data by area under the time-concentration curve failed to improve test results. In conclusion, the bentiromide test is a simple, clinically useful means of detecting pancreatic insufficiency in young children, but a higher dose administered with a liquid meal is recommended.
Resumo:
The purpose of this research is to extend an understanding of how Black and White South African consumers' causal attributions for major household appliance performance failures impact on their anger and subsequent complaint behaviour. A survey was administered to Black and White South African consumers who were dissatisfied with the performance of a major household appliance item. Respondents resided in a major metropolitan area. The findings showed that, compared to Whites, the Black South Africans felt a low but significantly higher external locus of causality and lower control, and experienced a higher level of anger regarding product failure. The level of anger determined the decision to take complaint action, but racial group determined the type of action taken. Blacks complained more actively to retailers and engaged more in private complaint action than Whites. These findings may show that Black South Africans are developing a more individualistic orientation as consumers. Therefore, researchers should consider the effect of cultural swapping when researching consumer behaviour in multi-cultural countries. Implications for retailers in terms of complaint handling are indicated.
Resumo:
Purpose This research investigates whether application of a community-based social marketing principle, namely increasing the visibility of a target behaviour in the community, can change social norms surrounding the behaviour. Design/methodology/approach A repeated measures quasi-experimental design was employed to evaluate the Victorian Health Promotion Foundation’s Walk to School 2013 programme, which increases the visibility of walking to and from school through programme participation to promote active transportation for primary school children. The target population for the survey were caregivers of primary school children aged between 5-12 years old. The final sample size across the three online surveys administered was 102 respondents. Findings The results suggest that the programme increased caregivers’ perceptions that children in their community walked to and from school and that walking to and from school is socially acceptable. Originality/value The study contributes to addressing the recent call for research examining the relationship between community-based social marketing principles and programme outcomes. Further, the results provide insight for enhancing the social norms approach, which has traditionally relied on changing social norms exclusively through media campaigns.
Resumo:
The immune response against Salmonella is multi-faceted involving both the innate and the adaptive immune system. The characterization of specific Salmonella antigens inducing immune response could critically contribute to the development of epitope based vaccines for Salmonella. We have tried to identify a protective T cell epitope(s) of Salmonella, as cell mediated immunity conferred by CD8+ T cells is the most crucial subset conferring protective immunity against Salmonella. It being a proven fact that secreted proteins are better in inducing cell mediated immunity than cell surface and cytosolic antigens, we have analyzed all the genbank annotated Salmonella pathogenicity island 1 and 2 secreted proteins of Salmonella enterica serovar Typhimurium (S. typhimurium) and S. enterica serovar Typhi (S. typhi). They were subjected to BIMAS and SYFPEITHI analysis to map MHC-I and MHC-II binding epitopes. The huge profile of possible T cell epitopes obtained from the two classes of secreted proteins were tabulated and using a scoring system that considers the binding affinity and promiscuity of binding to more than one allele, SopB and SifB were chosen for experimental confirmation in murine immunization model. The entire SopB and SifB genes were cloned into DNA vaccine vectors and were administered along with live attenuated Salmonella and it was found that SopB vaccination reduced the bacterial burden of organs by about 5-fold on day 4 and day 8 after challenge with virulent Salmonella and proved to be a more efficient vaccination strategy than live attenuated bacteria alone.