858 resultados para spent sulfidic caustic
Resumo:
Background: Concerns of a decrease in physical activity levels (PALs) of children and a concurrent increase in childhood obesity exist worldwide. The exact relation between these two parameters however has as yet to be fully defined in children. Objective: This study examined the relation in 47 children, aged 5–10.5 y (mean age 8.4plusminus0.9 y) between habitual physical activity, minutes spent in moderate, vigorous and hard intensity activity and body composition parameters. Design: Total energy expenditure (TEE) was calculated using the doubly labelled water technique and basal metabolic rate (BMR) was predicted from Schofield's equations. PAL was determined by PAL=TEE/BMR. Time spent in moderate, vigorous and hard intensity activity was determined by accelerometry, using the Tritrac-R3D. Body fatness and body mass index (BMI) were used as the two measures of body composition. Results: Body fat and BMI were significantly inversely correlated with PAL (r=-0.43, P=0.002 and r=-0.45, P=0.001). Times spent in vigorous activity and hard activity were significantly correlated to percentage body fat (r=-0.44, P=0.004 and r=-0.39, P=0.014), but not BMI. Children who were in the top tertiles for both vigorous activity and hard activity had significantly lower body fat percentages than those in the middle and lowest tertiles. Moderate intensity activity was not correlated with measures of body composition. Conclusions: As well as showing a significant relation between PAL and body composition, these data intimate that there may be a threshold of intensity of physical activity that is influential on body fatness. In light of world trends showing increasing childhood obesity, this study supports the need to further investigate the importance of physical activity for children.
Resumo:
Mineralogical, hydrochemical and S isotope data were used to constrain hydrogeochemical processes that produce acid mine drainage from sulfidic waste at the historic Mount Morgan Au–Cu mine, and the factors controlling the concentration of SO4 and environmentally hazardous metals in the nearby Dee River in Queensland, Australia. Some highly contaminated acid waters, with metal contents up to hundreds of orders of magnitude greater than the Australia–New Zealand environmental standards, by-pass the water management system at the site and drain into the adjacent Dee River. Mine drainage precipitates at Mt. Morgan were classified into 4 major groups and were identified as hydrous sulfates and hydroxides of Fe and Al with various contents of other metals. These minerals contain adsorbed or mineralogically bound metals that are released into the water system after rainfall events. Sulfate in open pit water and collection sumps generally has a narrow range of S isotope compositions (δ34S = 1.8–3.7‰) that is comparable to the orebody sulfides and makes S isotopes useful for tracing SO4 back to its source. The higher δ34S values for No. 2 Mill Diesel sump may be attributed to a difference in the source. Dissolved SO4 in the river above the mine influence and 20 km downstream show distinctive heavier isotope compositions (δ34S = 5.4–6.8‰). The Dee River downstream of the mine is enriched in 34S (δ34S = 2.8–5.4‰) compared with mine drainage possibly as a result of bacterial SO4 reduction in the weir pools, and in the water bodies within the river channel. The SO4 and metals attenuate downstream by a combination of dilution with the receiving waters, SO4 reduction, and the precipitation of Fe and Al sulfates and hydroxides. It is suggested here that in subtropical Queensland, with distinct wet and dry seasons, temporary reducing environments in the river play an important role in S isotope systematics
Resumo:
This review reflects the state of the art in study of contact and dynamic phenomena occurring in cold roll forming. The importance of taking these phenomena into account is determined by significant machine time and tooling costs spent on worn out forming rolls replacement and equipment adjustment in cold roll forming. Predictive modelling of the tool wear caused by contact and dynamic phenomena can reduce the production losses in this technological process.
Resumo:
Physiological and kinematic data were collected from elite under-19 rugby union players to provide a greater understanding of the physical demands of rugby union. Heart rate, blood lactate and time-motion analysis data were collected from 24 players (mean +/- s((x) over bar): body mass 88.7 +/- 9.9 kg, height 185 +/- 7 cm, age 18.4 +/- 0.5 years) during six competitive premiership fixtures. Six players were chosen at random from each of four groups: props and locks, back row forwards, inside backs, outside backs. Heart rate records were classified based on percent time spent in four zones (>95%, 85-95%, 75-84%, <75% HRmax). Blood lactate concentration was measured periodically throughout each match, with movements being classified as standing, walking, jogging, cruising, sprinting, utility, rucking/mauling and scrummaging. The heart rate data indicated that props and locks (58.4%) and back row forwards (56.2%) spent significantly more time in high exertion (85-95% HRmax) than inside backs (40.5%) and outside backs (33.9%) (P < 0.001). Inside backs (36.5%) and outside backs (38.5%) spent significantly more time in moderate exertion (75-84% HRmax) than props and locks (22.6%) and back row forwards (19.8%) (P < 0.05). Outside backs (20.1%) spent significantly more time in low exertion (< 75% HRmax) than props and locks (5.8%) and back row forwards (5.6%) (P < 0.05). Mean blood lactate concentration did not differ significantly between groups (range: 4.67 mmol.l(-1) for outside backs to 7.22 mmol.l(-1) for back row forwards; P < 0.05). The motion analysis data indicated that outside backs (5750 m) covered a significantly greater total distance than either props and locks or back row forwards (4400 and 4080 m, respectively; P < 0.05). Inside backs and outside backs covered significantly greater distances walking (1740 and 1780 m, respectively; P < 0.001), in utility movements (417 and 475 m, respectively; P < 0.001) and sprinting (208 and 340 m, respectively; P < 0.001) than either props and locks or back row forwards (walking: 1000 and 991 m; utility movements: 106 and 154 m; sprinting: 72 and 94 m, respectively). Outside backs covered a significantly greater distance sprinting than inside backs (208 and 340 m, respectively; P < 0.001). Forwards maintained a higher level of exertion than backs, due to more constant motion and a large involvement in static high-intensity activities. A mean blood lactate concentration of 4.8-7.2 mmol.l(-1) indicated a need for 'lactate tolerance' training to improve hydrogen ion buffering and facilitate removal following high-intensity efforts. Furthermore, the large distances (4.2-5.6 km) covered during, and intermittent nature of, match-play indicated a need for sound aerobic conditioning in all groups (particularly backs) to minimize fatigue and facilitate recovery between high-intensity efforts.
Resumo:
Spending by aid agencies on emergencies has quadrupled over the last decade, to over US$ 6 billion. To date, cost-effectiveness has seldom been considered in the prioritization and evaluation of emergency interventions. The sheer volume of resources spent on humanitarian aid and the chronicity of many humanitarian interventions call for more attention to be paid to the issue of 'value for money'. In this paper we present data from a major humanitarian crisis, an epidemic of visceral leishmaniasis (VL) in war-torn Sudan. The special circumstances provided us, in retrospect, with unusually accurate data on excess mortality, costs of the intervention and its effects, thus allowing us to express cost-effectiveness as the cost per Disability Adjusted Life Year (DALY) averted. The cost-effectiveness ratio, of US$ 18.40 per DALY (uncertainty range between US$ 13.53 and US$ 27.63), places the treatment of VL in Sudan among health interventions considered 'very flood value for money' (interventions of less than US$ 25 per DALY). We discuss the usefulness of this analysis to the internal management of the VL programme, the procurement of funds for the programme, and more generally, to priority setting in humanitarian relief interventions. We feel that in evaluations of emergency interventions attempts could be made more often to perform cost-effectiveness analyses, including the use of DALYs, provided that the outcomes of these analyses are seen in the broad context of the emergency situation and its consequences on the affected population. This paper provides a first contribution to what is hoped to become an international database of cost-effectiveness studies of health outcome such as the DALY.
Resumo:
Recent studies have demonstrated a link in young populations between unemployment and ill health. The purpose of this study is to correlate mortality with employment status in two cohorts of young Australian males, aged 17-25 years, from 1984 to 1988. Two youth cohorts consisting of an initially unemployed sample (n = 1424 males) and a population sample (n = 4573 males), were surveyed annually throughout the study period. Those lost to follow-up during the survey period were matched with death registries across Australia. Employment status was determined from weekly diaries and death certificates and was designated as: employed or student; unemployed; not in the work force (excluding students). Conditional logistic regression, using age- and cohort- matched cases (deaths) and controls (alive), was used to estimate the odds ratio (OR) of dying with regard to employment status, taking into account potential confounders such as ethnicity, aboriginality, educational attainment, pre-existing health problems, socio-economic status of parents, and other factors. Twenty three male survey respondents were positively matched to death registry records. Compared to those employed or students (referent group), significantly elevated ORs were found to be associated with neither being in the workforce nor a student for all cause, external cause, and external cause mortality other than suicide. Odds ratios were adjusted for age, survey cohort, ethnicity, pre-existing physical and mental health status, education level, and socio-economic status of parent(s). A statistically significant increasing linear trend in odds ratios of male mortality for most cause groups was found across the employment categories, from those employed or student (lowest ORs), through those unemployed; to those not in the workforce (highest ORs). Suicide was higher, but not statistically significantly, in those unemployed or not in the workforce. Suicide also was associated, though not significantly, with the respondent not living with their parents when they were 14 years of age. No association was found between mortality and past unemployment experience, as measured by length of time spent unemployed, or the number of spells of unemployment experienced during the survey. The results of this study underscore the elevated risk to survival in young males as a consequence of being neither employed nor a student. (C) 1999 Elsevier Science Ltd. All rights reserved.
Resumo:
Objective: We compared service consumption, continuity of care and risk of readmission in a record linkage follow-up study of cohorts of patients with schizophrenia and related disorders in Victoria (Australia) and in Groningen (The Netherlands). These areas are interesting to compare because mental health care is in a different stage of deiustitutionalization. More beds are available in Groningen and more community resources are available in Victoria. Method: The cohorts were followed for 4 years, since discharge from inpatient services using record linkage data available in the psychiatric case-registers in both areas. Survival analysis was used to study continuity of care and risk of readmission. Results: Available indicators showed a higher level of continuity of care in Victoria. While the relative risk of readmission was the same in both areas and not affected by aftercare contact after discharge, the number of days spent in hospital was much higher in the Groningen register area. Conclusion: These findings provide further support for earlier reports that the risk of readmission is predominantly affected by attributes of mental illness. However, the duration of admissions, is strongly affected by service system variables, including the provision of continuity of care.
Resumo:
Background The introduction of community care in psychiatry is widely thought to have resulted in more offending among the seriously mentally ill. This view affects public policy towards and public perceptions of such people. We investigated the association between the introduction of community care and the pattern of offending in patients with schizophrenia in Victoria, Australia. Methods We established patterns of offending from criminal records in two groups of patients with schizophrenia over their lifetime to date and in the 10 years after their first hospital admission. One group was first admitted in 1975 before major deinstitutionalisation in Victoria, the second group in 1985 when community care was becoming the norm. Each patient was matched to a control, by age, sex, and place of residence to allow for changing patterns of offending over time in the wider community. Findings Compared with controls, significantly more of those with schizophrenia were convicted at least once for ail categories of criminal offending except sexual offences (relative risk of offending in 1975=3.5 [95% CI 2.0-5.5), p=0.001, in 1985=3.0 [1.9-4.9], p=0.001). Among men, more offences were committed in the 1985 group than the 1975 group, but this was matched by a similar increase in convictions among the community controls. Those with schizophrenia who had also received treatment for substance abuse accounted for a disproportionate amount of offending. Analysis of admission data for the patients and the total population of admissions with schizophrenia showed that although there had been an increase of 74 days per annum spent in the community for each of the study population as a whole, first admissions spent only 1 more day in the community in 1985 compared with 1975. Interpretation Increased rates in criminal conviction for those with schizophrenia over the last 20 years are consistent with change in the pattern of offending in the general community. Deinstitutionalisation does not adequately explain such change. Mental-health services should aim to reduce the raised rates of criminal offending associated with schizophrenia, but turning the clock back on community care is unlikely to contribute towards any positive outcome.
Resumo:
In gastropod mollusks, neuroendocrine cells in the anterior ganglia have been shown to regulate growth and reproduction. As a first step toward understanding the molecular mechanisms underlying the regulation of these physiological processes in the tropical abalone Haliotis asinina, ive have identified sets of POU, Sox, and Pax transcription factor genes that are expressed in these ganglia. Using highly degenerate oligonucleotide primers designed to anneal to conserved codons in each of these gene families, we have amplified by reverse transcriptase polymerase chain reaction 2 POU genes (HasPOU-III and HasPOU-IV), 2 Sox genes (HasSox-B and HasSox-C), and two Pax genes (HasPax-258 and HaxPax-6). Analyses with gene-specific primers indicated that the 6 genes are expressed in the cerebral and pleuropedal ganglia of both reproductively active and spent adults, in a number of sensory structures, and in a subset of other adult tissues.
Resumo:
Background. International research indicates that blue-collar employees typically exhibit lower rates of leisure-time physical activity. While lack of time and work demands are commonly reported barriers to activity, the extent to which time-at-work mediates the relationship between occupation and leisure-time physical activity is unclear. This study investigated the association between occupation, time spent in paid employment, and participation in leisure-time physical activity. Methods. This was a secondary analysis of cross-sectional data from the 1995 Australian Health Survey, focusing on employed persons ages 18-64 years (n = 24,454), Occupation was coded as per the Australian Standard Classification of Occupations and collapsed into three categories (professional, white-collar, blue-collar). Hours worked was categorized into eight levels, ranging from 1-14 to more than 50 h per week. Participation in leisure-time physical activity was categorized as either insufficient or sufficient for health, consistent with recommended levels of energy expenditure (1600 METS-min/fortnight). The relationship between occupation, hours worked, and leisure-time physical activity was examined using logistic regression. Analyses were conducted separately for male and female, and the results are presented as a series of models that successively adjust for a range of potential covariates: age, living arrangement, smoking status, body mass index, and self-reported health. Results. Individuals in blue-collar occupations were approximately 50% more likely to be classified as insufficiently active. This occupational variability in leisure-time physical activity was not explained by hours worked. There was a suggested relationship between hours worked and leisure-time physical activity; however, this differed between men and women, and was difficult to interpret. Conclusions. Occupational variability in leisure-time physical activity cannot be explained by hours worked. Therefore, reports that work constitutes a barrier to participation should be explored further. Identification of the factors contributing to occupational variability in leisure-time physical activity will add to our understanding of why population subgroups differ in their health risk profiles, and assist in the development of health promotion strategies to reduce rates of sedentariness and health inequalities. (C) 2000 American Health Foundation and Academic Press.
Resumo:
Background: Between 1998 and 1999, a burden of disease assessment was carried out in Victoria, Australia applying and improving on the methods of the Global Burden of Disease Study. This paper describes the methods and results of the calculations of the burden due to 22 mental disorders, adding 14 conditions not included in previous burden of disease estimates, Methods: The National Survey of Mental Health and Wellbeing provided recent data on the occurrence of the major adult mental disorders in Australia. Data from international studies and expert advice further contributed to the construction of disease models, describing each condition in terms of incidence, average duration and level of severity, with adjustments for comorbidity with other mental disorders. Disability weights for the time spent in different states of mental ill health were borrowed mainly from a study in the Netherlands, supplemented by weights derived in a local extrapolation exercise. Results: Mental disorders were the third largest group of conditions contributing to the burden of disease in Victoria, ranking behind cancers and cardiovascular diseases. Depression was the greatest cause of disability in both men and women. Eight other mental disorders in men and seven in women ranked among the top twenty causes of disability. Conclusions: Insufficient information on the natural history of many of the mental disorders, the limited information on the validity of mental disorder diagnoses in community surveys and considerable differences between ICD-10 and DSM-IV defined diagnoses were the main concerns about the accuracy of the estimates. Similar and often greater concerns have been raised in relation to the estimation of the burden from common non-fatal physical conditions such as asthma, diabetes and osteoarthritis. In comparison, psychiatric epidemiology can boast greater scientific rigour in setting standards for population surveys.
Resumo:
Purpose. To conduct a controlled trial of traditional and problem-based learning (PBL) methods of teaching epidemiology. Method. All second-year medical students (n = 136) at The University of Western Australia Medical School were offered the chance to participate in a randomized controlled trial of teaching methods fur an epidemiology course. Students who consented to participate (n = 80) were randomly assigned to either a PBL or a traditional course. Students who did not consent or did not return the consent form (n = 56) were assigned to the traditional course, Students in both streams took identical quizzes and exams. These scores, a collection of semi-quantitative feedback from all students, and a qualitative analysis of interviews with a convenience sample of six students from each stream were compared. Results. There was no significant difference in performances on quizzes or exams between PBL and traditional students. Students using PBL reported a stronger grasp of epidemiologic principles, enjoyed working with a group, and, at the end of the course, were more enthusiastic about epidemiology and its professional relevance to them than were students in the traditional course. PBL students worked more steadily during the semester but spent only marginally more time on the epidemiology course overall. Interviews corroborated these findings. Non-consenting students were older (p < 0.02) and more likely to come from non-English-speaking backgrounds (p < 0.005). Conclusions. PBL provides an academically equivalent but personally far richer learning experience. The adoption of PBL approaches to medical education makes it important to study whether PBL presents particular challenges for students whose first language is not the language of instruction.
Resumo:
Scototaxis, the preference for dark environments in detriment of bright ones, is an index of anxiety in zebrafish. In this work, we analyzed avoidance of the white compartment by analysis of the spatiotemporal pattern of exploratory behavior (time spent in the white compartment of the apparatus and shuttle frequency between compartments) and swimming ethogram (thigmotaxis, freezing and burst swimming in the white compartment) in four experiments. In Experiment 1, we demonstrate that spatiotemporal measures of white avoidance and locomotion do not habituate during a single 15-min session. In Experiments 2 and 3, we demonstrate that locomotor activity habituates to repeated exposures to the apparatus, regardless of whether inter-trial interval is 15-min or 24-h; however, no habituation of white avoidance was observed in either experiment. In Experiment 4, we confined animals for three 15-min sessions in the white compartment prior to recording spatiotemporal and ethogram measures in a standard preference test. After these forced exposures, white avoidance and locomotor activity showed no differences in relation to non-confined animals, but burst swimming, thigmotaxis and freezing in the white compartment were all decreased. These results suggest that neither avoidance of the white compartment nor approach to the black compartment account for the behavior of zebrafish in the scototaxis test. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The effect of intraseptal injections of lidocaine before a first or a second session in the elevated plus-maze, in a test-retest paradigm, was investigated. In addition to gross session analyses, a minute-by-minute analysis of the sessions was used to evaluate both anxiety and memory. Lidocaine injections before the test session produced increases in the frequency of entries, time spent and distance run in the open arms without affecting activity occurring in the closed arms. During the retest session, saline- and lidocaine-treated rats exhibited increased indices of anxiety and lidocaine-treated rats exhibited decreased closed-arm entries. The minute-by-minute analysis showed a faster decrease in anxiety-related behaviors during the test session by saline- than by lidocaine-treated rats and a significant decrease in closed-arm exploration by saline-treated rats, but not by lidocaine-treated ones. Lidocaine injection before the retest session produced increases in the frequency of entries, time spent and distance run in the open arms in the second session when compared with saline-treated rats. Minute-by-minute analysis showed an increase in the time spent in the open arms by lidocaine animals at the beginning of the retest session in comparison to saline animals and a significant decrease in closed-arm exploration by both groups. These results suggest that inactivation of the medial septum by lidocaine affects the expression of unconditioned and conditioned forms of anxiety in the elevated plus-maze and, in a lesser way, the acquisition and retention of spatial information. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The present work aimed to evaluate the effects of social separation for 14 days (chronic stress) and of withdrawal from a 14-day treatment with diazepam (acute stress) on the exploratory behaviour of male rats in the elevated plus-maze and on serotonin (5-hydroxytryptamine) turnover in different brain structures. Social separation had an anxiogenic effect, evidenced by fewer entries into, and less time spent on the open arms of the elevated plus-maze. Separation also selectively increased 5-hydroxytryptamine turnover in the hippocampus and median raphe nucleus. Diazepam withdrawal had a similar anxiogenic effect in grouped animals and increased 5-hydroxytryptamine turnover in the same brain structures. Chronic treatment with imipramine during the 14 days of separation prevented the behavioural and neurochemical changes caused by social separation. It is suggested that the increase in anxiety determined by both acute and chronic stress is mediated by the activation of the median raphe nucleus-hippocampal 5-hydroxytryptamine pathway.