154 resultados para Month
Resumo:
Complete biological nutrient removal (BNR) in a single tank, sequencing batch reactor (SBR) process, is demonstrated here at full-scale on a typical domestic wastewater. The unique feature of the UniFed process is the introduction of the influent into the settled sludge blanket during the settling and decant periods of the SBR operation. This achieves suitable conditions for denitrification and anaerobic phosphate release which is critical to successful biological phosphorus removal, It also achieves a selector effect, which helps in generating a compact, well settling biomass in the reactor. The results of this demonstration show that it is possible to achieve well over 90% removal of GOD, nitrogen and phosphorus in such a process. Effluent quality achieved over a six-month operating period directly after commissioning was: 29 mg/l GOD, 0.5 mg/l NH4-N, 1.5 mg/l NOx-N and 1.5 mg/l PO4-P (50%-iles of daily samples). During an 8-day, intensive sampling period, the effluent BOD5 was
Resumo:
Objective: To determine the feasibility, safety and effectiveness of a structured clinical pathway for stratification and management of patients presenting with chest pain and classified as having intermediate risk of adverse cardiac outcomes in the subsequent six months. Design: Prospective clinical audit. Participants and setting: 630 consecutive patients who presented to the emergency department of a metropolitan tertiary care hospital between January 2000 and June 2001 with chest pain and intermediate-risk features. Intervention: Use of the Accelerated Chest Pain Assessment Protocol (ACPAP), as advocated by the Management of unstable angina guidelines - 2000 from the National Heart Foundation and the Cardiac Society of Australia and New Zealand. Main outcome measure: Adverse cardiac events during six-month follow-up. Results: 409 patients (65%) were reclassified as low risk and discharged at a mean of 14 hours after assessment in the chest pain unit. None had missed myocardial infarctions, while three (1%) had cardiac events at six months (all elective revascularisation procedures, with no readmissions with acute coronary syndromes). Another 110 patients (17%) were reclassified as high risk, and 21 (19%) of these had cardiac events (mainly revascularisations) by six months. Patients who were unable to exercise or had non-diagnostic exercise stress test results (equivocal risk) had an intermediate cardiac event rate (8%). Conclusions: This study validates use of ACPAP. The protocol eliminated missed myocardial infarction; allowed early, safe discharge of low-risk patients; and led to early identification and management of high-risk patients.
Resumo:
The purpose of this investigation was to assess changes in total energy expenditure (TEE), body weight (BW) and body composition following a peripheral blood stem cell transplant and following participation in a 3-month duration, moderate-intensity, mixed-type exercise programme. The doubly labelled and singly labelled water methods were used to measure TEE and total body water (TBW). Body weight and TBW were then used to calculate percentage body fat (%BF), and fat and fat-free mass (FFM). TEE and body composition measures were assessed pretransplant (PI), immediately post-transplant (PII) and 3 months post-PII (PIII). Following PII, 12 patients were divided equally into a control group (CG) or exercise intervention group (EG). While there was no change in TEE between pre- and post-transplant, BW (P
Resumo:
Purpose: The purpose of this investigation was to evaluate the impact of undertaking peripheral blood stem cell transplantation (PBST) on T-cell number and function, and to determine the role of a mixed type, moderate intensity exercise program in facilitating the recovery of T-cell number and function. Methods: Immunological measures of white blood cell, lymphocyte, CD3(+), CD4(+), and CD8(+) counts, and CD3(+) cell function were assessed pretransplant (PI), immediately posttransplant (PII), and 1 month (II), 2 months (12) and 3 months (PIII) posttransplant. After PII, 12 patients were divided equally into a control group (CG) or exercise intervention group (EG). Results: Lower total T-cell, helper T-cell, and suppressor T-cell counts (P < 0.01), as well as lower T-cell function (P < 0.01), when compared with normative data, were found at PI. More specifically, 88% of the group had CD3(+), CD4(+), and CD8(+) counts that were more than 40%, 20%, and 50% below normal at PI, respectively. Undertaking a PBST caused further adverse changes to the total leukocyte, lymphocyte, CD3(+), CD4(+) and CD8(+) count. and the helper/suppressor ratio. Although CD8(+) counts had returned to normal by PIII, CD3(+), CD4(+), and the CD4(+)/CD8(+) ratio remained significantly lower than normative data (P < 0.01), with 66%, 100%, and 100% of the subject group reporting counts and ratios, respectively, below the normal range. Conclusion: The PBST patients were immunocompromised before undertaking the transplant, and the transplant procedure imposed further adverse changes to the leukocyte and lymphocyte counts. The leukocyte and CD8(+) counts returned to normal within 3 months posttransplant; however, the other immunological parameters assessed demonstrated a delayed recovery. Although participation in the exercise program did not facilitate a faster immune cell recovery, neither did the exercise program hinder or delay recovery.
Resumo:
Objective: To review the outcome of acute liver failure (ALF) and the effect of liver transplantation in children in Australia. Methodology: A retrospective review was conducted of all paediatric patients referred with acute liver failure between 1985 and 2000 to the Queensland Liver Transplant Service, a paediatric liver transplant centre based at the Royal Children's Hospital, Brisbane, that is one of three paediatric transplant centres in Australia. Results: Twenty-six patients were referred with ALF. Four patients did not require transplantation and recovered with medical therapy while two were excluded because of irreversible neurological changes and died. Of the 20 patients considered for transplant, three refused for social and/or religious reasons, with 17 patients listed for transplantation. One patient recovered spontaneously and one died before receiving a transplant. There were 15 transplants of which 40% (6/15) were < 2 years old. Sixty-seven per cent (10/15) survived > 1 month after transplantation. Forty per cent (6/15) survived more than 6 months after transplant. There were only four long-term survivors after transplant for ALF (27%). Overall, 27% (6/22) of patients referred with ALF survived. Of the 16 patients that died, 44% (7/16) were from neurological causes. Most of these were from cerebral oedema but two patients transplanted for valproate hepatotoxicity died from neurological disease despite good graft function. Conclusions: Irreversible neurological disease remains a major cause of death in children with ALF. We recommend better patient selection and early referral and transfer to a transplant centre before onset of irreversible neurological disease to optimize outcome of children transplanted for ALF.
Resumo:
The objectives of this study are to (1) quantify prior cardiopulmonary resuscitation (CPR) training in households of patients presenting to the Emergency Department (ED) with or without chest pain or ischaemic heart disease (IHD); (2) evaluate the willingness of household members to undertake CPR training; and (3) identify potential barriers to the learning and provision of bystander CPR. A cross-sectional study was conducted by surveying patients presenting to the ED of a metropolitan teaching hospital over a 6-month period. Two in five households of patients presenting with chest pain or IHD had prior training in CPR. This was no higher than for households of patients presenting without chest pain or IHD. Just under two in three households of patients presenting with chest pain or IHD were willing to participate in future CPR classes. Potential barriers to learning CPR included lack of information on CPR classes, perceived lack of intellectual and/or physical capability to learn CPR and concern about causing anxiety in the person at risk of cardiac arrest. Potential barriers to CPR provision included an unknown cardiac arrest victim and fear of infection. The ED provides an opportunity for increasing family and community capacity for bystander intervention through referral to appropriate training. (C) 2003 Published by Elsevier Science Ireland Ltd.
Resumo:
Objective: To evaluate the benefits of coordinating community services through the Post-Acute Care (PAC) program in older patients after discharge from hospital. Design: Prospective multicentre, randomised controlled trial with six months of follow-up with blinded outcome measurement. Setting: Four university-affiliated metropolitan general hospitals in Victoria. Participants: All patients aged 65 years and over who were discharged between August 1998 and October 1999 and required community services after discharge. Interventions: Participants were randomly allocated to receive services of a Post-Acute Care (PAC) coordinator (intervention) versus usual discharge planning (control). Main outcome measures: Comparison of quality of life and carer stress at one-month post-discharge, mortality, hospital readmissions, use of community services and community and hospital costs over the six months post-discharge. Results: 654 patients were randomised, and 598 were included in the analysis (311 in the PAC group and 287 in the control group). There was no difference in mortality between the groups (both 6%), but significantly greater overall quality-of-life scores at one-month follow-up in the PAC group. There was no difference in unplanned readmissions, but PAC patients used significantly fewer hospital bed-days in the six months after discharge (mean, 3.0 days; 95% CI, 2.1-3.9) than control patients (5.2 days; 95% CI, 3.8-6.7). Total costs (including hospitalisation, community services and the intervention) were lower in the PAC than the control group (mean difference, $1545; 95% CI, $11-$3078). Conclusions: The PAC program is beneficial in the transition from hospital to the community in older patients.
Resumo:
In the 1996 baseline surveys of the Australian Longitudinal Study of Women's Health (ALSWH), 36.1% of mid-age women (45-50) and 35% of older women (70-75) reported leaking urine. This study aimed to investigate (a) the range of self-management strategies used to deal with urinary incontinence (UI); (b) the reasons why many women who report leaking urine do not seek help for UI; and (c) the types of health professionals consulted and treatment provided, and perceptions of satisfaction with these, among a sample of women in each age group who reported leaking urine often' at baseline. Five hundred participants were randomly selected from women in each of the mid-age and older cohorts of the ALSWH who had reported leaking urine often in a previous survey. Details about UI (frequency, severity, and situations), self-management behaviors and help-seeking for UI, types of health professional consulted, recommended treatment for the problem, and satisfaction with the service provided by health care professionals and the outcomes of recommended treatments were sought through a self-report mailed follow-up survey. Most respondents had leaked urine in the last month (94% and 91% of mid-age and older women, respectively), and 72.2% and 73. 1% of mid-aged and older women, respectively, had sought help or advice about their UI. In both age groups, the likelihood of having sought help significantly increased with severity of incontinence. The most common reasons for not seeking help were that the women felt they could manage the problem themselves or they did not consider it to be a problem.. Many women in both cohorts had employed avoidance techniques in an attempt to prevent leaking urine, including reducing their liquid consumption, going to the toilet just in case, and rushing to the toilet the minute they felt the need to. Strategies are needed to inform women who experience UI of more effective management techniques and the possible health risks associated with commonly used avoidance behaviors. There may be a need to better publicize existing incontinence services and improve access to these services for women of all ages.
Resumo:
Aims: The objectives of the current study were (1) to measure type and severity of urinary leakage and (2) to investigate the association between these factors and age-related life events and conditions in three groups of Australian women with a history of urinary leakage. Methods: Five hundred participants were randomly selected from women in the young (aged 18-22 in 1996), mid-age (4550),and older (70-75) cohorts of the Australian Longitudinal Study of Women's Health (ALSWH) who had reported leaking urine in the 1996 baseline survey. Details about leaking urine (frequency, severity, situations) and associated factors (pregnancy, childbirth, body mass index [BMI]) were sought through self-report mailed follow-up surveys in 1999. Results & Conclusions: Response rates were 50, 83, and 80% in the young, mid-age, and older women, respectively. Most women confirmed that they had, leaked urine in the past month, and the majority of these were cases of mixed incontinence. Incontinence severity tended to increase with BMI for women of all ages, and increased severity scores were associated with having urine that burns or stings. Additional independent risk factors for increasing incontinence severity were heavy smoking in young women, past or present use of hormone replacement therapy in older women, and BMI and history of hysterectomy in mid-age women. (C) 2003 Wiley-Liss, Inc.
Resumo:
A research program on atmospheric boundary layer processes and local wind regimes in complex terrain was conducted in the vicinity of Lake Tekapo in the southern Alps of New Zealand, during two 1-month field campaigns in 1997 and 1999. The effects of the interaction of thermal and dynamic forcing were of specific interest, with a particular focus on the interaction of thermal forcing of differing scales. The rationale and objectives of the field and modeling program are described, along with the methodology used to achieve them. Specific research aims include improved knowledge of the role of surface forcing associated with varying energy balances across heterogeneous terrain, thermal influences on boundary layer and local wind development, and dynamic influences of the terrain through channeling effects. Data were collected using a network of surface meteorological and energy balance stations, radiosonde and pilot balloon soundings, tethered balloon and kite-based systems, sodar, and an instrumented light aircraft. These data are being used to investigate the energetics of surface heat fluxes, the effects of localized heating/cooling and advective processes on atmospheric boundary layer development, and dynamic channeling. A complementary program of numerical modeling includes application of the Regional Atmospheric Modeling System (RAMS) to case studies characterizing typical boundary layer structures and airflow patterns observed around Lake Tekapo. Some initial results derived from the special observation periods are used to illustrate progress made to date. In spite of the difficulties involved in obtaining good data and undertaking modeling experiments in such complex terrain, initial results show that surface thermal heterogeneity has a significant influence on local atmospheric structure and wind fields in the vicinity of the lake. This influence occurs particularly in the morning. However, dynamic channeling effects and the larger-scale thermal effect of the mountain region frequently override these more local features later in the day.
Resumo:
Time motion analysis is extensively used to assess the demands of team sports. At present there is only limited information on the reliability of measurements using this analysis tool. The aim of this study was to establish the reliability of an individual observer's time motion analysis of rugby union. Ten elite level rugby players were individually tracked in Southern Hemisphere Super 12 matches using a digital video camera. The video footage was subsequently analysed by a single researcher on two occasions one month apart. The test-retest reliability was quantified as the typical error of measurement (TEM) and rated as either good (10% TEM). The total time spent in the individual movements of walking, jogging, striding, sprinting, static exertion and being stationary had moderate to poor reliability (5.8-11.1% TEM). The frequency of individual movements had good to poor reliability (4.3-13.6% TEM), while the mean duration of individual movements had moderate reliability (7.1-9.3% TEM). For the individual observer in the present investigation, time motion analysis was shown to be moderately reliable as an evaluation tool for examining the movement patterns of players in competitive rugby. These reliability values should be considered when assessing the movement patterns of rugby players within competition.
Resumo:
A grazing trial was conducted to quantify N cycling in degraded Leucaena leucocephala (leucaena)-Brachiaria decumbens (signal grass) pastures grown on an acid, infertile, podzolic soil in south-east Queensland. Nitrogen accumulation and cycling in leucaena-signal grass pastures were evaluated for 9 weeks until all of the leucaena on offer (mean 600 kg edible dry matter (EDM)/ha, 28% of total pasture EDM) was consumed. Nitrogen pools in the grass, leucaena, soil, cattle liveweight, faeces and urine were estimated. The podzolic soil (pH 4.8-5.9) was found to be deficient in P, Ca and K. Leucaena leaf tissues contained deficient levels of N, P and Ca. Grass tissues were deficient in N and P. Grazing was found to cycle 65% of N on offer in pasture herbage. However, due to the effect of the plant nutrient imbalances described above, biological N fixation by leucaena contributed only 15 kg/ha N to the pasture system over the 9-month regrowth period, of which 13 kg/ha N was cycled. Cattle retained 1.8 kg/ha N (8% of total N consumed) in body tissue and the remainder was excreted in dung and urine in approximately equal proportions. Mineral soil N concentrations did not change significantly (-3.5 kg/ha N) over the trial period. The ramifications of grazing and fertiliser management strategies, and implications for pasture rundown and sustainability are discussed.
Resumo:
The influence of complex plaque morphology on the extent of demand-induced ischemia in unselected patients is not well defined. We sought to investigate the functional significance of lesion morphology in patients who underwent coronary angiography and dobutamine stress echocardiography (DSE).,Angiography and DSE were performed within a 6-month period (mean 1 +/- 1 month) in 196 patients. Angiographic assessments involved quantification of stenosis severity, assessment of the extent of jeopardized myocardium, and categorization of plaque morphology according to the Ambrose classification. DSE was interpreted by separate investigators with respect to wall motion score index (WMSI) and number of coronary territories involved. A general linear model was constructed to assess,the independent contribution of patient characteristics and angiographic and DSE results with respect to extent of ischemic myocardium. Complex lesion morphology was seen in 62 patients (32%). Patients with complex lesions were more likely to have had prior myocardial infarction (p < 0.001) and be current smokers (p = 0.03). During angiography, they exhibited a trend toward a greater number of diseased vessels, had a greater coronary jeopardy score (p < 0.001) and more frequent collateral flow (p = 0.03). During echocardiography, patients had a higher stress WMSI (p < 0.001) and were more likely to show ischemia in all 3 arterial territories (p < 0.01). On multivariate regression, the coronary artery jeopardy score and the presence of complex plaque morphology were independent predictors of the extent of ischemic myocardium (R 2 = 34%, p < 0.001). Thus, patients with complex plaque morphology are older, more likely to smoke, and more likely to have had prior myocardial. infarction. They exhibit more extensive disease with higher coronary jeopardy scores and a higher resting and peak stress WMSI. Despite these differences, complex plaque morphology remains an independent predictor of the extent of ischemia during stress. (C) 2003 by Excerpta Medica, Inc.
Resumo:
Measurement of Health-Related Quality of Life (HRQoL) of the elderly requires instruments with demonstrated sensitivity, reliability, and validity, particularly with the increasing proportion of older people entering the health care system. This article reports the psychometric properties of the 12-item Assessment of Quality of Life (AQoL) instrument in chronically ill community-dwelling elderly people with an 18-month follow-up. Comparator instruments included the SF-36 and the OARS. Construct validity of the AQoL was strong when examined via factor analysis and convergent and divergent validity against other scales. Receiver Operator Characteristic (ROC) curve analyses and relative efficiency estimates indicated the AQoL is sensitive, responsive, and had the strongest predicative validity for nursing home entry. It was also sensitive to economic prediction over the follow-up. Given these robust psychometric properties and the brevity of the scale, AQoL appears to be a suitable instrument for epidemiologic studies where HRQoL and utility data are required from elderly populations. (C) 2003 Elsevier Science Inc. All rights reserved.
Resumo:
CONTEXT: Despite more than 2 decades of outcomes research after very preterm birth, clinicians remain uncertain about the extent to which neonatal morbidities predict poor long-term outcomes of extremely low-birth-weight (ELBW) infants. OBJECTIVE: To determine the individual and combined prognostic effects of bronchopulmonary dysplasia (BPD), ultrasonographic signs of brain injury, and severe retinopathy of prematurity (ROP) on 18-month outcomes of ELBW infants. DESIGN: Inception cohort assembled for the Trial of Indomethacin Prophylaxis in Preterms (TIPP). SETTING AND PARTICIPANTS: A total of 910 infants with birth weights of 500 to 999 g who were admitted to 1 of 32 neonatal intensive care units in Canada, the United States, Australia, New Zealand, and Hong Kong between 1996 and 1998 and who survived to a postmenstrual age of 36 weeks. MAIN OUTCOME MEASURES: Combined end point of death or survival to 18 months with 1 or more of cerebral palsy, cognitive delay, severe hearing loss, and bilateral blindness. RESULTS: Each of the neonatal morbidities was similarly and independently correlated with a poor 18-month outcome. Odds ratios were 2.4 (95% confidence interval [CI], 1.8-3.2) for BPD, 3.7 (95% CI, 2.6-5.3) for brain injury, and 3.1 (95% CI, 1.9-5.0) for severe ROP. In children who were free of BPD, brain injury, and severe ROP the rate of poor long-term outcomes was 18% (95% CI, 14%-22%). Corresponding rates with any 1, any 2, and all 3 neonatal morbidities were 42% (95% CI, 37%-47%), 62% (95% CI, 53%-70%), and 88% (64%-99%), respectively. CONCLUSION: In ELBW infants who survive to a postmenstrual age of 36 weeks, a simple count of 3 common neonatal morbidities strongly predicts the risk of later death or neurosensory impairment.