306 resultados para Assessing Climatic Risk
Resumo:
Objectives: Resternotomy is a common part of cardiac surgical practice. Associated with resternotomy are the risks of cardiac injury and catastrophic hemorrhage and the subsequent elevated morbidity and mortality in the operating room or during the postoperative period. The technique of direct vision resternotomy is safe and has fewer, if any, serious cardiac injuries. The technique, the reduced need for groin cannulation and the overall low operative mortality and morbidity are the focus of this restrospective analysis. Methods: The records of 495 patients undergoing 546 resternotomies over a 21-year period to January 2000 were reviewed. All consecutive reoperations by the one surgeon comprised patients over the age of 20 at first resternotomy: M:F 343:203, mean age 57 years (range 20 to 85, median age 60). The mean NYHA grade was 2.3 [with 67 patients (1), 273 (11),159 (111), 43 (IV), and 4 (V classification)] with elective reoperation in 94.6%. Cardiac injury was graded into five groups and the incidence and reasons for groin cannulation estimated. The morbidity and mortality as a result of the reoperation and resternotomy were assessed. Results: The hospital/30 day mortality was 2.9% (95% Cl: 1.6%-4.4%) (16 deaths) over the 21 years. First (481), second (53), and third (12) resternotomies produced 307 uncomplicated technical reopenings, 203 slower but uncomplicated procedures, 9 minor superficial cardiac lacerations, and no moderate or severe cardiac injuries. Direct vision resternotomy is crystalized into the principle that only adhesions that are visualized from below are divided and only sternal bone that is freed of adhesions is sewn. Groin exposure was never performed prophylactically for resternotomy. Fourteen patients (2.6%) had such cannulation for aortic dissection/aneurysm (9 patients), excessive sternal adherence of cardiac structures (3 patients), presurgery cardiac arrest (1 patient), and high aortic cannulation desired and not possible (1 patient). The average postop blood loss was 594 mL (95% CI:558-631) in the first 12 hours. The need to return to the operating room for control of excessive bleeding was 2% (11 patients). Blood transfusion was given in 65% of the resternotomy procedures over the 21 years (mean 854 mL 95% Cl 765-945 mL) and 41% over the last 5 years. Conclusions: The technique of direct vision resternotomy has been associated with zero moderate or major cardiac injury/catastrophic hemorrhage at reoperation. Few patients have required groin cannulation. In the postoperative period, there was acceptable blood loss, transfusion rates, reduced morbidity, and moderate low mortality for this potentially high risk group.
Resumo:
In a primary analysis of a large recently completed randomized trial in 915 high-risk patients undergoing major abdominal surgery, we found no difference in outcome between patients receiving perioperative epidural analgesia and those receiving IV opioids, apart from the incidence of respiratory failure. Therefore, we performed a selected number of predetermined subgroup analyses to identify specific types of patients who may have derived benefit from epidural analgesia. We found no difference in outcome between epidural and control groups in subgroups at increased risk of respiratory or cardiac complications or undergoing aortic surgery, nor in a subgroup with failed epidural block (all P > 0.05). There was a small reduction in the duration of postoperative ventilation (geometric mean [SD]: control group, 0.3 [6.5] h, versus epidural group, 0.2 [4.8] h, P = 0.048). No differences were found in length of stay in intensive care or in the hospital. There was no relationship between frequency of use of epidural analgesia in routine practice outside the trial and benefit from epidural analgesia in the trial. We found no evidence that perioperative epidural analgesia significantly influences major morbidity or mortality after major abdominal surgery.
Resumo:
1. Ice-volume forced glacial-interglacial cyclicity is the major cause of global climate variation within the late Quaternary period. Within the Australian region, this variation is expressed predominantly as oscillations in moisture availability. Glacial periods were substantially drier than today with restricted distribution of mesic plant communities, shallow or ephemeral water bodies and extensive aeolian dune activity. 2. Superimposed on this cyclicity in Australia is a trend towards drier and/or more variable climates within the last 350 000 years. This trend may have been initiated by changes in atmospheric and ocean circulation resulting from Australia's continued movement into the Southeast Asian region and involving the onset or intensification of the El Nino-Southern Oscillation system and a reduction in summer monsoon activity. 3. Increased biomass burning, stemming originally from increased climatic variability and later enhanced by activities of indigenous people, resulted in a more open and sclerophyllous vegetation, increased salinity and a further reduction in water availability. 4. Past records combined with recent observations suggest that the degree of environmental variability will increase and the drying trend will be enhanced in the foreseeable future, regardless of the extent or nature of human intervention.
Resumo:
In this paper, it is shown that, for a wide range of risk-averse generalized expected utility preferences, independent risks are complementary, contrary to the results for expected utility preferences satisfying conditions such as proper and standard risk aversion.
Resumo:
Four pollen and charcoal records derived from marine cores around the northern perimeter of Australia are examined to provide a regional picture of patterns, causes and impacts of climate change over the last 100-300 ka. The availability of radiocarbon dates and oxygen isotope records for the cores provides primary chronological control. Spectral analysis of components of these records demonstrates an overall importance of Milankovitch frequencies with clear glacial-interglacial cyclicity dominated by variation in precipitation. In addition, a number of pollen taxa, as well as charcoal particles, exhibit a 30 ka frequency that is considered, from its relationship with biomass burning and with results of past modelling, to reflect changes in the intensity of El Nino-Southern Oscillation (ENSO) variability. Pollen components of all records show a decline, frequently stepwise, in more fire-sensitive vegetation and its replacement with more fire-tolerant vegetation. There is some evidence that this trend is linked to an onset or general increase in ENSO activity and perhaps also to variation in monsoon activity dating from about 300 ka BP that was caused by changes to oceanic circulation within the Indonesian region. The trend may have accelerated within the last 45 ka due to burning by indigenous people. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
Background and Purpose - This study was undertaken to better clarify the risks associated with cigarette smoking and subarachnoid hemorrhage (SAH). Methods - The study included 432 incident cases of SAH frequency matched to 473 community SAH-free controls to determine dose-dependent associations of active and passive smoking ( at home) and smoking cessation with SAH. Results - Compared with never smokers not exposed to passive smoking, the adjusted odds ratio for SAH among current smokers was 5.0 (95% confidence interval [CI], 3.1 to 8.1); for past smokers, 1.2 ( 95% CI, 0.8 to 2.0); and for passive smokers, 0.9 ( 95% CI, 0.6 to 1.5). Current and lifetime exposures showed a clear dose-dependent effect, and risks appeared more prominent in women and for aneurysmal SAH. Approximately 1 in 3 cases of SAH could be attributed to current smoking, but risks decline quickly after smoking cessation, even among heavy smokers. Conclusions - A strong positive association was found between cigarette smoking and SAH, especially for aneurysmal SAH and women, which is virtually eliminated within a few years of smoking cessation. Large opportunities exist for preventing SAH through smoking avoidance and cessation programs.
Resumo:
Background and Purpose-Limited information exists on the long-term prognosis after first-ever stroke. We aimed to determine the absolute frequency of first recurrent stroke and disability and the relative frequency of recurrent stroke over 10 years after first-ever stroke in Perth, Western Australia. Methods-For a 12-month period beginning February 1989, all individuals with suspected acute stroke or transient ischemic attack who lived in a geographically defined and representative region of Perth were registered prospectively. Patients with a definite first-ever stroke were followed up 10 years after the index event. Results-Over 10 years of follow-up, the cumulative risk of a first recurrent stroke was 43% (95% confidence interval [CI], 34 to 51). After the first year after first-ever stroke, the average annual risk of recurrent stroke was approximate to4%. Case fatality at 30 days after first recurrent stroke was 41%, which was significantly greater than the case fatality at 30 days after first-ever stroke (22%) (P=0.003). For 30-day survivors of first-ever stroke, the 10-year cumulative risk of death or new institutionalization was 79% (95% CI, 73 to 85) and of death or new disability was 87% (95% CI, 81 to 92). Conclusions-Over 10 years of follow-up, the risk of first recurrent stroke is 6 times greater than the risk of first-ever stroke in the general population of the same age and sex, almost one half of survivors remain disabled, and one seventh require institutional care. Effective strategies for prevention of stroke need to be implemented early, monitored frequently, and maintained long term after first-ever stroke.
Resumo:
The prevalence of colonization with the anaerobic intestinal spirochaetes Brachyspira aalborgi and Brachyspira pilosicoli was investigated in humans (n = 316) and dogs (n = 101) living on three tea estates in Assam, India. Colonization was detected using PCR on DNA from faeces. Nineteen (6%) human faecal samples contained B. aalborgi DNA, 80 (25.3%) contained B. pilosicoli DNA, and 10 (3.2%) contained DNA from both species. One canine sample contained DNA from B. pilosicoli. Significant factors for B. aalborgi colonization in logistic regression were: infection of family members with B. aalborgi (P < 0.001), being a resident of Balipara (P = 0.03), and use of water treatment (P = 0.03). For B. pilosicoli, significant factors were: other family members being positive for B. pilosicoli (P < 0.001), water obtained from a well (P = 0.006), water treatment (P = 0.03), and not having visited a doctor in the previous 12 months (P = 0.03).
Resumo:
This study examined whether people born in other countries had higher rates of death and hospitalization due to road crashes than people born in Australia. Data on deaths that occurred in the whole of Australia between 1994 and 1997 and hospitalizations that occurred in the state of New South Wales, Australia, between I July 1995 and 30 June 1997 due to road crashes were analyzed. The rates of death and hospitalization, adjusted for age and area of residence, were calculated using population data from the 1996 Australian census. The study categorized people born in other countries according to the language (English speaking, non-English speaking) and the road convention (left-hand side, right-hand side) of their country of birth. Australia has the left-hand side driving convention. The study found that drivers born in other countries had rates of death or hospitalization due to road trauma equal to or below those of Australian born drivers. In contrast, pedestrians born in other countries, especially older pedestrians had higher rates of death and hospitalization due to road crashes. Pedestrians aged 60 years or more born in non-English speaking countries where traffic travels on the right-hand side of the road had risks about twice those of Australian born pedestrians in the same age group. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
OBJECTIVE To determine the prevalence, intensity and associated risk factors for infection with Ascaris, hookworms and Trichuris in three tea-growing communities in Assam, India. METHODS Single faecal samples were collected from 328 individuals and subjected to centrifugal floatation and the Kato Katz quantitation technique and prevalence and intensities of infection with each parasite calculated. Associations between parasite prevalence, intensity and host and environmental factors were then made using both univariate and multivariate analysis. RESULTS The overall prevalence of Ascaris was 38% [95% confidence interval (CI): 33, 43], and the individual prevalence of hookworm and Trichuris was 43% (95% CI: 38, 49). The strongest predictors for the intensity of one or more geohelminths using multiple regression (P less than or equal to 0.10) were socioeconomic status, age, household crowding, level of education, religion, use of footwear when outdoors, defecation practices, pig ownership and water source. CONCLUSION A universal blanket treatment with broad-spectrum anthelmintics together with promotion of scholastic and health education and improvements in sanitation is recommended for helminth control in the communities under study.
Resumo:
Objective: Existing evidence suggests that family interventions can be effective in reducing relapse rates in schizophrenia and related conditions. Despite this, such interventions are not routinely delivered in Australian mental health services. The objective of the current study is to investigate the incremental cost-effectiveness ratios (ICERs) of introducing three types of family interventions, namely: behavioural family management (BFM); behavioural intervention for families (BIF); and multiple family groups (MFG) into current mental health services in Australia. Method: The ICER of each of the family interventions is assessed from a health sector perspective, including the government, persons with schizophrenia and their families/carers using a standardized methodology. A two-stage approach is taken to the assessment of benefit. The first stage involves a quantitative analysis based on disability-adjusted life years (DALYs) averted. The second stage involves application of 'second filter' criteria (including equity, strength of evidence, feasibility and acceptability to stakeholders) to results. The robustness of results is tested using multivariate probabilistic sensitivity analysis. Results: The most cost-effective intervention, in order of magnitude, is BIF (A$8000 per DALY averted), followed by MFG (A$21 000 per DALY averted) and lastly BFM (A$28 000 per DALY averted). The inclusion of time costs makes BFM more cost-effective than MFG. Variation of discount rate has no effect on conclusions. Conclusions: All three interventions are considered 'value-for-money' within an Australian context. This conclusion needs to be tempered against the methodological challenge of converting clinical outcomes into a generic economic outcome measure (DALY). Issues surrounding the feasibility of routinely implementing such interventions need to be addressed.
Resumo:
Objective: The Assessing Cost-Effectiveness - Mental Health (ACE-MH) study aims to assess from a health sector perspective, whether there are options for change that could improve the effectiveness and efficiency of Australia's current mental health services by directing available resources toward 'best practice' cost-effective services. Method: The use of standardized evaluation methods addresses the reservations expressed by many economists about the simplistic use of League Tables based on economic studies confounded by differences in methods, context and setting. The cost-effectiveness ratio for each intervention is calculated using economic and epidemiological data. This includes systematic reviews and randomised controlled trials for efficacy, the Australian Surveys of Mental Health and Wellbeing for current practice and a combination of trials and longitudinal studies for adherence. The cost-effectiveness ratios are presented as cost (A$) per disability-adjusted life year (DALY) saved with a 95% uncertainty interval based on Monte Carlo simulation modelling. An assessment of interventions on 'second filter' criteria ('equity', 'strength of evidence', 'feasibility' and 'acceptability to stakeholders') allows broader concepts of 'benefit' to be taken into account, as well as factors that might influence policy judgements in addition to cost-effectiveness ratios. Conclusions: The main limitation of the study is in the translation of the effect size from trials into a change in the DALY disability weight, which required the use of newly developed methods. While comparisons within disorders are valid, comparisons across disorders should be made with caution. A series of articles is planned to present the results.