801 resultados para Hold-up risk
Resumo:
The renewed concern in assessing risks and consequences from technological hazards in industrial and urban areas continues emphasizing the development of local-scale consequence analysis (CA) modelling tools able to predict shortterm pollution episodes and exposure effects on humans and the environment in case of accident with hazardous gases (hazmat). In this context, the main objective of this thesis is the development and validation of the EFfects of Released Hazardous gAses (EFRHA) model. This modelling tool is designed to simulate the outflow and atmospheric dispersion of heavy and passive hazmat gases in complex and build-up areas, and to estimate the exposure consequences of short-term pollution episodes in accordance to regulatory/safety threshold limits. Five main modules comprising up-to-date methods constitute the model: meteorological, terrain, source term, dispersion, and effects modules. Different initial physical states accident scenarios can be examined. Considered the main core of the developed tool, the dispersion module comprises a shallow layer modelling approach capable to account the main influence of obstacles during the hazmat gas dispersion phenomena. Model validation includes qualitative and quantitative analyses of main outputs by the comparison of modelled results against measurements and/or modelled databases. The preliminary analysis of meteorological and source term modules against modelled outputs from extensively validated models shows the consistent description of ambient conditions and the variation of the hazmat gas release. Dispersion is compared against measurements observations in obstructed and unobstructed areas for different release and dispersion scenarios. From the performance validation exercise, acceptable agreement was obtained, showing the reasonable numerical representation of measured features. In general, quality metrics are within or close to the acceptance limits recommended for ‘non-CFD models’, demonstrating its capability to reasonably predict hazmat gases accidental release and atmospheric dispersion in industrial and urban areas. EFRHA model was also applied to a particular case study, the Estarreja Chemical Complex (ECC), for a set of accidental release scenarios within a CA scope. The results show the magnitude of potential effects on the surrounding populated area and influence of the type of accident and the environment on the main outputs. Overall the present thesis shows that EFRHA model can be used as a straightforward tool to support CA studies in the scope of training and planning, but also, to support decision and emergency response in case of hazmat gases accidental release in industrial and built-up areas.
Resumo:
Seismic risk evaluation of built-up areas involves analysis of the level of earthquake hazard of the region, building vulnerability and exposure. Within this approach that defines seismic risk, building vulnerability assessment assumes great importance, not only because of the obvious physical consequences in the eventual occurrence of a seismic event, but also because it is the one of the few potential aspects in which engineering research can intervene. In fact, rigorous vulnerability assessment of existing buildings and the implementation of appropriate retrofitting solutions can help to reduce the levels of physical damage, loss of life and the economic impact of future seismic events. Vulnerability studies of urban centresshould be developed with the aim of identifying building fragilities and reducing seismic risk. As part of the rehabilitation of the historic city centre of Coimbra, a complete identification and inspection survey of old masonry buildings has been carried out. The main purpose of this research is to discuss vulnerability assessment methodologies, particularly those of the first level, through the proposal and development of a method previously used to determine the level of vulnerability, in the assessment of physical damage and its relationship with seismic intensity.
Resumo:
Cardiovascular diseases are the leading cause of death in Portugal, alike what is verified in the remaining western countries. There are factors that increase the risk of its occurrence, that do not usually emerge isolated, tending to group in the individual. This coexistence results in a combined effect which is larger than the one expected from the sum of its individual effects. The global cardiovascular risk is defined as the percentage change of developing a cardiovascular event over a given period of time (generally 10 years). The purpose of global cardiovascular risk assessment is to identify the individuals who should be counseled and receive treatment to prevent a cardiovascular disease, as well as to establish the therapeutics aggressiveness level.
Resumo:
Background: Diabetes mellitus is one of the major causes of chronic morbidity and loss of quality of life, and it seems to be increasing in the coming decades. Overall prevalence of diabetes in Portugal in 2010, according to the latest National Observatory of Diabetes Report, was 12.4%, which corresponds to a total of approximately 991 thousand individuals aged between 20 and 79 years. The level of control of diabetes mellitus, as measured by glycosilated haemoglobin A1c(HbA1c) influences the long-term risk of macrovascular and microvascular complications. Given the frequent association of diabetes with hypertension/dyslipidemia/overweight, managing these risk factors is a crucial part of the diabetes control.
Resumo:
Background World Health Organization hand hygiene guidelines state that if electric hand dryers are used, they should not aerosolize pathogens. Previous studies have investigated the dispersal by different hand-drying devices of chemical indicators, fungi and bacteria on the hands. This study assessed the aerosolization and dispersal of virus on the hands to determine any differences between hand-drying devices in their potential to contaminate other occupants of public washrooms and the washroom environment. Methods A suspension of MS2, an Escherichia coli bacteriophage virus, was used to artificially contaminate the hands of participants prior to using three different handdrying devices: jet air dryer, warm air dryer, paper towel dispenser. Virus was detected by plaque formation on agar plates layered with the host bacterium. Vertical dispersal of virus was assessed at a fixed distance (0.4 m) and over a range of different heights (0.0 – 1.8 m) from the floor. Horizontal dispersal was assessed at different distances of up to three metres from the hand-drying devices. Virus aerosolization and dispersal was also assessed at different times up to 15 minutes after use by means of air sampling at two distances (0.1 and 1.0 m) and at a distance behind and offset from each of the hand-drying devices. Results Over a range of heights, the jet air dryer was shown to produce over 60 times greater vertical dispersal of virus from the hands than a warm air dryer and over 1300 times greater than paper towels; the maximum being detected between 0.6 and 1.2 metres from the floor. Horizontal dispersal of virus by the jet air dryer was over 20 times greater than a warm air dryer and over 190 times greater than paper towels; virus being detected at distances of up to three metres. Air sampling at three different positions from the hand-drying devices 15 minutes after use showed that the jet air dryer produced over 50-times greater viral contamination of the air than a warm air dryer and over 110-times greater than paper towels. Conclusions Due to their high air speed, jet air dryers aerosolize and disperse more virus over a range of heights, greater distances, and for longer times than other hand drying devices. If hands are inadequately washed, they have a greater potential to contaminate other occupants of a public washroom and the washroom environment. Main messages: Jet air dryers with claimed air speeds of over 600 kph have a greater potential than warm air dryers or paper towels to aerosolize and disperse viruses on the hands of users. The choice of hand-drying device should be carefully considered. Jet air dryers may increase the risk of transmission of human viruses, such as norovirus, particularly if hand washing is inadequate.
Resumo:
Coal contains trace elements and naturally occurring radionuclides such as 40K, 232Th, 238U. When coal is burned, minerals, including most of the radionuclides, do not burn and concentrate in the ash several times in comparison with their content in coal. Usually, a small fraction of the fly ash produced (2-5%) is released into the atmosphere. The activities released depend on many factors (concentration in coal, ash content and inorganic matter of the coal, combustion temperature, ratio between bottom and fly ash, filtering system). Therefore, marked differences should be expected between the by-products produced and the amount of activity discharged (per unit of energy produced) from different coal-fired power plants. In fact, the effects of these releases on the environment due to ground deposition have been received some attention but the results from these studies are not unanimous and cannot be understood as a generic conclusion for all coal-fired power plants. In this study, the dispersion modelling of natural radionuclides was carried out to assess the impact of continuous atmospheric releases from a selected coal plant. The natural radioactivity of the coal and the fly ash were measured and the dispersion was modelled by a Gaussian plume estimating the activity concentration at different heights up to a distance of 20 km in several wind directions. External and internal doses (inhalation and ingestion) and the resulting risk were calculated for the population living within 20 km from the coal plant. In average, the effective dose is lower than the ICRP’s limit and the risk is lower than the U.S. EPA’s limit. Therefore, in this situation, the considered exposure does not pose any risk. However, when considering the dispersion in the prevailing wind direction, these values are significant due to an increase of 232Th and 226Ra concentrations in 75% and 44%, respectively.
Resumo:
Background: Little is known about the risk of progression to hazardous alcohol use in people currently drinking at safe limits. We aimed to develop a prediction model (predictAL) for the development of hazardous drinking in safe drinkers. Methods: A prospective cohort study of adult general practice attendees in six European countries and Chile followed up over 6 months. We recruited 10,045 attendees between April 2003 to February 2005. 6193 European and 2462 Chilean attendees recorded AUDIT scores below 8 in men and 5 in women at recruitment and were used in modelling risk. 38 risk factors were measured to construct a risk model for the development of hazardous drinking using stepwise logistic regression. The model was corrected for over fitting and tested in an external population. The main outcome was hazardous drinking defined by an AUDIT score >= 8 in men and >= 5 in women. Results: 69.0% of attendees were recruited, of whom 89.5% participated again after six months. The risk factors in the final predictAL model were sex, age, country, baseline AUDIT score, panic syndrome and lifetime alcohol problem. The predictAL model's average c-index across all six European countries was 0.839 (95% CI 0.805, 0.873). The Hedge's g effect size for the difference in log odds of predicted probability between safe drinkers in Europe who subsequently developed hazardous alcohol use and those who did not was 1.38 (95% CI 1.25, 1.51). External validation of the algorithm in Chilean safe drinkers resulted in a c-index of 0.781 (95% CI 0.717, 0.846) and Hedge's g of 0.68 (95% CI 0.57, 0.78). Conclusions: The predictAL risk model for development of hazardous consumption in safe drinkers compares favourably with risk algorithms for disorders in other medical settings and can be a useful first step in prevention of alcohol misuse.
Resumo:
Dissertation presented to obtain the Ph.D. degree in Biology/ Molecular Biology
Resumo:
Introduction: We previously reported the results of a phase II study for patients with newly diagnosed primary CNS lymphoma (PCNSL) treated with autologous peripheral blood stem-cell transplantation (aPBSCT) and responseadapted whole brain radiotherapy (WBRT). The purpose of this report is to update the initial results and provide long-term data regarding overall survival, prognostic factors, and the risk of treatment-related neurotoxicity.Methods: A long-term follow-up was conducted on surviving primary central nervous system lymphoma patients having been treated according to the ,,OSHO-53 study", which was initiated by the Ostdeutsche Studiengruppe Hamatologie-Onkologie. Between August 1999 and October 2004 twentythree patients with an average age of 55 and median Karnofsky performance score of 70% were enrolled and received high-dose mthotrexate (HD-MTX) on days 1 and 10. In case of at least a partial remission (PR), high-dose busulfan/ thiotepa (HD-BuTT) followed by aPBSCT was performed. Patients without response to induction or without complete remission (CR) after HD-BuTT received WBRT. All patients (n=8), who are alive in 2011, were contacted and Mini Mental State examination (MMSE) and the EORTC QLQ-C30 were performed.Results: Eight patients are still alive with a median follow-up of 116,9 months (79 - 141, range). One of them suffered from a late relapse eight and a half years after initial diagnosis of PCNSL, another one suffers from a gall bladder carcinoma. Both patients are alive, the one with the relapse of PCNSL has finished rescue therapy and is further observed, the one with gall baldder carcinoma is still under therapy. MMSE and QlQ-C30 showed impressive results in the patients, who were not irradiated. Only one of the irradiated patients is still alive with a clear neurologic deficit but acceptable quality of life.Conclusions: Long-term follow-up of our patients, who were included in the OSHO-53 study show an overall survival of 30 percent. If WBRT can be avoided no long-term neurotoxicity has been observed and the patients benefit from excellent Quality of Life. Induction chemotherapy with two cycles of HD-MTX should be intensified to improve the unsatisfactory OAS of 30 percent.
Resumo:
BACKGROUND: Clinical scores may help physicians to better assess the individual risk/benefit of oral anticoagulant therapy. We aimed to externally validate and compare the prognostic performance of 7 clinical prediction scores for major bleeding events during oral anticoagulation therapy. METHODS: We followed 515 adult patients taking oral anticoagulants to measure the first major bleeding event over a 12-month follow-up period. The performance of each score to predict the risk of major bleeding and the physician's subjective assessment of bleeding risk were compared with the C statistic. RESULTS: The cumulative incidence of a first major bleeding event during follow-up was 6.8% (35/515). According to the 7 scoring systems, the proportions of major bleeding ranged from 3.0% to 5.7% for low-risk, 6.7% to 9.9% for intermediate-risk, and 7.4% to 15.4% for high-risk patients. The overall predictive accuracy of the scores was poor, with the C statistic ranging from 0.54 to 0.61 and not significantly different from each other (P=.84). Only the Anticoagulation and Risk Factors in Atrial Fibrillation score performed slightly better than would be expected by chance (C statistic, 0.61; 95% confidence interval, 0.52-0.70). The performance of the scores was not statistically better than physicians' subjective risk assessments (C statistic, 0.55; P=.94). CONCLUSION: The performance of 7 clinical scoring systems in predicting major bleeding events in patients receiving oral anticoagulation therapy was poor and not better than physicians' subjective assessments.
Resumo:
Preterm children born before 32 weeks of gestation represent 1% of the annual births in Switzerland, and are the most at risk of neurodevelopmental disabilities. A neurological surveillance is thus implemented in the neonatal units, and multidisciplinary neurodevelopmental follow-up is offered to all our preterm patients. The follow-up clinics of the University hospitals in Lausanne and Geneva follow the Swiss guidelines for follow-up. An extended history and neurological examination is taken at each appointment, and a standardized test of development is performed. These examinations, which take place between the ages of 3 months and 9 years old, allow the early identification and treatment of developmental disorders frequent in this population, such as motor, cognitive or behavioral disorders, as well as the monitoring of the quality of neonatal care.
Resumo:
A computerized handheld procedure is presented in this paper. It is intended as a database complementary tool, to enhance prospective risk analysis in the field of occupational health. The Pendragon forms software (version 3.2) has been used to implement acquisition procedures on Personal Digital Assistants (PDAs) and to transfer data to a computer in an MS-Access format. The data acquisition strategy proposed relies on the risk assessment method practiced at the Institute of Occupational Health Sciences (IST). It involves the use of a systematic hazard list and semi-quantitative risk assessment scales. A set of 7 modular forms has been developed to cover the basic need of field audits. Despite the minor drawbacks observed, the results obtained so far show that handhelds are adequate to support field risk assessment and follow-up activities. Further improvements must still be made in order to increase the tool effectiveness and field adequacy.
Resumo:
BACKGROUND: Recommendations for statin use for primary prevention of coronary heart disease (CHD) are based on estimation of the 10- year CHD risk. We compared the 10-year CHD risk assessments and eligibility percentages for statin therapy using three scoring algorithms currently used in Europe. METHODS: We studied 5683 women and men, aged 35-75, without overt cardiovascular disease (CVD), in a population-based study in Switzerland. We compared the 10-year CHD risk using three scoring schemes, i.e., the Framingham risk score (FRS) from the U.S. National Cholesterol Education Program's Adult Treatment Panel III (ATP III), the PROCAM scoring scheme from the International Atherosclerosis Society (IAS), and the European risk SCORE for low-risk countries, without and with extrapolation to 60 years as recommended by the European Society of Cardiology guidelines (ESC). With FRS and PROCAM, high-risk was defined as a 10- year risk of fatal or non-fatal CHD>20% and a 10-year risk of fatal CVD≥5% with SCORE. We compared the proportions of high-risk participants and eligibility for statin use according to these three schemes. For each guideline, we estimated the impact of increased statin use from current partial compliance to full compliance on potential CHD deaths averted over 10 years, using a success proportion of 27% for statins. RESULTS: Participants classified at high-risk (both genders) were 5.8% according to FRS and 3.0% to the PROCAM, whereas the European risk SCORE classified 12.5% at high-risk (15.4% with extrapolation to 60 years). For the primary prevention of CHD, 18.5% of participants were eligible for statin therapy using ATP III, 16.6% using IAS, and 10.3% using ESC (13.0% with extrapolation) because ESC guidelines recommend statin therapy only in high-risk subjects. In comparison with IAS, agreement to identify eligible adults for statins was good with ATP III, but moderate with ESC. Using a population perspective, a full compliance with ATP III guidelines would reduce up to 17.9% of the 24′ 310 CHD deaths expected over 10 years in Switzerland, 17.3% with IAS and 10.8% with ESC (11.5% with extrapolation). CONCLUSIONS: Full compliance with guidelines for statin therapy would result in substantial health benefits, but proportions of high-risk adults and eligible adults for statin use varied substantially depending on the scoring systems and corresponding guidelines used for estimating CHD risk in Europe.
Resumo:
BACKGROUND: Data from prospective cohort studies regarding the association between subclinical hyperthyroidism and cardiovascular outcomes are conflicting.We aimed to assess the risks of total and coronary heart disease (CHD) mortality, CHD events, and atrial fibrillation (AF) associated with endogenous subclinical hyperthyroidism among all available large prospective cohorts. METHODS: Individual data on 52 674 participants were pooled from 10 cohorts. Coronary heart disease events were analyzed in 22 437 participants from 6 cohorts with available data, and incident AF was analyzed in 8711 participants from 5 cohorts. Euthyroidism was defined as thyrotropin level between 0.45 and 4.49 mIU/L and endogenous subclinical hyperthyroidism as thyrotropin level lower than 0.45 mIU/L with normal free thyroxine levels, after excluding those receiving thyroid-altering medications. RESULTS: Of 52 674 participants, 2188 (4.2%) had subclinical hyperthyroidism. During follow-up, 8527 participants died (including 1896 from CHD), 3653 of 22 437 had CHD events, and 785 of 8711 developed AF. In age- and sex-adjusted analyses, subclinical hyperthyroidism was associated with increased total mortality (hazard ratio[HR], 1.24, 95% CI, 1.06-1.46), CHD mortality (HR,1.29; 95% CI, 1.02-1.62), CHD events (HR, 1.21; 95%CI, 0.99-1.46), and AF (HR, 1.68; 95% CI, 1.16-2.43).Risks did not differ significantly by age, sex, or preexisting cardiovascular disease and were similar after further adjustment for cardiovascular risk factors, with attributable risk of 14.5% for total mortality to 41.5% forAF in those with subclinical hyperthyroidism. Risks for CHD mortality and AF (but not other outcomes) were higher for thyrotropin level lower than 0.10 mIU/L compared with thyrotropin level between 0.10 and 0.44 mIU/L(for both, P value for trend, .03). CONCLUSION: Endogenous subclinical hyperthyroidism is associated with increased risks of total, CHD mortality, and incident AF, with highest risks of CHD mortality and AF when thyrotropin level is lower than 0.10 mIU/L.
Resumo:
BACKGROUND: Mild cognitive impairment (MCI) has been defined as a transitional state between normal aging and dementia. In many cases, MCI represents an early stage of developing cognitive impairment. Patients diagnosed with MCI do not meet the criteria for dementia as their general intellect and everyday activities are preserved, although minor changes in instrumental activities of daily living (ADL) may occur. However, they may exhibit significant behavioral and psychological signs and symptoms (BPS), also frequently observed in patients with Alzheimer's disease (AD). Hence, we wondered to what extent specific BPS are associated with cognitive decline in participants with MCI or AD. METHODS: Our sample consisted of 164 participants, including 46 patients with amnestic (single or multi-domain) MCI and 54 patients with AD, as well as 64 control participants without cognitive disorders. Global cognitive performance, BPS, and ADL were assessed using validated clinical methods at baseline and at two-year follow-up. RESULTS: The BPS variability over the follow-up period was more pronounced in the MCI group than in patients with AD: some BPS improve, others occur newly or worsen, while others still remain unchanged. Moreover, specific changes in BPS were associated with a rapid deterioration of the global cognitive level in MCI patients. In particular, an increase of euphoria, eating disorders, and aberrant motor behavior, as well as worsened sleep quality, predicted a decline in cognitive functioning. CONCLUSIONS: Our findings confirm a higher variability of BPS over time in the MCI group than in AD patients. Moreover, our results provide evidence of associations between specific BPS and cognitive decline in the MCI group that might suggest a risk of conversion of individuals with amnestic MCI to AD.