979 resultados para fire hazard analysis
Resumo:
Background: Polyphenols may lower the risk of cardiovascular disease (CVD) and other chronic diseases due to their antioxidant and anti-inflammatory properties, as well as their beneficial effects on blood pressure, lipids and insulin resistance. However, no previous epidemiological studies have evaluated the relationship between the intake of total polyphenols intake and polyphenol subclasses with overall mortality. Our aim was to evaluate whether polyphenol intake is associated with all-cause mortality in subjects at high cardiovascular risk. Methods: We used data from the PREDIMED study, a 7,447-participant, parallel-group, randomized, multicenter, controlled five-year feeding trial aimed at assessing the effects of the Mediterranean Diet in primary prevention of cardiovascular disease. Polyphenol intake was calculated by matching food consumption data from repeated food frequency questionnaires (FFQ) with the Phenol-Explorer database on the polyphenol content of each reported food. Hazard ratios (HR) and 95% confidence intervals (CI) between polyphenol intake and mortality were estimated using time-dependent Cox proportional hazard models. Results: Over an average of 4.8 years of follow-up, we observed 327 deaths. After multivariate adjustment, we found a 37% relative reduction in all-cause mortality comparing the highest versus the lowest quintiles of total polyphenol intake (hazard ratio (HR) = 0.63; 95% CI 0.41 to 0.97; P for trend = 0.12). Among the polyphenol subclasses, stilbenes and lignans were significantly associated with reduced all-cause mortality (HR =0.48; 95% CI 0.25 to 0.91; P for trend = 0.04 and HR = 0.60; 95% CI 0.37 to 0.97; P for trend = 0.03, respectively), with no significant associations apparent in the rest (flavonoids or phenolic acids). Conclusions: Among high-risk subjects, those who reported a high polyphenol intake, especially of stilbenes and lignans, showed a reduced risk of overall mortality compared to those with lower intakes. These results may be useful to determine optimal polyphenol intake or specific food sources of polyphenols that may reduce the risk of all-cause mortality.
Resumo:
Objectives: To examine the safety and effectiveness of cobalt-chromium everolimus eluting stents compared with bare metal stents. Design: Individual patient data meta-analysis of randomised controlled trials. Cox proportional regression models stratified by trial, containing random effects, were used to assess the impact of stent type on outcomes. Hazard ratios with 95% confidence interval for outcomes were reported. Data sources and study selection: Medline, Embase, the Cochrane Central Register of Controlled Trials. Randomised controlled trials that compared cobalt-chromium everolimus eluting stents with bare metal stents were selected. The principal investigators whose trials met the inclusion criteria provided data for individual patients. Primary outcomes: The primary outcome was cardiac mortality. Secondary endpoints were myocardial infarction, definite stent thrombosis, definite or probable stent thrombosis, target vessel revascularisation, and all cause death. Results: The search yielded five randomised controlled trials, comprising 4896 participants. Compared with patients receiving bare metal stents, participants receiving cobalt-chromium everolimus eluting stents had a significant reduction of cardiac mortality (hazard ratio 0.67, 95% confidence interval 0.49 to 0.91; P=0.01), myocardial infarction (0.71, 0.55 to 0.92; P=0.01), definite stent thrombosis (0.41, 0.22 to 0.76; P=0.005), definite or probable stent thrombosis (0.48, 0.31 to 0.73; P<0.001), and target vessel revascularisation (0.29, 0.20 to 0.41; P<0.001) at a median follow-up of 720 days. There was no significant difference in all cause death between groups (0.83, 0.65 to 1.06; P=0.14). Findings remained unchanged at multivariable regression after adjustment for the acuity of clinical syndrome (for instance, acute coronary syndrome v stable coronary artery disease), diabetes mellitus, female sex, use of glycoprotein IIb/IIIa inhibitors, and up to one year v longer duration treatment with dual antiplatelets. Conclusions: This meta-analysis offers evidence that compared with bare metal stents the use of cobalt-chromium everolimus eluting stents improves global cardiovascular outcomes including cardiac survival, myocardial infarction, and overall stent thrombosis.
Resumo:
Objectives: To examine the safety and effectiveness of cobalt-chromium everolimus eluting stents compared with bare metal stents. Design: Individual patient data meta-analysis of randomised controlled trials. Cox proportional regression models stratified by trial, containing random effects, were used to assess the impact of stent type on outcomes. Hazard ratios with 95% confidence interval for outcomes were reported. Data sources and study selection: Medline, Embase, the Cochrane Central Register of Controlled Trials. Randomised controlled trials that compared cobalt-chromium everolimus eluting stents with bare metal stents were selected. The principal investigators whose trials met the inclusion criteria provided data for individual patients. Primary outcomes: The primary outcome was cardiac mortality. Secondary endpoints were myocardial infarction, definite stent thrombosis, definite or probable stent thrombosis, target vessel revascularisation, and all cause death. Results: The search yielded five randomised controlled trials, comprising 4896 participants. Compared with patients receiving bare metal stents, participants receiving cobalt-chromium everolimus eluting stents had a significant reduction of cardiac mortality (hazard ratio 0.67, 95% confidence interval 0.49 to 0.91; P=0.01), myocardial infarction (0.71, 0.55 to 0.92; P=0.01), definite stent thrombosis (0.41, 0.22 to 0.76; P=0.005), definite or probable stent thrombosis (0.48, 0.31 to 0.73; P<0.001), and target vessel revascularisation (0.29, 0.20 to 0.41; P<0.001) at a median follow-up of 720 days. There was no significant difference in all cause death between groups (0.83, 0.65 to 1.06; P=0.14). Findings remained unchanged at multivariable regression after adjustment for the acuity of clinical syndrome (for instance, acute coronary syndrome v stable coronary artery disease), diabetes mellitus, female sex, use of glycoprotein IIb/IIIa inhibitors, and up to one year v longer duration treatment with dual antiplatelets. Conclusions: This meta-analysis offers evidence that compared with bare metal stents the use of cobalt-chromium everolimus eluting stents improves global cardiovascular outcomes including cardiac survival, myocardial infarction, and overall stent thrombosis.
Resumo:
Tässä diplomityössä tehtiin Olkiluodon ydinvoimalaitoksella sijaitsevan käytetyn ydinpolttoaineen allasvarastointiin perustuvan välivaraston todennäköisyysperustainen ulkoisten uhkien riskianalyysi. Todennäköisyysperustainen riskianalyysi (PRA) on yleisesti käytetty riskien tunnistus- ja lähestymistapa ydinvoimalaitoksella. Työn tarkoituksena oli laatia täysin uusi ulkoisten uhkien PRA-analyysi, koska Suomessa ei ole aiemmin tehty vastaavanlaisia tämän tutkimusalueen riskitarkasteluja. Riskitarkastelun motiivina ovat myös maailmalla tapahtuneiden luonnonkatastrofien vuoksi korostunut ulkoisten uhkien rooli käytetyn ydinpolttoaineen välivarastoinnin turvallisuudessa. PRA analyysin rakenne pohjautui tutkimuksen alussa luotuun metodologiaan. Analyysi perustuu mahdollisten ulkoisten uhkien tunnistamiseen pois lukien ihmisen aikaansaamat tahalliset vahingot. Tunnistettujen ulkoisten uhkien esiintymistaajuuksien ja vahingoittamispotentiaalin perusteella ulkoiset uhat joko karsittiin pois tutkimuksessa määriteltyjen karsintakriteerien avulla tai analysoitiin tarkemmin. Tutkimustulosten perusteella voitiin todeta, että tiedot hyvin harvoin tapahtuvista ulkoisista uhista ovat epätäydellisiä. Suurinta osaa näistä hyvin harvoin tapahtuvista ulkoisista uhista ei ole koskaan esiintynyt eikä todennäköisesti koskaan tule esiintymään Olkiluodon vaikutusalueella tai edes Suomessa. Esimerkiksi salaman iskujen ja öljyaltistuksen roolit ja vaikutukset erilaisten komponenttien käytettävyyteen ovat epävarmasti tunnettuja. Tutkimuksen tuloksia voidaan pitää kokonaisuudessaan merkittävinä, koska niiden perusteella voidaan osoittaa ne ulkoiset uhat, joiden vaikutuksia olisi syytä tutkia tarkemmin. Yksityiskohtaisempi tietoisuus hyvin harvoin esiintyvistä ulkoisista uhista tarkentaisi alkutapahtumataajuuksien estimaatteja.
Resumo:
Abstract: Mammary gland tumors are the most common type of tumors in bitches but research on survival time after diagnosis is scarce. The purpose of this study was to investigate the relationship between survival time after mastectomy and a number of clinical and morphological variables. Data was collected retrospectively on bitches with mammary tumors seen at the Small Animal Surgery Clinic Service at the University of Brasília. All subjects had undergone mastectomy. Survival analysis was conducted using Cox's proportional hazard method. Of the 139 subjects analyzed, 68 died and 71 survived until the end of the study (64 months). Mean age was 11.76 years (SD=2.71), 53.84% were small dogs. 76.92% of the tumors were malignant, and 65.73% had both thoracic and inguinal glands affected. Survival time in months was associated with age (hazard rate ratios [HRR] =1.23, p-value =1.4x10-4), animal size (HRR between giant and small animals =2.61, p-value =0.02), nodule size (HRR =1.09, p-value =0.03), histological type (HRR between solid carcinoma and carcinoma in a mixed tumor =2.40, p-value =0.02), time between diagnosis and surgery (TDS, with HRR =1.21, p-value =2.7x10-15), and the interaction TDS*follow-up time (HRR =0.98, p-value =1.6x10-11). The present study is one of the few on the subject matter. Several important covariates were evaluated and age, animal size, nodule size, histological type, TDS and TDS*follow up time were identified as significantly associated to survival time.
Resumo:
Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.
Resumo:
Occupational stress is becoming a major issue in both corporate and social agenda .In industrialized countries, there have been quite dramatic changes in the conditions at work, during the last decade ,caused by economic, social and technical development. As a consequence, the people today at work are exposed to high quantitative and qualitative demands as well as hard competition caused by global economy. A recent report says that ailments due to work related stress is likely to cost India’s exchequer around 72000 crores between 2009 and 2015. Though India is a fast developing country, it is yet to create facilities to mitigate the adverse effects of work stress, more over only little efforts have been made to assess the work related stress.In the absence of well defined standards to assess the work related stress in India, an attempt is made in this direction to develop the factors for the evaluation of work stress. Accordingly, with the help of existing literature and in consultation with the safety experts, seven factors for the evaluation of work stress is developed. An instrument ( Questionnaire) was developed using these seven factors for the evaluation of work stress .The validity , and unidimensionality of the questionnaire was ensured by confirmatory factor analysis. The reliability of the questionnaire was ensured before administration. While analyzing the relation ship between the variables, it is noted that no relationship exists between them, and hence the above factors are treated as independent factors/ variables for the purpose of research .Initially five profit making manufacturing industries, under public sector in the state of Kerala, were selected for the study. The influence of factors responsible for work stress is analyzed in these industries. These industries were classified in to two types, namely chemical and heavy engineering ,based on the product manufactured and work environment and the analysis is further carried out for these two categories.The variation of work stress with different age , designation and experience of the employees are analyzed by means of one-way ANOVA. Further three different type of modelling of work stress, namely factor modelling, structural equation modelling and multinomial logistic regression modelling was done to analyze the association of factors responsible for work stress. All these models are found equally good in predicting the work stress.The present study indicates that work stress exists among the employees in public sector industries in Kerala. Employees belonging to age group 40-45yrs and experience groups 15-20yrs had relatively higher work demand ,low job control, and low support at work. Low job control was noted among lower designation levels, particularly at the worker level in these industries. Hence the instrument developed using the seven factors namely demand, control, manager support, peer support, relationship, role and change can be effectively used for the evaluation of work stress in industries.
Resumo:
Data centre is a centralized repository,either physical or virtual,for the storage,management and dissemination of data and information organized around a particular body and nerve centre of the present IT revolution.Data centre are expected to serve uniinterruptedly round the year enabling them to perform their functions,it consumes enormous energy in the present scenario.Tremendous growth in the demand from IT Industry made it customary to develop newer technologies for the better operation of data centre.Energy conservation activities in data centre mainly concentrate on the air conditioning system since it is the major mechanical sub-system which consumes considerable share of the total power consumption of the data centre.The data centre energy matrix is best represented by power utilization efficiency(PUE),which is defined as the ratio of the total facility power to the IT equipment power.Its value will be greater than one and a large value of PUE indicates that the sub-systems draw more power from the facility and the performance of the data will be poor from the stand point of energy conservation. PUE values of 1.4 to 1.6 are acievable by proper design and management techniques.Optimizing the air conditioning systems brings enormous opportunity in bringing down the PUE value.The air conditioning system can be optimized by two approaches namely,thermal management and air flow management.thermal management systems are now introduced by some companies but they are highly sophisticated and costly and do not catch much attention in the thumb rules.
Resumo:
Reliability analysis is a well established branch of statistics that deals with the statistical study of different aspects of lifetimes of a system of components. As we pointed out earlier that major part of the theory and applications in connection with reliability analysis were discussed based on the measures in terms of distribution function. In the beginning chapters of the thesis, we have described some attractive features of quantile functions and the relevance of its use in reliability analysis. Motivated by the works of Parzen (1979), Freimer et al. (1988) and Gilchrist (2000), who indicated the scope of quantile functions in reliability analysis and as a follow up of the systematic study in this connection by Nair and Sankaran (2009), in the present work we tried to extend their ideas to develop necessary theoretical framework for lifetime data analysis. In Chapter 1, we have given the relevance and scope of the study and a brief outline of the work we have carried out. Chapter 2 of this thesis is devoted to the presentation of various concepts and their brief reviews, which were useful for the discussions in the subsequent chapters .In the introduction of Chapter 4, we have pointed out the role of ageing concepts in reliability analysis and in identifying life distributions .In Chapter 6, we have studied the first two L-moments of residual life and their relevance in various applications of reliability analysis. We have shown that the first L-moment of residual function is equivalent to the vitality function, which have been widely discussed in the literature .In Chapter 7, we have defined percentile residual life in reversed time (RPRL) and derived its relationship with reversed hazard rate (RHR). We have discussed the characterization problem of RPRL and demonstrated with an example that the RPRL for given does not determine the distribution uniquely
Resumo:
This work identifies the importance of plenum pressure on the performance of the data centre. The present methodology followed in the industry considers the pressure drop across the tile as a dependant variable, but it is shown in this work that this is the only one independent variable that is responsible for the entire flow dynamics in the data centre, and any design or assessment procedure must consider the pressure difference across the tile as the primary independent variable. This concept is further explained by the studies on the effect of dampers on the flow characteristics. The dampers have found to introduce an additional pressure drop there by reducing the effective pressure drop across the tile. The effect of damper is to change the flow both in quantitative and qualitative aspects. But the effect of damper on the flow in the quantitative aspect is only considered while using the damper as an aid for capacity control. Results from the present study suggest that the use of dampers must be avoided in data centre and well designed tiles which give required flow rates must be used in the appropriate locations. In the present study the effect of hot air recirculation is studied with suitable assumptions. It identifies that, the pressure drop across the tile is a dominant parameter which governs the recirculation. The rack suction pressure of the hardware along with the pressure drop across the tile determines the point of recirculation in the cold aisle. The positioning of hardware in the racks play an important role in controlling the recirculation point. The present study is thus helpful in the design of data centre air flow, based on the theory of jets. The air flow can be modelled both quantitatively and qualitatively based on the results.
Resumo:
In the present study the effect of hot air recirculation is studied with suitable assumptions. It identifies that, the pressure drop across the tile is a dominant parameter which governs the recirculation. The rack suction pressure of the hardware along with the pressure drop across the tile determines the point of recirculation in the cold aisle. The positioning of hardware in the racks play an important role in controlling the recirculation point. The present study is thus helpful in the design of data centre air flow, based on the theory of jets. The air flow can be modelled both quantitatively and qualitatively based on the results
Resumo:
What is the relationship between the type of training combatants receive upon recruitment into an armed group and their propensity to abuse civilians in civil war? Does military training or political training prevent or exacerbate the victimization of civilians by armed non-state actors? While the literature on civilian victimization has expanded rapidly, few studies have examined the correlation between abuse of civilians and the modes of training that illegal armed actors receive. Using a simple formal model, we develop hypotheses regarding this connection and argue that while military training should not decrease the probability that a combatant engages in civilian abuse, political training should. We test these hypotheses using a new survey consisting of a representative sample of approximately 1,500 demobilized combatants from the Colombian conflict, which we match with department-level data on civilian casualties. The empirical analysis confirms our hypotheses about the connection between training and civilian abuse and the results are robust to adding a full set of controls both at the department and at the individual level
Resumo:
Changes in mature forest cover amount, composition, and configuration can be of significant consequence to wildlife populations. The response of wildlife to forest patterns is of concern to forest managers because it lies at the heart of such competing approaches to forest planning as aggregated vs. dispersed harvest block layouts. In this study, we developed a species assessment framework to evaluate the outcomes of forest management scenarios on biodiversity conservation objectives. Scenarios were assessed in the context of a broad range of forest structures and patterns that would be expected to occur under natural disturbance and succession processes. Spatial habitat models were used to predict the effects of varying degrees of mature forest cover amount, composition, and configuration on habitat occupancy for a set of 13 focal songbird species. We used a spatially explicit harvest scheduling program to model forest management options and simulate future forest conditions resulting from alternative forest management scenarios, and used a process-based fire-simulation model to simulate future forest conditions resulting from natural wildfire disturbance. Spatial pattern signatures were derived for both habitat occupancy and forest conditions, and these were placed in the context of the simulated range of natural variation. Strategic policy analyses were set in the context of current Ontario forest management policies. This included use of sequential time-restricted harvest blocks (created for Woodland caribou (Rangifer tarandus) conservation) and delayed harvest areas (created for American marten (Martes americana atrata) conservation). This approach increased the realism of the analysis, but reduced the generality of interpretations. We found that forest management options that create linear strips of old forest deviate the most from simulated natural patterns, and had the greatest negative effects on habitat occupancy, whereas policy options that specify deferment and timing of harvest for large blocks helped ensure the stable presence of an intact mature forest matrix over time. The management scenario that focused on maintaining compositional targets best supported biodiversity objectives by providing the composition patterns required by the 13 focal species, but this scenario may be improved by adding some broad-scale spatial objectives to better maintain large blocks of interior forest habitat through time.
Resumo:
Office returns in the City of London are more volatile than in other UK markets. This volatility may reflect fluctuations in capital flows associated with changing patterns of ownership and the growing linkage between real estate and financial markets in the City. Using current and historical data, patterns of ownership in the City are investigated. They reveal that overseas ownership has grown markedly since 1985, that owners are predominantly FIRE sector firms and that there are strong links between ownership and occupation. This raises concerns about future volatility and systemic risk.
Resumo:
Since 1999, the National Commission for the Knowledge and Use of the Biodiversity (CONABIO) in Mexico has been developing and managing the “Operational program for the detection of hot-spots using remote sensing techniques”. This program uses images from the MODerate resolution Imaging Spectroradiometer (MODIS) onboard the Terra and Aqua satellites and from the Advanced Very High Resolution Radiometer of the National Oceanic and Atmospheric Administration (NOAA-AVHRR), which are operationally received through the Direct Readout station (DR) at CONABIO. This allows the near-real time monitoring of fire events in Mexico and Central America. In addition to the detection of active fires, the location of hot spots are classified with respect to vegetation types, accessibility, and risk to Nature Protection Areas (NPA). Besides the fast detection of fires, further analysis is necessary due to the considerable effects of forest fires on biodiversity and human life. This fire impact assessment is crucial to support the needs of resource managers and policy makers for adequate fire recovery and restoration actions. CONABIO attempts to meet these requirements, providing post-fire assessment products as part of the management system in particular for satellite-based burnt area mapping. This paper provides an overview of the main components of the operational system and will present an outlook to future activities and system improvements, especially the development of a burnt area product. A special focus will also be placed on the fire occurrence within NPAs of Mexico