44 resultados para Cloud Computing, Risk Assessment, Security, Framework

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a consequence of flood impacts, communities inhabiting mountain areas are increasingly affected by considerable damage to infrastructure and property. The design of effective flood risk mitigation strategies and their subsequent implementation is crucial for a sustainable development in mountain areas. The assessment of the dynamic evolution of flood risk is the pillar of any subsequent planning process that is targeted at a reduction of the expected adverse consequences of the hazard impact. Given these premises, firstly, a comprehensive method to derive flood hazard process scenarios for well-defined areas at risk is presented. Secondly, conceptualisations of a static and dynamic flood risk assessment are provided. These are based on formal schemes to compute the risk mitigation performance of devised mitigation strategies within the framework of economic cost-benefit analysis. In this context, techniques suitable to quantify the expected losses induced by the identified flood impacts are provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing workload conditions, such as number of connected users, application performance might suffer, leading to violations of Service Level Agreements (SLA) and possible inefficient use of hardware resources. Combining dynamic application requirements with the increased use of virtualised computing resources creates a challenging resource Management context for application and cloud-infrastructure owners. In such complex environments, business entities use SLAs as a means for specifying quantitative and qualitative requirements of services. There are several challenges in running distributed enterprise applications in cloud environments, ranging from the instantiation of service VMs in the correct order using an adequate quantity of computing resources, to adapting the number of running services in response to varying external loads, such as number of users. The application owner is interested in finding the optimum amount of computing and network resources to use for ensuring that the performance requirements of all her/his applications are met. She/he is also interested in appropriately scaling the distributed services so that application performance guarantees are maintained even under dynamic workload conditions. Similarly, the infrastructure Providers are interested in optimally provisioning the virtual resources onto the available physical infrastructure so that her/his operational costs are minimized, while maximizing the performance of tenants’ applications. Motivated by the complexities associated with the management and scaling of distributed applications, while satisfying multiple objectives (related to both consumers and providers of cloud resources), this thesis proposes a cloud resource management platform able to dynamically provision and coordinate the various lifecycle actions on both virtual and physical cloud resources using semantically enriched SLAs. The system focuses on dynamic sizing (scaling) of virtual infrastructures composed of virtual machines (VM) bounded application services. We describe several algorithms for adapting the number of VMs allocated to the distributed application in response to changing workload conditions, based on SLA-defined performance guarantees. We also present a framework for dynamic composition of scaling rules for distributed service, which used benchmark-generated application Monitoring traces. We show how these scaling rules can be combined and included into semantic SLAs for controlling allocation of services. We also provide a detailed description of the multi-objective infrastructure resource allocation problem and various approaches to satisfying this problem. We present a resource management system based on a genetic algorithm, which performs allocation of virtual resources, while considering the optimization of multiple criteria. We prove that our approach significantly outperforms reactive VM-scaling algorithms as well as heuristic-based VM-allocation approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of the work reported here is to test reliable molecular profiles using routinely processed formalin-fixed paraffin-embedded (FFPE) tissues from participants of the clinical trial BIG 1-98 with a median follow-up of 60 months.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Because of the development of modern transportation facilities, an ever rising number of individuals including many patients with preexisting diseases visit high-altitude locations (>2500 m). High-altitude exposure triggers a series of physiologic responses intended to maintain an adequate tissue oxygenation. Even in normal subjects, there is enormous interindividual variability in these responses that may be further amplified by environmental factors such as cold temperature, low humidity, exercise, and stress. These adaptive mechanisms, although generally tolerated by most healthy subjects, may induce major problems in patients with preexisting cardiovascular diseases in which the functional reserves are already limited. Preexposure assessment of patients helps to minimize risk and detect contraindications to high-altitude exposure. Moreover, the great variability and nonpredictability of the adaptive response should encourage physicians counseling such patients to adapt a cautionary approach. Here, we will briefly review how high-altitude adjustments may interfere with and aggravate/decompensate preexisting cardiovascular diseases. Moreover, we will provide practical recommendations on how to investigate and counsel patients with cardiovascular disease desiring to travel to high-altitude locations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: To investigate the association of the Periodontal Risk Assessment (PRA) model categories with periodontitis recurrence and tooth loss during supportive periodontal therapy (SPT) and to explore the role of patient compliance. Material and Methods: In a retrospective cohort, PRA was performed for 160 patients after active periodontal therapy (APT) and after 9.5 ± 4.5 years of SPT. The recurrence of periodontitis and tooth loss were analysed according to the patient's risk profile (low, moderate or high) after APT and compliance with SPT. The association of risk factors with tooth loss and recurrence of periodontitis was investigated using logistic regression analysis. Results: In 18.2% of patients with a low-risk profile, in 42.2% of patients with a moderate-risk profile and in 49.2% of patients with a high-risk profile after APT, periodontitis recurred. During SPT, 1.61 ± 2.8 teeth/patient were lost. High-risk profile patients lost significantly more teeth (2.59 ± 3.9) than patients with moderate- (1.02 ± 1.8) or low-risk profiles (1.18 ± 1.9) (Kruskal–Wallis test, p=0.0229). Patients with erratic compliance lost significantly (Kruskal–Wallis test, p=0.0067) more teeth (3.11 ± 4.5) than patients compliant with SPT (1.07 ± 1.6). Conclusions: In multivariate logistic regression analysis, a high-risk patient profile according to the PRA model at the end of APT was associated with recurrence of periodontitis. Another significant factor for recurrence of periodontitis was an SPT duration of more than 10 years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction The survival of patients admitted to an emergency department is determined by the severity of acute illness and the quality of care provided. The high number and the wide spectrum of severity of illness of admitted patients make an immediate assessment of all patients unrealistic. The aim of this study is to evaluate a scoring system based on readily available physiological parameters immediately after admission to an emergency department (ED) for the purpose of identification of at-risk patients. Methods This prospective observational cohort study includes 4,388 consecutive adult patients admitted via the ED of a 960-bed tertiary referral hospital over a period of six months. Occurrence of each of seven potential vital sign abnormalities (threat to airway, abnormal respiratory rate, oxygen saturation, systolic blood pressure, heart rate, low Glasgow Coma Scale and seizures) was collected and added up to generate the vital sign score (VSS). VSSinitial was defined as the VSS in the first 15 minutes after admission, VSSmax as the maximum VSS throughout the stay in ED. Occurrence of single vital sign abnormalities in the first 15 minutes and VSSinitial and VSSmax were evaluated as potential predictors of hospital mortality. Results Logistic regression analysis identified all evaluated single vital sign abnormalities except seizures and abnormal respiratory rate to be independent predictors of hospital mortality. Increasing VSSinitial and VSSmax were significantly correlated to hospital mortality (odds ratio (OR) 2.80, 95% confidence interval (CI) 2.50 to 3.14, P < 0.0001 for VSSinitial; OR 2.36, 95% CI 2.15 to 2.60, P < 0.0001 for VSSmax). The predictive power of VSS was highest if collected in the first 15 minutes after ED admission (log rank Chi-square 468.1, P < 0.0001 for VSSinitial;,log rank Chi square 361.5, P < 0.0001 for VSSmax). Conclusions Vital sign abnormalities and VSS collected in the first minutes after ED admission can identify patients at risk of an unfavourable outcome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: We aimed to assess the predictive value of the SYNTAX score (SXscore) for major adverse cardiac events in the all-comers population of the LEADERS (Limus Eluted from A Durable versus ERodable Stent coating) trial. BACKGROUND: The SXscore has been shown to be an effective predictor of clinical outcomes in patients with multivessel disease undergoing percutaneous coronary intervention. METHODS: The SXscore was prospectively collected in 1,397 of the 1,707 patients enrolled in the LEADERS trial (patients after surgical revascularization were excluded). Post hoc analysis was performed by stratifying clinical outcomes at 1-year follow-up, according to 1 of 3 SXscore tertiles. RESULTS: The 1,397 patients were divided into tertiles based on the SXscore in the following fashion: SXscore8 and 16 (SXhigh) (n=461). At 1-year follow-up, there was a significantly lower number of patients with major cardiac event-free survival in the highest tertile of SXscore (SXlow=92.2%, SXmid=91.1%, and SXhigh=84.6%; p<0.001). Death occurred in 1.5% of SXlow patients, 2.1% of SXmid patients, and 5.6% of SXhigh patients (hazard ratio [HR]: 1.97, 95% confidence interval [CI]: 1.29 to 3.01; p=0.002). The myocardial infarction rate tended to be higher in the SXhigh group. Target vessel revascularization was 11.3% in the SXhigh group compared with 6.3% and 7.8% in the SXlow and SXmid groups, respectively (HR: 1.38, 95% CI: 1.1 to 1.75; p=0.006). Composite of cardiac death, myocardial infarction, and clinically indicated target vessel revascularization was 7.8%, 8.9%, and 15.4% in the SXlow, SXmid, and SXhigh groups, respectively (HR: 1.47, 95% CI: 1.19 to 1.81; p<0.001). CONCLUSIONS: The SXscore, when applied to an all-comers patient population treated with drug-eluting stents, may allow prospective risk stratification of patients undergoing percutaneous coronary intervention. (LEADERS Trial Limus Eluted From A Durable Versus ERodable Stent Coating; NCT00389220).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exposure to combination antiretroviral therapy (cART) can lead to important metabolic changes and increased risk of coronary heart disease (CHD). Computerized clinical decision support systems have been advocated to improve the management of patients at risk for CHD but it is unclear whether such systems reduce patients' risk for CHD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. LAY ABSTRACT: Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seventeen polycyclic aromatic hydrocarbons (PAHs) were studied in surface waters (including particulate phase) from the Chenab River, Pakistan and ranged from 289-994 and 437-1290 ng l-1 in summer and winter (2007-09), respectively. Concentrations for different ring-number PAHs followed the trend: 3-rings > 2-rings > 4-rings > 5-rings > 6-rings. The possible sources of PAHs are identified by calculating the indicative ratios; appropriating petrogenic sources of PAHs in urban and sub-urban regions with pyrogenic sources in agricultural region. Factor analysis based on principal component analysis identified the origins of PAHs from industrial activities, coal and trash burning in agricultural areas and municipal waste disposal from surrounding urban and sub-urban areas via open drains into the riverine ecosystem. Water quality guidelines and toxic equivalent factors highlighted the potential risk of low molecular weight PAHs to the aquatic life of the Chenab River. The flux estimated for PAHs contaminants from the Chenab River to the Indus River was >50 tons/year.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The evolution of the Next Generation Networks, especially the wireless broadband access technologies such as Long Term Evolution (LTE) and Worldwide Interoperability for Microwave Access (WiMAX), have increased the number of "all-IP" networks across the world. The enhanced capabilities of these access networks has spearheaded the cloud computing paradigm, where the end-users aim at having the services accessible anytime and anywhere. The services availability is also related with the end-user device, where one of the major constraints is the battery lifetime. Therefore, it is necessary to assess and minimize the energy consumed by the end-user devices, given its significance for the user perceived quality of the cloud computing services. In this paper, an empirical methodology to measure network interfaces energy consumption is proposed. By employing this methodology, an experimental evaluation of energy consumption in three different cloud computing access scenarios (including WiMAX) were performed. The empirical results obtained show the impact of accurate network interface states management and application network level design in the energy consumption. Additionally, the achieved outcomes can be used in further software-based models to optimized energy consumption, and increase the Quality of Experience (QoE) perceived by the end-users.