33 resultados para values-driven management
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Cloud Computing enables provisioning and distribution of highly scalable services in a reliable, on-demand and sustainable manner. However, objectives of managing enterprise distributed applications in cloud environments under Service Level Agreement (SLA) constraints lead to challenges for maintaining optimal resource control. Furthermore, conflicting objectives in management of cloud infrastructure and distributed applications might lead to violations of SLAs and inefficient use of hardware and software resources. This dissertation focusses on how SLAs can be used as an input to the cloud management system, increasing the efficiency of allocating resources, as well as that of infrastructure scaling. First, we present an extended SLA semantic model for modelling complex service-dependencies in distributed applications, and for enabling automated cloud infrastructure management operations. Second, we describe a multi-objective VM allocation algorithm for optimised resource allocation in infrastructure clouds. Third, we describe a method of discovering relations between the performance indicators of services belonging to distributed applications and then using these relations for building scaling rules that a CMS can use for automated management of VMs. Fourth, we introduce two novel VM-scaling algorithms, which optimally scale systems composed of VMs, based on given SLA performance constraints. All presented research works were implemented and tested using enterprise distributed applications.
Resumo:
Advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing workload conditions, such as number of connected users, application performance might suffer, leading to violations of Service Level Agreements (SLA) and possible inefficient use of hardware resources. Combining dynamic application requirements with the increased use of virtualised computing resources creates a challenging resource Management context for application and cloud-infrastructure owners. In such complex environments, business entities use SLAs as a means for specifying quantitative and qualitative requirements of services. There are several challenges in running distributed enterprise applications in cloud environments, ranging from the instantiation of service VMs in the correct order using an adequate quantity of computing resources, to adapting the number of running services in response to varying external loads, such as number of users. The application owner is interested in finding the optimum amount of computing and network resources to use for ensuring that the performance requirements of all her/his applications are met. She/he is also interested in appropriately scaling the distributed services so that application performance guarantees are maintained even under dynamic workload conditions. Similarly, the infrastructure Providers are interested in optimally provisioning the virtual resources onto the available physical infrastructure so that her/his operational costs are minimized, while maximizing the performance of tenants’ applications. Motivated by the complexities associated with the management and scaling of distributed applications, while satisfying multiple objectives (related to both consumers and providers of cloud resources), this thesis proposes a cloud resource management platform able to dynamically provision and coordinate the various lifecycle actions on both virtual and physical cloud resources using semantically enriched SLAs. The system focuses on dynamic sizing (scaling) of virtual infrastructures composed of virtual machines (VM) bounded application services. We describe several algorithms for adapting the number of VMs allocated to the distributed application in response to changing workload conditions, based on SLA-defined performance guarantees. We also present a framework for dynamic composition of scaling rules for distributed service, which used benchmark-generated application Monitoring traces. We show how these scaling rules can be combined and included into semantic SLAs for controlling allocation of services. We also provide a detailed description of the multi-objective infrastructure resource allocation problem and various approaches to satisfying this problem. We present a resource management system based on a genetic algorithm, which performs allocation of virtual resources, while considering the optimization of multiple criteria. We prove that our approach significantly outperforms reactive VM-scaling algorithms as well as heuristic-based VM-allocation approaches.
Resumo:
We describe a system for performing SLA-driven management and orchestration of distributed infrastructures composed of services supporting mobile computing use cases. In particular, we focus on a Follow-Me Cloud scenario in which we consider mobile users accessing cloud-enable services. We combine a SLA-driven approach to infrastructure optimization, with forecast-based performance degradation preventive actions and pattern detection for supporting mobile cloud infrastructure management. We present our system's information model and architecture including the algorithmic support and the proposed scenarios for system evaluation.
Resumo:
Recent advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing environmental conditions and number of users, application performance might suffer, leading to Service Level Agreement (SLA) violations and inefficient use of hardware resources. We introduce a system for controlling the complexity of scaling applications composed of multiple services using mechanisms based on fulfillment of SLAs. We present how service monitoring information can be used in conjunction with service level objectives, predictions, and correlations between performance indicators for optimizing the allocation of services belonging to distributed applications. We validate our models using experiments and simulations involving a distributed enterprise information system. We show how discovering correlations between application performance indicators can be used as a basis for creating refined service level objectives, which can then be used for scaling the application and improving the overall application's performance under similar conditions.
Resumo:
There may be a considerable gap between LDL cholesterol (LDL-C) and blood pressure (BP) goal values recommended by the guidelines and results achieved in daily practice.
Fluctuation phenotyping based on daily fraction of exhaled nitric oxide values in asthmatic children
Resumo:
Fraction of exhaled nitric oxide (Feno), a marker of airway inflammation, has been proposed to be useful for asthma management, but conclusions are inconsistent. This might be due to the failure of mean statistics to characterize individual variability in Feno values, which is possibly a better indicator of asthma control than single measurements.
Resumo:
BACKGROUND: Little information on the management and long-term follow-up of patients with biallelic mutations in the chloride channel gene CLCNKB is available. METHODS: Long-term follow-up was evaluated from 5.0 to 24 years (median, 14 years) after diagnosis in 13 patients with homozygous (n = 10) or compound heterozygous (n = 3) mutations. RESULTS: Medical treatment at last follow-up control included supplementation with potassium in 12 patients and sodium in 2 patients and medical treatment with indomethacin in 9 patients. At the end of follow-up, body height was 2.0 standard deviation score or less in 6 patients; 2 of these patients had growth hormone deficiency. Body weight (
Resumo:
PURPOSE: Antiretroviral therapy (ART) may induce metabolic changes and increase the risk of coronary heart disease (CHD). Based on a health care system approach, we investigated predictors for normalization of dyslipidemia in HIV-infected individuals receiving ART. METHOD: Individuals included in the study were registered in the Swiss HIV Cohort Study (SHCS), had dyslipidemia but were not on lipid-lowering medication, were on potent ART for >or= 3 months, and had >or= 2 follow-up visits. Dyslipidemia was defined as two consecutive total cholesterol (TC) values above recommended levels. Predictors of achieving treatment goals for TC were assessed using Cox models. RESULTS: Analysis included 958 individuals with median followup of 2.3 years (IQR 1.2-4.0). 454 patients (47.4%) achieved TC treatment goals. In adjusted analyses, variables significantly associated with a lower hazard of reaching TC treatment goals were as follows: older age (compared to 18-37 year olds: hazard ratio [HR] 0.62 for 45-52 year olds, 95% CI 0.47-0.82; HR 0.40 for 53-85, 95% CI 0.29-0.54), diabetes (HR 0.39, 95% CI 0.26-0.59), history of coronary heart disease (HR 0.27, 95% CI 0.10-0.71), higher baseline TC (HR 0.78, 95% CI 0.71-0.85), baseline triple nucleoside regimen (HR 0.12 compared to PI-only regimen, 95% CI 0.07-0.21), longer time on PI-only regimen (HR 0.39, 95% CI 0.33-0.46), longer time on NNRTI only regimen (HR 0.35, 95% CI 0.29-0.43), and longer time on PI/NNRTI regimen (HR 0.34, 95% CI 0.26-0.43). Switching ART regimen when viral load was undetectable was associated with a higher hazard of reaching TC treatment goals (HR 1.48, 95% CI 1.14-1.91). CONCLUSION: In SHCS participants on ART, several ART-related and not ART-related epidemiological factors were associated with insufficient control of dyslipidemia. Control of dyslipidemia in ART recipients must be further improved.
Resumo:
BACKGROUND: Simultaneous pancreas/kidney transplantation (SPK) should be the procedure of choice for (pre)uremic patients with type 1 diabetes. All standard immunosuppressive protocols for SPK include a calcineurin-inhibitor. Both calcineurin inhibitors, cyclosporine (CyA) and probably tacrolimus (FK506) too, are associated with the occurrence of cholelithiasis due to their metabolic side effects. PATIENTS AND METHODS: We evaluated the prevalence of cholelithiasis in 83 kidney/pancreas transplanted type I-diabetic patients (46 males, 37 females, mean age 42.8 +/- 7.5 years) by conventional B-mode ultrasound 5 years after transplantation. 56 patients received CyA (group 1) and 27 received tacrolimus (group 2) as first-line-immunosuppressive drug. Additional immunosuppression consisted of steroids, azathioprine or mycophenolate mofetil. Additionally, laboratory analyses of cholestasis parameters (gamma-GT and alcalic phosphatasis) were performed. RESULTS: In total, 23 patients (28%) revealed gallstones and 52 patients (62%) revealed a completely normal gallbladder. In eight patients (10%) a cholecystectomy was performed before or during transplantation because of already known gallstones. No concrements in the biliary ducts (choledocholithiasis) could be detected. In group 2 the number of patients with gallstones was slightly lower (22%) compared with group 1 patients (30%), but without statistical significance. - Cholestasis parameters were not increased and HbA1c values were normal in both groups of patients. CONCLUSION: The prevalence of biliary disease in kidney/pancreas transplanted type I-diabetic patients with 28% is increased in comparison to the general population (10-15%). Lithogenicity under tacrolimus seems to be lower as under cyclosporine based immunosuppressive drug treatment. We recommend regular sonographical examinations to detect an acute or chronic cholecystis as early as possible, which may develop occultly in these patients.
Resumo:
Rapid diagnostic tests (RDT) are sometimes recommended to improve the home-based management of malaria. The accuracy of an RDT for the detection of clinical malaria and the presence of malarial parasites has recently been evaluated in a high-transmission area of southern Mali. During the same study, the cost-effectiveness of a 'test-and-treat' strategy for the home-based management of malaria (based on an artemisinin-combination therapy) was compared with that of a 'treat-all' strategy. Overall, 301 patients, of all ages, each of whom had been considered a presumptive case of uncomplicated malaria by a village healthworker, were checked with a commercial RDT (Paracheck-Pf). The sensitivity, specificity, and positive and negative predictive values of this test, compared with the results of microscopy and two different definitions of clinical malaria, were then determined. The RDT was found to be 82.9% sensitive (with a 95% confidence interval of 78.0%-87.1%) and 78.9% (63.9%-89.7%) specific compared with the detection of parasites by microscopy. In the detection of clinical malaria, it was 95.2% (91.3%-97.6%) sensitive and 57.4% (48.2%-66.2%) specific compared with a general practitioner's diagnosis of the disease, and 100.0% (94.5%-100.0%) sensitive but only 30.2% (24.8%-36.2%) specific when compared against the fulfillment of the World Health Organization's (2003) research criteria for uncomplicated malaria. Among children aged 0-5 years, the cost of the 'test-and-treat' strategy, per episode, was about twice that of the 'treat-all' (U.S.$1.0. v. U.S.$0.5). In older subjects, however, the two strategies were equally costly (approximately U.S.$2/episode). In conclusion, for children aged 0-5 years in a high-transmission area of sub-Saharan Africa, use of the RDT was not cost-effective compared with the presumptive treatment of malaria with an ACT. In older patients, use of the RDT did not reduce costs. The question remains whether either of the strategies investigated can be made affordable for the affected population.
Resumo:
Physiology and current knowledge about gestational diabetes which led to the adoption of new diagnostic criterias and blood glucose target levels during pregnancy by the Swiss Society for Endocrinology and Diabetes are reviewed. The 6th International Workshop Conference on Gestational Diabetes mellitus in Pasedena (2008) defined new diagnostic criteria based on the results of the HAPO-Trial. These criteria were during the ADA congress in New Orleans in 2009 presented. According to the new criteria there is no need for screening, but all pregnant women have to be tested with a 75 g oral glucose tolerance test between the 24th and 28th week of pregnancy. The new diagnostic values are very similar to the ones previously adopted by the ADA with the exception that only one out of three values has to be elevated in order to make the diagnosis of gestational diabetes. Due to this important difference it is very likely that gestational diabetes will be diagnosed more frequently in the future. The diagnostic criteria are: Fasting plasma glucose > or = 5.1 mmol/l, 1-hour value > or = 10.0 mmol/l or 2-hour value > or = 8.5 mmol/l. Based on current knowledge and randomized trials it is much more difficult to define glucose target levels during pregnancy. This difficulty has led to many different recommendations issued by diabetes societies. The Swiss Society of Endocrinology and Diabetes follows the arguments of the International Diabetes Federation (IDF) that self-blood glucose monitoring itself lacks precision and that there are very few randomized trials. Therefore, the target levels have to be easy to remember and might be slightly different in mmol/l or mg/dl. The Swiss Society for Endocrinology and Diabetes adopts the tentative target values of the IDF with fasting plasma glucose values < 5.3 mM and 1- and 2-hour postprandial (after the end of the meal) values of < 8.0 and 7.0 mmol/l, respectively. The last part of these recommendations deals with the therapeutic options during pregnancy (nutrition, physical exercise and pharmaceutical treatment). If despite lifestyle changes the target values are not met, approximately 25 % of patients have to be treated pharmaceutically. Insulin therapy is still the preferred treatment option, but metformin (and as an exception glibenclamide) can be used, if there are major hurdles for the initiation of insulin therapy.
Resumo:
Lichens are a key component of forest biodiversity. However, a comprehensive study analyzing lichen species richness in relation to several management types, extending over different regions and forest stages and including information on site conditions is missing for temperate European forests. In three German regions (Schwäbische Alb, Hainich-Dün, Schorfheide-Chorin), the so-called Biodiversity Exploratories, we studied lichen species richness in 631 forest plots of 400 m2 comprising different management types (unmanaged, selection cutting, deciduous and coniferous age-class forests resulting from clear cutting or shelterwood logging), various stand ages, and site conditions, typical for large parts of temperate Europe. We analyzed how lichen species richness responds to management and habitat variables (standing biomass, cover of deadwood, cover of rocks). We found strong regional differences with highest lichen species richness in the Schwäbische Alb, probably driven by regional differences in former air pollution, and in precipitation and habitat variables. Overall, unmanaged forests harbored 22% more threatened lichen species than managed age-class forests. In general, total, corticolous, and threatened lichen species richness did not differ among management types of deciduous forests. However, in the Schwäbische-Alb region, deciduous forests had 61% more lichen species than coniferous forests and they had 279% more threatened and 76% more corticolous lichen species. Old deciduous age classes were richer in corticolous lichen species than young ones, while old coniferous age-classes were poorer than young ones. Overall, our findings highlight the importance of stand continuity for conservation. To increase total and threatened lichen species richness we suggest (1) conserving unmanaged forests, (2) promoting silvicultural methods assuring stand continuity, (3) conserving old trees in managed forests, (4) promoting stands of native deciduous tree species instead of coniferous plantations, and (5) increasing the amount of deadwood in forests.