27 resultados para Service level objective

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional resource management has had as its main objective the optimisation of throughput, based on parameters such as CPU, memory, and network bandwidth. With the appearance of Grid Markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The SORMA project aims to allow resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA’s motivation is to achieve efficient resource utilisation by maximising revenue for resource providers, and minimising the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that desired Quality of Service levels meet the expectations of market participants. This paper explains the proposed use of an Economically Enhanced Resource Manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximisation across multiple Service Level Agreements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional resource management has had as its main objective the optimisation of throughput, based on pa- rameters such as CPU, memory, and network bandwidth. With the appearance of Grid Markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The SORMA project aims to allow resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA’s motivation is to achieve efficient resource utilisation by maximising revenue for resource providers, and minimising the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that desired Quality of Service levels meet the expectations of market participants. This paper explains the proposed use of an Economically Enhanced Resource Manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximisation across multiple Service Level Agreements.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Traditional resource management has had as its main objective the optimization of throughput, based on parameters such as CPU, memory, and network bandwidth. With the appearance of Grid markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The Self-organizing ICT Resource Management (SORMA) project aims at allowing resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA's motivation is to achieve efficient resource utilization by maximizing revenue for resource providers and minimizing the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that the desired quality of service levels meet the expectations of market participants. This paper explains the proposed use of an economically enhanced resource manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximization across multiple service level agreements and provides an application scenario to demonstrate its usefulness and effectiveness. Copyright © 2008 John Wiley & Sons, Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A full assessment of para-­virtualization is important, because without knowledge about the various overheads, users can not understand whether using virtualization is a good idea or not. In this paper we are very interested in assessing the overheads of running various benchmarks on bare-­‐metal, as well as on para-­‐virtualization. The idea is to see what the overheads of para-­‐ virtualization are, as well as looking at the overheads of turning on monitoring and logging. The knowledge from assessing various benchmarks on these different systems will help a range of users understand the use of virtualization systems. In this paper we assess the overheads of using Xen, VMware, KVM and Citrix, see Table 1. These different virtualization systems are used extensively by cloud-­‐users. We are using various Netlib1 benchmarks, which have been developed by the University of Tennessee at Knoxville (UTK), and Oak Ridge National Laboratory (ORNL). In order to assess these virtualization systems, we run the benchmarks on bare-­‐metal, then on the para-­‐virtualization, and finally we turn on monitoring and logging. The later is important as users are interested in Service Level Agreements (SLAs) used by the Cloud providers, and the use of logging is a means of assessing the services bought and used from commercial providers. In this paper we assess the virtualization systems on three different systems. We use the Thamesblue supercomputer, the Hactar cluster and IBM JS20 blade server (see Table 2), which are all servers available at the University of Reading. A functional virtualization system is multi-­‐layered and is driven by the privileged components. Virtualization systems can host multiple guest operating systems, which run on its own domain, and the system schedules virtual CPUs and memory within each Virtual Machines (VM) to make the best use of the available resources. The guest-­‐operating system schedules each application accordingly. You can deploy virtualization as full virtualization or para-­‐virtualization. Full virtualization provides a total abstraction of the underlying physical system and creates a new virtual system, where the guest operating systems can run. No modifications are needed in the guest OS or application, e.g. the guest OS or application is not aware of the virtualized environment and runs normally. Para-­‐virualization requires user modification of the guest operating systems, which runs on the virtual machines, e.g. these guest operating systems are aware that they are running on a virtual machine, and provide near-­‐native performance. You can deploy both para-­‐virtualization and full virtualization across various virtualized systems. Para-­‐virtualization is an OS-­‐assisted virtualization; where some modifications are made in the guest operating system to enable better performance. In this kind of virtualization, the guest operating system is aware of the fact that it is running on the virtualized hardware and not on the bare hardware. In para-­‐virtualization, the device drivers in the guest operating system coordinate the device drivers of host operating system and reduce the performance overheads. The use of para-­‐virtualization [0] is intended to avoid the bottleneck associated with slow hardware interrupts that exist when full virtualization is employed. It has revealed [0] that para-­‐ virtualization does not impose significant performance overhead in high performance computing, and this in turn this has implications for the use of cloud computing for hosting HPC applications. The “apparent” improvement in virtualization has led us to formulate the hypothesis that certain classes of HPC applications should be able to execute in a cloud environment, with minimal performance degradation. In order to support this hypothesis, first it is necessary to define exactly what is meant by a “class” of application, and secondly it will be necessary to observe application performance, both within a virtual machine and when executing on bare hardware. A further potential complication is associated with the need for Cloud service providers to support Service Level Agreements (SLA), so that system utilisation can be audited.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

CONTEXT: The link between long-haul air travel and venous thromboembolism is the subject of continuing debate. It remains unclear whether the reduced cabin pressure and oxygen tension in the airplane cabin create an increased risk compared with seated immobility at ground level. OBJECTIVE: To determine whether hypobaric hypoxia, which may be encountered during air travel, activates hemostasis. DESIGN, SETTING, AND PARTICIPANTS: A single-blind, crossover study, performed in a hypobaric chamber, to assess the effect of an 8-hour seated exposure to hypobaric hypoxia on hemostasis in 73 healthy volunteers, which was conducted in the United Kingdom from September 2003 to November 2005. Participants were screened for factor V Leiden G1691A and prothrombin G20210A mutation and were excluded if they tested positive. Blood was drawn before and after exposure to assess activation of hemostasis. INTERVENTIONS: Individuals were exposed alternately (> or =1 week apart) to hypobaric hypoxia, similar to the conditions of reduced cabin pressure during commercial air travel (equivalent to atmospheric pressure at an altitude of 2438 m), and normobaric normoxia (control condition; equivalent to atmospheric conditions at ground level, circa 70 m above sea level). MAIN OUTCOME MEASURES: Comparative changes in markers of coagulation activation, fibrinolysis, platelet activation, and endothelial cell activation. RESULTS: Changes were observed in some hemostatic markers during the normobaric exposure, attributed to prolonged sitting and circadian variation. However, there were no significant differences between the changes in the hypobaric and the normobaric exposures. For example, the median difference in change between the hypobaric and normobaric exposure was 0 ng/mL for thrombin-antithrombin complex (95% CI, -0.30 to 0.30 ng/mL); -0.02 [corrected] nmol/L for prothrombin fragment 1 + 2 (95% CI, -0.03 to 0.01 nmol/L); 1.38 ng/mL for D-dimer (95% CI, -3.63 to 9.72 ng/mL); and -2.00% for endogenous thrombin potential (95% CI, -4.00% to 1.00%). CONCLUSION: Our findings do not support the hypothesis that hypobaric hypoxia, of the degree that might be encountered during long-haul air travel, is associated with prothrombotic alterations in the hemostatic system in healthy individuals at low risk of venous thromboembolism.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: The English Improving Access to Psychological Therapies (IAPT) initiative aims to make evidence-based psychological therapies for depression and anxiety disorder more widely available in the National Health Service (NHS). 32 IAPT services based on a stepped care model were established in the first year of the programme. We report on the reliable recovery rates achieved by patients treated in the services and identify predictors of recovery at patient level, service level, and as a function of compliance with National Institute of Health and Care Excellence (NICE) Treatment Guidelines. METHOD: Data from 19,395 patients who were clinical cases at intake, attended at least two sessions, had at least two outcomes scores and had completed their treatment during the period were analysed. Outcome was assessed with the patient health questionnaire depression scale (PHQ-9) and the anxiety scale (GAD-7). RESULTS: Data completeness was high for a routine cohort study. Over 91% of treated patients had paired (pre-post) outcome scores. Overall, 40.3% of patients were reliably recovered at post-treatment, 63.7% showed reliable improvement and 6.6% showed reliable deterioration. Most patients received treatments that were recommended by NICE. When a treatment not recommended by NICE was provided, recovery rates were reduced. Service characteristics that predicted higher reliable recovery rates were: high average number of therapy sessions; higher step-up rates among individuals who started with low intensity treatment; larger services; and a larger proportion of experienced staff. CONCLUSIONS: Compliance with the IAPT clinical model is associated with enhanced rates of reliable recovery.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Quantifying the effect of the seawater density changes on sea level variability is of crucial importance for climate change studies, as the sea level cumulative rise can be regarded as both an important climate change indicator and a possible danger for human activities in coastal areas. In this work, as part of the Ocean Reanalysis Intercomparison Project, the global and regional steric sea level changes are estimated and compared from an ensemble of 16 ocean reanalyses and 4 objective analyses. These estimates are initially compared with a satellite-derived (altimetry minus gravimetry) dataset for a short period (2003–2010). The ensemble mean exhibits a significant high correlation at both global and regional scale, and the ensemble of ocean reanalyses outperforms that of objective analyses, in particular in the Southern Ocean. The reanalysis ensemble mean thus represents a valuable tool for further analyses, although large uncertainties remain for the inter-annual trends. Within the extended intercomparison period that spans the altimetry era (1993–2010), we find that the ensemble of reanalyses and objective analyses are in good agreement, and both detect a trend of the global steric sea level of 1.0 and 1.1 ± 0.05 mm/year, respectively. However, the spread among the products of the halosteric component trend exceeds the mean trend itself, questioning the reliability of its estimate. This is related to the scarcity of salinity observations before the Argo era. Furthermore, the impact of deep ocean layers is non-negligible on the steric sea level variability (22 and 12 % for the layers below 700 and 1500 m of depth, respectively), although the small deep ocean trends are not significant with respect to the products spread.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data from four recent reanalysis projects [ECMWF, NCEP-NCAR, NCEP - Department of Energy ( DOE), NASA] have been diagnosed at the scale of synoptic weather systems using an objective feature tracking method. The tracking statistics indicate that, overall, the reanalyses correspond very well in the Northern Hemisphere (NH) lower troposphere, although differences for the spatial distribution of mean intensities show that the ECMWF reanalysis is systematically stronger in the main storm track regions but weaker around major orographic features. A direct comparison of the track ensembles indicates a number of systems with a broad range of intensities that compare well among the reanalyses. In addition, a number of small-scale weak systems are found that have no correspondence among the reanalyses or that only correspond upon relaxing the matching criteria, indicating possible differences in location and/or temporal coherence. These are distributed throughout the storm tracks, particularly in the regions known for small-scale activity, such as secondary development regions and the Mediterranean. For the Southern Hemisphere (SH), agreement is found to be generally less consistent in the lower troposphere with significant differences in both track density and mean intensity. The systems that correspond between the various reanalyses are considerably reduced and those that do not match span a broad range of storm intensities. Relaxing the matching criteria indicates that there is a larger degree of uncertainty in both the location of systems and their intensities compared with the NH. At upper-tropospheric levels, significant differences in the level of activity occur between the ECMWF reanalysis and the other reanalyses in both the NH and SH winters. This occurs due to a lack of coherence in the apparent propagation of the systems in ERA15 and appears most acute above 500 hPa. This is probably due to the use of optimal interpolation data assimilation in ERA15. Also shown are results based on using the same techniques to diagnose the tropical easterly wave activity. Results indicate that the wave activity is sensitive not only to the resolution and assimilation methods used but also to the model formulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new objective climatology of polar lows in the Nordic (Norwegian and Barents) seas has been derived from a database of diagnostics of objectively identified cyclones spanning the period January 2000 to April 2004. There are two distinct parts to this study: the development of the objective climatology and a characterization of the dynamical forcing of the polar lows identified. Polar lows are an intense subset of polar mesocyclones. Polar mesocyclones are distinguished from other cyclones in the database as those that occur in cold air outbreaks over the open ocean. The difference between the wet-bulb potential temperature at 700 hPa and the sea surface temperature (SST) is found to be an effective discriminator between the atmospheric conditions associated with polar lows and other cyclones in the Nordic seas. A verification study shows that the objective identification method is reliable in the Nordic seas region. After demonstrating success at identifying polar lows using the above method, the dynamical forcing of the polar lows in the Nordic seas is characterized. Diagnostics of the ratio of mid-level vertical motion attributable to quasi-geostrophic forcing from upper and lower levels (U/L ratio) are used to determine the prevalence of a recently proposed category of extratropical cyclogenesis, type C, for which latent heat release is crucial to development. Thirty-one percent of the objectively identified polar low events (36 from 115) exceeded the U/L ratio of 4.0, previously identified as a threshold for type C cyclones. There is a contrast between polar lows to the north and south of the Nordic seas. In the southern Norwegian Sea, the population of polar low events is dominated by type C cyclones. These possess strong convection and weak low-level baroclinicity. Over the Barents and northern Norwegian seas, the well-known cyclogenesis types A and B dominate. These possess stronger low-level baroclinicity and weaker convection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective The Medicines Use Review (MUR) community pharmacy service was introduced in 2005 to enhance patient empowerment but the service has not been taken up as widely as expected. We investigated the depiction of the patient–pharmacist power relationship within MUR patient information leaflets. Methods We identified 11 MUR leaflets including the official Department of Health MUR booklet and through discourse analysis examined the way language and imagery had been used to symbolise and give meaning to the MUR service, especially the portrayal of the patient–pharmacist interactions and the implied power relations. Results A variety of terminology was used to describe the MUR, a service that aimed ultimately to produce more informed patients through the information imparted by knowledgeable, skilled pharmacists. Conclusion The educational role of the MUR overshadowed the intended patient empowerment that would take place with a true concordance-centred approach. Although patient empowerment was implied, this was within the boundaries of the biomedical model with the pharmacist as the expert provider of medicines information. Practice implications If patient empowerment is to be conveyed this needs to be communicated to patients through consistent use of language and imagery that portrays the inclusivity intended.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction Health promotion (HP) aims to enhance good health while preventing ill-health at three levels of activity; primary (preventative), secondary (diagnostic) and tertiary (management).1 It can range from simple provision of health education to ongoing support, but the effectiveness of HP is ultimately dependent on its ability to influence change. HP as part of the Community Pharmacy Contract (CPC) aims to increase public knowledge and target ‘hard-to-reach’ individuals by focusing mainly on primary and tertiary HP. The CPC does not include screening programmes (secondary HP) as a service. Coronary heart disease (CHD) is a significant cause of morbidity and mortality in the UK. While there is evidence to support the effectiveness of some community pharmacy HP strategies in CHD, there is paucity of research in relation to screening services.2 Against this background, Alliance Pharmacy introduced a free CHD risk screening programme to provide tailored HP advice as part of a participant–pharmacist consultation. The aim of this study is to report on the CHD risk levels of participants and to provide a qualitative indication of consultation outcomes. Methods Case records for 12 733 people who accessed a free CHD risk screening service between August 2004 and April 2006 offered at 217 community pharmacies were obtained. The service involved initial self-completion of the Healthy Heart Assessment (HHA) form and measurement of height, weight, body mass index, blood pressure, total cholesterol and highdensity lipoprotein levels by pharmacists to calculate CHD risk.3 Action taken by pharmacists (lifestyle advice, statin recommendation or general practitioner (GP) referral) and qualitative statements of advice were recorded, and a copy provided to the participants. The service did not include follow-up of participants. All participants consented to taking part in evaluations of the service. Ethical committee scrutiny was not required for this service development evaluation. Results Case records for 10 035 participants (3658 male) were evaluable; 5730 (57%) were at low CHD risk (<15%); 3636 (36%) at moderate-to-high CHD risk (≥15%); and 669 (7%) had existing heart disease. A significantly higher proportion of male (48% versus 30% female) participants were at moderate- to-high risk of CHD (chi-square test; P < 0.005). A range of outcomes resulted from consultations. Lifestyle advice was provided irrespective of participants’ CHD risk or existing disease. In the moderate-to-high-risk group, of which 52% received prescribed medication, lifestyle advice was recorded for 62%, 16% were referred and 34% were advised to have a re-assessment. Statin recommendations were made in 1% of all cases. There was evidence of supportive and motivational statements in the advice recorded. Discussion Pharmacists were able to identify individuals’ level of CHD risk and provide them with bespoke advice. Identification of at-risk participants did not automatically result in referrals or statin recommendation. One-third of those accessing the screening service had moderate-to-high risk of CHD, a significantly higher proportion of whom were men. It is not known whether these individuals had been previously exposed to HP but presumably by accessing this service they may have contemplated change. As effectiveness of HP advice will depend among other factors on ability to influence change, future consultations may need to explore patients’ attitude towards change in relation to the Trans Theoretical Model4 to better tailor HP advice. The high uptake of the service by those at moderate-to-high CHD risk indicates a need for this type of screening programme in community pharmacy, perhaps specifically to reach men who access medical services less.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To evaluate CBTp delivered by non-expert therapists, using CBT relevant measures. Methods: Participants (N=74) were randomised into immediate therapy or waiting list control groups. The therapy group was offered six months of therapy and followed up three months later. The waiting list group received therapy after waiting nine months (becoming the delayed therapy group). Results: Depression improved in the combined therapy group at both the end of therapy and follow-up. Other significant effects were found in only one of the two therapy groups (positive symptoms; cognitive flexibility; uncontrollability of thoughts) or one of the two timepoints (end of therapy: PANSS general symptoms, anxiety, suicidal ideation, social functioning, resistance to voices; follow-up: power beliefs about voices, negative symptoms). There was no difference in costs between the groups. Conclusions: The only robust improvement was in depression. Nevertheless, there were further encouraging but modest improvements in both emotional and cognitive variables, in addition to psychotic symptoms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Farming systems research is a multi-disciplinary holistic approach to solve the problems of small farms. Small and marginal farmers are the core of the Indian rural economy Constituting 0.80 of the total farming community but possessing only 0.36 of the total operational land. The declining trend of per capita land availability poses a serious challenge to the sustainability and profitability of farming. Under such conditions, it is appropriate to integrate land-based enterprises such as dairy, fishery, poultry, duckery, apiary, field and horticultural cropping within the farm, with the objective of generating adequate income and employment for these small and marginal farmers Under a set of farm constraints and varying levels of resource availability and Opportunity. The integration of different farm enterprises can be achieved with the help of a linear programming model. For the current review, integrated farming systems models were developed, by Way Of illustration, for the marginal, small, medium and large farms of eastern India using linear programming. Risk analyses were carried out for different levels of income and enterprise combinations. The fishery enterprise was shown to be less risk-prone whereas the crop enterprise involved greater risk. In general, the degree of risk increased with the increasing level of income. With increase in farm income and risk level, the resource use efficiency increased. Medium and large farms proved to be more profitable than small and marginal farms with higher level of resource use efficiency and return per Indian rupee (Rs) invested. Among the different enterprises of integrated farming systems, a chain of interaction and resource flow was observed. In order to make fanning profitable and improve resource use efficiency at the farm level, the synergy among interacting components of farming systems should be exploited. In the process of technology generation, transfer and other developmental efforts at the farm level (contrary to the discipline and commodity-based approaches which have a tendency to be piecemeal and in isolation), it is desirable to place a whole-farm scenario before the farmers to enhance their farm income, thereby motivating them towards more efficient and sustainable fanning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Earlier studies suggest age is positively associated with job satisfaction, while others use length of service, or tenure, as a predictor of job satisfaction levels. This article examines whether age and tenure are individual determinants of satisfaction, or whether there is an interaction between the two. The results indicate that employee age is not significantly associated with overall job satisfaction level, but that tenure is. There is also significant relationship between tenure and facets of satisfaction (job, pay and fringe benefits), but the effect of tenure on satisfaction is significantly modified by age.