29 resultados para Key performance indicators
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Key performance features of a miniature laser ablation time-of-flight mass spectrometer designed for in situ investigations of the chemical composition of planetary surfaces are presented. This mass spectrometer is well suited for elemental and isotopic analysis of raw solid materials with high sensitivity and high spatial resolution. In this study, ultraviolet laser radiation with irradiances suitable for ablation (< 1 GW/cm2) is used to achieve stable ion formation and low sample consumption. In comparison to our previous laser ablation studies at infrared wavelengths, several improvements to the experimental setup have been made, which allow accurate control over the experimental conditions and good reproducibility of measurements. Current performance evaluations indicate significant improvements to several instrumental figures of merit. Calibration of the mass scale is performed within a mass accuracy (Δm/m) in the range of 100 ppm, and a typical mass resolution (m/Δm) ~600 is achieved at the lead mass peaks. At lower laser irradiances, the mass resolution is better, about (m/Δm) ~900 for lead, and limited by the laser pulse duration of 3 ns. The effective dynamic range of the instrument was enhanced from about 6 decades determined in previous study up to more than 8 decades at present. Current studies show high sensitivity in detection of both metallic and non-metallic elements. Their abundance down to tens of ppb can be measured together with their isotopic patterns. Due to strict control of the experimental parameters, e.g. laser characteristics, ion-optical parameters and sample position, by computer control, measurements can be performed with high reproducibility. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
Purpose – A growing body of literature points to the importance of public service motivation (PSM) for the performance of public organizations. The purpose of this paper is to assess the method predominantly used for studying this linkage by comparing the findings it yields without and with a correction suggested by Brewer (2006), which removes the common-method bias arising from employee-specific response tendencies. Design/methodology/approach – First, the authors conduct a systematic review of published empirical research on the effects of PSM on performance and show that all studies found have been conducted at the individual level. Performance indicators in all but three studies were obtained by surveying the same employees who were also asked about their PSM. Second, the authors conduct an empirical analysis. Using survey data from 240 organizational units within the Swiss federal government, the paper compares results from an individual-level analysis (comparable to existing research) to two analyses where the data are aggregated to the organizational level, one without and one with the correction for common-method bias suggested by Brewer (2006). Findings – Looking at the Attraction to Policy-Making dimension of PSM, there is an interesting contrast: While this variable is positively correlated with performance in both the individual-level analysis and the aggregated data analysis without the correction for common-method bias, it is not statistically associated with performance in the aggregated data analysis with the correction. Originality/value – The analysis is the first to assess the robustness of the performance-PSM linkage to a correction for common-method bias. The findings place the validity of at least one part of the individual-level linkage between PSM and performance into question.
Resumo:
Individuals differ in their preference for processing information on the basis of taxonomic, feature-based similarity, or thematic, relation-based similarity. These differences, which have been investigated in a recently emerging research stream in cognitive psychology, affect innovative behavior and thus constitute an important antecedent of individual performance in research and development (R&D) that has been overlooked so far in the literature on innovation management. To fill this research gap, survey and test data from the employees of a multinational information technology services firm are used to examine the relationship between thematic thinking and R&D professionals' individual performance. A moderated mediation model is applied to investigate the proposed relationships of thematic thinking and individual-level performance indicators. Results show a positive relationship between thematic thinking and innovativeness, as well as individual job performance. While the results do not support the postulated moderation of the innovativeness–job performance relationship by employees' political skill, they show that the relationship between thematic thinking and job performance is fully mediated by R&D professionals' innovativeness. The present study is thus the first to reveal a positive relationship between thematic thinking and innovative performance.
Resumo:
We present a climate analysis of nine unique Swiss Alpine new snow series that have been newly digitized. The stations cover different altitudes (450–1860 m asl) and all time series cover more than 100 years (one from 1864 to 2009). In addition, data from 71 stations for the last 50–80 years for new snow and snow depth are analysed to get a more complete picture of the Swiss Alpine snow variability. Important snow climate indicators such as new snow sums (NSS), maximum new snow (MAXNS) and days with snowfall (DWSF) are calculated and variability and trends analysed. Series of days with snow pack (DWSP) ≥ 1 cm are reconstructed with useful quality for six stations using the daily new snow, local temperature and precipitation data. Our results reveal large decadal variability with phases of low and high values for NSS, DWSF and DWSP. For most stations NSS, DWSF and DWSP show the lowest values recorded and unprecedented negative trends in the late 1980s and 1990s. For MAXNS, however, no clear trends and smaller decadal variability are found but very large MAXNS values (>60 cm) are missing since the year 2000. The fraction of NSS and DWSP in different seasons (autumn, winter and spring) has changed only slightly over the ∼150 year record. Some decreases most likely attributable to temperature changes in the last 50 years are found for spring, especially for NSS at low stations. Both the NSS and DWSP snow indicators show a trend reversal in most recent years (since 2000), especially at low and medium altitudes. This is consistent with the recent ‘plateauing’ (i.e. slight relative decrease) of mean winter temperature in Switzerland and illustrates how important decadal variability is in understanding the trends in key snow indicators.
Resumo:
Recent advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing environmental conditions and number of users, application performance might suffer, leading to Service Level Agreement (SLA) violations and inefficient use of hardware resources. We introduce a system for controlling the complexity of scaling applications composed of multiple services using mechanisms based on fulfillment of SLAs. We present how service monitoring information can be used in conjunction with service level objectives, predictions, and correlations between performance indicators for optimizing the allocation of services belonging to distributed applications. We validate our models using experiments and simulations involving a distributed enterprise information system. We show how discovering correlations between application performance indicators can be used as a basis for creating refined service level objectives, which can then be used for scaling the application and improving the overall application's performance under similar conditions.
Resumo:
Cloud Computing is an enabler for delivering large-scale, distributed enterprise applications with strict requirements in terms of performance. It is often the case that such applications have complex scaling and Service Level Agreement (SLA) management requirements. In this paper we present a simulation approach for validating and comparing SLA-aware scaling policies using the CloudSim simulator, using data from an actual Distributed Enterprise Information System (dEIS). We extend CloudSim with concurrent and multi-tenant task simulation capabilities. We then show how different scaling policies can be used for simulating multiple dEIS applications. We present multiple experiments depicting the impact of VM scaling on both datacenter energy consumption and dEIS performance indicators.
Resumo:
Cloud Computing enables provisioning and distribution of highly scalable services in a reliable, on-demand and sustainable manner. However, objectives of managing enterprise distributed applications in cloud environments under Service Level Agreement (SLA) constraints lead to challenges for maintaining optimal resource control. Furthermore, conflicting objectives in management of cloud infrastructure and distributed applications might lead to violations of SLAs and inefficient use of hardware and software resources. This dissertation focusses on how SLAs can be used as an input to the cloud management system, increasing the efficiency of allocating resources, as well as that of infrastructure scaling. First, we present an extended SLA semantic model for modelling complex service-dependencies in distributed applications, and for enabling automated cloud infrastructure management operations. Second, we describe a multi-objective VM allocation algorithm for optimised resource allocation in infrastructure clouds. Third, we describe a method of discovering relations between the performance indicators of services belonging to distributed applications and then using these relations for building scaling rules that a CMS can use for automated management of VMs. Fourth, we introduce two novel VM-scaling algorithms, which optimally scale systems composed of VMs, based on given SLA performance constraints. All presented research works were implemented and tested using enterprise distributed applications.
Resumo:
This paper presents a multifactor approach for performance assessment of Water Users Associations (WUAs) in Uzbekistan in order to identify the drivers for improved and effi cient performance of WUAs. The study was carried out in the Fergana Valley where the WUAs were created along the South Fergana Main Canal during the last 10 years. The farmers and the employees of 20 WUAs were questioned about the WUAs’ activities and the quantitative and qualitative data were obtained. This became a base for the calculation of 36 indicators divided into 6 groups: Water supply, technical conditions, economic conditions, social and cultural conditions, organizational conditions and information conditions. All the indicators assessed with a differentiated point system adjusted for subjectivity of several of them give the total maximal result for the associations of 250 point. The WUAs of the Fergana Valley showed the score between 145 and 219 points, what refl ects a highly diverse level of the WUAs performance in the region. The analysis of the indicators revealed that the key points of the WUA’s success are the organizational and institutional conditions including the participatory factors and awareness of both the farmers and employees about the work of WUA. The research showed that the low performance of the WUAs is always explained by the low technical and economic conditions along with weak organization and information dissemination conditions. It is clear that it is complicated to improve technical and economic conditions immediately because they are cost-based and cost-induced. However, it is possible to improve the organizational conditions and to strengthen the institutional basis via formal and information institutions which will gradually lead to improvement of economic and technical conditions of WUAs. Farmers should be involved into the WUA Governance and into the process of making common decisions and solving common problems together via proper institutions. Their awareness can also be improved by leading additional trainings for increasing farmers’ agronomic and irrigation knowledge, teaching them water saving technologies and acquainting them with the use of water measuring equipment so it can bring reliable water supply, transparent budgeting and adequate as well as equitable water allocation to the water users.
Resumo:
Morbidity and mortality of myocardial infarction remains significant with resulting left ventricular function presenting as a major determinant of clinical outcome. Protecting the myocardium against ischemia reperfusion injury has become a major therapeutic goal and the identification of key signaling pathways has paved the way for various interventions, but until now with disappointing results. This article describes the recently discovered new role of G-protein-coupled receptor kinase-2 (GRK2), which is known to critically influence the development and progression of heart failure, in acute myocardial injury. This article focuses on potential applications of the GRK2 peptide inhibitor βARKct in ischemic myocardial injury, the use of GRK2 as a biomarker in acute myocardial infarction and discusses the challenges of translating GRK2 inhibition as a cardioprotective strategy to a possible future clinical application.
Resumo:
The Long Term Evolution (LTE) cellular technology is expected to extend the capacity and improve the performance of current 3G cellular networks. Among the key mechanisms in LTE responsible for traffic management is the packet scheduler, which handles the allocation of resources to active flows in both the frequency and time dimension. This paper investigates for various scheduling scheme how they affect the inter-cell interference characteristics and how the interference in turn affects the user’s performance. A special focus in the analysis is on the impact of flow-level dynamics resulting from the random user behaviour. For this we use a hybrid analytical/simulation approach which enables fast evaluation of flow-level performance measures. Most interestingly, our findings show that the scheduling policy significantly affects the inter-cell interference pattern but that the scheduler specific pattern has little impact on the flow-level performance.
Resumo:
Carnitine is an amino acid derivative that plays a key role in energy metabolism. Endogenous carnitine is found in its free form or esterified with acyl groups of several chain lengths. Quantification of carnitine and acylcarnitines is of particular interest for screening for research and metabolic disorders. We developed a method with online solid-phase extraction coupled to high-performance liquid chromatography and tandem mass spectrometry to quantify carnitine and three acylcarnitines with different polarity (acetylcarnitine, octanoylcarnitine, and palmitoylcarnitine). Plasma samples were deproteinized with methanol, loaded on a cation exchange trapping column and separated on a reversed-phase C8 column using heptafluorobutyric acid as an ion-pairing reagent. Considering the endogenous nature of the analytes, we quantified with the standard addition method and with external deuterated standards. Solid-phase extraction and separation were achieved within 8 min. Recoveries of carnitine and acylcarnitines were between 98 and 105 %. Both quantification methods were equally accurate (all values within 84 to 116 % of target concentrations) and precise (day-to-day variation of less than 18 %) for all carnitine species and concentrations analyzed. The method was used successfully for determination of carnitine and acylcarnitines in different human samples. In conclusion, we present a method for simultaneous quantification of carnitine and acylcarnitines with a rapid sample work-up. This approach requires small sample volumes and a short analysis time, and it can be applied for the determination of other acylcarnitines than the acylcarnitines tested. The method is useful for applications in research and clinical routine.
Resumo:
Three different fissure preparation procedures were tested and compared to the non-invasive approach using a conventional unfilled sealant and a flowable composite. Eighty permanent molars were selected and divided into 4 groups of 20 teeth each. All the teeth were split into 2 halves, and the exposed fissures were photographed under a microscope (35x) before and after being prepared using the following methods: (I) Er:YAG laser (KEY Laser, KaVo) 600 mJ pulse energy, 6 Hz; (II) diamond bur; (III) Er: YAG laser (KEY Laser, KaVo) 200 mJ pulse energy, 4 Hz; (IV) Control group: Powder jet cleaner (Prophyflex, KaVo, Germany). The pre-and postimages were superimposed in order to evaluate the amount of hard tissue removed. Ten teeth in each group were then acid etched and sealed with an unfilled sealant (Delton opaque, Dentsply), while the remaining 10 teeth were acid etched, primed and bonded (Prime ; Bond NT, Dentsply) and sealed with a flowable composite (X-flow, DeTrey, Dentsply). Material penetration and microleakage were evaluated after thermocycling (5000 cycles) and staining with methylene blue 5%. ANOVA and Mann-Whitney tests were applied for statistical analysis. The laser 600 mJ and bur eliminated the greatest amount of hard tissue. The control teeth presented the least microleakage when sealed with Delton or X-flow. A correlation between material penetration and microleakage could not be statistically confirmed. Mechanical preparation prior to fissure sealing did not enhance the final performance of the sealant.