958 resultados para Performance indices


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose. To assess in a sample of normal, keratoconic, and keratoconus (KC) suspect eyes the performance of a set of new topographic indices computed directly from the digitized images of the Placido rings. Methods. This comparative study was composed of a total of 124 eyes of 106 patients from the ophthalmic clinics Vissum Alicante and Vissum Almería (Spain) divided into three groups: control group (50 eyes), KC group (50 eyes), and KC suspect group (24 eyes). In all cases, a comprehensive examination was performed, including the corneal topography with a Placidobased CSO topography system. Clinical outcomes were compared among groups, along with the discriminating performance of the proposed irregularity indices. Results. Significant differences at level 0.05 were found on the values of the indices among groups by means of Mann-Whitney-Wilcoxon nonparametric test and Fisher exact test. Additional statistical methods, such as receiver operating characteristic analysis and K-fold cross validation, confirmed the capability of the indices to discriminate between the three groups. Conclusions. Direct analysis of the digitized images of the Placido mires projected on the cornea is a valid and effective tool for detection of corneal irregularities. Although based only on the data from the anterior surface of the cornea, the new indices performed well even when applied to the KC suspect eyes. They have the advantage of simplicity of calculation combined with high sensitivity in corneal irregularity detection and thus can be used as supplementary criteria for diagnosing and grading KC that can be added to the current keratometric classifications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The impact of acute weight loss on rowing performance was assessed when generous nutrient intake was provided in 2 h of recovery after making weight. Methods: Competitive rowers (N = 17) completed four ergometer trials, each separated by 48 h. Two trials were performed after a 4% body mass loss in the previous 24 h (WT) and two were performed after no weight restrictions, that is, unrestricted (UNR). In addition, two trials (I X WT, I X UNR) were in a thermoneutral environment (NEUTRAL, mean 21.1 +/- SD 0.7 degrees C, 29.0 +/- 4.5% RH) and two were in the heat (HOT 32.4, +/- 0.4 degrees C, 60.4 +/- 2.7% RH). Trials were performed in a counterbalanced fashion according to a Latin square design. Aggressive nutritional recovery strategies (WT 2.3 g(.)kg(-11) carbohydrate, 34 mg-kg(-1) Na, 28.4 mL(.)kg(-1) fluid; UNR ad libitum) were employed in the 2 h after weigh-in. Results: Both WT (mean 2.1, 95% CI 0.7-3.4 s; P = 0.003) and HOT (4.1, 2.7 - 5.4 s; P < 0.001) compromised 2000-m time-trial performance. Whereas WT resulted in hypohydration, the associated reduction in plasma volume explained only part of the performance compromise observed (0.2 s for every 1% decrement) Moreover, WT did not influence core temperature or indices of cardiovascular function. Conclusions: Acute weight loss compromised performance, despite generous nutrient intake in recovery, although the effect was small. Performance decrements were further exacerbated when exercise was performed in the heat.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective. To explore the relationship between leadership effectiveness and health-care trust performance, taking into account external quality measures and the number of patient complaints; also, to examine the role of care quality climate as a mediator. Design. We developed scales for rating leadership effectiveness and care quality climate. We then drew upon UK national indices of health-care trust performance—Commission for Health Improvement star ratings, Clinical Governance Review ratings and the number of patient complaints per thousand. We conducted statistical analysis to examine any significant relationships between predictor and outcome variables. Setting. The study is based on 86 hospital trusts run by the National Health Service (NHS) in the UK. The data collection is part of an annual staff survey commissioned by the NHS to explore the quality of working life. Participants. A total of 17 949 employees were randomly surveyed (41% of the total sample). Results. Leadership effectiveness is associated with higher Clinical Governance Review ratings and Commission for Health Improvement star ratings for our sample (ß = 0.42, P < 0.05; ß = 0.37, P < 0.05, respectively), and lower patient complaints (ß = –0.57, P < 0.05). In addition, 98% of the relationship between leadership and patient complaints is explained by care quality climate. Conclusions. Results offer insight into how non-clinical leadership may foster performance outcomes for health-care organizations. A frequently neglected area—patient complaints—may be a valid measure to consider when assessing leadership and quality in a health-care context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we develop an index and an indicator of productivity change that can be used with negative data. For that purpose the range directional model (RDM), a particular case of the directional distance function, is used for computing efficiency in the presence of negative data. We use RDM efficiency measures to arrive at a Malmquist-type index, which can reflect productivity change, and we use RDM inefficiency measures to arrive at a Luenberger productivity indicator, and relate the two. The productivity index and indicator are developed relative to a fixed meta-technology and so they are referred to as a meta-Malmquist index and meta-Luenberger indicator. We also address the fact that VRS technologies are used for computing the productivity index and indicator (a requirement under negative data), which raises issues relating to the interpretability of the index. We illustrate how the meta-Malmquist index can be used, not only for comparing the performance of a unit in two time periods, but also for comparing the performance of two different units at the same or different time periods. The proposed approach is then applied to a sample of bank branches where negative data were involved. The paper shows how the approach yields information from a variety of perspectives on performance which management can use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past decade, several experienced Operational Researchers have advanced the view that the theoretical aspects of model building have raced ahead of the ability of people to use them. Consequently, the impact of Operational Research on commercial organisations and the public sector is limited, and many systems fail to achieve their anticipated benefits in full. The primary objective of this study is to examine a complex interactive Stock Control system, and identify the reasons for the differences between the theoretical expectations and the operational performance. The methodology used is to hypothesise all the possible factors which could cause a divergence between theory and practice, and to evaluate numerically the effect each of these factors has on two main control indices - Service Level and Average Stock Value. Both analytical and empirical methods are used, and simulation is employed extensively. The factors are divided into two main categories for analysis - theoretical imperfections in the model, and the usage of the system by Buyers. No evidence could be found in the literature of any previous attempts to place the differences between theory and practice in a system in quantitative perspective nor, more specifically, to study the effects of Buyer/computer interaction in a Stock Control system. The study reveals that, in general, the human factors influencing performance are of a much higher order of magnitude than the theoretical factors, thus providing objective evidence to support the original premise. The most important finding is that, by judicious intervention into an automatic stock control algorithm, it is possible for Buyers to produce results which not only attain but surpass the algorithmic predictions. However, the complexity and behavioural recalcitrance of these systems are such that an innately numerate, enquiring type of Buyer needs to be inducted to realise the performance potential of the overall man/computer system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study is concerned with the durability of cement stabilised minestone (CSM). Minestone is dominated by the clay-bearing mudrocks and shales of the Coal Measures. Consequently, engineering problems are often encountered due to the likelihood of these rocks undergoing volume change and degradation when exposed to fluctuations in moisture content. In addition, iron sulphides (chiefly pyrite) are frequently present in minestone as diagenetic minerals which on excavation have the potential to oxidise forming sulphate minerals. The oxidation of sulphides may in itself contribute to volume increase in pyritic rocks and sulphate minerals may combine with the products of cement hydration to produce further expansion. The physical and chemical properties of a wide range of minestones are determined and attempts are made to correlate these with the engineering performance of cement stabilised specimens subjected to short-term immersion in water. Criteria, based on these raw material indices are proposed with a view to eliminating minestones which are unsuitable. A long-term durability study is also described. In this, the geochemical stability of pyrite in CSM was examined together with the role played by the sulphur bearing mineralogy in determining the engineering performance of CSM's exposed to conditions of increased moisture availability. The nature of a number of disrupted CSM pavements which have been examined are also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pavement performance is one of the most important components of the pavement management system. Prediction of the future performance of a pavement section is important in programming maintenance and rehabilitation needs. Models for predicting pavement performance have been developed on the basis of traffic and age. The purpose of this research is to extend the use of a relatively new approach to performance prediction in pavement performance modeling using adaptive logic networks (ALN). Adaptive logic networks have recently emerged as an effective alternative to artificial neural networks for machine learning tasks. ^ The ALN predictive methodology is applicable to a wide variety of contexts including prediction of roughness based indices, composite rating indices and/or individual pavement distresses. The ALN program requires key information about a pavement section, including the current distress indexes, pavement age, climate region, traffic and other variables to predict yearly performance values into the future. ^ This research investigates the effect of different learning rates of the ALN in pavement performance modeling. It can be used at both the network and project level for predicting the long term performance of a road network. Results indicate that the ALN approach is well suited for pavement performance prediction modeling and shows a significant improvement over the results obtained from other artificial intelligence approaches. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is growing popularity in the use of composite indices and rankings for cross-organizational benchmarking. However, little attention has been paid to alternative methods and procedures for the computation of these indices and how the use of such methods may impact the resulting indices and rankings. This dissertation developed an approach for assessing composite indices and rankings based on the integration of a number of methods for aggregation, data transformation and attribute weighting involved in their computation. The integrated model developed is based on the simulation of composite indices using methods and procedures proposed in the area of multi-criteria decision making (MCDM) and knowledge discovery in databases (KDD). The approach developed in this dissertation was automated through an IT artifact that was designed, developed and evaluated based on the framework and guidelines of the design science paradigm of information systems research. This artifact dynamically generates multiple versions of indices and rankings by considering different methodological scenarios according to user specified parameters. The computerized implementation was done in Visual Basic for Excel 2007. Using different performance measures, the artifact produces a number of excel outputs for the comparison and assessment of the indices and rankings. In order to evaluate the efficacy of the artifact and its underlying approach, a full empirical analysis was conducted using the World Bank's Doing Business database for the year 2010, which includes ten sub-indices (each corresponding to different areas of the business environment and regulation) for 183 countries. The output results, which were obtained using 115 methodological scenarios for the assessment of this index and its ten sub-indices, indicated that the variability of the component indicators considered in each case influenced the sensitivity of the rankings to the methodological choices. Overall, the results of our multi-method assessment were consistent with the World Bank rankings except in cases where the indices involved cost indicators measured in per capita income which yielded more sensitive results. Low income level countries exhibited more sensitivity in their rankings and less agreement between the benchmark rankings and our multi-method based rankings than higher income country groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the performance of series of two geomagnetic indices and series synthesized from a semi-empirical model of magnetospheric currents, in explaining the geomagnetic activity observed at Northern Hemipshere's mid-latitude ground-based stations. We analyse data, for the 2007 to 2014 period, from four magnetic observatories (Coimbra, Portugal; Panagyurishte, Bulgary; Novosibirsk, Russia and Boulder, USA), at geomagnetic latitudes between 40° and 50° N. The quiet daily (QD) variation is firstly removed from the time series of the geomagnetic horizontal component (H) using natural orthogonal components (NOC) tools. We compare the resulting series with series of storm-time disturbance (Dst) and ring current (RC) indices and with H series synthesized from the Tsyganenko and Sitnov (2005, doi:10.1029/2004JA010798) (TS05) semi-empirical model of storm-time geomagnetic field. In the analysis, we separate days with low and high local K-index values. Our results show that NOC models are as efficient as standard models of QD variation in preparing raw data to be compared with proxies, but with much less complexity. For the two stations in Europe, we obtain indication that NOC models could be able to separate ionospheric and magnetospheric contributions. Dst and RC series explain the four observatory H-series successfully, with values for the mean of significant correlation coefficients, from 0.5 to 0.6 during low geomagnetic activity (K less than 4) and from 0.6 to 0.7 for geomagnetic active days (K greater than or equal to 4). With regard to the performance of TS05, our results show that the four observatories separate into two groups: Coimbra and Panagyurishte, in one group, for which the magnetospheric/ionospheric ratio in QD variation is smaller, a dominantly QD ionospheric contribution can be removed and TS05 simulations are the best proxy; Boulder and Novosibirsk,in the other group, for which the ionospheric and magnetospheric contributions in QD variation can not be differentiated and correlations with TS05 series can not be made to improve. The main contributor to magnetospheric QD signal are Birkeland currents. The relatively good success of TS05 model in explaining ground-based irregular geomagnetic activity at mid-latitudes makes it an effective tool to classify storms according to their main sources. For Coimbra and Panagyurishte in particular, where ionospheric and magnetospheric daily contributions seem easier to separate, we can aspire to use the TS05 model for ensemble generation in space weather (SW) forecasting and interpretation of past SW events.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a one-dimensional exploratory study which compares the socially responsible companies included in the Spanish sustainability index, FTSE4Good Ibex, with the rest of the indices in the IBEX family. The aim is to use different economic variables to establish whether there are differences in economic performance. Parametric testing was used to study whether there are differences between the two types of companies. The results demonstrate that there are no statistically significant differences in economic performance between the two groups. The study confirms that companies with good practices are as profitable as the rest, but it also demonstrates that the economic-financial behaviour is not better as a result of being in the sustainability index.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The initiation of the Benguela upwelling has been dated to the late Miocene, but estimates of its sea surface temperature evolution are not available. This study presents data from Ocean Drilling Program (ODP) Site 1085 recovered from the southern Cape Basin. Samples of the middle Miocene to Pliocene were analyzed for alkenone-based (UK'37, SSTUK) and glycerol dialkyl glycerol tetraether (GDGT) based (TEX86, TempTEX) water temperature proxies. In concordance with global cooling during the Miocene, SSTUK and TempTEX exhibit a decline of about 8°C and 16°C, respectively. The temperature trends suggest an inflow of cold Antarctic waters triggered by Antarctic ice sheet expansion and intensification of Southern Hemisphere southeasterly winds. A temperature offset between both proxies developed with the onset of upwelling, which can be explained by differences in habitat: alkenone-producing phytoplankton live in the euphotic zone and record sea surface temperatures, while GDGT-producing Thaumarchaeota are displaced to colder subsurface waters in upwelling-influenced areas and record subsurface water temperatures. We suggest that variations in subsurface water temperatures were driven by advection of cold Antarctic waters and thermocline adjustments that were due to changes in North Atlantic deep water formation. A decline in surface temperatures, an increased offset between temperature proxies, and an increase in primary productivity suggest the establishment of the Benguela upwelling at 10 Ma. During the Messinian Salinity Crisis, between 7 and 5 Ma, surface and subsurface temperature estimates became similar, likely because of a strong reduction in Atlantic overturning circulation, while high total organic carbon contents suggest a "biogenic bloom." In the Pliocene the offset between the temperature estimates and the cooling trend was reestablished.