31 resultados para Ankle Measurement Evaluation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a research group with no commercial interest in any macular pigment optical density (MPOD) measurement devices or nutritional supplements, we feel that we were well-placed to carry out an independent clinical assessment of the reliability of the MPS 9000 (Tinsley Precision Instruments, Redhill, Surrey, UK). Our study was prompted by the fact that we could not find any reported coefficient of repeatability value within the literature, and none was provided by the manufacturer.1 We had planned to use this instrument in our own research studies investigating the impact of nutritional supplementation on MPOD. For this purpose, we needed …

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to determine whether an ophthalmophakometric technique could offer a feasible means of investigating ocular component contributions to residual astigmatism in human eyes. Current opinion was gathered on the prevalence, magnitude and source of residual astigmatism. It emerged that a comprehensive evaluation of the astigmatic contributions of the eye's internal ocular surfaces and their respective axial separations (effectivity) had not been carried out to date. An ophthalmophakometric technique was developed to measure astigmatism arising from the internal ocular components. Procedures included the measurement of refractive error (infra-red autorefractometry), anterior corneal surface power (computerised video keratography), axial distances (A-scan ultrasonography) and the powers of the posterior corneal surface in addition to both surfaces of the crystalline lens (multi-meridional still flash ophthalmophakometry). Computing schemes were developed to yield the required biometric data. These included (1) calculation of crystalline lens surface powers in the absence of Purkinje images arising from its anterior surface, (2) application of meridional analysis to derive spherocylindrical surface powers from notional powers calculated along four pre-selected meridians, (3) application of astigmatic decomposition and vergence analysis to calculate contributions to residual astigmatism of ocular components with obliquely related cylinder axes, (4) calculation of the effect of random experimental errors on the calculated ocular component data. A complete set of biometric measurements were taken from both eyes of 66 undergraduate students. Effectivity due to corneal thickness made the smallest cylinder power contribution (up to 0.25DC) to residual astigmatism followed by contributions of the anterior chamber depth (up to 0.50DC) and crystalline lens thickness (up to 1.00DC). In each case astigmatic contributions were predominantly direct. More astigmatism arose from the posterior corneal surface (up to 1.00DC) and both crystalline lens surfaces (up to 2.50DC). The astigmatic contributions of the posterior corneal and lens surfaces were found to be predominantly inverse whilst direct astigmatism arose from the anterior lens surface. Very similar results were found for right versus left eyes and males versus females. Repeatability was assessed on 20 individuals. The ophthalmophakometric method was found to be prone to considerable accumulated experimental errors. However, these errors are random in nature so that group averaged data were found to be reasonably repeatable. A further confirmatory study was carried out on 10 individuals which demonstrated that biometric measurements made with and without cycloplegia did not differ significantly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present thesis evaluates various aspects of videokeratoscopes, which are now becoming increasingly popular in the investigation of corneal topography. The accuracy and repeatability of these instruments has been assessed mainly using spherical surfaces, however, few studies have assessed the performance of videokeratoscopes in measuring convex aspheric surfaces. Using two videokeratoscopes, the accuracy and repeatability of measurements using twelve aspheric surfaces is determined. Overall, the accuracy and repeatability of both instruments were acceptable, however, progressively flatter surfaces introduced greater errors in measurement. The possible reasons for these errors are discussed. The corneal surface is a biological structure lubricated by the precorneal tear film. The effects of variations in the tear film on the repeatability of videokeratoscopes have not been determined in terms of peripheral corneal measurements. The repeatability of two commercially available videokeratoscopes is assessed. The repeatability is found to be dependent on the point of measurement on the corneal surface. Typically, superior and nasal meridians exhibit poorest repeatability. It is suggested that interference of the ocular adnexa is responsible for the reduced repeatability. This localised reduction in repeatability will occur for all videokeratoscopes. Further, comparison with the keratometers and videokeratoscopes used show that measurements between these instruments are not interchangeable. The final stage of this thesis evaluates the performance of new algorithms. The characteristics of a new videokeratoscope are described. This videokeratoscope is used to test the accuracy of the new algorithms for twelve aspheric surfaces. The new algorithms are accurate in determining the shape of aspheric surfaces, more so than those algorithms proposed at present.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Off-highway motive plant equipment is costly in capital outlay and maintenance. To reduce these overheads and increase site safety and workrate, a technique of assessing and limiting the velocity of such equipment is required. Due to the extreme environmental conditions met on such sites, conventional velocity measurement techniques are inappropriate. Ogden Electronics Limited were formed specifically to manufacture a motive plant safety system incorporating a speed sensor and sanction unit; to date, the only such commercial unit available. However, problems plague the reliability, accuracy and mass production of this unit. This project assesses the company's exisiting product, and in conjunction with an appreciation of the company history and structure, concludes that this unit is unsuited to its intended application. Means of improving the measurement accuracy and longevity of this unit, commensurate with the company's limited resources and experience, are proposed, both for immediate retrofit and for longer term use. This information is presented in the form of a number of internal reports for the company. The off-highway environment is examined; and in conjunction with an evaluation of means of obtaining a returned signal, comparisons of processing techniques, and on-site gathering of previously unavailable data, preliminary designs for an alternative product are drafted. Theoretical aspects are covered by a literature review of ground-pointing radar, vehicular radar, and velocity measuring systems. This review establishes and collates the body of knowledge in areas previously considered unrelated. Based upon this work, a new design is proposed which is suitable for incorporation into the existing company product range. Following production engineering of the design, five units were constructed, tested and evaluated on-site. After extended field trials, this design has shown itself to possess greater accuracy, reliability and versatility than the existing sensor, at a lower unit cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is concerned with the development of techniques for the evaluation of large-scale highway schemes with particular reference to the assessment of their costs and benefits in the context of the current transport planning (T.P.P.) process. It has been carried out in close cooperation with West Midlands County Council, although its application and results are applicable elsewhere. The background to highway evaluation and its development in recent years has been described and the emergence of a number of deficiencies in current planning practise noted. One deficiency in particular stood out, that stemming from inadequate methods of scheme generation and the research has concentrated upon improving this stage of appraisal, to ensure that subsequent stages of design, assessment and implementation are based upon a consistent and responsive foundation. Deficiencies of scheme evaluation were found to stem from inadequate development of appraisal methodologies suffering from difficulties of valuation, measurement and aggregation of the disparate variables that characterise highway evaluation. A failure to respond to local policy priorities was also noted. A 'problem' rather than 'goals' based approach to scheme generation was taken, as it represented the current and foreseeable resource allocation context more realistically. A review of techniques with potential for highway problem based scheme generation, which would work within a series of practical and theoretical constraints were assessed and that of multivariate analysis, and classical factor analysis in particular, was selected, because it offerred considerable application to the difficulties of valuation, measurement and aggregation that existed. Computer programs were written to adapt classical factor analysis to the requirements of T.P.P. highway evaluation, using it to derive a limited number of factors which described the extensive quantity of highway problem data. From this, a series of composite problem scores for 1979 were derived for a case study area of south Birmingham, based upon the factorial solutions, and used to assess highway sites in terms of local policy issues. The methodology was assessed in the light of its ability to describe highway problems in both aggregate and disaggregate terms, to guide scheme design, coordinate with current scheme evaluation methods, and in general to improve upon current appraisal. Analysis of the results was both in subjective, 'common-sense' terms and using statistical methods to assess the changes in problem definition, distribution and priorities that emerged. Overall, the technique was found to improve upon current scheme generation methods in all respects and in particular in overcoming the problems of valuation, measurement and aggregation without recourse to unsubstantiated and questionable assumptions. A number of deficiencies which remained have been outlined and a series of research priorities described which need to be reviewed in the light of current and future evaluation needs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM: To determine the validity and reliability of the measurement of corneal curvature and non-invasive tear break-up time (NITBUT) measures using the Oculus Keratograph. METHOD: One hundred eyes of 100 patients had their corneal curvature assessed with the Keratograph and the Nidek ARKT TonorefII. NITBUT was then measured objectively with the Keratograph with Tear Film Scan software and subjectively with the Keeler Tearscope. The Keratograph measurements of corneal curvature and NITBUT were repeated to test reliability. The ocular sensitivity disease index questionnaire was completed to quantify ocular comfort. RESULTS: The Keratograph consistently measured significantly flatter corneal curvatures than the ARKT (MSE difference: +1.83±0.44D), but was repeatable (p>0.05). Keratograph NITBUT measurements were significantly lower than observation using the Tearscope (by 12.35±7.45s; pp < 0.001) and decreased on subsequent measurement (by -1.64 ± 6.03s; p < 0.01). The Keratograph measures the first time the tears break up anywhere on the cornea with 63% of subjects having NI-TBUT's <5s and a further 22% having readings between 5 and 10s. The Tearscope results were found to correlate better with the patients symptoms (r = -0.32) compared to the Keratograph (r = -0.19). Conclusions: The Keratograph requires a calibration off-set to be comparable to other keratometry devices. Its current software detects very early tear film changes, recording significantly lower NITBUT values than conventional subjective assessment. Adjustments to instrumentation software have the potential to enhance the value of Keratograph objective measures in clinical practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Performance evaluation in conventional data envelopment analysis (DEA) requires crisp numerical values. However, the observed values of the input and output data in real-world problems are often imprecise or vague. These imprecise and vague data can be represented by linguistic terms characterised by fuzzy numbers in DEA to reflect the decision-makers' intuition and subjective judgements. This paper extends the conventional DEA models to a fuzzy framework by proposing a new fuzzy additive DEA model for evaluating the efficiency of a set of decision-making units (DMUs) with fuzzy inputs and outputs. The contribution of this paper is threefold: (1) we consider ambiguous, uncertain and imprecise input and output data in DEA, (2) we propose a new fuzzy additive DEA model derived from the a-level approach and (3) we demonstrate the practical aspects of our model with two numerical examples and show its comparability with five different fuzzy DEA methods in the literature. Copyright © 2011 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Guest editorial Ali Emrouznejad is a Senior Lecturer at the Aston Business School in Birmingham, UK. His areas of research interest include performance measurement and management, efficiency and productivity analysis as well as data mining. He has published widely in various international journals. He is an Associate Editor of IMA Journal of Management Mathematics and Guest Editor to several special issues of journals including Journal of Operational Research Society, Annals of Operations Research, Journal of Medical Systems, and International Journal of Energy Management Sector. He is in the editorial board of several international journals and co-founder of Performance Improvement Management Software. William Ho is a Senior Lecturer at the Aston University Business School. Before joining Aston in 2005, he had worked as a Research Associate in the Department of Industrial and Systems Engineering at the Hong Kong Polytechnic University. His research interests include supply chain management, production and operations management, and operations research. He has published extensively in various international journals like Computers & Operations Research, Engineering Applications of Artificial Intelligence, European Journal of Operational Research, Expert Systems with Applications, International Journal of Production Economics, International Journal of Production Research, Supply Chain Management: An International Journal, and so on. His first authored book was published in 2006. He is an Editorial Board member of the International Journal of Advanced Manufacturing Technology and an Associate Editor of the OR Insight Journal. Currently, he is a Scholar of the Advanced Institute of Management Research. Uses of frontier efficiency methodologies and multi-criteria decision making for performance measurement in the energy sector This special issue aims to focus on holistic, applied research on performance measurement in energy sector management and for publication of relevant applied research to bridge the gap between industry and academia. After a rigorous refereeing process, seven papers were included in this special issue. The volume opens with five data envelopment analysis (DEA)-based papers. Wu et al. apply the DEA-based Malmquist index to evaluate the changes in relative efficiency and the total factor productivity of coal-fired electricity generation of 30 Chinese administrative regions from 1999 to 2007. Factors considered in the model include fuel consumption, labor, capital, sulphur dioxide emissions, and electricity generated. The authors reveal that the east provinces were relatively and technically more efficient, whereas the west provinces had the highest growth rate in the period studied. Ioannis E. Tsolas applies the DEA approach to assess the performance of Greek fossil fuel-fired power stations taking undesirable outputs into consideration, such as carbon dioxide and sulphur dioxide emissions. In addition, the bootstrapping approach is deployed to address the uncertainty surrounding DEA point estimates, and provide bias-corrected estimations and confidence intervals for the point estimates. The author revealed from the sample that the non-lignite-fired stations are on an average more efficient than the lignite-fired stations. Maethee Mekaroonreung and Andrew L. Johnson compare the relative performance of three DEA-based measures, which estimate production frontiers and evaluate the relative efficiency of 113 US petroleum refineries while considering undesirable outputs. Three inputs (capital, energy consumption, and crude oil consumption), two desirable outputs (gasoline and distillate generation), and an undesirable output (toxic release) are considered in the DEA models. The authors discover that refineries in the Rocky Mountain region performed the best, and about 60 percent of oil refineries in the sample could improve their efficiencies further. H. Omrani, A. Azadeh, S. F. Ghaderi, and S. Abdollahzadeh presented an integrated approach, combining DEA, corrected ordinary least squares (COLS), and principal component analysis (PCA) methods, to calculate the relative efficiency scores of 26 Iranian electricity distribution units from 2003 to 2006. Specifically, both DEA and COLS are used to check three internal consistency conditions, whereas PCA is used to verify and validate the final ranking results of either DEA (consistency) or DEA-COLS (non-consistency). Three inputs (network length, transformer capacity, and number of employees) and two outputs (number of customers and total electricity sales) are considered in the model. Virendra Ajodhia applied three DEA-based models to evaluate the relative performance of 20 electricity distribution firms from the UK and the Netherlands. The first model is a traditional DEA model for analyzing cost-only efficiency. The second model includes (inverse) quality by modelling total customer minutes lost as an input data. The third model is based on the idea of using total social costs, including the firm’s private costs and the interruption costs incurred by consumers, as an input. Both energy-delivered and number of consumers are treated as the outputs in the models. After five DEA papers, Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou, and D. Zevgolis presented a multiple criteria analysis weighting approach to evaluate the energy and climate policy. The proposed approach is akin to the analytic hierarchy process, which consists of pairwise comparisons, consistency verification, and criteria prioritization. In the approach, stakeholders and experts in the energy policy field are incorporated in the evaluation process by providing an interactive mean with verbal, numerical, and visual representation of their preferences. A total of 14 evaluation criteria were considered and classified into four objectives, such as climate change mitigation, energy effectiveness, socioeconomic, and competitiveness and technology. Finally, Borge Hess applied the stochastic frontier analysis approach to analyze the impact of various business strategies, including acquisition, holding structures, and joint ventures, on a firm’s efficiency within a sample of 47 natural gas transmission pipelines in the USA from 1996 to 2005. The author finds that there were no significant changes in the firm’s efficiency by an acquisition, and there is a weak evidence for efficiency improvements caused by the new shareholder. Besides, the author discovers that parent companies appear not to influence a subsidiary’s efficiency positively. In addition, the analysis shows a negative impact of a joint venture on technical efficiency of the pipeline company. To conclude, we are grateful to all the authors for their contribution, and all the reviewers for their constructive comments, which made this special issue possible. We hope that this issue would contribute significantly to performance improvement of the energy sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To independently evaluate the impact of the second phase of the Health Foundation's Safer Patients Initiative (SPI2) on a range of patient safety measures. Design: A controlled before and after design. Five substudies: survey of staff attitudes; review of case notes from high risk (respiratory) patients in medical wards; review of case notes from surgical patients; indirect evaluation of hand hygiene by measuring hospital use of handwashing materials; measurement of outcomes (adverse events, mortality among high risk patients admitted to medical wards, patients' satisfaction, mortality in intensive care, rates of hospital acquired infection). Setting: NHS hospitals in England. Participants: Nine hospitals participating in SPI2 and nine matched control hospitals. Intervention The SPI2 intervention was similar to the SPI1, with somewhat modified goals, a slightly longer intervention period, and a smaller budget per hospital. Results: One of the scores (organisational climate) showed a significant (P=0.009) difference in rate of change over time, which favoured the control hospitals, though the difference was only 0.07 points on a five point scale. Results of the explicit case note reviews of high risk medical patients showed that certain practices improved over time in both control and SPI2 hospitals (and none deteriorated), but there were no significant differences between control and SPI2 hospitals. Monitoring of vital signs improved across control and SPI2 sites. This temporal effect was significant for monitoring the respiratory rate at both the six hour (adjusted odds ratio 2.1, 99% confidence interval 1.0 to 4.3; P=0.010) and 12 hour (2.4, 1.1 to 5.0; P=0.002) periods after admission. There was no significant effect of SPI for any of the measures of vital signs. Use of a recommended system for scoring the severity of pneumonia improved from 1.9% (1/52) to 21.4% (12/56) of control and from 2.0% (1/50) to 41.7% (25/60) of SPI2 patients. This temporal change was significant (7.3, 1.4 to 37.7; P=0.002), but the difference in difference was not significant (2.1, 0.4 to 11.1; P=0.236). There were no notable or significant changes in the pattern of prescribing errors, either over time or between control and SPI2 hospitals. Two items of medical history taking (exercise tolerance and occupation) showed significant improvement over time, across both control and SPI2 hospitals, but no additional SPI2 effect. The holistic review showed no significant changes in error rates either over time or between control and SPI2 hospitals. The explicit case note review of perioperative care showed that adherence rates for two of the four perioperative standards targeted by SPI2 were already good at baseline, exceeding 94% for antibiotic prophylaxis and 98% for deep vein thrombosis prophylaxis. Intraoperative monitoring of temperature improved over time in both groups, but this was not significant (1.8, 0.4 to 7.6; P=0.279), and there were no additional effects of SPI2. A dramatic rise in consumption of soap and alcohol hand rub was similar in control and SPI2 hospitals (P=0.760 and P=0.889, respectively), as was the corresponding decrease in rates of Clostridium difficile and meticillin resistant Staphylococcus aureus infection (P=0.652 and P=0.693, respectively). Mortality rates of medical patients included in the case note reviews in control hospitals increased from 17.3% (42/243) to 21.4% (24/112), while in SPI2 hospitals they fell from 10.3% (24/233) to 6.1% (7/114) (P=0.043). Fewer than 8% of deaths were classed as avoidable; changes in proportions could not explain the divergence of overall death rates between control and SPI2 hospitals. There was no significant difference in the rate of change in mortality in intensive care. Patients' satisfaction improved in both control and SPI2 hospitals on all dimensions, but again there were no significant changes between the two groups of hospitals. Conclusions: Many aspects of care are already good or improving across the NHS in England, suggesting considerable improvements in quality across the board. These improvements are probably due to contemporaneous policy activities relating to patient safety, including those with features similar to the SPI, and the emergence of professional consensus on some clinical processes. This phenomenon might have attenuated the incremental effect of the SPI, making it difficult to detect. Alternatively, the full impact of the SPI might be observable only in the longer term. The conclusion of this study could have been different if concurrent controls had not been used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Eight otherwise healthy diabetic volunteers took a daily antioxidant supplement consisting of vitamin E (200 IU), vitamin C (250 mg) and α-lipoic acid (90 mg) for a period of 6 weeks. Diabetic dapsone hydroxylamine-mediated methaemoglobin formation and resistance to erythrocytic thiol depletion was compared with age and sex-matched non-diabetic subjects. At time zero, methaemoglobin formation in the non-diabetic subjects was greater at all four time points compared with that of the diabetic subjects. Resistance to glutathione depletion was initially greater in non-diabetic compared with diabetic samples. Half-way through the study (3 weeks), there were no differences between the two groups in methaemoglobin formation and thiol depletion in the diabetic samples was now lower than the non-diabetic samples at 10 and 20 min. At 6 weeks, diabetic erythrocytic thiol levels remained greater than those of non-diabetics. HbA1c values were significantly reduced in the diabetic subjects at 6 weeks compared with time zero values. At 10 weeks, 4 weeks after the end of supplementation, the diabetic HbA1c values significantly increased to the point where they were not significantly different from the time zero values. Total antioxidant status measurement (TAS) indicated that diabetic plasma antioxidant capacity was significantly improved during antioxidant supplementation. Conversion of α-lipoic acid to dihydrolipoic acid (DHLA) in vivo led to potent interference in a standard fructosamine assay kit, negating its use in this study. This report suggests that triple antioxidant therapy in diabetic volunteers attenuates the in vitro experimental oxidative stress of methaemoglobin formation and reduces haemoglobin glycation in vivo. © 2003 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Defining 'effectiveness' in the context of community mental health teams (CMHTs) has become increasingly difficult under the current pattern of provision required in National Health Service mental health services in England. The aim of this study was to establish the characteristics of multi-professional team working effectiveness in adult CMHTs to develop a new measure of CMHT effectiveness. The study was conducted between May and November 2010 and comprised two stages. Stage 1 used a formative evaluative approach based on the Productivity Measurement and Enhancement System to develop the scale with multiple stakeholder groups over a series of qualitative workshops held in various locations across England. Stage 2 analysed responses from a cross-sectional survey of 1500 members in 135 CMHTs from 11 Mental Health Trusts in England to determine the scale's psychometric properties. Based on an analysis of its structural validity and reliability, the resultant 20-item scale demonstrated good psychometric properties and captured one overall latent factor of CMHT effectiveness comprising seven dimensions: improved service user well-being, creative problem-solving, continuous care, inter-team working, respect between professionals, engagement with carers and therapeutic relationships with service users. The scale will be of significant value to CMHTs and healthcare commissioners both nationally and internationally for monitoring, evaluating and improving team functioning in practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The uncertainty of measurements must be quantified and considered in order to prove conformance with specifications and make other meaningful comparisons based on measurements. While there is a consistent methodology for the evaluation and expression of uncertainty within the metrology community industry frequently uses the alternative Measurement Systems Analysis methodology. This paper sets out to clarify the differences between uncertainty evaluation and MSA and presents a novel hybrid methodology for industrial measurement which enables a correct evaluation of measurement uncertainty while utilising the practical tools of MSA. In particular the use of Gage R&R ANOVA and Attribute Gage studies within a wider uncertainty evaluation framework is described. This enables in-line measurement data to be used to establish repeatability and reproducibility, without time consuming repeatability studies being carried out, while maintaining a complete consideration of all sources of uncertainty and therefore enabling conformance to be proven with a stated level of confidence. Such a rigorous approach to product verification will become increasingly important in the era of the Light Controlled Factory with metrology acting as the driving force to achieve the right first time and highly automated manufacture of high value large scale products such as aircraft, spacecraft and renewable power generation structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is a crucial task to evaluate the reliability of manufacturing process in product development process. Process reliability is a measurement of production ability of reconfigurable manufacturing system (RMS), which serves as an integrated performance indicator of the production process under specified technical constraints, including time, cost and quality. An integration framework of manufacturing process reliability evaluation is presented together with product development process. A mathematical model and algorithm based on universal generating function (UGF) is developed for calculating the reliability of manufacturing process with respect to task intensity and process capacity, which are both independent random variables. The rework strategies of RMS are analyzed under different task intensity based on process reliability is presented, and the optimization of rework strategies based on process reliability is discussed afterwards.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Indicators are widely used by organizations as a way of evaluating, measuring and classifying organizational performance. As part of performance evaluation systems, indicators are often shared or compared across internal sectors or with other organizations. However, indicators can be vague and imprecise, and also can lack semantics, making comparisons with other indicators difficult. Thus, this paper presents a knowledge model based on an ontology that may be used to represent indicators semantically and generically, dealing with the imprecision and vagueness, and thus facilitating better comparison. Semantic technologies are shown to be suitable for this solution, so that it could be able to represent complex data involved in indicators comparison.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Indicators are widely used by organizations as a way of evaluating, measuring and classifying organizational performance. As part of performance evaluation systems, indicators are often shared or compared across internal sectors or with other organizations. However, indicators can be vague and imprecise, and also can lack semantics, making comparisons with other indicators difficult. Thus, this paper presents a knowledge model based on an ontology that may be used to represent indicators semantically and generically, dealing with the imprecision and vagueness, and thus facilitating better comparison. Semantic technologies are shown to be suitable for this solution, so that it could be able to represent complex data involved in indicators comparison.