943 resultados para Quality models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diplomityön tarkoituksena oli luoda ja kehittää kaksi asiakastyytyväisyysmallia asiakastyytyväisyyden mittaamisen aloittamiseksi ja toteuttamiseksi kohdeyrityksessä. Työ pohjautuu nykyisten tyytyväisyysprosessien analysointiin sekä työn teoriaosaan, joka käsittelee yksityiskohtaisesti niitä asioita, joita asiakastyytyväisyyden mittaamisessa ja prosessissa tulisi huomioida. Työssä tehdyn mallien tarkoituksen on auttaa kohdeyritystä hyödyntämään asiakastyytyväisyysmittauksen tuloksia paremmin liiketoiminnassa, sekä asiakkaiden keskuudessa. Työn yhtenä tavoitteena oli myös sopivan mittaustyökalun löytäminen ja suositteleminen kohdeyritykselle.Teorian ja analysoinnin pohjalta luotiin molemmat asiakastyytyväisyysmallit vastamaan kohdeyksiköiden tarpeita. Kun ulkoiset seikat, kuten mittaustavat, mittausinstrumentit, kyselylomakkeet ja vastaajaryhmät oli määritelty, keskityttiin tulosten analysointiin ja hyödyntämiseen, mikä korostui asiakassuuntautuneessa organisaatiossa. Työssä pohdittiin myös yhtenäisen asiakastyytyväisyysprosessin merkitystä ja etuja kohdeyrityksessä.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ohjelmistojen tärkeys nykypäivän yhteiskunnalle kasvaa jatkuvasti. Monia ohjelmistoprojekteja vaivaavat ongelmat aikataulussa pysymisestä, korkean tuottavuuden ylläpitämisestä ja riittävän korkeasta laadusta. Ohjelmistokehitysprosessien parantamisessa on naiden ongelmien minimoimiseksi tehty suuria investointeja. Investointien syynä on ollut olettamus ohjelmistokehityksen kapasiteetin suora riippuvuus tuotteen laadusta. Tämän tutkimuksen tarkoituksena oli tutkia Ohjelmistokehitysprosessien parantamisen mahdollisuuksia. Olemassaolevat ohjelmistokehityksen ja Ohjelmistokehitysprosessin parantamisen mallit, tekniikat ja metodologiat esiteltiin. Esiteltyjen mallien, tekniikoiden ja metodologioiden soveltuvuus analysoitiin ja suositus mallien käytöstä annettiin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Healthcare accreditation models generally include indicators related to healthcare employees' perceptions (e.g. satisfaction, career development, and health safety). During the accreditation process, organizations are asked to demonstrate the methods with which assessments are made. However, none of the models provide standardized systems for the assessment of employees. In this study, we analyzed the psychometric properties of an instrument for the assessment of nurses' perceptions as indicators of human capital quality in healthcare organizations. The Human Capital Questionnaire was applied to a sample of 902 nurses in four European countries (Spain, Portugal, Poland, and the UK). Exploratory factor analysis identified six factors: satisfaction with leadership, identification and commitment, satisfaction with participation, staff well-being, career development opportunities, and motivation. The results showed the validity and reliability of the questionnaire, which when applied to healthcare organizations, provide a better understanding of nurses' perceptions, and is a parsimonious instrument for assessment and organizational accreditation. From a practical point of view, improving the quality of human capital, by analyzing nurses and other healthcare employees' perceptions, is related to workforce empowerment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Notre consommation en eau souterraine, en particulier comme eau potable ou pour l'irrigation, a considérablement augmenté au cours des années. De nombreux problèmes font alors leur apparition, allant de la prospection de nouvelles ressources à la remédiation des aquifères pollués. Indépendamment du problème hydrogéologique considéré, le principal défi reste la caractérisation des propriétés du sous-sol. Une approche stochastique est alors nécessaire afin de représenter cette incertitude en considérant de multiples scénarios géologiques et en générant un grand nombre de réalisations géostatistiques. Nous rencontrons alors la principale limitation de ces approches qui est le coût de calcul dû à la simulation des processus d'écoulements complexes pour chacune de ces réalisations. Dans la première partie de la thèse, ce problème est investigué dans le contexte de propagation de l'incertitude, oú un ensemble de réalisations est identifié comme représentant les propriétés du sous-sol. Afin de propager cette incertitude à la quantité d'intérêt tout en limitant le coût de calcul, les méthodes actuelles font appel à des modèles d'écoulement approximés. Cela permet l'identification d'un sous-ensemble de réalisations représentant la variabilité de l'ensemble initial. Le modèle complexe d'écoulement est alors évalué uniquement pour ce sousensemble, et, sur la base de ces réponses complexes, l'inférence est faite. Notre objectif est d'améliorer la performance de cette approche en utilisant toute l'information à disposition. Pour cela, le sous-ensemble de réponses approximées et exactes est utilisé afin de construire un modèle d'erreur, qui sert ensuite à corriger le reste des réponses approximées et prédire la réponse du modèle complexe. Cette méthode permet de maximiser l'utilisation de l'information à disposition sans augmentation perceptible du temps de calcul. La propagation de l'incertitude est alors plus précise et plus robuste. La stratégie explorée dans le premier chapitre consiste à apprendre d'un sous-ensemble de réalisations la relation entre les modèles d'écoulement approximé et complexe. Dans la seconde partie de la thèse, cette méthodologie est formalisée mathématiquement en introduisant un modèle de régression entre les réponses fonctionnelles. Comme ce problème est mal posé, il est nécessaire d'en réduire la dimensionnalité. Dans cette optique, l'innovation du travail présenté provient de l'utilisation de l'analyse en composantes principales fonctionnelles (ACPF), qui non seulement effectue la réduction de dimensionnalités tout en maximisant l'information retenue, mais permet aussi de diagnostiquer la qualité du modèle d'erreur dans cet espace fonctionnel. La méthodologie proposée est appliquée à un problème de pollution par une phase liquide nonaqueuse et les résultats obtenus montrent que le modèle d'erreur permet une forte réduction du temps de calcul tout en estimant correctement l'incertitude. De plus, pour chaque réponse approximée, une prédiction de la réponse complexe est fournie par le modèle d'erreur. Le concept de modèle d'erreur fonctionnel est donc pertinent pour la propagation de l'incertitude, mais aussi pour les problèmes d'inférence bayésienne. Les méthodes de Monte Carlo par chaîne de Markov (MCMC) sont les algorithmes les plus communément utilisés afin de générer des réalisations géostatistiques en accord avec les observations. Cependant, ces méthodes souffrent d'un taux d'acceptation très bas pour les problèmes de grande dimensionnalité, résultant en un grand nombre de simulations d'écoulement gaspillées. Une approche en deux temps, le "MCMC en deux étapes", a été introduite afin d'éviter les simulations du modèle complexe inutiles par une évaluation préliminaire de la réalisation. Dans la troisième partie de la thèse, le modèle d'écoulement approximé couplé à un modèle d'erreur sert d'évaluation préliminaire pour le "MCMC en deux étapes". Nous démontrons une augmentation du taux d'acceptation par un facteur de 1.5 à 3 en comparaison avec une implémentation classique de MCMC. Une question reste sans réponse : comment choisir la taille de l'ensemble d'entrainement et comment identifier les réalisations permettant d'optimiser la construction du modèle d'erreur. Cela requiert une stratégie itérative afin que, à chaque nouvelle simulation d'écoulement, le modèle d'erreur soit amélioré en incorporant les nouvelles informations. Ceci est développé dans la quatrième partie de la thèse, oú cette méthodologie est appliquée à un problème d'intrusion saline dans un aquifère côtier. -- Our consumption of groundwater, in particular as drinking water and for irrigation, has considerably increased over the years and groundwater is becoming an increasingly scarce and endangered resource. Nofadays, we are facing many problems ranging from water prospection to sustainable management and remediation of polluted aquifers. Independently of the hydrogeological problem, the main challenge remains dealing with the incomplete knofledge of the underground properties. Stochastic approaches have been developed to represent this uncertainty by considering multiple geological scenarios and generating a large number of realizations. The main limitation of this approach is the computational cost associated with performing complex of simulations in each realization. In the first part of the thesis, we explore this issue in the context of uncertainty propagation, where an ensemble of geostatistical realizations is identified as representative of the subsurface uncertainty. To propagate this lack of knofledge to the quantity of interest (e.g., the concentration of pollutant in extracted water), it is necessary to evaluate the of response of each realization. Due to computational constraints, state-of-the-art methods make use of approximate of simulation, to identify a subset of realizations that represents the variability of the ensemble. The complex and computationally heavy of model is then run for this subset based on which inference is made. Our objective is to increase the performance of this approach by using all of the available information and not solely the subset of exact responses. Two error models are proposed to correct the approximate responses follofing a machine learning approach. For the subset identified by a classical approach (here the distance kernel method) both the approximate and the exact responses are knofn. This information is used to construct an error model and correct the ensemble of approximate responses to predict the "expected" responses of the exact model. The proposed methodology makes use of all the available information without perceptible additional computational costs and leads to an increase in accuracy and robustness of the uncertainty propagation. The strategy explored in the first chapter consists in learning from a subset of realizations the relationship between proxy and exact curves. In the second part of this thesis, the strategy is formalized in a rigorous mathematical framework by defining a regression model between functions. As this problem is ill-posed, it is necessary to reduce its dimensionality. The novelty of the work comes from the use of functional principal component analysis (FPCA), which not only performs the dimensionality reduction while maximizing the retained information, but also allofs a diagnostic of the quality of the error model in the functional space. The proposed methodology is applied to a pollution problem by a non-aqueous phase-liquid. The error model allofs a strong reduction of the computational cost while providing a good estimate of the uncertainty. The individual correction of the proxy response by the error model leads to an excellent prediction of the exact response, opening the door to many applications. The concept of functional error model is useful not only in the context of uncertainty propagation, but also, and maybe even more so, to perform Bayesian inference. Monte Carlo Markov Chain (MCMC) algorithms are the most common choice to ensure that the generated realizations are sampled in accordance with the observations. Hofever, this approach suffers from lof acceptance rate in high dimensional problems, resulting in a large number of wasted of simulations. This led to the introduction of two-stage MCMC, where the computational cost is decreased by avoiding unnecessary simulation of the exact of thanks to a preliminary evaluation of the proposal. In the third part of the thesis, a proxy is coupled to an error model to provide an approximate response for the two-stage MCMC set-up. We demonstrate an increase in acceptance rate by a factor three with respect to one-stage MCMC results. An open question remains: hof do we choose the size of the learning set and identify the realizations to optimize the construction of the error model. This requires devising an iterative strategy to construct the error model, such that, as new of simulations are performed, the error model is iteratively improved by incorporating the new information. This is discussed in the fourth part of the thesis, in which we apply this methodology to a problem of saline intrusion in a coastal aquifer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: quality of life (QoL) is a subjective perception whose components may vary in importance between individuals. Little is known about which domains of QoL older people deem most important. OBJECTIVE: this study investigated in community-dwelling older people the relationships between the importance given to domains defining their QoL and socioeconomic, demographic and health status. METHODS: data were compiled from older people enrolled in the Lc65+ cohort study and two additional, population-based, stratified random samples (n = 5,300). Principal components analysis (PCA) was used to determine the underlying domains among 28 items that participants defined as important to their QoL. The components extracted were used as dependent variables in multiple linear regression models to explore their associations with socioeconomic, demographic and health status. RESULTS: PCA identified seven domains that older persons considered important to their QoL. In order of importance (highest to lowest): feeling of safety, health and mobility, autonomy, close entourage, material resources, esteem and recognition, and social and cultural life. A total of six and five domains of importance were significantly associated with education and depressive symptoms, respectively. The importance of material resources was significantly associated with a good financial situation (β = 0.16, P = 0.011), as was close entourage with living with others (β = 0.20, P = 0.007) and as was health and mobility with age (β = -0.16, P = 0.014). CONCLUSION: the importance older people give to domains of their QoL appears strongly related to their actual resources and experienced losses. These findings may help clinicians, researchers and policy makers better adapt strategies to individuals' needs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Construction of multiple sequence alignments is a fundamental task in Bioinformatics. Multiple sequence alignments are used as a prerequisite in many Bioinformatics methods, and subsequently the quality of such methods can be critically dependent on the quality of the alignment. However, automatic construction of a multiple sequence alignment for a set of remotely related sequences does not always provide biologically relevant alignments.Therefore, there is a need for an objective approach for evaluating the quality of automatically aligned sequences. The profile hidden Markov model is a powerful approach in comparative genomics. In the profile hidden Markov model, the symbol probabilities are estimated at each conserved alignment position. This can increase the dimension of parameter space and cause an overfitting problem. These two research problems are both related to conservation. We have developed statistical measures for quantifying the conservation of multiple sequence alignments. Two types of methods are considered, those identifying conserved residues in an alignment position, and those calculating positional conservation scores. The positional conservation score was exploited in a statistical prediction model for assessing the quality of multiple sequence alignments. The residue conservation score was used as part of the emission probability estimation method proposed for profile hidden Markov models. The results of the predicted alignment quality score highly correlated with the correct alignment quality scores, indicating that our method is reliable for assessing the quality of any multiple sequence alignment. The comparison of the emission probability estimation method with the maximum likelihood method showed that the number of estimated parameters in the model was dramatically decreased, while the same level of accuracy was maintained. To conclude, we have shown that conservation can be successfully used in the statistical model for alignment quality assessment and in the estimation of emission probabilities in the profile hidden Markov models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes high-quality journals in Brazil and Spain, with an emphasis on the distribution models used. It presents the general characteristics (age, type of publisher, and theme) and analyzes the distribution model by studying the type of format (print or digital), the type of access (open access or subscription), and the technology platform used. The 549 journals analyzed (249 in Brazil and 300 in Spain) are included in the 2011 Web of Science (WoS) and Scopus databases. Data on each journal were collected directly from their websites between March and October 2012. Brazil has a fully open access distribution model (97%) in which few journals require payment by authors thanks to cultural, financial, operational, and technological support provided by public agencies. In Spain, open access journals account for 55% of the total and have also received support from public agencies, although to a lesser extent. These results show that there are systems support of open access in scientific journals other than the"author pays" model advocated by the Finch report for the United Kingdom.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, the results of a modified SERVQUAL questionnaire (Parasuraman et al., 1991) are reported. The modifications consisted in substituting questionnaire items particularly suited to a specific service (banking) and context (county of Girona, Spain) for the original rather general and abstract items. These modifications led to more interpretable factors which accounted for a higher percentage of item variance. The data were submitted to various structural equation models which made it possible to conclude that the questionnaire contains items with a high measurement quality with respect to five identified dimensions of service quality which differ from those specified by Parasuraman et al. And are specific to the banking service. The two dimensions relating to the behaviour of employees have the greatest predictive power on overall quality and satisfaction ratings, which enables managers to use a low-cost reduced version of the questionnaire to monitor quality on a regular basis. It was also found that satisfaction and overall quality were perfectly correlated thus showing that customers do not perceive these concepts as being distinct

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this paper was to show the potential additional insight that result from adding greenhouse gas (GHG) emissions to plant performance evaluation criteria, such as effluent quality (EQI) and operational cost (OCI) indices, when evaluating (plant-wide) control/operational strategies in wastewater treatment plants (WWTPs). The proposed GHG evaluation is based on a set of comprehensive dynamic models that estimate the most significant potential on-site and off-site sources of CO2, CH4 and N2O. The study calculates and discusses the changes in EQI, OCI and the emission of GHGs as a consequence of varying the following four process variables: (i) the set point of aeration control in the activated sludge section; (ii) the removal efficiency of total suspended solids (TSS) in the primary clarifier; (iii) the temperature in the anaerobic digester; and (iv) the control of the flow of anaerobic digester supernatants coming from sludge treatment. Based upon the assumptions built into the model structures, simulation results highlight the potential undesirable effects of increased GHG production when carrying out local energy optimization of the aeration system in the activated sludge section and energy recovery from the AD. Although off-site CO2 emissions may decrease, the effect is counterbalanced by increased N2O emissions, especially since N2O has a 300-fold stronger greenhouse effect than CO2. The reported results emphasize the importance and usefulness of using multiple evaluation criteria to compare and evaluate (plant-wide) control strategies in a WWTP for more informed operational decision making

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Population aging is closely related to high prevalence of chronic conditions in developed countries. In this context, health care policies aim to increase life span cost-effectively while maintaining quality of life and functional ability. There is still, however, a need for further understanding of how chronic conditions affect these health aspects. The aim of this paper is to assess the individual and combined impact of chronic physical and mental conditions on quality of life and disability in Spain, and secondly to show gender trends. METHODS: Cross-sectional data were collected from the COURAGE study. A total of 3,625 participants over 50 years old from Spain were included. Crude and adjusted multiple linear regressions were conducted to detect associations between individual chronic conditions and disability, and between chronic conditions and quality of life. Separate models were used to assess the influence of the number of diseases on the same variables. Additional analogous regressions were performed for males and females. RESULTS: All chronic conditions except hypertension were statistically associated with poor results in quality of life and disability. Depression, anxiety and stroke were found to have the greatest impact on outcomes. The number of chronic conditions was associated with substantially lower quality of life [β for 4+ diseases: -18.10 (-20.95,-15.25)] and greater disability [β for 4+ diseases: 27.64 (24.99,30.29]. In general, women suffered from higher rates of multimorbidity and poorer results in quality of life and disability. CONCLUSIONS: Chronic conditions impact greatly on quality of life and disability in the older Spanish population, especially when co-occurring diseases are added. Multimorbidity considerations should be a priority in the development of future health policies focused on quality of life and disability. Further studies would benefit from an expanded selection of diseases. Policies should also deal with gender idiosyncrasy in certain cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The analysis of efficiency and productivity in banking has received a great deal of attention for almost three decades now. However, most of the literature to date has not explicitly accounted for risk when measuring efficiency. We propose an analysis of profit efficiency taking into account how the inclusion of a variety of bank risk measures might bias efficiency scores. Our measures of risk are partly inspired by the literature on earnings management and earnings quality, keeping in mind that loan loss provisions, as a generally accepted proxy for risk, can be adjusted to manage earnings and regulatory capital. We also consider some variants of traditional models of profit efficiency where different regimes are stipulated so that financial institutions can be evaluated in different dimensions—i.e., prices, quantities, or prices and quantities simultaneously. We perform this analysis on the Spanish banking industry, whose institutions have been deeply affected by the current international financial crisis, and where re-regulation is taking place. Our results can be explored in multiple dimensions but, in general, they indicate that the impact of earnings management on profit efficiency is of less magnitude than what might a priori be expected, and that on the whole, savings banks have performed less well than commercial banks. However, savings banks are adapting to the new regulatory scenario and rapidly catching up with commercial banks, especially in some dimensions of performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity distribution network operation (NO) models are challenged as they are expected to continue to undergo changes during the coming decades in the fairly developed and regulated Nordic electricity market. Network asset managers are to adapt to competitive technoeconomical business models regarding the operation of increasingly intelligent distribution networks. Factors driving the changes for new business models within network operation include: increased investments in distributed automation (DA), regulative frameworks for annual profit limits and quality through outage cost, increasing end-customer demands, climatic changes and increasing use of data system tools, such as Distribution Management System (DMS). The doctoral thesis addresses the questions a) whether there exist conditions and qualifications for competitive markets within electricity distribution network operation and b) if so, identification of limitations and required business mechanisms. This doctoral thesis aims to provide an analytical business framework, primarily for electric utilities, for evaluation and development purposes of dedicated network operation models to meet future market dynamics within network operation. In the thesis, the generic build-up of a business model has been addressed through the use of the strategicbusiness hierarchy levels of mission, vision and strategy for definition of the strategic direction of the business followed by the planning, management and process execution levels of enterprisestrategy execution. Research questions within electricity distribution network operation are addressed at the specified hierarchy levels. The results of the research represent interdisciplinary findings in the areas of electrical engineering and production economics. The main scientific contributions include further development of the extended transaction cost economics (TCE) for government decisions within electricity networks and validation of the usability of the methodology for the electricity distribution industry. Moreover, DMS benefit evaluations in the thesis based on the outage cost calculations propose theoretical maximum benefits of DMS applications equalling roughly 25% of the annual outage costs and 10% of the respective operative costs in the case electric utility. Hence, the annual measurable theoretical benefits from the use of DMS applications are considerable. The theoretical results in the thesis are generally validated by surveys and questionnaires.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software systems are expanding and becoming increasingly present in everyday activities. The constantly evolving society demands that they deliver more functionality, are easy to use and work as expected. All these challenges increase the size and complexity of a system. People may not be aware of a presence of a software system, until it malfunctions or even fails to perform. The concept of being able to depend on the software is particularly significant when it comes to the critical systems. At this point quality of a system is regarded as an essential issue, since any deficiencies may lead to considerable money loss or life endangerment. Traditional development methods may not ensure a sufficiently high level of quality. Formal methods, on the other hand, allow us to achieve a high level of rigour and can be applied to develop a complete system or only a critical part of it. Such techniques, applied during system development starting at early design stages, increase the likelihood of obtaining a system that works as required. However, formal methods are sometimes considered difficult to utilise in traditional developments. Therefore, it is important to make them more accessible and reduce the gap between the formal and traditional development methods. This thesis explores the usability of rigorous approaches by giving an insight into formal designs with the use of graphical notation. The understandability of formal modelling is increased due to a compact representation of the development and related design decisions. The central objective of the thesis is to investigate the impact that rigorous approaches have on quality of developments. This means that it is necessary to establish certain techniques for evaluation of rigorous developments. Since we are studying various development settings and methods, specific measurement plans and a set of metrics need to be created for each setting. Our goal is to provide methods for collecting data and record evidence of the applicability of rigorous approaches. This would support the organisations in making decisions about integration of formal methods into their development processes. It is important to control the software development, especially in its initial stages. Therefore, we focus on the specification and modelling phases, as well as related artefacts, e.g. models. These have significant influence on the quality of a final system. Since application of formal methods may increase the complexity of a system, it may impact its maintainability, and thus quality. Our goal is to leverage quality of a system via metrics and measurements, as well as generic refinement patterns, which are applied to a model and a specification. We argue that they can facilitate the process of creating software systems, by e.g. controlling complexity and providing the modelling guidelines. Moreover, we find them as additional mechanisms for quality control and improvement, also for rigorous approaches. The main contribution of this thesis is to provide the metrics and measurements that help in assessing the impact of rigorous approaches on developments. We establish the techniques for the evaluation of certain aspects of quality, which are based on structural, syntactical and process related characteristics of an early-stage development artefacts, i.e. specifications and models. The presented approaches are applied to various case studies. The results of the investigation are juxtaposed with the perception of domain experts. It is our aspiration to promote measurements as an indispensable part of quality control process and a strategy towards the quality improvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quality is not only free but it can be a profit maker. Every dollar that is not spent on doing things wrong becomes a dollar right on the bottom line. The main objective of this thesis is to give an answer on how cost of poor quality can be measured theoretically correctly. Different calculation methods for cost of poor quality are presented and discussed in order to give comprehensive picture about measurement process. The second objective is to utilize the knowledge from the literature review and to apply it when creating a method for measuring cost of poor quality in supplier performance rating. Literature review indicates that P-A-F model together with ABC methodology provides a mean for quality cost calculations. These models give an answer what should be measured and how this measurement should be carried out. However, when product or service quality costs are incurred when quality character derivates from target value, then QLF seems to be most appropriate methodology for quality cost calculation. These methodologies were applied when creating a quality cost calculation method for supplier performance ratings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to compare the hydrographically conditioned digital elevation models (HCDEMs) generated from data of VNIR (Visible Near Infrared) sensor of ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer), of SRTM (Shuttle Radar Topography Mission) and topographical maps from IBGE in a scale of 1:50,000, processed in the Geographical Information System (GIS), aiming the morphometric characterization of watersheds. It was taken as basis the Sub-basin of São Bartolomeu River, obtaining morphometric characteristics from HCDEMs. Root Mean Square Error (RMSE) and cross validation were the statistics indexes used to evaluate the quality of HCDEMs. The percentage differences in the morphometric parameters obtained from these three different data sets were less than 10%, except for the mean slope (21%). In general, it was observed a good agreement between HCDEMs generated from remote sensing data and IBGE maps. The result of HCDEM ASTER was slightly higher than that from HCDEM SRTM. The HCDEM ASTER was more accurate than the HCDEM SRTM in basins with high altitudes and rugged terrain, by presenting frequency altimetry nearest to HCDEM IBGE, considered standard in this study.