882 resultados para the least squares distance method


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A literatura internacional que analisa os fatores impactantes das transações com partes relacionadas concentra-se no Reino Unido, nos EUA e no continente asiático, sendo o Brasil um ambiente pouco investigado. Esta pesquisa tem por objetivo investigar tanto os fatores impactantes dos contratos com partes relacionadas, quanto o impacto dessas transações no desempenho das empresas brasileiras. Estudos recentes que investigaram as determinantes das transações com partes relacionadas (TPRs), assim como seus impactos no desempenho das empresas, levaram em consideração as vertentes apresentadas por Gordon, Henry e Palia (2004): (a) de conflitos de interesses, as quais apoiam a visão de que as TPRs são danosas para os acionistas minoritários, implicando expropriação da riqueza deles, por parte dos controladores (acionistas majoritários); e (b) transações eficientes que podem ser benéficas às empresas, atendendo, desse modo, aos objetivos econômicos subjacentes delas. Esta pesquisa apoia-se na vertente de conflito de interesses, com base na teoria da agência e no fato de que o cenário brasileiro apresenta ter como característica uma estrutura de propriedade concentrada e ser um país emergente com ambiente legal caracterizado pela baixa proteção aos acionistas minoritários. Para operacionalizar a pesquisa, utilizou-se uma amostra inicial composta de 70 empresas com ações listadas na BM&FBovespa, observando o período de 2010 a 2012. Os contratos relacionados foram identificados e quantificados de duas formas, de acordo com a metodologia aplicada por Kohlbeck e Mayhew (2004; 2010) e Silveira, Prado e Sasso (2009). Como principais determinantes foram investigadas proxies para captar os efeitos dos mecanismos de governança corporativa e ambiente legal, do desempenho das empresas, dos desvios entre direitos sobre controle e direitos sobre fluxo de caixa e do excesso de remuneração executiva. Também foram adicionadas variáveis de controle para isolar as características intrínsecas das firmas. Nas análises econométricas foram estimados os modelos pelos métodos de Poisson, corte transversal agrupado (Pooled-OLS) e logit. A estimação foi feita pelo método dos mínimos quadrados ordinários (MQO), e para aumentar a robustez das estimativas econométricas, foram utilizadas variáveis instrumentais estimadas pelo método dos momentos generalizados (MMG). As evidências indicam que os fatores investigados impactam diferentemente as diversas medidas de TPRs das empresas analisadas. Verificou-se que os contratos relacionados, em geral, são danosos às empresas, impactando negativamente o desempenho delas, desempenho este que é aumentado pela presença de mecanismos eficazes de governança corporativa. Os resultados do impacto das medidas de governança corporativa e das características intrínsecas das firmas no desempenho das empresas são robustos à presença de endogeneidade com base nas regressões com variáveis instrumentais.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta dissertação foi desenvolvida com o objetivo de investigar os efeitos da instalação e das características do Conselho Fiscal e do Comitê de Auditoria sobre a qualidade das informações contábeis no Brasil. As características estudadas foram à independência e a qualificação dos membros. As proxies da qualidade da informação contábil foram relevância, tempestividade e conservadorismo condicional. A amostra utilizada foi composta por empresas brasileiras, listadas na Bolsa de Valores, Mercadorias e Futuros de São Paulo (BM&FBovespa), com liquidez anual superior a 0,001, no período de 2010 a 2013. Os dados foram coletados na base de dados Comdinheiro e nos Formulários de Referência das empresas, disponíveis no sítio eletrônico da Comissão de Valores Mobiliários (CVM) ou BM&FBovespa. Os modelos de qualidade da informação foram adaptados ao recorte metodológico e estimados pelo método dos mínimos quadrados ordinários (MQO), com erros-padrão robustos clusterizados por firma. Os resultados revelaram efeitos da instalação dos órgãos analisados sobre as proxies de qualidade da informação contábil. A instalação do Conselho Fiscal impactou positivamente a relevância do patrimônio líquido, enquanto a instalação do Comitê de Auditoria, a relevância do lucro. Esses resultados podem indicar diferenças no direcionamento da atenção desses órgãos: em proteger o patrimônio da entidade para os acionistas (Conselho Fiscal) ou em assegurar números mais confiáveis sobre o desempenho dos administradores (Comitê de Auditoria). Paralelamente, os resultados para a instalação do Conselho Fiscal de forma permanente inferiu força desse órgão como mecanismo de controle, ao invés da instalação somente a pedido dos acionistas. Já, a implementação do Conselho Fiscal Turbinado se mostrou ineficiente no controle da qualidade das informações contábeis. Na análise das características, a independência dos membros do Comitê de Auditoria impactou a relevância do lucro. Ao passo que a independência do Conselho Fiscal impactou a relevância do patrimônio líquido e o conservadorismo condicional (reconhecimento oportuno de perdas econômicas). Essas associações foram mais significantes quando os membros do Conselho Fiscal eram independentes dos acionistas controladores. Na análise da qualificação dos membros, foram encontradas evidências positivas na relação entre a relevância do patrimônio líquido e a maior proporção de membros do Conselho Fiscal com qualificação em Business (Contabilidade, Administração e Economia). O conservadorismo condicional foi maior na medida em que a qualificação dos membros do Conselho Fiscal convergia para a Contabilidade. Os resultados da qualificação dos membros do Comitê de Auditoria demonstraram relevância do lucro na presença de, ao menos, um Contador e na maior proporção de membros com qualificação tanto em Contabilidade como em Business; sendo mais significante conforme a qualificação dos membros do Comitê de Auditoria convergia para a Contabilidade.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A fast and direct surface plasmon resonance (SPR) method for the kinetic analysis of the interactions between peptide antigens and immobilised monoclonal antibodies (mAb) has been established. Protocols have been developed to overcome the problems posed by the small size of the analytes (< 1600 Da). The interactions were well described by a simple 1:1 bimolecular interaction and the rate constants were self-consistent and reproducible. The key features for the accuracy of the kinetic constants measured were high buffer flow rates, medium antibody surface densities and high peptide concentrations. The method was applied to an extensive analysis of over 40 peptide analogues towards two distinct anti-FMDV antibodies, providing data in total agreement with previous competition ELISA experiments. Eleven linear 15-residue synthetic peptides, reproducing all possible combinations of the four replacements found in foot-and-mouth disease virus (FMDV) field isolate C-S30, were evaluated. The direct kinetic SPR analysis of the interactions between these peptides and three anti-site A mAbs suggested additivity in all combinations of the four relevant mutations, which was confirmed by parallel ELISA analysis. The four-point mutant peptide (A15S30) reproducing site A from the C-S30 strain was the least antigenic of the set, in disagreement with previously reported studies with the virus isolate. Increasing peptide size from 15 to 21 residues did not significantly improve antigenicity. Overnight incubation of A15S30 with mAb 4C4 in solution showed a marked increase in peptide antigenicity not observed for other peptide analogues, suggesting that conformational rearrangement could lead to a stable peptide-antibody complex. In fact, peptide cyclization clearly improved antigenicity, confirming an antigenic reversion in a multiply substituted peptide. Solution NMR studies of both linear and cyclic versions of the antigenic loop of FMDV C-S30 showed that structural features previously correlated with antigenicity were more pronounced in the cyclic peptide. Twenty-six synthetic peptides, corresponding to all possible combinations of five single-point antigenicity-enhancing replacements in the GH loop of FMDV C-S8c1, were also studied. SPR kinetic screening of these peptides was not possible due to problems mainly related to the high mAb affinities displayed by these synthetic antigens. Solution affinity SPR analysis was employed and affinities displayed were generally comparable to or even higher than those corresponding to the C-S8c1 reference peptide A15. The NMR characterisation of one of these multiple mutants in solution showed that it had a conformational behaviour quite similar to that of the native sequence A15 and the X-ray diffraction crystallographic analysis of the peptide ? mAb 4C4 complex showed paratope ? epitope interactions identical to all FMDV peptide ? mAb complexes studied so far. Key residues for these interactions are those directly involved in epitope ? paratope contacts (141Arg, 143Asp, 146His) as well as residues able to stabilise a particular peptide global folding. A quasi-cyclic conformation is held up by a hydrophobic cavity defined by residues 138, 144 and 147 and by other key intrapeptide hydrogen bonds, delineating an open turn at positions 141, 142 and 143 (corresponding to the Arg-Gly-Asp motif).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Copyright: © 2014 Aranda et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Information:The incorporation of distance learning activities by institutions of higher education is considered an important contribution to create new opportunities for teaching at both, initial and continuing training. In Medicine and Nursing, several papers illustrate the adaptation of technological components and teaching methods are prolific, however, when we look at the Pharmaceutical Education area, the examples are scarce. In that sense this project demonstrates the implementation and assessment of a B-Learning Strategy for Therapeutics using a “case based learning” approach. Setting: Academic Pharmacy Methods:This is an exploratory study involving 2nd year students of the Pharmacy Degree at the School of Allied Health Sciences of Oporto. The study population consists of 61 students, divided in groups of 3-4 elements. The b-learning model was implemented during a time period of 8 weeks. Results:A B-learning environment and digital learning objects were successfully created and implemented. Collaboration and assessment techniques were carefully developed to ensure the active participation and fair assessment of all students. Moodle records show a consistent activity of students during the assignments. E-portfolios were also developed using Wikispaces, which promoted reflective writing and clinical reasoning. Conclusions:Our exploratory study suggests that the “case based learning” method can be successfully combined with the technological components to create and maintain a feasible online learning environment for the teaching of therapeutics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE To evaluate the cross-cultural validity of the Demand-Control Questionnaire, comparing the original Swedish questionnaire with the Brazilian version. METHODS We compared data from 362 Swedish and 399 Brazilian health workers. Confirmatory and exploratory factor analyses were performed to test structural validity, using the robust weighted least squares mean and variance-adjusted (WLSMV) estimator. Construct validity, using hypotheses testing, was evaluated through the inspection of the mean score distribution of the scale dimensions according to sociodemographic and social support at work variables. RESULTS The confirmatory and exploratory factor analyses supported the instrument in three dimensions (for Swedish and Brazilians): psychological demands, skill discretion and decision authority. The best-fit model was achieved by including an error correlation between work fast and work intensely (psychological demands) and removing the item repetitive work (skill discretion). Hypotheses testing showed that workers with university degree had higher scores on skill discretion and decision authority and those with high levels of Social Support at Work had lower scores on psychological demands and higher scores on decision authority. CONCLUSIONS The results supported the equivalent dimensional structures across the two culturally different work contexts. Skill discretion and decision authority formed two distinct dimensions and the item repetitive work should be removed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUNDWhile the pharmaceutical industry keeps an eye on plasmid DNA production for new generation gene therapies, real-time monitoring techniques for plasmid bioproduction are as yet unavailable. This work shows the possibility of in situ monitoring of plasmid production in Escherichia coli cultures using a near infrared (NIR) fiber optic probe. RESULTSPartial least squares (PLS) regression models based on the NIR spectra were developed for predicting bioprocess critical variables such as the concentrations of biomass, plasmid, carbon sources (glucose and glycerol) and acetate. In order to achieve robust models able to predict the performance of plasmid production processes, independently of the composition of the cultivation medium, cultivation strategy (batch versus fed-batch) and E. coli strain used, three strategies were adopted, using: (i) E. coliDH5 cultures conducted under different media compositions and culture strategies (batch and fed-batch); (ii) engineered E. coli strains, MG1655endArecApgi and MG1655endArecA, grown on the same medium and culture strategy; (iii) diverse E. coli strains, over batch and fed-batch cultivations and using different media compositions. PLS models showed high accuracy for predicting all variables in the three groups of cultures. CONCLUSIONNIR spectroscopy combined with PLS modeling provides a fast, inexpensive and contamination-free technique to accurately monitoring plasmid bioprocesses in real time, independently of the medium composition, cultivation strategy and the E. coli strain used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the present study we examine the potential use of oligonucleotide probes to characterize Neisseria meningitidis serotypes without the use of monoclonal antibodies (MAbs). Antigenic diversity on PorB protein forms the bases of serotyping method. However, the current panel of MAbs underestimated, by at least 50% the PorB variability, presumably because reagents for several PorB variable regions (VRs) are lacking, or because a number of VR variants are not recognized by serotype-defining MAbs12. We analyzed the use of oligonucleotide probes to characterize serotype 10 and serotype 19 of N. meningitidis. The porB gene sequence for the prototype strain of serotype 10 was determined, aligned with 7 other porB sequences from different serotypes, and analysis of individual VRs were performed. The results of DNA probes 21U (VR1-A) and 615U (VR3-B) used against 72 N. meningitidis strains confirm that VR1 type A and VR3 type B encode epitopes for serotype-defined MAbs 19 and 10, respectively. The use of probes for characterizing serotypes possible can type 100% of the PorB VR diversity. It is a simple and rapid method specially useful for analysis of large number of samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Submitted in partial fulfillment for the Requirements for the Degree of PhD in Mathematics, in the Speciality of Statistics in the Faculdade de Ciências e Tecnologia

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Durante as últimas décadas observou-se o crescimento da importância das avaliações fornecidas pelas agências de rating, sendo este um fator decisivo na tomada de decisão dos investidores. Também os emitentes de dívida são largamente afetados pelas alterações das classificações atribuídas por estas agências. Esta investigação pretende, por um lado, compreender se estas agências têm poder para conseguirem influenciar a evolução da dívida pública e qual o seu papel no mercado financeiro. Por outro, pretende compreender quais os fatores determinantes da dívida pública portuguesa, bem como a realização de uma análise por percentis com o objetivo de lhe atribuir um rating. Para a análise dos fatores que poderão influenciar a dívida pública, a metodologia utilizada é uma regressão linear múltipla estimada através do Método dos Mínimos Quadrados (Ordinary Least Squares – OLS), em que num cenário inicial era composta por onze variáveis independentes, sendo a dívida pública a variável dependente, para um período compreendido entre 1996 e 2013. Foram realizados vários testes ao modelo inicial, com o objetivo de encontrar um modelo que fosse o mais explicativo possível. Conseguimos ainda identificar uma relação inversa entre o rating atribuído por estas agências e a evolução da dívida pública, no sentido em que para períodos em que o rating desce, o crescimento da dívida é mais acentuado. Não nos foi, no entanto, possível atribuir um rating à dívida pública através de uma análise de percentis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Toxocariasis is a worldwide public-health problem that poses major risks to children who may accidentally ingest embryonated eggs of Toxocara. The objectives of this study were to investigate the occurrence of anti-Toxocara spp. antibodies in children and adolescents and the variables that may be involved, as well as environmental contamination by Toxocara spp. eggs, in urban recreation areas of north central mesoregion, Paraná State, Brazil. From June 2005 to March 2007. a total of 376 blood samples were collected by the Public Health Service from children and adolescents one to 12 years old, of both genders. Samples were analyzed by the indirect ELISA method for detection of anti-Toxocara antibodies. Serum samples were previously absorbed with Ascaris suum antigens, and considered positive with a reagent reactivity index >1. Soil samples from all of the public squares and schools located in the four evaluated municipalities that had sand surfaces (n = 19) or lawns (n = 15) were analyzed. Of the 376 serum samples, 194 (51.6%) were positive. The seroprevalence rate was substantially higher among children aging one to five years (p = 0.001) and six to eight years (p = 0.022). The clinical signs and symptoms investigated did not show a statistical difference between seropositive and seronegative individuals (p > 0.05). In 76.5% of the investigated recreation places, eggs of Toxocara were detected in at least one of the five collected samples. Recreation areas from public schools were 2.8 times more contaminated than from public squares. It is important to institute educational programs to inform families and educators, as well as to improve sanitary control of animals and cleaning of the areas intended for recreation in order to prevent toxocariasis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In health related research it is common to have multiple outcomes of interest in a single study. These outcomes are often analysed separately, ignoring the correlation between them. One would expect that a multivariate approach would be a more efficient alternative to individual analyses of each outcome. Surprisingly, this is not always the case. In this article we discuss different settings of linear models and compare the multivariate and univariate approaches. We show that for linear regression models, the estimates of the regression parameters associated with covariates that are shared across the outcomes are the same for the multivariate and univariate models while for outcome-specific covariates the multivariate model performs better in terms of efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Keystone XL has a big role for transforming Canadian oil to the USA. The function of the pipeline is decreasing the dependency of the American oil industry on other countries and it will help to limit external debt. The proposed pipeline seeks the most suitable route which cannot damage agricultural and natural water recourses such as the Ogallala Aquifer. Using the Geographic Information System (GIS) techniques, the suggested path in this study got extremely high correct results that will help in the future to use the least cost analysis for similar studies. The route analysis contains different weighted overlay surfaces, each, was influenced by various criteria (slope, geology, population and land use). The resulted least cost path routes for each weighted overlay surface were compared with the original proposed pipeline and each displayed surface was more effective than the proposed Keystone XL pipeline.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: Quality of life was measured using the EQ-5D index for Portugal and a Self-Assessed Ranking of Health (SARH) to understand which patients suffer the most decrease in quality of life: diabetics or hypertensive. Method: Using the National Health Survey (NHS), two analyses were conducted on 5649 respondents. The EQ-5D index was calculated by matching questions in the NHS with its dimensions. The SARH was calculated based on a specific question in the NHS. Results: Differences between diseases do not occur using the EQ-5D index. Using the SARH, type 1 diabetics suffer the most while hypertensive suffers the least.