991 resultados para value-mapping
Resumo:
Information about Improving Transition Outcomes statewide resource mapping product.
Resumo:
BACKGROUND: Replicative phenotypic HIV resistance testing (rPRT) uses recombinant infectious virus to measure viral replication in the presence of antiretroviral drugs. Due to its high sensitivity of detection of viral minorities and its dissecting power for complex viral resistance patterns and mixed virus populations rPRT might help to improve HIV resistance diagnostics, particularly for patients with multiple drug failures. The aim was to investigate whether the addition of rPRT to genotypic resistance testing (GRT) compared to GRT alone is beneficial for obtaining a virological response in heavily pre-treated HIV-infected patients. METHODS: Patients with resistance tests between 2002 and 2006 were followed within the Swiss HIV Cohort Study (SHCS). We assessed patients' virological success after their antiretroviral therapy was switched following resistance testing. Multilevel logistic regression models with SHCS centre as a random effect were used to investigate the association between the type of resistance test and virological response (HIV-1 RNA <50 copies/mL or ≥1.5 log reduction). RESULTS: Of 1158 individuals with resistance tests 221 with GRT+rPRT and 937 with GRT were eligible for analysis. Overall virological response rates were 85.1% for GRT+rPRT and 81.4% for GRT. In the subgroup of patients with >2 previous failures, the odds ratio (OR) for virological response of GRT+rPRT compared to GRT was 1.45 (95% CI 1.00-2.09). Multivariate analyses indicate a significant improvement with GRT+rPRT compared to GRT alone (OR 1.68, 95% CI 1.31-2.15). CONCLUSIONS: In heavily pre-treated patients rPRT-based resistance information adds benefit, contributing to a higher rate of treatment success.
Resumo:
Introdução Hoje em dia, o conceito de ontologia (Especificação explícita de uma conceptualização [Gruber, 1993]) é um conceito chave em sistemas baseados em conhecimento em geral e na Web Semântica em particular. Entretanto, os agentes de software nem sempre concordam com a mesma conceptualização, justificando assim a existência de diversas ontologias, mesmo que tratando o mesmo domínio de discurso. Para resolver/minimizar o problema de interoperabilidade entre estes agentes, o mapeamento de ontologias provou ser uma boa solução. O mapeamento de ontologias é o processo onde são especificadas relações semânticas entre entidades da ontologia origem e destino ao nível conceptual, e que por sua vez podem ser utilizados para transformar instâncias baseadas na ontologia origem em instâncias baseadas na ontologia destino. Motivação Num ambiente dinâmico como a Web Semântica, os agentes alteram não só os seus dados mas também a sua estrutura e semântica (ontologias). Este processo, denominado evolução de ontologias, pode ser definido como uma adaptação temporal da ontologia através de alterações que surgem no domínio ou nos objectivos da própria ontologia, e da gestão consistente dessas alterações [Stojanovic, 2004], podendo por vezes deixar o documento de mapeamento inconsistente. Em ambientes heterogéneos onde a interoperabilidade entre sistemas depende do documento de mapeamento, este deve reflectir as alterações efectuadas nas ontologias, existindo neste caso duas soluções: (i) gerar um novo documento de mapeamento (processo exigente em termos de tempo e recursos computacionais) ou (ii) adaptar o documento de mapeamento, corrigindo relações semânticas inválidas e criar novas relações se forem necessárias (processo menos existente em termos de tempo e recursos computacionais, mas muito dependente da informação sobre as alterações efectuadas). O principal objectivo deste trabalho é a análise, especificação e desenvolvimento do processo de evolução do documento de mapeamento de forma a reflectir as alterações efectuadas durante o processo de evolução de ontologias. Contexto Este trabalho foi desenvolvido no contexto do MAFRA Toolkit1. O MAFRA (MApping FRAmework) Toolkit é uma aplicação desenvolvida no GECAD2 que permite a especificação declarativa de relações semânticas entre entidades de uma ontologia origem e outra de destino, utilizando os seguintes componentes principais: Concept Bridge – Representa uma relação semântica entre um conceito de origem e um de destino; Property Bridge – Representa uma relação semântica entre uma ou mais propriedades de origem e uma ou mais propriedades de destino; Service – São aplicados às Semantic Bridges (Property e Concept Bridges) definindo como as instâncias origem devem ser transformadas em instâncias de destino. Estes conceitos estão especificados na ontologia SBO (Semantic Bridge Ontology) [Silva, 2004]. No contexto deste trabalho, um documento de mapeamento é uma instanciação do SBO, contendo relações semânticas entre entidades da ontologia de origem e da ontologia de destino. Processo de evolução do mapeamento O processo de evolução de mapeamento é o processo onde as entidades do documento de mapeamento são adaptadas, reflectindo eventuais alterações nas ontologias mapeadas, tentando o quanto possível preservar a semântica das relações semântica especificadas. Se as ontologias origem e/ou destino sofrerem alterações, algumas relações semânticas podem tornar-se inválidas, ou novas relações serão necessárias, sendo por isso este processo composto por dois sub-processos: (i) correcção de relações semânticas e (ii) processamento de novas entidades das ontologias. O processamento de novas entidades das ontologias requer a descoberta e cálculo de semelhanças entre entidades e a especificação de relações de acordo com a ontologia/linguagem SBO. Estas fases (“similarity measure” e “semantic bridging”) são implementadas no MAFRA Toolkit, sendo o processo (semi-) automático de mapeamento de ontologias descrito em [Silva, 2004]. O processo de correcção de entidades SBO inválidas requer um bom conhecimento da ontologia/linguagem SBO, das suas entidades e relações, e de todas as suas restrições, i.e. da sua estrutura e semântica. Este procedimento consiste em (i) identificar as entidades SBO inválidas, (ii) a causa da sua invalidez e (iii) corrigi-las da melhor forma possível. Nesta fase foi utilizada informação vinda do processo de evolução das ontologias com o objectivo de melhorar a qualidade de todo o processo. Conclusões Para além do processo de evolução do mapeamento desenvolvido, um dos pontos mais importantes deste trabalho foi a aquisição de um conhecimento mais profundo sobre ontologias, processo de evolução de ontologias, mapeamento etc., expansão dos horizontes de conhecimento, adquirindo ainda mais a consciência da complexidade do problema em questão, o que permite antever e perspectivar novos desafios para o futuro.
Resumo:
Perceptual maps have been used for decades by market researchers to illuminatethem about the similarity between brands in terms of a set of attributes, to position consumersrelative to brands in terms of their preferences, or to study how demographic and psychometricvariables relate to consumer choice. Invariably these maps are two-dimensional and static. Aswe enter the era of electronic publishing, the possibilities for dynamic graphics are opening up.We demonstrate the usefulness of introducing motion into perceptual maps through fourexamples. The first example shows how a perceptual map can be viewed in three dimensions,and the second one moves between two analyses of the data that were collected according todifferent protocols. In a third example we move from the best view of the data at the individuallevel to one which focuses on between-group differences in aggregated data. A final exampleconsiders the case when several demographic variables or market segments are available foreach respondent, showing an animation with increasingly detailed demographic comparisons.These examples of dynamic maps use several data sets from marketing and social scienceresearch.
Resumo:
Research on achievement goal promotion at University has shown that performance-approach goals are perceived as a means to succeed at University (high social utility) but are not appreciated (low social desirability). We argue that such a paradox could explain why research has detected that performance-approach goals consistently predict academic grades. First-year psychology students answered a performance-approach goal scale with standard, social desirability and social utility instructions. Participants' grades were recorded at the end of the semester. Results showed that the relationship between performance-approach goals and grades was inhibited by the increase of these goals' social desirability and facilitated by the increase of their social utility, revealing that the predictive validity of performance-approach goals depend on social value.
Value of sTREM-1, procalcitonin and CRP as laboratory parameters for postmortem diagnosis of sepsis.
Resumo:
OBJECTIVES: Triggering receptor expressed on myeloid cells-1 (TREM-1) was reported to be up-regulated in various inflammatory diseases as well as in bacterial sepsis. Increased cell-surface TREM-1 expression was also shown to result in marked plasma elevation of the soluble form of this molecule (sTREM-1) in patients with bacterial infections. In this study, we investigated sTREM-1, procalcitonin and C-reactive protein in postmortem serum in a series of sepsis-related fatalities and control individuals who underwent medico-legal investigations. sTREM-1 was also measured in pericardial fluid and urine. METHODS: Two study groups were prospectively formed, a sepsis-related fatalities group and a control group. The sepsis-related fatalities group consisted of sixteen forensic autopsy cases. Eight of these had a documented clinical diagnosis of sepsis in vivo. The control group consisted of sixteen forensic autopsy cases with various causes of death. RESULTS: Postmortem serum sTREM-1 concentrations were higher in the sepsis group with a mean value of 173.6 pg/ml in septic cases and 79.2 pg/ml in control individuals. The cutoff value of 90 pg/ml provided the best sensitivity and specificity. Pericardial fluid sTREM-1 values were higher in the septic group, with a mean value of 296.7 pg/ml in septic cases and 100.9 pg/ml in control individuals. The cutoff value of 135 pg/ml provided the best sensitivity and specificity. Mean urine sTREM-1 concentration was 102.9 pg/ml in septic cases and 89.3 pg/ml in control individuals. CONCLUSIONS: Postmortem serum sTREM-1, individually considered, did not provide better sensitivity and specificity than procalcitonin in detecting sepsis. However, simultaneous assessment of procalcitonin and sTREM-1 in postmortem serum can be of help in clarifying contradictory postmortem findings. sTREM-1 determination in pericardial fluid can be an alternative to postmortem serum in those situations in which biochemical analyses are required and blood collected during autopsy proves insufficient.
Resumo:
The paper contrasts empirically the results of alternative methods for estimating thevalue and the depreciation of mineral resources. The historical data of Mexico andVenezuela, covering the period 1920s-1980s, is used to contrast the results of severalmethods. These are the present value, the net price method, the user cost method andthe imputed income method. The paper establishes that the net price and the user costare not competing methods as such, but alternative adjustments to different scenariosof closed and open economies. The results prove that the biases of the methods, ascommonly described in the theoretical literature, only hold under the most restrictedscenario of constant rents over time. It is argued that the difference between what isexpected to happen and what actually did happen is for the most part due to a missingvariable, namely technological change. This is an important caveat to therecommendations made based on these models.
Resumo:
The paper develops a method to solve higher-dimensional stochasticcontrol problems in continuous time. A finite difference typeapproximation scheme is used on a coarse grid of low discrepancypoints, while the value function at intermediate points is obtainedby regression. The stability properties of the method are discussed,and applications are given to test problems of up to 10 dimensions.Accurate solutions to these problems can be obtained on a personalcomputer.
Resumo:
We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods.
Resumo:
This paper proposes an argument that explains incumbency advantage without recurring to the collective irresponsibility of legislatures. For that purpose, we exploit the informational value of incumbency: incumbency confers voters information about governing politicians not available from challengers. Because there are many reasons for high reelection rates different from incumbency status, we propose a measure of incumbency advantage that improves the use of pure reelection success. We also study the relationship between incumbency advantage and ideological and selection biases. An important implication of our analysis is that the literature linking incumbency and legislature irresponsibility most likely provides an overestimation of the latter.
Resumo:
How much information does an auctioneer want bidders to have in a private value environment?We address this question using a novel approach to ordering information structures based on the property that in private value settings more information leads to a more disperse distribution of buyers updated expected valuations. We define the class of precision criteria following this approach and different notions of dispersion, and relate them to existing criteria of informativeness. Using supermodular precision, we obtain three results: (1) a more precise information structure yields a more efficient allocation; (2) the auctioneer provides less than the efficient level of information since more information increases bidder informational rents; (3) there is a strategic complementarity between information and competition, so that both the socially efficient and the auctioneer s optimal choice of precision increase with the number of bidders, and both converge as the number of bidders goes to infinity.
Resumo:
This paper examines the value of connections between German industry andthe Nazi movement in early 1933. Drawing on previously unused contemporarysources about management and supervisory board composition and stock returns,we find that one out of seven firms, and a large proportion of the biggest companies,had substantive links with the National Socialist German Workers Party. Firmssupporting the Nazi movement experienced unusually high returns, outperformingunconnected ones by 5% to 8% between January and March 1933. These resultsare not driven by sectoral composition and are robust to alternative estimatorsand definitions of affiliation.
Resumo:
To provide a quantitative support to the handwriting evidence evaluation, a new method was developed through the computation of a likelihood ratio based on a Bayesian approach. In the present paper, the methodology is briefly described and applied to data collected within a simulated case of a threatening letter. Fourier descriptors are used to characterise the shape of loops of handwritten characters "a" of the true writer of the threatening letter, and: 1) with reference characters "a" of the true writer of the threatening letter, and then 2) with characters "a" of a writer who did not write the threatening letter. The findings support that the probabilistic methodology correctly supports either the hypothesis of authorship or the alternative hypothesis. Further developments will enable the handwriting examiner to use this methodology as a helpful assistance to assess the strength of evidence in handwriting casework.