525 resultados para Contingency
Resumo:
A joint distribution of two discrete random variables with finite support can be displayed as a two way table of probabilities adding to one. Assume that this table hasn rows and m columns and all probabilities are non-null. This kind of table can beseen as an element in the simplex of n · m parts. In this context, the marginals areidentified as compositional amalgams, conditionals (rows or columns) as subcompositions. Also, simplicial perturbation appears as Bayes theorem. However, the Euclideanelements of the Aitchison geometry of the simplex can also be translated into the tableof probabilities: subspaces, orthogonal projections, distances.Two important questions are addressed: a) given a table of probabilities, which isthe nearest independent table to the initial one? b) which is the largest orthogonalprojection of a row onto a column? or, equivalently, which is the information in arow explained by a column, thus explaining the interaction? To answer these questionsthree orthogonal decompositions are presented: (1) by columns and a row-wise geometric marginal, (2) by rows and a columnwise geometric marginal, (3) by independenttwo-way tables and fully dependent tables representing row-column interaction. Animportant result is that the nearest independent table is the product of the two (rowand column)-wise geometric marginal tables. A corollary is that, in an independenttable, the geometric marginals conform with the traditional (arithmetic) marginals.These decompositions can be compared with standard log-linear models.Key words: balance, compositional data, simplex, Aitchison geometry, composition,orthonormal basis, arithmetic and geometric marginals, amalgam, dependence measure,contingency table
Resumo:
There is insufficient evidence of the usefulness of dengue diagnostic tests under routine conditions. We sought to analyse how physicians are using dengue diagnostics to inform research and development. Subjects attending 14 health institutions in an endemic area of Colombia with either a clinical diagnosis of dengue or for whom a dengue test was ordered were included in the study. Patterns of test-use are described herein. Factors associated with the ordering of dengue diagnostic tests were identified using contingency tables, nonparametric tests and logistic regression. A total of 778 subjects were diagnosed with dengue by the treating physician, of whom 386 (49.5%) were tested for dengue. Another 491 dengue tests were ordered in subjects whose primary diagnosis was not dengue. Severe dengue classification [odds ratio (OR) 2.2; 95% confidence interval (CI) 1.1-4.5], emergency consultation (OR 1.9; 95% CI 1.4-2.5) and month of the year (OR 3.1; 95% CI 1.7-5.5) were independently associated with ordering of dengue tests. Dengue tests were used both to rule in and rule out diagnosis. The latter use is not justified by the sensitivity of current rapid dengue diagnostic tests. Ordering of dengue tests appear to depend on a combination of factors, including physician and institutional preferences, as well as other patient and epidemiological factors.
Resumo:
Introduction: Increased respiratory pattern variability is associated with improved oxygenation. Pressure support (PS) is a widely used partial-assist mechanical ventilation (MV) mode, in which each breathing cycle is initiated by flow or pressure variation at the airway due to patient inspiratory effort. Neurally adjusted ventilatory assist (NAVA) is relatively new and uses the electrical activity of the diaphragm (Eadi) to deliver ventilatory support proportional to the patient's inspiratory demand. We hypothesize that respiratory variability should be greater with NAVA compared with PS.Methods: Twenty-two patients underwent 20 minutes of PS followed by 20 minutes of NAVA. Flow and Eadi curves were used to obtain tidal volume (Vt) and ∫Eadi for 300 to 400 breaths in each patient. Patient-specific cumulative distribution functions (CDF) show the percentage Vt and ∫Eadi within a clinically defined (±10%) variability band for each patient. Values are normalized to patient-specific medians for direct comparison. Variability in Vt (outcome) is thus expressed in terms of variability in ∫Eadi (demand) on the same plot.Results: Variability in Vt relative to variability in ∫Eadi is significantly greater for NAVA than PS (P = 0.00012). Hence, greater variability in outcome Vt is obtained for a given demand in ∫Eadi, under NAVA, as illustrated in Figure 1 for a typical patient. A Fisher 2 × 2 contingency analysis showed that 45% of patients under NAVA had a Vt variability in equal proportion to ∫Eadi variability, versus 0% for PS (P < 0.05).Conclusions: NAVA yields greater variability in tidal volume, relative to ∫Eadi demand, and a better match between Vt and ∫Eadi. These results indicate that NAVA could achieve improved oxygenation compared with PS when sufficient underlying variability in ∫Eadi is present, due to its ability to achieve higher tidal volume variability from a given variability in ∫Eadi.
Resumo:
OBJECTIVES: To study the ways of managing HIV risk within male homosexual steady relationships (gay couples), including factors associated with consistent condom use during anal sex with the steady partner.¦METHOD: An anonymous and standardized questionnaire completed by a convenience sample of homosexuals in Switzerland in 1997 (n = 1097). Information on the couple was provided by the 74% (n = 786) of male respondents who reported having a steady partner in the past 12 months. Data were analysed by contingency tables and logistic regression.¦RESULTS: Different ways of managing HIV risk were reported: negotiated safety (both HIV negative, condoms abandoned) was chosen by one quarter of the couples, but the most frequent solution was reliance on condoms for anal sex, chosen by more than four in 10. Altogether 84% of couples exhibited safe management of HIV risk within their partnership. The 16% of couples showing inadequate management of HIV risk within the couple mostly relied on questionable assumptions about past or present risks. A total of 74% of couples had spoken about managing HIV risk with possible casual partners. Reported behaviour with the steady partner and with casual partners was highly consistent with claimed strategies chosen to manage HIV risk. Consistent condom use with the steady partner was mostly associated with variables characterizing the relationship: initial 2 years of the relationship, discordant or unknown serological HIV status, non-exclusivity.¦CONCLUSION: Gay couples manage HIV risk in a variety of ways. Most strategies provide adequate protection with casual partners, but leave gaps in protection between the steady partners themselves.
Resumo:
This study aimed to verify the association between self-care ability and sociodemographic factors of people with spinal cord injury (SCI). It was a cross-sectional study, conducted in 2012, in all 58 Basic Health Units of Natal/RN, Brazil. Seventy-three subjects completed a sociodemographic form andSelf-Care Agency Scale. Statistical analyses were performed using SPSS,including Cronbach’s Alpha, Chi-square, Fisher’s and contingency coefficient tests. The Cronbach's alpha was 0.788. The result verified that sex (p = 0.028), religion (p <0.001), education (p = 0.046), current age (p = 0.027), SCI time (p = 0.020) and the SCI type (p = 0.012) were variables associated with self-care ability of the subjects. It was concluded that sociodemographic factors may interfere with the self-care ability of persons with SCI, and nurses should consider this aspect during the execution of the nursing process.
Resumo:
We compare two methods for visualising contingency tables and developa method called the ratio map which combines the good properties of both.The first is a biplot based on the logratio approach to compositional dataanalysis. This approach is founded on the principle of subcompositionalcoherence, which assures that results are invariant to considering subsetsof the composition. The second approach, correspondence analysis, isbased on the chi-square approach to contingency table analysis. Acornerstone of correspondence analysis is the principle of distributionalequivalence, which assures invariance in the results when rows or columnswith identical conditional proportions are merged. Both methods may bedescribed as singular value decompositions of appropriately transformedmatrices. Correspondence analysis includes a weighting of the rows andcolumns proportional to the margins of the table. If this idea of row andcolumn weights is introduced into the logratio biplot, we obtain a methodwhich obeys both principles of subcompositional coherence and distributionalequivalence.
Resumo:
A mudança do normativo contabilístico ocorrido em 2009, alterou o paradigma de reconhecimento e mensuração de activos. Embora a natureza das operações contínua presente no processo contabilístico, muitas coisas foram alteradas tendo em conta a substância da informação e a sua realidade económica. O caso dos contratos de concessão é disto um bom exemplo. Há casos em que no normativo anterior eram reconhecidos como activos fixos tangíveis e actualmente são reconhecidos como intangíveis. O estudo em causa tem como objetivo principal analisar o conceito dos contratos de concessão, bem como os procedimentos para o reconhecimento, mensuração e divulgação nas demonstrações financeiras. Considerados activos intangíveis (de facto a entidade acaba por ter um “Direito” de explorar um determinado activo), o processo contabilístico é feito a luz do disposto na Norma de Relato Financeiro nº6 – Activos Intangíveis. Os contractos de concessão apresentam especificidades próprias e por esta razão o IASB emitiu uma IFRIC (nº 12) com o objectivo de clarificar o tratamento contabilístico desta problemática. Não existindo no normativo nacional tal norma interpretativa as empresas nacionais que convivem com esta realidade vêem-se na contingência de, supletivamente, recorrer às normas internacionais de contabilidade para resolver o assunto. É o caso da ELECTRA para os activos afectos a distribuição. Neste sentido, o estudo debruça sobre esta problemática, apresenta um enquadramento teórico, analisar os principais aspectos de reconhecimento a luz dos dois normativos contabilísticos nacionais (o antigo Plano Nacional de Contabilidade e o actual Sistema de Normalização Contabilística e de Relato Financeiro) e termina utilizando as informações da ELECTRA, SARL para ilustrar este processo de reconhecimento contabilístico. The change of a the accounting regulatory occurred in 2009, changed the paradigm for recognizing and measuring assets. Although the continuous nature of the operations in this accounting process, many things have changed in view of the substance of information and its economic reality. The case of concession contracts, it is a good example. There are cases where the former were recognized as legal and tangible fixed assets are currently recognized as intangible assets. The study is aimed to analyzing the concept of concession contracts, as well as procedures for the recognition, measurement and disclosure in the financial statements. Considered intangible assets (in fact the entity turns out to have a “right” to exploit a particular asset) the accounting process is done in light of the provisions of Financial Reporting Standard No. 6 – Intangible Assets. The concession contracts have specific characteristics and for this reason the IASB issued IFRIC one (Ner. 12 ) in order to clarify the accounting treatment of this problem. In the absence of such a standard national regulatory interpretative national companies that live with this reality find themselves in contingency, additionally, make use of international accounting standards to resolve the matter. ELECTRA is the case of the assets connected to the distribution. In this sense, the study focuses on this issue, presents a theoretical framework to analyze the main aspects of recognition light of both national accounting standards (formerly the National Accounting Standards and the current system of accounting and financial reporting) and ends up using the information the Electra SARL to illustrate this process of accounting recognition.
Resumo:
Power transformations of positive data tables, prior to applying the correspondence analysis algorithm, are shown to open up a family of methods with direct connections to the analysis of log-ratios. Two variations of this idea are illustrated. The first approach is simply to power the original data and perform a correspondence analysis this method is shown to converge to unweighted log-ratio analysis as the power parameter tends to zero. The second approach is to apply the power transformation to thecontingency ratios, that is the values in the table relative to expected values based on the marginals this method converges to weighted log-ratio analysis, or the spectral map. Two applications are described: first, a matrix of population genetic data which is inherently two-dimensional, and second, a larger cross-tabulation with higher dimensionality, from a linguistic analysis of several books.
Resumo:
Este trabalho baseia-se na análise de dados do desemprego em Cabo Verde nos anos de 2006 e 2008, usando informação da base de dados do INE e IEFP. Partindo da análise dos dados em estudo vai-se procurar descrever e perspectivar metodologias que contemplam as variáveis qualitativas e quantitativas com significado social positivo para a sociedade deste país. Após a introdução no capítulo 1, fez-se, no capítulo 2, a análise exploratória dos dados do desemprego em Cabo Verde referente aos anos 2006 e 2008. No capítulo 3 estudam-se associações entre variáveis, usando a metodologia de tabelas contingência, através da realização de testes de independência e testes de homogeneidade, e análise de medidas de associação. As variáveis usadas, vão ser essencialmente, o escalão etário, o género e o ano. O capítulo 4 é dedicado ao estudo de modelos Log - lineares em tabela de contingência, finalizando-se o trabalho com a apresentação das principais conclusões.
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods.
Resumo:
Although correspondence analysis is now widely available in statistical software packages and applied in a variety of contexts, notably the social and environmental sciences, there are still some misconceptions about this method as well as unresolved issues which remain controversial to this day. In this paper we hope to settle these matters, namely (i) the way CA measures variance in a two-way table and how to compare variances between tables of different sizes, (ii) the influence, or rather lack of influence, of outliers in the usual CA maps, (iii) the scaling issue and the biplot interpretation of maps,(iv) whether or not to rotate a solution, and (v) statistical significance of results.
Resumo:
El debate sobre qué significan las metáforas es una constante en diversas áreas del pensamiento contemporáneo. Desde los estudios literarios hasta la ciencia cognitiva o la lingüística, la metáfora se ha interpretado como una instancia del lenguaje fundamental para comprender no sólo cómo nos comunicamos, sino cómo funciona nuestra mente. Por su parte, la idea de una contingencia del lenguaje y la acción comunicativa, en la que los individuos de una comunidad lingüística sientan las bases de sudesarrollo, gana terreno para una comprensión más práctica y veraz del progreso social. Desde estos supuestos, la reflexión desarrollada en estas páginas tendrá como propósito rediscutir algunas nociones y teorías sobre la metáfora y sus relaciones con los conceptos de «verdad» y «significado», tratando deubicarla en los marcos del uso y la conversación cultural en la contingencia de nuestro lenguaje.
Resumo:
In the scope of the European project Hydroptimet, INTERREG IIIB-MEDOCC programme, limited area model (LAM) intercomparison of intense events that produced many damages to people and territory is performed. As the comparison is limited to single case studies, the work is not meant to provide a measure of the different models' skill, but to identify the key model factors useful to give a good forecast on such a kind of meteorological phenomena. This work focuses on the Spanish flash-flood event, also known as "Montserrat-2000" event. The study is performed using forecast data from seven operational LAMs, placed at partners' disposal via the Hydroptimet ftp site, and observed data from Catalonia rain gauge network. To improve the event analysis, satellite rainfall estimates have been also considered. For statistical evaluation of quantitative precipitation forecasts (QPFs), several non-parametric skill scores based on contingency tables have been used. Furthermore, for each model run it has been possible to identify Catalonia regions affected by misses and false alarms using contingency table elements. Moreover, the standard "eyeball" analysis of forecast and observed precipitation fields has been supported by the use of a state-of-the-art diagnostic method, the contiguous rain area (CRA) analysis. This method allows to quantify the spatial shift forecast error and to identify the error sources that affected each model forecasts. High-resolution modelling and domain size seem to have a key role for providing a skillful forecast. Further work is needed to support this statement, including verification using a wider observational data set.
Resumo:
Abstract: Hannah Arendt's storytelling: an approach to the understanding contingency
Resumo:
In this thesis, we study the use of prediction markets for technology assessment. We particularly focus on their ability to assess complex issues, the design constraints required for such applications and their efficacy compared to traditional techniques. To achieve this, we followed a design science research paradigm, iteratively developing, instantiating, evaluating and refining the design of our artifacts. This allowed us to make multiple contributions, both practical and theoretical. We first showed that prediction markets are adequate for properly assessing complex issues. We also developed a typology of design factors and design propositions for using these markets in a technology assessment context. Then, we showed that they are able to solve some issues related to the R&D portfolio management process and we proposed a roadmap for their implementation. Finally, by comparing the instantiation and the results of a multi-criteria decision method and a prediction market, we showed that the latter are more efficient, while offering similar results. We also proposed a framework for comparing forecasting methods, to identify the constraints based on contingency factors. In conclusion, our research opens a new field of application of prediction markets and should help hasten their adoption by enterprises. Résumé français: Dans cette thèse, nous étudions l'utilisation de marchés de prédictions pour l'évaluation de nouvelles technologies. Nous nous intéressons plus particulièrement aux capacités des marchés de prédictions à évaluer des problématiques complexes, aux contraintes de conception pour une telle utilisation et à leur efficacité par rapport à des techniques traditionnelles. Pour ce faire, nous avons suivi une approche Design Science, développant itérativement plusieurs prototypes, les instanciant, puis les évaluant avant d'en raffiner la conception. Ceci nous a permis de faire de multiples contributions tant pratiques que théoriques. Nous avons tout d'abord montré que les marchés de prédictions étaient adaptés pour correctement apprécier des problématiques complexes. Nous avons également développé une typologie de facteurs de conception ainsi que des propositions de conception pour l'utilisation de ces marchés dans des contextes d'évaluation technologique. Ensuite, nous avons montré que ces marchés pouvaient résoudre une partie des problèmes liés à la gestion des portes-feuille de projets de recherche et développement et proposons une feuille de route pour leur mise en oeuvre. Finalement, en comparant la mise en oeuvre et les résultats d'une méthode de décision multi-critère et d'un marché de prédiction, nous avons montré que ces derniers étaient plus efficaces, tout en offrant des résultats semblables. Nous proposons également un cadre de comparaison des méthodes d'évaluation technologiques, permettant de cerner au mieux les besoins en fonction de facteurs de contingence. En conclusion, notre recherche ouvre un nouveau champ d'application des marchés de prédiction et devrait permettre d'accélérer leur adoption par les entreprises.