869 resultados para Error-correcting codes (Information theory)
Resumo:
Background. The use of hospital discharge administrative data (HDAD) has been recommended for automating, improving, even substituting, population-based cancer registries. The frequency of false positive and false negative cases recommends local validation. Methods. The aim of this study was to detect newly diagnosed, false positive and false negative cases of cancer from hospital discharge claims, using four Spanish population-based cancer registries as the gold standard. Prostate cancer was used as a case study. Results. A total of 2286 incident cases of prostate cancer registered in 2000 were used for validation. In the most sensitive algorithm (that using five diagnostic codes), estimates for Sensitivity ranged from 14.5% (CI95% 10.3-19.6) to 45.7% (CI95% 41.4-50.1). In the most predictive algorithm (that using five diagnostic and five surgical codes) Positive Predictive Value estimates ranged from 55.9% (CI95% 42.4-68.8) to 74.3% (CI95% 67.0-80.6). The most frequent reason for false positive cases was the number of prevalent cases inadequately considered as newly diagnosed cancers, ranging from 61.1% to 82.3% of false positive cases. The most frequent reason for false negative cases was related to the number of cases not attended in hospital settings. In this case, figures ranged from 34.4% to 69.7% of false negative cases, in the most predictive algorithm. Conclusions. HDAD might be a helpful tool for cancer registries to reach their goals. The findings suggest that, for automating cancer registries, algorithms combining diagnoses and procedures are the best option. However, for cancer surveillance purposes, in those cancers like prostate cancer in which care is not only hospital-based, combining inpatient and outpatient information will be required.
Resumo:
This paper presents the design and implementation of QRP, an open source proof-of-concept authentication system that uses a two-factorauthentication by combining a password and a camera-equipped mobile phone, acting as an authentication token. QRP is extremely secure asall the sensitive information stored and transmitted is encrypted, but it isalso an easy to use and cost-efficient solution. QRP is portable and can be used securely in untrusted computers. Finally, QRP is able to successfully authenticate even when the phone is offline.
Resumo:
In epidemiologic studies, measurement error in dietary variables often attenuates association between dietary intake and disease occurrence. To adjust for the attenuation caused by error in dietary intake, regression calibration is commonly used. To apply regression calibration, unbiased reference measurements are required. Short-term reference measurements for foods that are not consumed daily contain excess zeroes that pose challenges in the calibration model. We adapted two-part regression calibration model, initially developed for multiple replicates of reference measurements per individual to a single-replicate setting. We showed how to handle excess zero reference measurements by two-step modeling approach, how to explore heteroscedasticity in the consumed amount with variance-mean graph, how to explore nonlinearity with the generalized additive modeling (GAM) and the empirical logit approaches, and how to select covariates in the calibration model. The performance of two-part calibration model was compared with the one-part counterpart. We used vegetable intake and mortality data from European Prospective Investigation on Cancer and Nutrition (EPIC) study. In the EPIC, reference measurements were taken with 24-hour recalls. For each of the three vegetable subgroups assessed separately, correcting for error with an appropriately specified two-part calibration model resulted in about three fold increase in the strength of association with all-cause mortality, as measured by the log hazard ratio. Further found is that the standard way of including covariates in the calibration model can lead to over fitting the two-part calibration model. Moreover, the extent of adjusting for error is influenced by the number and forms of covariates in the calibration model. For episodically consumed foods, we advise researchers to pay special attention to response distribution, nonlinearity, and covariate inclusion in specifying the calibration model.
Resumo:
This document contains a report and summary of the field research activities in a rural community of rice farmers in Kampot province, Cambodia in 2011, which I conducted within the context of my PhD research at ICTA-UAB (Institute of Environmental Science and Technology, Autonomous University of Barcelona, Spain). The purpose of the field research was to gather data for a MuSIASEM analysis (Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism) at the village and household level, in order to analyze the multidimensional challenges that small farmers may face nowadays within the context of global rural change and declining access to land. While the literature on MuSIASEM offers a great variety of theoretical explanations and practical applications, there is little information available for students regarding the practical steps required for doing a MuSIASEM analysis at the local level. Within this context, this report offers not only a documentation of the field research design and data collection methods, but further provides a general overview on some organizational and preparative aspects, including some personal reflections, that one may face when preparing and conducting field research for MuSIASEM analysis. In summary, this document thus serves three objectives: (i) to assure methodological transparency for the future work, based on the collected data during field research, (ii) to share my personal experience on the preparative and practical steps required for field research and data collection for a MuSIASEM analysis at the local level, and (iii) to make available for the further interested reader some more detailed background information on the case study village.
Resumo:
The radiation distribution function used by Domínguez and Jou [Phys. Rev. E 51, 158 (1995)] has been recently modified by Domínguez-Cascante and Faraudo [Phys. Rev. E 54, 6933 (1996)]. However, in these studies neither distribution was written in terms of directly measurable quantities. Here a solution to this problem is presented, and we also propose an experiment that may make it possible to determine the distribution function of nonequilibrium radiation experimentally. The results derived do not depend on a specific distribution function for the matter content of the system
Resumo:
Møller-Plesset (MP2) and Becke-3-Lee-Yang-Parr (B3LYP) calculations have been used to compare the geometrical parameters, hydrogen-bonding properties, vibrational frequencies and relative energies for several X- and X+ hydrogen peroxide complexes. The geometries and interaction energies were corrected for the basis set superposition error (BSSE) in all the complexes (1-5), using the full counterpoise method, yielding small BSSE values for the 6-311 + G(3df,2p) basis set used. The interaction energies calculated ranged from medium to strong hydrogen-bonding systems (1-3) and strong electrostatic interactions (4 and 5). The molecular interactions have been characterized using the atoms in molecules theory (AIM), and by the analysis of the vibrational frequencies. The minima on the BSSE-counterpoise corrected potential-energy surface (PES) have been determined as described by S. Simón, M. Duran, and J. J. Dannenberg, and the results were compared with the uncorrected PES
Resumo:
A comparision of the local effects of the basis set superposition error (BSSE) on the electron densities and energy components of three representative H-bonded complexes was carried out. The electron densities were obtained with Hartee-Fock and density functional theory versions of the chemical Hamiltonian approach (CHA) methodology. It was shown that the effects of the BSSE were common for all complexes studied. The electron density difference maps and the chemical energy component analysis (CECA) analysis confirmed that the local effects of the BSSE were different when diffuse functions were present in the calculations
Resumo:
For single-user MIMO communication with uncoded and coded QAM signals, we propose bit and power loading schemes that rely only on channel distribution information at the transmitter. To that end, we develop the relationship between the average bit error probability at the output of a ZF linear receiver and the bit rates and powers allocated at the transmitter. This relationship, and the fact that a ZF receiver decouples the MIMO parallel channels, allow leveraging bit loading algorithms already existing in the literature. We solve dual bit rate maximization and power minimization problems and present performance resultsthat illustrate the gains of the proposed scheme with respect toa non-optimized transmission.
Resumo:
In the assessment of medical malpractice imaging methods can be used for the documentation of crucial morphological findings which are indicative for or against an iatrogenically caused injury. The clarification of deaths in this context can be usefully supported by postmortem imaging (primarily native computed tomography, angiography, magnetic resonance imaging). Postmortem imaging offers significant additional information compared to an autopsy in the detection of iatrogenic air embolisms and documentation of misplaced medical aids before dissection with an inherent danger of relocation. Additional information is supplied by postmortem imaging in the search for sources of bleeding as well as the documentation of perfusion after cardiovascular surgery. Key criteria for the decision to perform postmortem imaging can be obtained from the necessary preliminary inspection of clinical documentation.
Resumo:
Drawing on Social Representations Theory, this study investigates focalisation and anchoring during the diffusion of information concerning the Large Hadron Collider (LHC), the particle accelerator at the European Organisation for Nuclear Research (CERN). We hypothesised that people focus on striking elements of the message, abandoning others, that the nature of the initial information affects diffusion of information, and that information is anchored in prior attitudes toward CERN and science. A serial reproduction experiment with two generations and four chains of reproduction diffusing controversial versus descriptive information about the LHC shows a reduction of information through generations, the persistence of terminology regarding the controversy and a decrease of other elements for participants exposed to polemical information. Concerning anchoring, positive attitudes toward CERN and science increase the use of expert terminology unrelated to the controversy. This research highlights the relevance of a social representational approach in the public understanding of science.
Resumo:
Returns to scale to capital and the strength of capital externalities play a key role for the empirical predictions and policy implications of different growth theories. We show that both can be identified with individual wage data and implement our approach at the city-level using US Census data on individuals in 173 cities for 1970, 1980, and 1990. Estimation takes into account fixed effects, endogeneity of capital accumulation, and measurement error. We find no evidence for human or physical capital externalities and decreasing aggregate returns to capital. Returns to scale to physical and human capital are around 80 percent. We also find strong complementarities between human capital and labor and substantial total employment externalities.
Resumo:
We argue that the main barrier to an integrated international interbankmarket is the existence of asymmetric information between differentcountries, which may prevail in spite of monetary integration or successfulcurrency pegging. In order to address this issue, we study the scope forinternational interbank market integration with unsecured lending whencross-country information is noisy. We find not only that an equilibriumwith integrated markets need not always exist, but also that when it does,the integrated equilibrium may coexist with one of interbank marketsegmentation. Therefore, market deregulation, per se, does not guaranteethe emergence of an integrated interbank market. The effect of a repo marketwhich, a priori, was supposed to improve efficiency happens to be morecomplex: it reduces interest rate spreads and improves upon the segmentationequilibrium, but\ it may destroy the unsecured integrated equilibrium, sincethe repo market will attract the best borrowers. The introduction of othertransnational institutional arrangements, such as multinational banking,correspondent banking and the existence of "too-big-to-fail" banks mayreduce cross country interest spreads and provide more insurance againstcountry wide liquidity shocks. Still, multinational banking, as theintroduction of repos, may threaten the integrated interbank marketequilibrium.
Resumo:
Returns to scale to capital and the strength of capital externalities play a key role for the empirical predictions and policy implications of different growth theories. We show that both can be identified with individual wage data and implement our approach at the city-level using US Census data on individuals in 173 cities for 1970, 1980, and 1990. Estimation takes into account fixed effects, endogeneity of capital accumulation, and measurement error. We find no evidence for human or physical capital externalities and decreasing aggregate returns to capital. Returns to scale to physical and human capital are around 80 percent. We also find strong complementarities between human capital and labor and substantial total employment externalities.
Resumo:
Small sample properties are of fundamental interest when only limited data is avail-able. Exact inference is limited by constraints imposed by speci.c nonrandomizedtests and of course also by lack of more data. These e¤ects can be separated as we propose to evaluate a test by comparing its type II error to the minimal type II error among all tests for the given sample. Game theory is used to establish this minimal type II error, the associated randomized test is characterized as part of a Nash equilibrium of a .ctitious game against nature.We use this method to investigate sequential tests for the di¤erence between twomeans when outcomes are constrained to belong to a given bounded set. Tests ofinequality and of noninferiority are included. We .nd that inference in terms oftype II error based on a balanced sample cannot be improved by sequential sampling or even by observing counter factual evidence providing there is a reasonable gap between the hypotheses.
Resumo:
We introduce two ways of comparing information structures, say ${\cal I}$ and${\cal J}$. First we say that ${\cal I}$ is richer than ${\cal J}$ when forevery compact game $G$, all correlated equilibrium distributions of $G$ inducedby ${\cal J}$ are also induced by ${\cal I}$. Second, we say that ${\cal J}$is faithfully reproducable from ${\cal I}$ when all the players can computefrom their information in ${\cal I}$ ``new information'' that they could havereceived from ${\cal J}$. We prove that ${\cal I}$ is richer than ${\cal J}$if and only if ${\cal J}$ is faithfully reproducable from ${\cal I}$.