969 resultados para Point Data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Identification of a Primary Care Physician (PCP) by older patients is considered as essential for the coordination of care, but the extent to which identified PCPs are general practitioners or specialists is unknown. This study described older patients' experiences with their PCP and tested the hypothesis of differences between patients who identify a specialist as their PCP (SP PCP) and those who turn to a general practitioner (GP PCP). METHODS: In 2012, a cross-sectional postal survey on care was conducted in the 68+ year old population of the canton of Vaud. Data was provided by 2,276 participants in the ongoing Lausanne cohort 65+ (Lc65+), a study of those born between 1934 and 1943, and by 998 persons from an additional sample drawn to include the population outside of Lausanne or born before 1934. RESULTS: Participants expressed favourable perceptions, at rates exceeding 75% for most items. However, only 38% to 51% responded positively for out-of-hours availability, easy access and at home visits, likelihood of prescribing expensive medication if needed, and doctors' awareness of over-the-counter drugs. 12.0% had an SP PCP, in 95.9% specialised in a discipline implying training in internal medicine. Bivariate and multivariate analyses did not result in significant differences between GP and SP PCPs regarding perceptions of accessibility/availability, doctor-patient relationship, information and continuity of care, prevention, spontaneous use of the emergency department or ambulatory care utilisation. CONCLUSIONS: Experiences of old patients were mostly positive despite some lack in reported hearing, memory testing, and colorectal cancer screening. We found no differences between GP and SP PCP groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND & AIMS: Recently, genetic variations in MICA (lead single nucleotide polymorphism [SNP] rs2596542) were identified by a genome-wide association study (GWAS) to be associated with hepatitis C virus (HCV)-related hepatocellular carcinoma (HCC) in Japanese patients. In the present study, we sought to determine whether this SNP is predictive of HCC development in the Caucasian population as well. METHODS: An extended region around rs2596542 was genotyped in 1924 HCV-infected patients from the Swiss Hepatitis C Cohort Study (SCCS). Pair-wise correlation between key SNPs was calculated both in the Japanese and European populations (HapMap3: CEU and JPT). RESULTS: To our surprise, the minor allele A of rs2596542 in proximity of MICA appeared to have a protective impact on HCC development in Caucasians, which represents an inverse association as compared to the one observed in the Japanese population. Detailed fine-mapping analyses revealed a new SNP in HCP5 (rs2244546) upstream of MICA as strong predictor of HCV-related HCC in the SCCS (univariable p=0.027; multivariable p=0.0002, odds ratio=3.96, 95% confidence interval=1.90-8.27). This newly identified SNP had a similarly directed effect on HCC in both Caucasian and Japanese populations, suggesting that rs2244546 may better tag a putative true variant than the originally identified SNPs. CONCLUSIONS: Our data confirms the MICA/HCP5 region as susceptibility locus for HCV-related HCC and identifies rs2244546 in HCP5 as a novel tagging SNP. In addition, our data exemplify the need for conducting meta-analyses of cohorts of different ethnicities in order to fine map GWAS signals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sex allocation data in eusocial Hymenoptera (ants, bees and wasps) provide an excellent opportunity to assess the effectiveness of kin selection, because queens and workers differ in their relatedness to females and males. The first studies on sex allocation in eusocial Hymenoptera compared population sex investment ratios across species. Female-biased investment in monogyne (= with single-queen colonies) populations of ants suggested that workers manipulate sex allocation according to their higher relatedness to females than males (relatedness asymmetry). However, several factors may confound these comparisons across species. First, variation in relatedness asymmetry is typically associated with major changes in breeding system and life history that may also affect sex allocation. Secondly, the relative cost of females and males is difficult to estimate across sexually dimorphic taxa, such as ants. Thirdly, each species in the comparison may not represent an independent data point, because of phylogenetic relationships among species. Recently, stronger evidence that workers control sex allocation has been provided by intraspecific studies of sex ratio variation across colonies. In several species of eusocial Hymenoptera, colonies with high relatedness asymmetry produced mostly females, in contrast to colonies with low relatedness asymmetry which produced mostly males. Additional signs of worker control were found by investigating proximate mechanisms of sex ratio manipulation in ants and wasps. However, worker control is not always effective, and further manipulative experiments will be needed to disentangle the multiple evolutionary factors and processes affecting sex allocation in eusocial Hymenoptera.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The PHA�coordinated Northern Ireland's participation in ECDC's National Prevalence Survey on�Hospital-Acquired Infections & on Antimicrobial Use. Hospitals in Northern Ireland participated in data collection between May and June 2012.This report provides a snapshot of the levels of hospital-acquired infections (HAI) and levels of antimicrobial use (AMU) in hospitals in Northern Ireland during 2012.There have been three previous HAI PPS surveys and the last survey was carried out in 2006. It is difficult to compare each survey as the data was collected in a different way. However, after making allowances, there was an overall drop in HAI prevalence of 18% from 2006 to 2012.The PPS data collection was undertaken by hospital teams between May and June 2012 (one hospital deferred data collection until September 2012 because of a move to a new hospital); 16 hospitals surveyed 3,992 eligible patients. The median age of all patients was 66 years. A total of 383 (10 per cent) children under 16 years of age were surveyed.�Key results from this year's survey:The prevalence of HAI was 4.2%. A total of 166 patients were diagnosed with an active HAI with 3 patients having more than one infection.When comparing ward specialties, HAI prevalence was highest for patients in adult intensive care units (ICUs) at 9.1 per cent, followed by care of the elderly wards at 5.7%.The most common types of HCAI were respiratory infections (including pneumonia and infections of the lower respiratory tract) (27.9 per cent of all infections), surgical site infections (18.9 per cent) and urinary tract infections (UTI) (11.8 per cent).Since the last PPS in 2006 there has been a reduction in MRSA infections - from 0.9 per cent �of the hospital population to less than 0.1 per cent in patients; and a five-fold reduction in C. difficile infections (from 1.1 per cent to 0.2 per cent).The prevalence of antimicrobial use was 29.5%.Most antibiotic use (60 per cent) in hospitals was in patients receiving treatment for infections which commenced in the community. Eleven percent of surgical prophylaxis was prescribed for greater than one day.��

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High precision U-Pb zircon and Ar-40/Ar-39 mica geochronological data on metagranodiorites, metagranites and mica schists from north and central Evia island (Greece) are presented in this study. U-Pb zircon ages range from 308 to 1912 Ma, and indicate a prolonged magmatic activity in Late Carboniferous. Proterozoic ages represent inherited cores within younger crystals. Muscovite Ar-40/Ar-39 plateau ages of 288 to 297 Ma are interpreted as cooling ages of the magmatic bodies and metamorphic host rocks in upper greenschist to epidote-amphibolite metamorphic conditions. The multistage magmatism had a duration between 308 and 319 hla but some older intrusions, as well as metamorphic events, cannot be excluded. Geochemical analyses and zircon typology indicate calc-alkaline affinities for the granites of central Evia and alkaline to calc-alkaline characteristics for the metagranodiorites from the northern part of the island. The new data point towards the SE continuation, in Evia and the Cyclades, of a Variscan continental crust already recognised in northern Greece (Pelagonian basement). The Late Carboniferous magmatism is viewed as a result of northward subduction of the Paleotethys under the Eurasian margin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that dichotomizing continuous data has the effect to decrease statistical power when the goal is to test for a statistical association between two variables. Modern researchers however are focusing not only on statistical significance but also on an estimation of the "effect size" (i.e., the strength of association between the variables) to judge whether a significant association is also clinically relevant. In this article, we are interested in the consequences of dichotomizing continuous data on the value of an effect size in some classical settings. It turns out that the conclusions will not be the same whether using a correlation or an odds ratio to summarize the strength of association between the variables: Whereas the value of a correlation is typically decreased by a factor pi/2 after each dichotomization, the value of an odds ratio is at the same time raised to the power 2. From a descriptive statistical point of view, it is thus not clear whether dichotomizing continuous data leads to a decrease or to an increase in the effect size, as illustrated using a data set to investigate the relationship between motor and intellectual functions in children and adolescents

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developments in the statistical analysis of compositional data over the last twodecades have made possible a much deeper exploration of the nature of variability,and the possible processes associated with compositional data sets from manydisciplines. In this paper we concentrate on geochemical data sets. First we explainhow hypotheses of compositional variability may be formulated within the naturalsample space, the unit simplex, including useful hypotheses of subcompositionaldiscrimination and specific perturbational change. Then we develop through standardmethodology, such as generalised likelihood ratio tests, statistical tools to allow thesystematic investigation of a complete lattice of such hypotheses. Some of these tests are simple adaptations of existing multivariate tests but others require specialconstruction. We comment on the use of graphical methods in compositional dataanalysis and on the ordination of specimens. The recent development of the conceptof compositional processes is then explained together with the necessary tools for astaying- in-the-simplex approach, namely compositional singular value decompositions. All these statistical techniques are illustrated for a substantial compositional data set, consisting of 209 major-oxide and rare-element compositions of metamorphosed limestones from the Northeast and Central Highlands of Scotland.Finally we point out a number of unresolved problems in the statistical analysis ofcompositional processes

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Studies of diffuse large B-cell lymphoma (DLBCL) are typically evaluated by using a time-to-event approach with relapse, re-treatment, and death commonly used as the events. We evaluated the timing and type of events in newly diagnosed DLBCL and compared patient outcome with reference population data. PATIENTS AND METHODS: Patients with newly diagnosed DLBCL treated with immunochemotherapy were prospectively enrolled onto the University of Iowa/Mayo Clinic Specialized Program of Research Excellence Molecular Epidemiology Resource (MER) and the North Central Cancer Treatment Group NCCTG-N0489 clinical trial from 2002 to 2009. Patient outcomes were evaluated at diagnosis and in the subsets of patients achieving event-free status at 12 months (EFS12) and 24 months (EFS24) from diagnosis. Overall survival was compared with age- and sex-matched population data. Results were replicated in an external validation cohort from the Groupe d'Etude des Lymphomes de l'Adulte (GELA) Lymphome Non Hodgkinien 2003 (LNH2003) program and a registry based in Lyon, France. RESULTS: In all, 767 patients with newly diagnosed DLBCL who had a median age of 63 years were enrolled onto the MER and NCCTG studies. At a median follow-up of 60 months (range, 8 to 116 months), 299 patients had an event and 210 patients had died. Patients achieving EFS24 had an overall survival equivalent to that of the age- and sex-matched general population (standardized mortality ratio [SMR], 1.18; P = .25). This result was confirmed in 820 patients from the GELA study and registry in Lyon (SMR, 1.09; P = .71). Simulation studies showed that EFS24 has comparable power to continuous EFS when evaluating clinical trials in DLBCL. CONCLUSION: Patients with DLBCL who achieve EFS24 have a subsequent overall survival equivalent to that of the age- and sex-matched general population. EFS24 will be useful in patient counseling and should be considered as an end point for future studies of newly diagnosed DLBCL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives. To study the utility of the Mini-Cog test for detection of patients with cognitive impairment (CI) in primary care (PC). Methods. We pooled data from two phase III studies conducted in Spain. Patients with complaints or suspicion of CI were consecutively recruited by PC physicians. The cognitive diagnosis was performed by an expert neurologist, after formal neuropsychological evaluation. The Mini-Cog score was calculated post hoc, and its diagnostic utility was evaluated and compared with the utility of the Mini-Mental State (MMS), the Clock Drawing Test (CDT), and the sum of the MMS and the CDT (MMS + CDT) using the area under the receiver operating characteristic curve (AUC). The best cut points were obtained on the basis of diagnostic accuracy (DA) and kappa index. Results. A total sample of 307 subjects (176 CI) was analyzed. The Mini-Cog displayed an AUC (±SE) of 0.78 ± 0.02, which was significantly inferior to the AUC of the CDT (0.84 ± 0.02), the MMS (0.84 ± 0.02), and the MMS + CDT (0.86 ± 0.02). The best cut point of the Mini-Cog was 1/2 (sensitivity 0.60, specificity 0.90, DA 0.73, and kappa index 0.48 ± 0.05). Conclusions. The utility of the Mini-Cog for detection of CI in PC was very modest, clearly inferior to the MMS or the CDT. These results do not permit recommendation of the Mini-Cog in PC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spatial data on species distributions are available in two main forms, point locations and distribution maps (polygon ranges and grids). The first are often temporally and spatially biased, and too discontinuous, to be useful (untransformed) in spatial analyses. A variety of modelling approaches are used to transform point locations into maps. We discuss the attributes that point location data and distribution maps must satisfy in order to be useful in conservation planning. We recommend that before point location data are used to produce and/or evaluate distribution models, the dataset should be assessed under a set of criteria, including sample size, age of data, environmental/geographical coverage, independence, accuracy, time relevance and (often forgotten) representation of areas of permanent and natural presence of the species. Distribution maps must satisfy additional attributes if used for conservation analyses and strategies, including minimizing commission and omission errors, credibility of the source/assessors and availability for public screening. We review currently available databases for mammals globally and show that they are highly variable in complying with these attributes. The heterogeneity and weakness of spatial data seriously constrain their utility to global and also sub-global scale conservation analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completelyabsent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and byMartín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involvedparts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method isintroduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that thetheoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approachhas reasonable properties from a compositional point of view. In particular, it is “natural” in the sense thatit recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in thesame paper a substitution method for missing values on compositional data sets is introduced

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time-lapse geophysical data acquired during transient hydrological experiments are being increasingly employed to estimate subsurface hydraulic properties at the field scale. In particular, crosshole ground-penetrating radar (GPR) data, collected while water infiltrates into the subsurface either by natural or artificial means, have been demonstrated in a number of studies to contain valuable information concerning the hydraulic properties of the unsaturated zone. Previous work in this domain has considered a variety of infiltration conditions and different amounts of time-lapse GPR data in the estimation procedure. However, the particular benefits and drawbacks of these different strategies as well as the impact of a variety of key and common assumptions remain unclear. Using a Bayesian Markov-chain-Monte-Carlo stochastic inversion methodology, we examine in this paper the information content of time-lapse zero-offset-profile (ZOP) GPR traveltime data, collected under three different infiltration conditions, for the estimation of van Genuchten-Mualem (VGM) parameters in a layered subsurface medium. Specifically, we systematically analyze synthetic and field GPR data acquired under natural loading and two rates of forced infiltration, and we consider the value of incorporating different amounts of time-lapse measurements into the estimation procedure. Our results confirm that, for all infiltration scenarios considered, the ZOP GPR traveltime data contain important information about subsurface hydraulic properties as a function of depth, with forced infiltration offering the greatest potential for VGM parameter refinement because of the higher stressing of the hydrological system. Considering greater amounts of time-lapse data in the inversion procedure is also found to help refine VGM parameter estimates. Quite importantly, however, inconsistencies observed in the field results point to the strong possibility that posterior uncertainties are being influenced by model structural errors, which in turn underlines the fundamental importance of a systematic analysis of such errors in future related studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

HIV virulence, i.e. the time of progression to AIDS, varies greatly among patients. As for other rapidly evolving pathogens of humans, it is difficult to know if this variance is controlled by the genotype of the host or that of the virus because the transmission chain is usually unknown. We apply the phylogenetic comparative approach (PCA) to estimate the heritability of a trait from one infection to the next, which indicates the control of the virus genotype over this trait. The idea is to use viral RNA sequences obtained from patients infected by HIV-1 subtype B to build a phylogeny, which approximately reflects the transmission chain. Heritability is measured statistically as the propensity for patients close in the phylogeny to exhibit similar infection trait values. The approach reveals that up to half of the variance in set-point viral load, a trait associated with virulence, can be heritable. Our estimate is significant and robust to noise in the phylogeny. We also check for the consistency of our approach by showing that a trait related to drug resistance is almost entirely heritable. Finally, we show the importance of taking into account the transmission chain when estimating correlations between infection traits. The fact that HIV virulence is, at least partially, heritable from one infection to the next has clinical and epidemiological implications. The difference between earlier studies and ours comes from the quality of our dataset and from the power of the PCA, which can be applied to large datasets and accounts for within-host evolution. The PCA opens new perspectives for approaches linking clinical data and evolutionary biology because it can be extended to study other traits or other infectious diseases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Members of the human APOBEC3 family of editing enzymes can inhibit various mobile genetic elements. APOBEC3A (A3A) can block the retrotransposon LINE-1 and the parvovirus adeno-associated virus type 2 (AAV-2) but does not inhibit retroviruses. In contrast, APOBEC3G (A3G) can block retroviruses but has only limited effects on AAV-2 or LINE-1. What dictates this differential target specificity remains largely undefined. Here, we modeled the structure of A3A based on its homology with the C-terminal domain of A3G and further compared the sequence of human A3A to those of 11 nonhuman primate orthologues. We then used these data to perform a mutational analysis of A3A, examining its ability to restrict LINE-1, AAV-2, and foreign plasmid DNA and to edit a single-stranded DNA substrate. The results revealed an essential functional role for the predicted single-stranded DNA-docking groove located around the A3A catalytic site. Within this region, amino acid differences between A3A and A3G are predicted to affect the shape of the polynucleotide-binding groove. Correspondingly, transferring some of these A3A residues to A3G endows the latter protein with the ability to block LINE-1 and AAV-2. These results suggest that the target specificity of APOBEC3 family members is partly defined by structural features influencing their interaction with polynucleotide substrates.