916 resultados para Bayes Estimator
Resumo:
The R-package “compositions”is a tool for advanced compositional analysis. Its basicfunctionality has seen some conceptual improvement, containing now some facilitiesto work with and represent ilr bases built from balances, and an elaborated subsys-tem for dealing with several kinds of irregular data: (rounded or structural) zeroes,incomplete observations and outliers. The general approach to these irregularities isbased on subcompositions: for an irregular datum, one can distinguish a “regular” sub-composition (where all parts are actually observed and the datum behaves typically)and a “problematic” subcomposition (with those unobserved, zero or rounded parts, orelse where the datum shows an erratic or atypical behaviour). Systematic classificationschemes are proposed for both outliers and missing values (including zeros) focusing onthe nature of irregularities in the datum subcomposition(s).To compute statistics with values missing at random and structural zeros, a projectionapproach is implemented: a given datum contributes to the estimation of the desiredparameters only on the subcompositon where it was observed. For data sets withvalues below the detection limit, two different approaches are provided: the well-knownimputation technique, and also the projection approach.To compute statistics in the presence of outliers, robust statistics are adapted to thecharacteristics of compositional data, based on the minimum covariance determinantapproach. The outlier classification is based on four different models of outlier occur-rence and Monte-Carlo-based tests for their characterization. Furthermore the packageprovides special plots helping to understand the nature of outliers in the dataset.Keywords: coda-dendrogram, lost values, MAR, missing data, MCD estimator,robustness, rounded zeros
Resumo:
A joint distribution of two discrete random variables with finite support can be displayed as a two way table of probabilities adding to one. Assume that this table hasn rows and m columns and all probabilities are non-null. This kind of table can beseen as an element in the simplex of n · m parts. In this context, the marginals areidentified as compositional amalgams, conditionals (rows or columns) as subcompositions. Also, simplicial perturbation appears as Bayes theorem. However, the Euclideanelements of the Aitchison geometry of the simplex can also be translated into the tableof probabilities: subspaces, orthogonal projections, distances.Two important questions are addressed: a) given a table of probabilities, which isthe nearest independent table to the initial one? b) which is the largest orthogonalprojection of a row onto a column? or, equivalently, which is the information in arow explained by a column, thus explaining the interaction? To answer these questionsthree orthogonal decompositions are presented: (1) by columns and a row-wise geometric marginal, (2) by rows and a columnwise geometric marginal, (3) by independenttwo-way tables and fully dependent tables representing row-column interaction. Animportant result is that the nearest independent table is the product of the two (rowand column)-wise geometric marginal tables. A corollary is that, in an independenttable, the geometric marginals conform with the traditional (arithmetic) marginals.These decompositions can be compared with standard log-linear models.Key words: balance, compositional data, simplex, Aitchison geometry, composition,orthonormal basis, arithmetic and geometric marginals, amalgam, dependence measure,contingency table
Resumo:
BACKGROUND Challenges exist in the clinical diagnosis of drug-induced liver injury (DILI) and in obtaining information on hepatotoxicity in humans. OBJECTIVE (i) To develop a unified list that combines drugs incriminated in well vetted or adjudicated DILI cases from many recognized sources and drugs that have been subjected to serious regulatory actions due to hepatotoxicity; and (ii) to supplement the drug list with data on reporting frequencies of liver events in the WHO individual case safety report database (VigiBase). DATA SOURCES AND EXTRACTION (i) Drugs identified as causes of DILI at three major DILI registries; (ii) drugs identified as causes of drug-induced acute liver failure (ALF) in six different data sources, including major ALF registries and previously published ALF studies; and (iii) drugs identified as being subjected to serious governmental regulatory actions due to their hepatotoxicity in Europe or the US were collected. The reporting frequency of adverse events was determined using VigiBase, computed as Empirical Bayes Geometric Mean (EBGM) with 90% confidence interval for two customized terms, 'overall liver injury' and 'ALF'. EBGM of >or=2 was considered a disproportional increase in reporting frequency. The identified drugs were then characterized in terms of regional divergence, published case reports, serious regulatory actions, and reporting frequency of 'overall liver injury' and 'ALF' calculated from VigiBase. DATA SYNTHESIS After excluding herbs, supplements and alternative medicines, a total of 385 individual drugs were identified; 319 drugs were identified in the three DILI registries, 107 from the six ALF registries (or studies) and 47 drugs that were subjected to suspension or withdrawal in the US or Europe due to their hepatotoxicity. The identified drugs varied significantly between Spain, the US and Sweden. Of the 319 drugs identified in the DILI registries of adjudicated cases, 93.4% were found in published case reports, 1.9% were suspended or withdrawn due to hepatotoxicity and 25.7% were also identified in the ALF registries/studies. In VigiBase, 30.4% of the 319 drugs were associated with disproportionally higher reporting frequency of 'overall liver injury' and 83.1% were associated with at least one reported case of ALF. CONCLUSIONS This newly developed list of drugs associated with hepatotoxicity and the multifaceted analysis on hepatotoxicity will aid in causality assessment and clinical diagnosis of DILI and will provide a basis for further characterization of hepatotoxicity.
Resumo:
BACKGROUNDS AUDIPOC is a nationwide clinical audit that describes the characteristics, interventions and outcomes of patients admitted to Spanish hospitals because of an exacerbation of chronic obstructive pulmonary disease (ECOPD), assessing the compliance of these parameters with current international guidelines. The present study describes hospital resources, hospital factors related to case recruitment variability, patients' characteristics, and adherence to guidelines. METHODOLOGY/PRINCIPAL FINDINGS An organisational database was completed by all participant hospitals recording resources and organisation. Over an 8-week period 11,564 consecutive ECOPD admissions to 129 Spanish hospitals covering 70% of the Spanish population were prospectively identified. At hospital discharge, 5,178 patients (45% of eligible) were finally included, and thus constituted the audited population. Audited patients were reassessed 90 days after admission for survival and readmission rates. A wide variability was observed in relation to most variables, hospital adherence to guidelines, and readmissions and death. Median inpatient mortality was 5% (across-hospital range 0-35%). Among discharged patients, 37% required readmission (0-62%) and 6.5% died (0-35%). The overall mortality rate was 11.6% (0-50%). Hospital size and complexity and aspects related to hospital COPD awareness were significantly associated with case recruitment. Clinical management most often complied with diagnosis and treatment recommendations but rarely (<50%) addressed guidance on healthy life-styles. CONCLUSIONS/SIGNIFICANCE The AUDIPOC study highlights the large across-hospital variability in resources and organization of hospitals, patient characteristics, process of care, and outcomes. The study also identifies resources and organizational characteristics associated with the admission of COPD cases, as well as aspects of daily clinical care amenable to improvement.
Resumo:
In the forensic examination of DNA mixtures, the question of how to set the total number of contributors (N) presents a topic of ongoing interest. Part of the discussion gravitates around issues of bias, in particular when assessments of the number of contributors are not made prior to considering the genotypic configuration of potential donors. Further complication may stem from the observation that, in some cases, there may be numbers of contributors that are incompatible with the set of alleles seen in the profile of a mixed crime stain, given the genotype of a potential contributor. In such situations, procedures that take a single and fixed number contributors as their output can lead to inferential impasses. Assessing the number of contributors within a probabilistic framework can help avoiding such complication. Using elements of decision theory, this paper analyses two strategies for inference on the number of contributors. One procedure is deterministic and focuses on the minimum number of contributors required to 'explain' an observed set of alleles. The other procedure is probabilistic using Bayes' theorem and provides a probability distribution for a set of numbers of contributors, based on the set of observed alleles as well as their respective rates of occurrence. The discussion concentrates on mixed stains of varying quality (i.e., different numbers of loci for which genotyping information is available). A so-called qualitative interpretation is pursued since quantitative information such as peak area and height data are not taken into account. The competing procedures are compared using a standard scoring rule that penalizes the degree of divergence between a given agreed value for N, that is the number of contributors, and the actual value taken by N. Using only modest assumptions and a discussion with reference to a casework example, this paper reports on analyses using simulation techniques and graphical models (i.e., Bayesian networks) to point out that setting the number of contributors to a mixed crime stain in probabilistic terms is, for the conditions assumed in this study, preferable to a decision policy that uses categoric assumptions about N.
Resumo:
The phylogeny and phylogeography of the Old World wood mice (subgenus Sylvaemus, genus Apodemus, Muridae) are well-documented. Nevertheless, the distributions of species, such as A. fulvipectus and A. ponticus remain dubious, as well as their phylogenetic relationships with A. sylvaticus. We analysed samples of Apodemus spp. across Europe using the mitochondrial cytochrome-b gene (cyt-b) and compared the DNA and amino-acid compositions of previously published sequences. The main result stemming from this study is the presence of a well-differentiated lineage of Sylvaemus including samples of various species (A. sylvaticus, A. fulvipectus, A. ponticus) from distant locations, which were revealed to be nuclear copies of the mitochondrial cyt-b. The presence of this cryptic pseudogene in published sequences is supported by different pathways. This has led to important errors in previous molecular trees and hence to partial misinterpretations in the phylogeny of Apodemus.
Resumo:
Background: Glutathione (GSH) dysregulation at the gene, protein and functional levels observed in schizophrenia patients, and schizophrenia-like anomalies in GSH deficit experimental models, suggest that genetic glutathione synthesis impairments represent one major risk factor for the disease (Do et al., 2009). In a randomized, double blind, placebo controlled, add-on clinical trial of 140 patients, the GSH precursor N-Acetyl-Cysteine (NAC, 2g/day, 6 months) significantly improved the negative symptoms and reduced sideeffects due to antipsychotics (Berk et al., 2008). In a subset of patients (n=7), NAC (2g/day, 2 months, cross-over design) also improved auditory evoked potentials, the NMDA-dependent mismatch negativity (Lavoie et al, 2008). Methods: To determine whether increased GSH levels would modulate the topography of functional brain connectivity, we applied a multivariate phase synchronization (MPS) estimator (Knyazeva et al, 2008) to dense-array EEGs recorded during rest with eyes closed at the protocol onset, the point of crossover, and at its end. Results: The whole-head imaging revealed a specific synchronization landscape in NAC compared to placebo condition. In particular, NAC increased MPS over frontal and left temporal regions in a frequency-specific manner. The topography and direction of MPS changes were similar and robust in all 7 patients. Moreover, these changes correlated with the changes in the Liddle's score of disorganization, thus linking EEG synchronization to the improvement of the clinical picture. Conclusions: The data suggest an important pathway towards new therapeutic strategies that target GSH dysregulation in schizophrenia. They also show the utility of MPS mapping as a marker of treatment efficacy.
Resumo:
In this paper we propose a parsimonious regime-switching approach to model the correlations between assets, the threshold conditional correlation (TCC) model. This method allows the dynamics of the correlations to change from one state (or regime) to another as a function of observable transition variables. Our model is similar in spirit to Silvennoinen and Teräsvirta (2009) and Pelletier (2006) but with the appealing feature that it does not suffer from the course of dimensionality. In particular, estimation of the parameters of the TCC involves a simple grid search procedure. In addition, it is easy to guarantee a positive definite correlation matrix because the TCC estimator is given by the sample correlation matrix, which is positive definite by construction. The methodology is illustrated by evaluating the behaviour of international equities, govenrment bonds and major exchange rates, first separately and then jointly. We also test and allow for different parts in the correlation matrix to be governed by different transition variables. For this, we estimate a multi-threshold TCC specification. Further, we evaluate the economic performance of the TCC model against a constant conditional correlation (CCC) estimator using a Diebold-Mariano type test. We conclude that threshold correlation modelling gives rise to a significant reduction in portfolio´s variance.
Resumo:
In a seminal paper, Aitchison and Lauder (1985) introduced classical kernel densityestimation techniques in the context of compositional data analysis. Indeed, they gavetwo options for the choice of the kernel to be used in the kernel estimator. One ofthese kernels is based on the use the alr transformation on the simplex SD jointly withthe normal distribution on RD-1. However, these authors themselves recognized thatthis method has some deficiencies. A method for overcoming these dificulties based onrecent developments for compositional data analysis and multivariate kernel estimationtheory, combining the ilr transformation with the use of the normal density with a fullbandwidth matrix, was recently proposed in Martín-Fernández, Chacón and Mateu-Figueras (2006). Here we present an extensive simulation study that compares bothmethods in practice, thus exploring the finite-sample behaviour of both estimators
Resumo:
BACKGROUND The objective of this research was to evaluate data from a randomized clinical trial that tested injectable diacetylmorphine (DAM) and oral methadone (MMT) for substitution treatment, using a multi-domain dichotomous index, with a Bayesian approach. METHODS Sixty two long-term, socially-excluded heroin injectors, not benefiting from available treatments were randomized to receive either DAM or MMT for 9 months in Granada, Spain. Completers were 44 and data at the end of the study period was obtained for 50. Participants were determined to be responders or non responders using a multi-domain outcome index accounting for their physical and mental health and psychosocial integration, used in a previous trial. Data was analyzed with Bayesian methods, using information from a similar study conducted in The Netherlands to select a priori distributions. On adding the data from the present study to update the a priori information, the distribution of the difference in response rates were obtained and used to build credibility intervals and relevant probability computations. RESULTS In the experimental group (n = 27), the rate of responders to treatment was 70.4% (95% CI 53.287.6), and in the control group (n = 23), it was 34.8% (95% CI 15.354.3). The probability of success in the experimental group using the a posteriori distributions was higher after a proper sensitivity analysis. Almost the whole distribution of the rates difference (the one for diacetylmorphine minus methadone) was located to the right of the zero, indicating the superiority of the experimental treatment. CONCLUSION The present analysis suggests a clinical superiority of injectable diacetylmorphine compared to oral methadone in the treatment of severely affected heroin injectors not benefiting sufficiently from the available treatments. TRIAL REGISTRATION Current Controlled Trials ISRCTN52023186.
Resumo:
The aim of this work is to make known the multicentric project AMCAC, whose objective is to describe the geographical distribution of mortality from all causes in census groups of the provincial capitals of Andalusia and Catalonia during 1992-2002 and 1994-2000 respectively, and to study the relationship between the sociodemographic characteristics of the census groups and mortality. This is an ecological study in which the analytical unit is the census group. The data correspond to 298,731 individuals (152,913 men and 145,818 women) who died during the study periods in the towns of Almeria, Barcelona, Cadiz, Cordoba, Girona, Granada, Huelva, Jaen, Lleida, Malaga, Seville and Tarragona during the study periods. The dependent variable is the number of deaths observed per census group. The independent variables are the percentage of unemployment, illiteracy and manual workers. Estimation of the moderated relative risk and the study of the associations among the sociodemographic characteristics of the census groups and the mortality will be done for each town and each sex using the Besag-York-Mollie model. Dissemination of the results will help to improve and broaden knowledge about the population's health, and will provide an important starting point to establish the influence of contextual variables on the health of urban populations.
Resumo:
A variety of behavioural traits have substantial effects on the gene dynamics and genetic structure of local populations. The mating system is a plastic trait that varies with environmental conditions in the domestic cat (Felis catus) allowing an intraspecific comparison of the impact of this feature on genetic characteristics of the population. To assess the potential effect of the heterogenity of males' contribution to the next generation on variance effective size, we applied the ecological approach of Nunney & Elam (1994) based upon a demographic and behavioural study, and the genetic 'temporal methods' of Waples (1989) and Berthier et al. (2002) using microsatellite markers. The two cat populations studied were nearly closed, similar in size and survival parameters, but differed in their mating system. Immigration appeared extremely restricted in both cases due to environmental and social constraints. As expected, the ratio of effective size to census number (Ne/N) was higher in the promiscuous cat population (harmonic mean = 42%) than in the polygynous one (33%), when Ne was calculated from the ecological method. Only the genetic results based on Waples' estimator were consistent with the ecological results, but failed to evidence an effect of the mating system. Results based on the estimation of Berthier et al. (2002) were extremely variable, with Ne sometimes exceeding census size. Such low reliability in the genetic results should retain attention for conservation purposes.
Resumo:
Reports of triatomine infestation in urban areas have increased. We analysed the spatial distribution of infestation by triatomines in the urban area of Diamantina, in the state of Minas Gerais, Brazil. Triatomines were obtained by community-based entomological surveillance. Spatial patterns of infestation were analysed by Ripley’s K function and Kernel density estimator. Normalised difference vegetation index (NDVI) and land cover derived from satellite imagery were compared between infested and uninfested areas. A total of 140 adults of four species were captured (100 Triatoma vitticeps, 25Panstrongylus geniculatus, 8 Panstrongylus megistus, and 7 Triatoma arthurneivai specimens). In total, 87.9% were captured within domiciles. Infection by trypanosomes was observed in 19.6% of 107 examined insects. The spatial distributions ofT. vitticeps, P. geniculatus, T. arthurneivai, and trypanosome-positive triatomines were clustered, occurring mainly in peripheral areas. NDVI values were statistically higher in areas infested by T. vitticeps and P. geniculatus. Buildings infested by these species were located closer to open fields, whereas infestations of P. megistus andT. arthurneivai were closer to bare soil. Human occupation and modification of natural areas may be involved in triatomine invasion, exposing the population to these vectors.
Resumo:
A graphical processing unit (GPU) is a hardware device normally used to manipulate computer memory for the display of images. GPU computing is the practice of using a GPU device for scientific or general purpose computations that are not necessarily related to the display of images. Many problems in econometrics have a structure that allows for successful use of GPU computing. We explore two examples. The first is simple: repeated evaluation of a likelihood function at different parameter values. The second is a more complicated estimator that involves simulation and nonparametric fitting. We find speedups from 1.5 up to 55.4 times, compared to computations done on a single CPU core. These speedups can be obtained with very little expense, energy consumption, and time dedicated to system maintenance, compared to equivalent performance solutions using CPUs. Code for the examples is provided.
Resumo:
BACKGROUND Low back pain and its associated incapacitating effects constitute an important healthcare and socioeconomic problem, as well as being one of the main causes of disability among adults of working age. The prevalence of non-specific low back pain is very high among the general population, and 60-70% of adults are believed to have suffered this problem at some time. Nevertheless, few randomised clinical trials have been made of the efficacy and efficiency of acupuncture with respect to acute low back pain. The present study is intended to assess the efficacy of acupuncture for acute low back pain in terms of the improvement reported on the Roland Morris Questionnaire (RMQ) on low back pain incapacity, to estimate the specific and non-specific effects produced by the technique, and to carry out a cost-effectiveness analysis. METHODS/DESIGN Randomised four-branch controlled multicentre prospective study made to compare semi-standardised real acupuncture, sham acupuncture (acupuncture at non-specific points), placebo acupuncture and conventional treatment. The patients are blinded to the real, sham and placebo acupuncture treatments. Patients in the sample present symptoms of non specific acute low back pain, with a case history of 2 weeks or less, and will be selected from working-age patients, whether in paid employment or not, referred by General Practitioners from Primary Healthcare Clinics to the four clinics participating in this study. In order to assess the primary and secondary result measures, the patients will be requested to fill in a questionnaire before the randomisation and again at 3, 12 and 48 weeks after starting the treatment. The primary result measure will be the clinical relevant improvement (CRI) at 3 weeks after randomisation. We define CRI as a reduction of 35% or more in the RMQ results. DISCUSSION This study is intended to obtain further evidence on the effectiveness of acupuncture on acute low back pain and to isolate the specific and non-specific effects of the treatment.