844 resultados para Divergence estimation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

SummaryDiscrete data arise in various research fields, typically when the observations are count data.I propose a robust and efficient parametric procedure for estimation of discrete distributions. The estimation is done in two phases. First, a very robust, but possibly inefficient, estimate of the model parameters is computed and used to indentify outliers. Then the outliers are either removed from the sample or given low weights, and a weighted maximum likelihood estimate (WML) is computed.The weights are determined via an adaptive process such that if the data follow the model, then asymptotically no observation is downweighted.I prove that the final estimator inherits the breakdown point of the initial one, and that its influence function at the model is the same as the influence function of the maximum likelihood estimator, which strongly suggests that it is asymptotically fully efficient.The initial estimator is a minimum disparity estimator (MDE). MDEs can be shown to have full asymptotic efficiency, and some MDEs have very high breakdown points and very low bias under contamination. Several initial estimators are considered, and the performances of the WMLs based on each of them are studied.It results that in a great variety of situations the WML substantially improves the initial estimator, both in terms of finite sample mean square error and in terms of bias under contamination. Besides, the performances of the WML are rather stable under a change of the MDE even if the MDEs have very different behaviors.Two examples of application of the WML to real data are considered. In both of them, the necessity for a robust estimator is clear: the maximum likelihood estimator is badly corrupted by the presence of a few outliers.This procedure is particularly natural in the discrete distribution setting, but could be extended to the continuous case, for which a possible procedure is sketched.RésuméLes données discrètes sont présentes dans différents domaines de recherche, en particulier lorsque les observations sont des comptages.Je propose une méthode paramétrique robuste et efficace pour l'estimation de distributions discrètes. L'estimation est faite en deux phases. Tout d'abord, un estimateur très robuste des paramètres du modèle est calculé, et utilisé pour la détection des données aberrantes (outliers). Cet estimateur n'est pas nécessairement efficace. Ensuite, soit les outliers sont retirés de l'échantillon, soit des faibles poids leur sont attribués, et un estimateur du maximum de vraisemblance pondéré (WML) est calculé.Les poids sont déterminés via un processus adaptif, tel qu'asymptotiquement, si les données suivent le modèle, aucune observation n'est dépondérée.Je prouve que le point de rupture de l'estimateur final est au moins aussi élevé que celui de l'estimateur initial, et que sa fonction d'influence au modèle est la même que celle du maximum de vraisemblance, ce qui suggère que cet estimateur est pleinement efficace asymptotiquement.L'estimateur initial est un estimateur de disparité minimale (MDE). Les MDE sont asymptotiquement pleinement efficaces, et certains d'entre eux ont un point de rupture très élevé et un très faible biais sous contamination. J'étudie les performances du WML basé sur différents MDEs.Le résultat est que dans une grande variété de situations le WML améliore largement les performances de l'estimateur initial, autant en terme du carré moyen de l'erreur que du biais sous contamination. De plus, les performances du WML restent assez stables lorsqu'on change l'estimateur initial, même si les différents MDEs ont des comportements très différents.Je considère deux exemples d'application du WML à des données réelles, où la nécessité d'un estimateur robuste est manifeste : l'estimateur du maximum de vraisemblance est fortement corrompu par la présence de quelques outliers.La méthode proposée est particulièrement naturelle dans le cadre des distributions discrètes, mais pourrait être étendue au cas continu.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genetic diversity of contemporary domesticated species is shaped by both natural and human-driven processes. However, until now, little is known about how domestication has imprinted the variation of fruit tree species. In this study, we reconstruct the recent evolutionary history of the domesticated almond tree, Prunus dulcis, around the Mediterranean basin, using a combination of nuclear and chloroplast microsatellites [i.e. simple sequence repeat (SSRs)] to investigate patterns of genetic diversity. Whereas conservative chloroplast SSRs show a widespread haplotype and rare locally distributed variants, nuclear SSRs show a pattern of isolation by distance with clines of diversity from the East to the West of the Mediterranean basin, while Bayesian genetic clustering reveals a substantial longitudinal genetic structure. Both kinds of markers thus support a single domestication event, in the eastern side of the Mediterranean basin. In addition, model-based estimation of the timing of genetic divergence among those clusters is estimated sometime during the Holocene, a result that is compatible with human-mediated dispersal of almond tree out of its centre of origin. Still, the detection of region-specific alleles suggests that gene flow from relictual wild preglacial populations (in North Africa) or from wild counterparts (in the Near East) could account for a fraction of the diversity observed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new formula for glomerular filtration rate estimation in pediatric population from 2 to 18 years has been developed by the University Unit of Pediatric Nephrology. This Quadratic formula, accessible online, allows pediatricians to adjust drug dosage and/or follow-up renal function more precisely and in an easy manner.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Salmonid populations of many rivers are rapidly declining. One possible explanation is that habitat fragmentation increases genetic drift and reduces the populations' potential to adapt to changing environmental conditions. We measured the genetic and eco-morphological diversity of brown trout (Salmo trutta) in a Swiss stream system, using multivariate statistics and Bayesian clustering. We found large genetic and phenotypic variation within only 40 km of stream length. Eighty-eight percent of all pairwise F(ST) comparisons and 50% of the population comparisons in body shape were significant. High success rates of population assignment tests confirmed the distinctiveness of populations in both genotype and phenotype. Spatial analysis revealed that divergence increased with waterway distance, the number of weirs, and stretches of poor habitat between sampling locations, but effects of isolation-by-distance and habitat fragmentation could not be fully disentangled. Stocking intensity varied between streams but did not appear to erode genetic diversity within populations. A lack of association between phenotypic and genetic divergence points to a role of local adaptation or phenotypically plastic responses to habitat heterogeneity. Indeed, body shape could be largely explained by topographic stream slope, and variation in overall phenotype matched the flow regimes of the respective habitats.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Comparison of donor-acceptor electronic couplings calculated within two-state and three-state models suggests that the two-state treatment can provide unreliable estimates of Vda because of neglecting the multistate effects. We show that in most cases accurate values of the electronic coupling in a π stack, where donor and acceptor are separated by a bridging unit, can be obtained as Ṽ da = (E2 - E1) μ12 Rda + (2 E3 - E1 - E2) 2 μ13 μ23 Rda2, where E1, E2, and E3 are adiabatic energies of the ground, charge-transfer, and bridge states, respectively, μij is the transition dipole moments between the states i and j, and Rda is the distance between the planes of donor and acceptor. In this expression based on the generalized Mulliken-Hush approach, the first term corresponds to the coupling derived within a two-state model, whereas the second term is the superexchange correction accounting for the bridge effect. The formula is extended to bridges consisting of several subunits. The influence of the donor-acceptor energy mismatch on the excess charge distribution, adiabatic dipole and transition moments, and electronic couplings is examined. A diagnostic is developed to determine whether the two-state approach can be applied. Based on numerical results, we showed that the superexchange correction considerably improves estimates of the donor-acceptor coupling derived within a two-state approach. In most cases when the two-state scheme fails, the formula gives reliable results which are in good agreement (within 5%) with the data of the three-state generalized Mulliken-Hush model

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The clinical demand for a device to monitor Blood Pressure (BP) in ambulatory scenarios with minimal use of inflation cuffs is increasing. Based on the so-called Pulse Wave Velocity (PWV) principle, this paper introduces and evaluates a novel concept of BP monitor that can be fully integrated within a chest sensor. After a preliminary calibration, the sensor provides non-occlusive beat-by-beat estimations of Mean Arterial Pressure (MAP) by measuring the Pulse Transit Time (PTT) of arterial pressure pulses travelling from the ascending aorta towards the subcutaneous vasculature of the chest. In a cohort of 15 healthy male subjects, a total of 462 simultaneous readings consisting of reference MAP and chest PTT were acquired. Each subject was recorded at three different days: D, D+3 and D+14. Overall, the implemented protocol induced MAP values to range from 80 ± 6 mmHg in baseline, to 107 ± 9 mmHg during isometric handgrip maneuvers. Agreement between reference and chest-sensor MAP values was tested by using intraclass correlation coefficient (ICC = 0.78) and Bland-Altman analysis (mean error = 0.7 mmHg, standard deviation = 5.1 mmHg). The cumulative percentage of MAP values provided by the chest sensor falling within a range of ±5 mmHg compared to reference MAP readings was of 70%, within ±10 mmHg was of 91%, and within ±15mmHg was of 98%. These results point at the fact that the chest sensor complies with the British Hypertension Society (BHS) requirements of Grade A BP monitors, when applied to MAP readings. Grade A performance was maintained even two weeks after having performed the initial subject-dependent calibration. In conclusion, this paper introduces a sensor and a calibration strategy to perform MAP measurements at the chest. The encouraging performance of the presented technique paves the way towards an ambulatory-compliant, continuous and non-occlusive BP monitoring system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Part I of this series of articles focused on the construction of graphical probabilistic inference procedures, at various levels of detail, for assessing the evidential value of gunshot residue (GSR) particle evidence. The proposed models - in the form of Bayesian networks - address the issues of background presence of GSR particles, analytical performance (i.e., the efficiency of evidence searching and analysis procedures) and contamination. The use and practical implementation of Bayesian networks for case pre-assessment is also discussed. This paper, Part II, concentrates on Bayesian parameter estimation. This topic complements Part I in that it offers means for producing estimates useable for the numerical specification of the proposed probabilistic graphical models. Bayesian estimation procedures are given a primary focus of attention because they allow the scientist to combine (his/her) prior knowledge about the problem of interest with newly acquired experimental data. The present paper also considers further topics such as the sensitivity of the likelihood ratio due to uncertainty in parameters and the study of likelihood ratio values obtained for members of particular populations (e.g., individuals with or without exposure to GSR).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biochemical systems are commonly modelled by systems of ordinary differential equations (ODEs). A particular class of such models called S-systems have recently gained popularity in biochemical system modelling. The parameters of an S-system are usually estimated from time-course profiles. However, finding these estimates is a difficult computational problem. Moreover, although several methods have been recently proposed to solve this problem for ideal profiles, relatively little progress has been reported for noisy profiles. We describe a special feature of a Newton-flow optimisation problem associated with S-system parameter estimation. This enables us to significantly reduce the search space, and also lends itself to parameter estimation for noisy data. We illustrate the applicability of our method by applying it to noisy time-course data synthetically produced from previously published 4- and 30-dimensional S-systems. In addition, we propose an extension of our method that allows the detection of network topologies for small S-systems. We introduce a new method for estimating S-system parameters from time-course profiles. We show that the performance of this method compares favorably with competing methods for ideal profiles, and that it also allows the determination of parameters for noisy profiles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To date, state-of-the-art seismic material parameter estimates from multi-component sea-bed seismic data are based on the assumption that the sea-bed consists of a fully elastic half-space. In reality, however, the shallow sea-bed generally consists of soft, unconsolidated sediments that are characterized by strong to very strong seismic attenuation. To explore the potential implications, we apply a state-of-the-art elastic decomposition algorithm to synthetic data for a range of canonical sea-bed models consisting of a viscoelastic half-space of varying attenuation. We find that in the presence of strong seismic attenuation, as quantified by Q-values of 10 or less, significant errors arise in the conventional elastic estimation of seismic properties. Tests on synthetic data indicate that these errors can be largely avoided by accounting for the inherent attenuation of the seafloor when estimating the seismic parameters. This can be achieved by replacing the real-valued expressions for the elastic moduli in the governing equations in the parameter estimation by their complex-valued viscoelastic equivalents. The practical application of our parameter procedure yields realistic estimates of the elastic seismic material properties of the shallow sea-bed, while the corresponding Q-estimates seem to be biased towards too low values, particularly for S-waves. Given that the estimation of inelastic material parameters is notoriously difficult, particularly in the immediate vicinity of the sea-bed, this is expected to be of interest and importance for civil and ocean engineering purposes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A number of experimental methods have been reported for estimating the number of genes in a genome, or the closely related coding density of a genome, defined as the fraction of base pairs in codons. Recently, DNA sequence data representative of the genome as a whole have become available for several organisms, making the problem of estimating coding density amenable to sequence analytic methods. Estimates of coding density for a single genome vary widely, so that methods with characterized error bounds have become increasingly desirable. We present a method to estimate the protein coding density in a corpus of DNA sequence data, in which a ‘coding statistic’ is calculated for a large number of windows of the sequence under study, and the distribution of the statistic is decomposed into two normal distributions, assumed to be the distributions of the coding statistic in the coding and noncoding fractions of the sequence windows. The accuracy of the method is evaluated using known data and application is made to the yeast chromosome III sequence and to C.elegans cosmid sequences. It can also be applied to fragmentary data, for example a collection of short sequences determined in the course of STS mapping.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The epithelial sodium channel (ENaC) regulates the sodium reabsorption in the principal cells of collecting duct of the nephron, and is essential for the maintenance of Na+ balance and blood pressure. ENaC is regulated by hormones such as aldosterone and vasopressin, by serine proteases. The functional ENaC channel expressed at the cell surface is a hetemultimeric complex composed by the homologous a, ß and y subunits. Several functional and biochemical studies have provided evidence that the ENaC is a heterotetramer formed by 2a lß and ly subunits. Recently, a channel homologue of ENaC, the acid-sensing ion channel ASIC1 has been crystallized as a homotrimer. This discrepancy in the subunit composition of these two channels of the same family, motivated us to revisit the subunit oligomerization of the purified functional abg EnaC channel complex. His(6)ENaC a ß y subunits were expressed in Xenopus leavis oocytes. The three ENaC subunits copurify on Ni+2-NTA agarose beads in a aßy ENaC complex. On Western blot, the ENaC subunits show typical post-translation modifications associated with a functional channel. Using differentially tagged ENaC subunits, we could demonstrate that 2 different a ENaC co- purify with ß and y subunits, whereas only one single ß and y are detected in the ENaC complex. Comparison of the mass of the aßy ENaC complex on Western blot under non reducing conditions with different ENaC dimeric, trimmeric and tetratemeric concatamers indicate that the ENaC channel complex is a heterotetramer made of 2a-, lß-, and ly ENaC subunits. Our result will certainly not provide the last words on the subunit stoichiometry of the ENaC/ASIC channels, but hopefully will promote the réévaluation of the cASICl crystal structure for its functional relevance. -- Le canal épithélial sodique ENaC est responsable de la réabsorption du sodium dans les cellules principales du tubule collecteur rénal et joue un rôle important dans le maintien de l'homéostasie sodique et le maintien de la pression artérielle. Ce canal est régulé par des hormones telles que l'aldostérone ou la Vasopressine mais également par des sérines protéases. ENaC est un canal multimerique constitué des trois sous-unités homologues a, ß and y. De nombreuses études fonctionnelles et biochimiques ont montré que le canal ENaC fonctionnel exprimé à la surface cellulaire est un canal formé de 4 sous unités avec une stoichiometric préférentielle de 2 sous-unités a, 1 sous-unité ß et 1 sous-unité y. Récemment, la cristallisation du canal sodique sensible au pH acide, ASIC, un autre membre de la famille ENaC/Deg, a mis en évidence un canal homotrimérique. Cette divergence dans la composition en sous-unités formant les complexes ENaC et ASIC, deux canaux de la même famille de gènes, nous a motivé à réinvestiguer le problème de l'oligomérisation du complexe fonctionnel ENaC après purification. Dans ce but le complexe ENaC fait des sous-unités aßy marquées par un épitope His 6 ont été exprimées dans l'ovocyte de Xenopus leavis. Les trois sous-unités aßy du complexe ENaC peuvent être co-purifiées sur des billes d'agarose Ni+2-NTA et montrent les modifications post-traductionnelles attendues pour le complexe fonctionnel ENaC exprimé en surface. Nous avons pu démontrer que ce complexe ENaC fonctionnel, est formé de deux sous-unités a différentes, mais de une seule sous-unité ß et une seule sous-unité y, suggérant un complexe ENaC formé de plus de trois sous-unités. L'estimation de la masse du complexe fonctionnel ENaC par Western blot, en comparaison avec des constructions concatemériques de ENaC faites de 2, 3, ou 4 sous-unités indique que le complexe aßy ENaC fonctionnel est une hétérotétramère composé de 2 sous-unités a, une ß et une y. Ces expériences ne représentent pas le fin d'une controverse quant à la structure des canaux ENaC et ASIC, mais soulèvent la question de la relevance fonctionnelle de la structure tridimentionelle du canal ASIC révélée par crystallographie.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Human arteries affected by atherosclerosis are characterized by altered wall viscoelastic properties. The possibility of noninvasively assessing arterial viscoelasticity in vivo would significantly contribute to the early diagnosis and prevention of this disease. This paper presents a noniterative technique to estimate the viscoelastic parameters of a vascular wall Zener model. The approach requires the simultaneous measurement of flow variations and wall displacements, which can be provided by suitable ultrasound Doppler instruments. Viscoelastic parameters are estimated by fitting the theoretical constitutive equations to the experimental measurements using an ARMA parameter approach. The accuracy and sensitivity of the proposed method are tested using reference data generated by numerical simulations of arterial pulsation in which the physiological conditions and the viscoelastic parameters of the model can be suitably varied. The estimated values quantitatively agree with the reference values, showing that the only parameter affected by changing the physiological conditions is viscosity, whose relative error was about 27% even when a poor signal-to-noise ratio is simulated. Finally, the feasibility of the method is illustrated through three measurements made at different flow regimes on a cylindrical vessel phantom, yielding a parameter mean estimation error of 25%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new registration algorithm, called Temporal Di eomorphic Free Form Deformation (TDFFD), and its application to motion and strain quanti cation from a sequence of 3D ultrasound (US) images. The originality of our approach resides in enforcing time consistency by representing the 4D velocity eld as the sum of continuous spatiotemporal B-Spline kernels. The spatiotemporal displacement eld is then recovered through forward Eulerian integration of the non-stationary velocity eld. The strain tensor iscomputed locally using the spatial derivatives of the reconstructed displacement eld. The energy functional considered in this paper weighs two terms: the image similarity and a regularization term. The image similarity metric is the sum of squared di erences between the intensities of each frame and a reference one. Any frame in the sequence can be chosen as reference. The regularization term is based on theincompressibility of myocardial tissue. TDFFD was compared to pairwise 3D FFD and 3D+t FFD, bothon displacement and velocity elds, on a set of synthetic 3D US images with di erent noise levels. TDFFDshowed increased robustness to noise compared to these two state-of-the-art algorithms. TDFFD also proved to be more resistant to a reduced temporal resolution when decimating this synthetic sequence. Finally, this synthetic dataset was used to determine optimal settings of the TDFFD algorithm. Subsequently, TDFFDwas applied to a database of cardiac 3D US images of the left ventricle acquired from 9 healthy volunteers and 13 patients treated by Cardiac Resynchronization Therapy (CRT). On healthy cases, uniform strain patterns were observed over all myocardial segments, as physiologically expected. On all CRT patients, theimprovement in synchrony of regional longitudinal strain correlated with CRT clinical outcome as quanti ed by the reduction of end-systolic left ventricular volume at follow-up (6 and 12 months), showing the potential of the proposed algorithm for the assessment of CRT.