953 resultados para quasi-likelihood
Resumo:
Dynamic morphological transitions in thin-layer electrodeposits obtained from copper sulphate solutions have been studied. The chemical composition of the electrodeposits indicates that they appear as a consequence of the competition between copper and cuprous oxide formation. In addition, the Ohmic control of the process is verified at initial stages of the deposit growth. At higher deposit developments, gravity-induced convection currents play a role in the control of the whole process and affect the position of these transitions.
Resumo:
The influence of an inert electrolyte (sodium sulfate) on quasi-two-dimensional copper electrodeposition from a nondeaerated aqueous copper sulfate solution has been analyzed. The different morphologies for a fixed concentration of CuSO4 have been classified in a diagram in terms of the applied potential and the inert electrolyte concentration. The main conclusion is the extension of the well-known Ohmic model for the homogeneous growth regime for copper sulfate solutions with small amounts of sodium sulfate. Moreover, we have observed the formation of fingerlike deposits at large applied potential and inert electrolyte concentration values, before hydrogen evolution becomes the main electrode reaction.
Resumo:
The present paper focuses on the analysis and discussion of a likelihood ratio (LR) development for propositions at a hierarchical level known in the context as 'offence level'. Existing literature on the topic has considered LR developments for so-called offender to scene transfer cases. These settings involve-in their simplest form-a single stain found on a crime scene, but with possible uncertainty about the degree to which that stain is relevant (i.e. that it has been left by the offender). Extensions to multiple stains or multiple offenders have also been reported. The purpose of this paper is to discuss a development of a LR for offence level propositions when case settings involve potential transfer in the opposite direction, i.e. victim/scene to offender transfer. This setting has previously not yet been considered. The rationale behind the proposed LR is illustrated through graphical probability models (i.e. Bayesian networks). The role of various uncertain parameters is investigated through sensitivity analyses as well as simulations.
Resumo:
We study the response of regional employment and nominal wages to trade liberalization, exploiting the natural experiment provided by the opening of Central and Eastern European markets after the fall of the Iron Curtain in 1990. Using data for Austrian municipalities, we examine di¤erential pre- and post-1990 wage and employment growth rates between regions bordering the formerly communist economies and interior regions. If the 'border regions'are de...ned narrowly, within a band of less than 50 kilometers, we can identify statistically signi...cant liberalization e¤ects on both employment and wages. While wages responded earlier than employment, the employment e¤ect over the entire adjustment period is estimated to be around three times as large as the wage e¤ect. The implied slope of the regional labor supply curve can be replicated in an economic geography model that features obstacles to labor migration due to immobile housing and to heterogeneous locational preferences.
Resumo:
Two likelihood ratio (LR) approaches are presented to evaluate the strength of evidence of MDMA tablet comparisons. The first one is based on a more 'traditional' comparison of MDMA tablets by using distance measures (e.g., Pearson correlation distance or a Euclidean distance). In this approach, LRs are calculated using the distribution of distances between tablets of the same-batch and that of different-batches. The second approach is based on methods used in some other fields of forensic comparison. Here LRs are calculated based on the distribution of values of MDMA tablet characteristics within a specific batch and from all batches. The data used in this paper must be seen as examples to illustrate both methods. In future research the methods can be applied to other and more complex data. In this paper, the methods and their results are discussed, considering their performance in evidence evaluation and several practical aspects. With respect to evidence in favor of the correct hypothesis, the second method proved to be better than the first one. It is shown that the LRs in same-batch comparisons are generally higher compared to the first method and the LRs in different-batch comparisons are generally lower. On the other hand, for operational purposes (where quick information is needed), the first method may be preferred, because it is less time consuming. With this method a model has to be estimated only once in a while, which means that only a few measurements have to be done, while with the second method more measurements are needed because each time a new model has to be estimated.
Resumo:
Biometric system performance can be improved by means of data fusion. Several kinds of information can be fused in order to obtain a more accurate classification (identification or verification) of an input sample. In this paper we present a method for computing the weights in a weighted sum fusion for score combinations, by means of a likelihood model. The maximum likelihood estimation is set as a linear programming problem. The scores are derived from a GMM classifier working on a different feature extractor. Our experimental results assesed the robustness of the system in front a changes on time (different sessions) and robustness in front a change of microphone. The improvements obtained were significantly better (error bars of two standard deviations) than a uniform weighted sum or a uniform weighted product or the best single classifier. The proposed method scales computationaly with the number of scores to be fussioned as the simplex method for linear programming.
Resumo:
An e cient procedure for the blind inversion of a nonlinear Wiener system is proposed. We proved that the problem can be expressed as a problem of blind source separation in nonlinear mixtures, for which a solution has been recently proposed. Based on a quasi-nonparametric relative gradient descent, the proposed algorithm can perform e ciently even in the presence of hard distortions.
Resumo:
Well developed experimental procedures currently exist for retrieving and analyzing particle evidence from hands of individuals suspected of being associated with the discharge of a firearm. Although analytical approaches (e.g. automated Scanning Electron Microscopy with Energy Dispersive X-ray (SEM-EDS) microanalysis) allow the determination of the presence of elements typically found in gunshot residue (GSR) particles, such analyses provide no information about a given particle's actual source. Possible origins for which scientists may need to account for are a primary exposure to the discharge of a firearm or a secondary transfer due to a contaminated environment. In order to approach such sources of uncertainty in the context of evidential assessment, this paper studies the construction and practical implementation of graphical probability models (i.e. Bayesian networks). These can assist forensic scientists in making the issue tractable within a probabilistic perspective. The proposed models focus on likelihood ratio calculations at various levels of detail as well as case pre-assessment.
Resumo:
The restricted maximum likelihood is preferred by many to the full maximumlikelihood for estimation with variance component and other randomcoefficientmodels, because the variance estimator is unbiased. It is shown that thisunbiasednessis accompanied in some balanced designs by an inflation of the meansquared error.An estimator of the cluster-level variance that is uniformly moreefficient than the fullmaximum likelihood is derived. Estimators of the variance ratio are alsostudied.
Resumo:
Motivation: The comparative analysis of gene gain and loss rates is critical for understanding the role of natural selection and adaptation in shaping gene family sizes. Studying complete genome data from closely related species allows accurate estimation of gene family turnover rates. Current methods and software tools, however, are not well designed for dealing with certain kinds of functional elements, such as microRNAs or transcription factor binding sites. Results: Here, we describe BadiRate, a new software tool to estimate family turnover rates, as well as the number of elements in internal phylogenetic nodes, by likelihood-based methods and parsimony. It implements two stochastic population models, which provide the appropriate statistical framework for testing hypothesis, such as lineage-specific gene family expansions or contractions. We have assessed the accuracy of BadiRate by computer simulations, and have also illustrated its functionality by analyzing a representative empirical dataset.
Resumo:
Motivation: The comparative analysis of gene gain and loss rates is critical for understanding the role of natural selection and adaptation in shaping gene family sizes. Studying complete genome data from closely related species allows accurate estimation of gene family turnover rates. Current methods and software tools, however, are not well designed for dealing with certain kinds of functional elements, such as microRNAs or transcription factor binding sites. Results: Here, we describe BadiRate, a new software tool to estimate family turnover rates, as well as the number of elements in internal phylogenetic nodes, by likelihood-based methods and parsimony. It implements two stochastic population models, which provide the appropriate statistical framework for testing hypothesis, such as lineage-specific gene family expansions or contractions. We have assessed the accuracy of BadiRate by computer simulations, and have also illustrated its functionality by analyzing a representative empirical dataset.
Resumo:
This paper is concerned with the derivation of new estimators and performance bounds for the problem of timing estimation of (linearly) digitally modulated signals. The conditional maximum likelihood (CML) method is adopted, in contrast to the classical low-SNR unconditional ML (UML) formulationthat is systematically applied in the literature for the derivationof non-data-aided (NDA) timing-error-detectors (TEDs). A new CML TED is derived and proved to be self-noise free, in contrast to the conventional low-SNR-UML TED. In addition, the paper provides a derivation of the conditional Cramér–Rao Bound (CRB ), which is higher (less optimistic) than the modified CRB (MCRB)[which is only reached by decision-directed (DD) methods]. It is shown that the CRB is a lower bound on the asymptotic statisticalaccuracy of the set of consistent estimators that are quadratic with respect to the received signal. Although the obtained boundis not general, it applies to most NDA synchronizers proposed in the literature. A closed-form expression of the conditional CRBis obtained, and numerical results confirm that the CML TED attains the new bound for moderate to high Eg/No.