898 resultados para Bayesian fusion
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
A methodology to define favorable areas in petroleum and mineral exploration is applied, which consists in weighting the exploratory variables, in order to characterize their importance as exploration guides. The exploration data are spatially integrated in the selected area to establish the association between variables and deposits, and the relationships among distribution, topology, and indicator pattern of all variables. Two methods of statistical analysis were compared. The first one is the Weights of Evidence Modeling, a conditional probability approach (Agterberg, 1989a), and the second one is the Principal Components Analysis (Pan, 1993). In the conditional method, the favorability estimation is based on the probability of deposit and variable joint occurrence, with the weights being defined as natural logarithms of likelihood ratios. In the multivariate analysis, the cells which contain deposits are selected as control cells and the weights are determined by eigendecomposition, being represented by the coefficients of the eigenvector related to the system's largest eigenvalue. The two techniques of weighting and complementary procedures were tested on two case studies: 1. Recôncavo Basin, Northeast Brazil (for Petroleum) and 2. Itaiacoca Formation of Ribeira Belt, Southeast Brazil (for Pb-Zn Mississippi Valley Type deposits). The applied methodology proved to be easy to use and of great assistance to predict the favorability in large areas, particularly in the initial phase of exploration programs. © 1998 International Association for Mathematical Geology.
Resumo:
The advent of molecular markers has created opportunities for a better understanding of quantitative inheritance and for developing novel strategies for genetic improvement of agricultural species, using information on quantitative trait loci (QTL). A QTL analysis relies on accurate genetic marker maps. At present, most statistical methods used for map construction ignore the fact that molecular data may be read with error. Often, however, there is ambiguity about some marker genotypes. A Bayesian MCMC approach for inferences about a genetic marker map when random miscoding of genotypes occurs is presented, and simulated and real data sets are analyzed. The results suggest that unless there is strong reason to believe that genotypes are ascertained without error, the proposed approach provides more reliable inference on the genetic map.
Resumo:
In this article we describe a feature extraction algorithm for pattern classification based on Bayesian Decision Boundaries and Pruning techniques. The proposed method is capable of optimizing MLP neural classifiers by retaining those neurons in the hidden layer that realy contribute to correct classification. Also in this article we proposed a method which defines a plausible number of neurons in the hidden layer based on the stem-and-leaf graphics of training samples. Experimental investigation reveals the efficiency of the proposed method. © 2002 IEEE.
Resumo:
Linear mixed effects models have been widely used in analysis of data where responses are clustered around some random effects, so it is not reasonable to assume independence between observations in the same cluster. In most biological applications, it is assumed that the distributions of the random effects and of the residuals are Gaussian. This makes inferences vulnerable to the presence of outliers. Here, linear mixed effects models with normal/independent residual distributions for robust inferences are described. Specific distributions examined include univariate and multivariate versions of the Student-t, the slash and the contaminated normal. A Bayesian framework is adopted and Markov chain Monte Carlo is used to carry out the posterior analysis. The procedures are illustrated using birth weight data on rats in a texicological experiment. Results from the Gaussian and robust models are contrasted, and it is shown how the implementation can be used for outlier detection. The thick-tailed distributions provide an appealing robust alternative to the Gaussian process in linear mixed models, and they are easily implemented using data augmentation and MCMC techniques.
Resumo:
Samples with a composition of 40InF 3-20ZnF 2-5MCl- xBaF 2-ySrF 2, where M=Na, Li and x+y=35 mol%, were prepared. The thermal properties related to the Ba/Sr ratio and to the remaining chlorine content in the glasses were studied. Thermal stability is improved with the addition of chlorine. However, chlorine concentration is regulated by the sublimation of indium fluorides which takes place at about 600°C. Indium fluorides arc formed during glass fusion. The mechanisms of chlorine sublimation were studied. © 2005 Akadémiai Kiadó, Budapest.
Resumo:
Membrane fusion is an essential step in the entry of enveloped viruses into their host cells triggered by conformational changes in viral glycoproteins. We have demonstrated previously that modification of vesicular stomatitis virus (VSV) with diethylpyrocarbonate (DEPC) abolished conformational changes on VSV glycoprotein and the fusion reaction catalyzed by the virus. In the present study, we evaluated whether treatment with DEPC was able to inactivate the virus. Infectivity and viral replication were abolished by viral treatment with 0.5 mM DEPC. Mortality profile and inflammatory response in the central nervous system indicated that G protein modification with DEPC eliminates the ability of the virus to cause disease. In addition, DEPC treatment did not alter the conformational integrity of surface proteins of inactivated VSV as demonstrated by transmission electron microscopy and competitive ELISA. Taken together, our results suggest a potential use of histidine (His) modification to the development of a new process of viral inactivation based on fusion inhibition. © 2006 Elsevier B.V. All rights reserved.
Resumo:
Biometrics is one of the biggest tendencies in human identification. The fingerprint is the most widely used biometric. However considering the automatic fingerprint recognition a completely solved problem is a common mistake. The most popular and extensively used methods, the minutiae-based, do not perform well on poor-quality images and when just a small area of overlap between the template and the query images exists. The use of multibiometrics is considered one of the keys to overcome the weakness and improve the accuracy of biometrics systems. This paper presents the fusion of a minutiae-based and a ridge-based fingerprint recognition method at rank, decision and score level. The fusion techniques implemented leaded to a reduction of the Equal Error Rate by 31.78% (from 4.09% to 2.79%) and a decreasing of 6 positions in the rank to reach a Correct Retrieval (from rank 8 to 2) when assessed in the FVC2002-DB1A database. © 2008 IEEE.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The objective of this experiment was to test in vitro embryo production (IVP) as a tool to estimate fertility performance in zebu bulls using Bayesian inference statistics. Oocytes were matured and fertilized in vitro using sperm cells from three different Zebu bulls (V, T, and G). The three bulls presented similar results with regard to pronuclear formation and blastocyst formation rates. However, the cleavage rates were different between bulls. The estimated conception rates based on combined data of cleavage and blastocyst formation were very similar to the true conception rates observed for the same bulls after a fixed-time artificial insemination program. Moreover, even when we used cleavage rate data only or blastocyst formation data only, the estimated conception rates were still close to the true conception rates. We conclude that Bayesian inference is an effective statistical procedure to estimate in vivo bull fertility using data from IVP. © 2011 Mateus José Sudano et al.
Resumo:
A Bayesian nonparametric model for Taguchi's on-line quality monitoring procedure for attributes is introduced. The proposed model may accommodate the original single shift setting to the more realistic situation of gradual quality deterioration and allows the incorporation of an expert's opinion on the production process. Based on the number of inspections to be carried out until a defective item is found, the Bayesian operation for the distribution function that represents the increasing sequence of defective fractions during a cycle considering a mixture of Dirichlet processes as prior distribution is performed. Bayes estimates for relevant quantities are also obtained. © 2012 Elsevier B.V.
Resumo:
The use of saturated two-level designs is very popular, especially in industrial applications where the cost of experiments is too high. Standard classical approaches are not appropriate to analyze data from saturated designs, since we could only get the estimates of the main factor effects and we would not have degrees of freedom to estimate the variance of the error. In this paper, we propose the use of empirical Bayesian procedures to get inferences for data obtained from saturated designs. The proposed methodology is illustrated assuming a simulated data set. © 2013 Growing Science Ltd. All rights reserved.
Resumo:
Wireless Sensor Networks (WSNs) can be used to monitor hazardous and inaccessible areas. In these situations, the power supply (e.g. battery) of each node cannot be easily replaced. One solution to deal with the limited capacity of current power supplies is to deploy a large number of sensor nodes, since the lifetime and dependability of the network will increase through cooperation among nodes. Applications on WSN may also have other concerns, such as meeting temporal deadlines on message transmissions and maximizing the quality of information. Data fusion is a well-known technique that can be useful for the enhancement of data quality and for the maximization of WSN lifetime. In this paper, we propose an approach that allows the implementation of parallel data fusion techniques in IEEE 802.15.4 networks. One of the main advantages of the proposed approach is that it enables a trade-off between different user-defined metrics through the use of a genetic machine learning algorithm. Simulations and field experiments performed in different communication scenarios highlight significant improvements when compared with, for instance, the Gur Game approach or the implementation of conventional periodic communication techniques over IEEE 802.15.4 networks. © 2013 Elsevier B.V. All rights reserved.
Resumo:
The exponential-logarithmic is a new lifetime distribution with decreasing failure rate and interesting applications in the biological and engineering sciences. Thus, a Bayesian analysis of the parameters would be desirable. Bayesian estimation requires the selection of prior distributions for all parameters of the model. In this case, researchers usually seek to choose a prior that has little information on the parameters, allowing the data to be very informative relative to the prior information. Assuming some noninformative prior distributions, we present a Bayesian analysis using Markov Chain Monte Carlo (MCMC) methods. Jeffreys prior is derived for the parameters of exponential-logarithmic distribution and compared with other common priors such as beta, gamma, and uniform distributions. In this article, we show through a simulation study that the maximum likelihood estimate may not exist except under restrictive conditions. In addition, the posterior density is sometimes bimodal when an improper prior density is used. © 2013 Copyright Taylor and Francis Group, LLC.
Resumo:
We investigate the possibilities of New Physics affecting the Standard Model (SM) Higgs sector. An effective Lagrangian with dimension-six operators is used to capture the effect of New Physics. We carry out a global Bayesian inference analysis, considering the recent LHC data set including all available correlations, as well as results from Tevatron. Trilinear gauge boson couplings and electroweak precision observables are also taken into account. The case of weak bosons tensorial couplings is closely examined and NLO QCD corrections are taken into account in the deviations we predict. We consider two scenarios, one where the coefficients of all the dimension-six operators are essentially unconstrained, and one where a certain subset is loop suppressed. In both scenarios, we find that large deviations from some of the SM Higgs couplings can still be present, assuming New Physics arising at 3 TeV. In particular, we find that a significantly reduced coupling of the Higgs to the top quark is possible and slightly favored by searches on Higgs production in association with top quark pairs. The total width of the Higgs boson is only weakly constrained and can vary between 0.7 and 2.7 times the Standard Model value within 95% Bayesian credible interval (BCI). We also observe sizeable effects induced by New Physics contributions to tensorial couplings. In particular, the Higgs boson decay width into Zγ can be enhanced by up to a factor 12 within 95% BCI. © 2013 SISSA.