38 resultados para Bayesian statistical decision theory
em Repositório Institucional UNESP - Universidade Estadual Paulista "Julio de Mesquita Filho"
Resumo:
In this article we describe a feature extraction algorithm for pattern classification based on Bayesian Decision Boundaries and Pruning techniques. The proposed method is capable of optimizing MLP neural classifiers by retaining those neurons in the hidden layer that realy contribute to correct classification. Also in this article we proposed a method which defines a plausible number of neurons in the hidden layer based on the stem-and-leaf graphics of training samples. Experimental investigation reveals the efficiency of the proposed method. © 2002 IEEE.
Resumo:
Using the functional integral formalism for the statistical generating functional in the statistical (finite temperature) quantum field theory, we prove the equivalence of many-photon Greens functions in the Duffin-Kennner-Petiau and Klein-Gordon-Fock statistical quantum field theories. As an illustration, we calculate the one-loop polarization operators in both theories and demonstrate their coincidence.
Resumo:
We contrast four distinct versions of the BCS-Bose statistical crossover theory according to the form assumed for the electron-number equation that accompanies the BCS gap equation. The four versions correspond to explicitly accounting for two-hole-(2h) as well as two-electron-(2e) Cooper pairs (CPs), or both in equal proportions, or only either kind. This follows from a recent generalization of the Bose-Einstein condensation (GBEC) statistical theory that includes not boson-boson interactions but rather 2e- and also (without loss of generality) 2h-CPs interacting with unpaired electrons and holes in a single-band model that is easily converted into a two-band model. The GBEC theory is essentially an extension of the Friedberg-Lee 1989 BEC theory of superconductors that excludes 2h-CPs. It can thus recover, when the numbers of 2h- and 2e-CPs in both BE-condensed and non-condensed states are separately equal, the BCS gap equation for all temperatures and couplings as well as the zero-temperature BCS (rigorous-upper-bound) condensation energy for all couplings. But ignoring either 2h- or 2e-CPs it can do neither. In particular, only half the BCS condensation energy is obtained in the two crossover versions ignoring either kind of CPs. We show how critical temperatures T-c from the original BCS-Bose crossover theory in 2D require unphysically large couplings for the Cooper/BCS model interaction to differ significantly from the T(c)s of ordinary BCS theory (where the number equation is substituted by the assumption that the chemical potential equals the Fermi energy). (c) 2007 Published by Elsevier B.V.
Resumo:
In the context of Bayesian statistical analysis, elicitation is the process of formulating a prior density f(.) about one or more uncertain quantities to represent a person's knowledge and beliefs. Several different methods of eliciting prior distributions for one unknown parameter have been proposed. However, there are relatively few methods for specifying a multivariate prior distribution and most are just applicable to specific classes of problems and/or based on restrictive conditions, such as independence of variables. Besides, many of these procedures require the elicitation of variances and correlations, and sometimes elicitation of hyperparameters which are difficult for experts to specify in practice. Garthwaite et al. (2005) discuss the different methods proposed in the literature and the difficulties of eliciting multivariate prior distributions. We describe a flexible method of eliciting multivariate prior distributions applicable to a wide class of practical problems. Our approach does not assume a parametric form for the unknown prior density f(.), instead we use nonparametric Bayesian inference, modelling f(.) by a Gaussian process prior distribution. The expert is then asked to specify certain summaries of his/her distribution, such as the mean, mode, marginal quantiles and a small number of joint probabilities. The analyst receives that information, treating it as a data set D with which to update his/her prior beliefs to obtain the posterior distribution for f(.). Theoretical properties of joint and marginal priors are derived and numerical illustrations to demonstrate our approach are given. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The second main cause of death in Brazil is cancer, and according to statistics disclosed by National Cancer Institute from Brazil (INCA) 466,730 new cases of cancer are forecast for 2008. The analysis of tumour tissues of various types and patients' clinical data, genetic profiles, characteristics of diseases and epidemiological data may lead to more precise diagnoses, providing more effective treatments. In this work we present a clinical decision support system for cancer diseases, which manages a relational database containing information relating to the tumour tissue and their location in freezers, patients and medical forms. Furthermore, it is also discussed some problems encountered, as database integration and the adoption of a standard to describe topography and morphology. It is also discussed the dynamic report generation functionality, that shows data in table and graph format, according to the user's configuration. © ACM 2008.
Resumo:
Monte Carlo simulations of water-dimethylformamide (DMF) mixtures were performed in the isothermal and isobaric ensemble at 298.15 K and 1 atm. The intermolecular interaction energy was calculated using the classical 6-12 Lennard-Jones pairwise potential plus a Coulomb term. The TIP4P model was used for simulating water molecules, and a six-site model previously optimised by us was used to represent DMF. The potential energy for the water-DMF interaction was obtained via standard geometric combining rules using the original potential parameters for the pure liquids. The radial distribution functions calculated for water-DMF mixtures show well characterised hydrogen bonds between the oxygen site of DMF and hydrogen of water. A structureless correlation curve was observed for the interaction between the hydrogen site of the carbonyl group and the oxygen site of water. Hydration effects on the stabilisation of the DMF molecule in aqueous solution have been investigated using statistical perturbation theory. The results show that energetic changes involved in the hydration process are not strong enough to stabilise another configuration of DMF than the planar one.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A body of research has developed within the context of nonlinear signal and image processing that deals with the automatic, statistical design of digital window-based filters. Based on pairs of ideal and observed signals, a filter is designed in an effort to minimize the error between the ideal and filtered signals. The goodness of an optimal filter depends on the relation between the ideal and observed signals, but the goodness of a designed filter also depends on the amount of sample data from which it is designed. In order to lessen the design cost, a filter is often chosen from a given class of filters, thereby constraining the optimization and increasing the error of the optimal filter. To a great extent, the problem of filter design concerns striking the correct balance between the degree of constraint and the design cost. From a different perspective and in a different context, the problem of constraint versus sample size has been a major focus of study within the theory of pattern recognition. This paper discusses the design problem for nonlinear signal processing, shows how the issue naturally transitions into pattern recognition, and then provides a review of salient related pattern-recognition theory. In particular, it discusses classification rules, constrained classification, the Vapnik-Chervonenkis theory, and implications of that theory for morphological classifiers and neural networks. The paper closes by discussing some design approaches developed for nonlinear signal processing, and how the nature of these naturally lead to a decomposition of the error of a designed filter into a sum of the following components: the Bayes error of the unconstrained optimal filter, the cost of constraint, the cost of reducing complexity by compressing the original signal distribution, the design cost, and the contribution of prior knowledge to a decrease in the error. The main purpose of the paper is to present fundamental principles of pattern recognition theory within the framework of active research in nonlinear signal processing.
Resumo:
Several statistical models can be used for assessing genotype X environment interaction (GEI) and studying genotypic stability. The objectives of this research were to show how (i) to use Bayesian methodology for computing Shukla's phenotypic stability variance and (ii) to incorporate prior information on the parameters for better estimation. Potato [Solanum tuberosum subsp. andigenum (Juz. & Bukasov) Hawkes], wheat (Triticum aestivum L.), and maize (Zea mays L.) multi environment trials (MET) were used for illustrating the application of the Bayes paradigm. The potato trial included 15 genotypes, but prior information for just three genotypes was used. The wheat trial used prior information on all 10 genotypes included in the trial, whereas for the maize trial, noninformative priors for the nine genotypes was used. Concerning the posterior distribution of the genotypic means, the maize MET with 20 sites gave less disperse posterior distributions of the genotypic means than did the posterior distribution of the genotypic means of the other METs, which included fewer environments. The Bayesian approach allows use of other statistical strategies such as the normal truncated distribution (used in this study). When analyzing grain yield, a lower bound of zero and an upper bound set by the researcher's experience can be used. The Bayesian paradigm offers plant breeders the possibility of computing the probability of a genotype being the best performer. The results of this study show that although some genotypes may have a very low probability of being the best in all sites, they have a relatively good chance of being among the five highest yielding genotypes.
Resumo:
We prove the equivalence of many-gluon Green's functions in the Duffin-Kemmer-Petieu and Klein-Gordon-Fock statistical quantum field theories. The proof is based on the functional integral formulation for the statistical generating functional in a finite-temperature quantum field theory. As an illustration, we calculate one-loop polarization operators in both theories and show that their expressions indeed coincide.
Resumo:
Enterprises need continuous product development activities to remain competitive in the marketplace. Their product development process (PDP) must manage stakeholders' needs - technical, financial, legal, and environmental aspects, customer requirements, Corporate strategy, etc. -, being a multidisciplinary and strategic issue. An approach to use real option to support the decision-making process at PDP phases in taken. The real option valuation method is often presented as an alternative to the conventional net present value (NPV) approach. It is based on the same principals of financial options: the right to buy or sell financial values (mostly stocks) at a predetermined price, with no obligation to do so. In PDP, a multi-period approach that takes into account the flexibility of, for instance, being able to postpone prototyping and design decisions, waiting for more information about technologies, customer acceptance, funding, etc. In the present article, the state of the art of real options theory is prospected and a model to use the real options in PDP is proposed, so that financial aspects can be properly considered at each project phase of the product development. Conclusion is that such model can provide more robustness to the decisions processes within PDP.