843 resultados para Inference Technique
Resumo:
Efficient new Bayesian inference technique is employed for studying critical properties of the Ising linear perceptron and for signal detection in code division multiple access (CDMA). The approach is based on a recently introduced message passing technique for densely connected systems. Here we study both critical and non-critical regimes. Results obtained in the non-critical regime give rise to a highly efficient signal detection algorithm in the context of CDMA; while in the critical regime one observes a first-order transition line that ends in a continuous phase transition point. Finite size effects are also studied. © 2006 Elsevier B.V. All rights reserved.
Resumo:
Les données provenant de l'échantillonnage fin d'un processus continu (champ aléatoire) peuvent être représentées sous forme d'images. Un test statistique permettant de détecter une différence entre deux images peut être vu comme un ensemble de tests où chaque pixel est comparé au pixel correspondant de l'autre image. On utilise alors une méthode de contrôle de l'erreur de type I au niveau de l'ensemble de tests, comme la correction de Bonferroni ou le contrôle du taux de faux-positifs (FDR). Des méthodes d'analyse de données ont été développées en imagerie médicale, principalement par Keith Worsley, utilisant la géométrie des champs aléatoires afin de construire un test statistique global sur une image entière. Il s'agit d'utiliser l'espérance de la caractéristique d'Euler de l'ensemble d'excursion du champ aléatoire sous-jacent à l'échantillon au-delà d'un seuil donné, pour déterminer la probabilité que le champ aléatoire dépasse ce même seuil sous l'hypothèse nulle (inférence topologique). Nous exposons quelques notions portant sur les champs aléatoires, en particulier l'isotropie (la fonction de covariance entre deux points du champ dépend seulement de la distance qui les sépare). Nous discutons de deux méthodes pour l'analyse des champs anisotropes. La première consiste à déformer le champ puis à utiliser les volumes intrinsèques et les compacités de la caractéristique d'Euler. La seconde utilise plutôt les courbures de Lipschitz-Killing. Nous faisons ensuite une étude de niveau et de puissance de l'inférence topologique en comparaison avec la correction de Bonferroni. Finalement, nous utilisons l'inférence topologique pour décrire l'évolution du changement climatique sur le territoire du Québec entre 1991 et 2100, en utilisant des données de température simulées et publiées par l'Équipe Simulations climatiques d'Ouranos selon le modèle régional canadien du climat.
Resumo:
A partir de uma adaptação da metodologia de Osler e Chang (1995), este trabalho avalia, empiricamente, a lucratividade de estratégias de investimento baseadas na identificação do padrão gráfico de Análise Técnica Ombro-Cabeça-Ombro no mercado de ações brasileiro. Para isso, foram definidas diversas estratégias de investimento condicionais à identificação de padrões Ombro-Cabeça- Ombro (em suas formas padrão e invertida), por um algoritmo computadorizado, em séries diárias de preços de 47 ações no período de janeiro de 1994 a agosto de 2006. Para testar o poder de previsão de cada estratégia, foram construídos intervalos de confiança, a partir da técnica Bootstrap de inferência amostral, consistentes com a hipótese nula de que, baseado apenas em dados históricos, não é possível criar estratégias com retornos positivos. Mais especificamente, os retornos médios obtidos por cada estratégia nas séries de preços das ações, foram comparados àqueles obtidos pelas mesmas estratégias aplicadas a 1.000 séries de preços artificiais - para cada ação - geradas segundo dois modelos de preços de ações largamente utilizados: Random Walk e E-GARCH. De forma geral, os resultados encontrados mostram que é possível criar estratégias condicionais à realização dos padrões Ombro- Cabeça-Ombro com retornos positivos, indicando que esses padrões conseguem capturar nas séries históricas de preços de ações sinais a respeito da sua movimentação futura de preços, que não são explicados nem por um Random Walk e nem por um E-GARCH. No entanto, se levados em consideração os efeitos das taxas e dos custos de transação, dependendo das suas magnitudes, essas conclusões somente se mantêm para o padrão na sua forma invertida
Resumo:
El objetivo de esta Tesis es crear un Modelo de Diseño Orientado a Marcos que, intermedio entre el Mundo Externo y el Modelo Interno del Mundo que supone el sistema ímplementado, disminuya la pérdida de conocimiento que se produce al formalizar la realidad en Bases de Conocimientos. El modelo disminuye la pérdida de conocimiento al formalizar Bases de Conocimiento, acercando el formalismo de Marcos al Mundo Externo, porque: 1. Crea una base teórica que uniformiza el concepto de Marco en el plano de la Formalización, estableciendo un conjunto de restricciones sintácticas y semánticas que impedirán, al Ingeniero del Conocimiento (IC) cuando formaliza, definir elementos no permitidos o el uso indebido de ellos. 2. Se incrementa la expresividad del formalismo al asociar a cada una de las propiedades de un marco clase un parámetro adicional que simboliza la representatividad de la propiedad en el concepto. Este parámetro, y las técnicas de inferencia que trabajan con él, permitirán al IC introducir en el Modelo Formalizado conocimiento que antes no introducía al construir la base de conocimientos y que, sin embargo, sí existía en la realidad. 3. Se propone una técnica de equiparación que trabaja con el conocimiento incierto presente en el dominio. Esta técnica de equiparación, utiliza la representatividad de las propiedades en los marcos clase y el grado de certeza de las propiedades de las entidades para calcular el valor de equiparación y, así, determinar en qué medida los marcos clase seleccionados son consistentes con la descripción de la situación actual dada por una entidad. 4. Proporciona nuevas técnicas de inferencia basadas en la transferencia de propiedades y modifica las ya existentes. Las transferencias de propiedades realizadas sobre relaciones "ad hoc" definidas por el IC al construir el sistema, es una nueva técnica de inferencia independiente y complementaria a la transferencia de propiedades llamada tradicionalmente Herencia (cesión de propiedades entre padres e hijos). A esta nueva técnica, se le ha llamado Donación, es decir, cesión de propiedades entre marcos sin parentesco. Como aportación práctica, se ha construido un entorno de construcción de Sistemas Basados en el Conocimiento formalizados en Marcos, donde se han introducido todos los nuevos conceptos del Modelo Teórico de la Tesis. Se trata de una cierta anidación. Es decir, son marcos que permiten formalizar cualquier SBC en marcos. El entorno permitirá al IC formalizar bases de conocimientos automáticamente y éste podrá validar el conocimiento del dominio en la fase de formalización en lugar de tener que esperar a que la BC esté implementada. Todo ello lleva a describir el Modelo de Diseño Orientado a Marcos como un puente que aproxima y comunica el Mundo Externo con el Modelo Interno asociado a la realidad e implementado en una computadora, disminuyendo así las diversas pérdidas de conocimiento que si bien no ocurren simultáneamente al construir Sistemas Basados en el Conocimiento, sí coexisten en él.---ABSTRACT---The goal of this thesis is to créate a Frame-Orlented Deslgn Model that, bridging the Outside World and the implemented system's Internal Model of the World, reduces the amount of knowledge lost when reality is formalized in Knowledge Bases (KB). The model diminishes the loss of knowledge when formalizing a KB and brings the Frame-formalized Model closer to the Outside World because: 1. It creates a theory that standardizes the concept of trame at the formalization level to establish a set of syntactic and semantic constraints that will prevent the Knowledge Engineer (KE) from defining forbidden elements or their undue use in the formalization process. 2. The formalism's expressiveness is increased by associating an additional parameter to each of the properties of a class frame to symbolize the representativeness of the concept property. This parameter and the related inference techniques will allow the KE to enter knowledge into the Formalized Model that actually existed but that was not used previously when building the KB. 3. The proposed technique involves matching and works with uncertain knowledge present in the domain. This matching technique takes the representativeness of the properties in the class frame and the degree of certainty of the properties of the entities to calcúlate the matching valué and thus determine to what extent the class frames selected are consistent with the description of the present situation given by an entity. 4. It offers new inference techniques based on property transfer and alters existing ones. Property transfer on ad hoc relations defined by the KE when building a system is a new inference technique independent of and complementary to property transfer traditionally termed Inheritance (transfer of properties between parents and children). This new technique has been callad Donation (transfer of properties between trames without relationships). 5. It improves control of the procedural knowledge defined in the trames by introducing OO concepta. A frame-formalized KBS building environment has been constructed, incorporating all the new concepts of the theoretical model set out in the thesis. There is some embedding, that is, they are trames that provide for any KBS to be formalizad in trames. The environment will enable the KE to formaliza KB automatically, and he will be able to valídate the domain knowledge in the formalization stage instead of havíng to wait until the KB has been implemented. This is a description of the Frame-oriented Design Model, a bridge that brings closer and communicates the Outside World with the Interna! Model associated to reality and implemented on a computar, thus reducing the different losses in knowledge that, though they do not occur simultaneosly when building a Knowledge-based System, coexist within it.
Resumo:
Supply chain formation (SCF) is the process of determining the set of participants and exchange relationships within a network with the goal of setting up a supply chain that meets some predefined social objective. Many proposed solutions for the SCF problem rely on centralized computation, which presents a single point of failure and can also lead to problems with scalability. Decentralized techniques that aid supply chain emergence offer a more robust and scalable approach by allowing participants to deliberate between themselves about the structure of the optimal supply chain. Current decentralized supply chain emergence mechanisms are only able to deal with simplistic scenarios in which goods are produced and traded in single units only and without taking into account production capacities or input-output ratios other than 1:1. In this paper, we demonstrate the performance of a graphical inference technique, max-sum loopy belief propagation (LBP), in a complex multiunit unit supply chain emergence scenario which models additional constraints such as production capacities and input-to-output ratios. We also provide results demonstrating the performance of LBP in dynamic environments, where the properties and composition of participants are altered as the algorithm is running. Our results suggest that max-sum LBP produces consistently strong solutions on a variety of network structures in a multiunit problem scenario, and that performance tends not to be affected by on-the-fly changes to the properties or composition of participants.
Resumo:
A class of intelligent systems located on anthropocentric objects that provide a crew with recommendations on the anthropocentric object's rational behavior in typical situations of operation is considered. We refer to this class of intelligent systems as onboard real-time advisory expert systems. Here, we present a formal model of the object domain, procedures for obtaining knowledge about the object domain, and a semantic structure of basic functional units of the onboard real-time advisory expert systems of typical situations. The stages of the development and improvement of knowledge bases for onboard real-time advisory expert systems of typical situations that are important in practice are considered.
Resumo:
Continuing developments in science and technology mean that the amounts of information forensic scientists are able to provide for criminal investigations is ever increasing. The commensurate increase in complexity creates difficulties for scientists and lawyers with regard to evaluation and interpretation, notably with respect to issues of inference and decision. Probability theory, implemented through graphical methods, and specifically Bayesian networks, provides powerful methods to deal with this complexity. Extensions of these methods to elements of decision theory provide further support and assistance to the judicial system. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science provides a unique and comprehensive introduction to the use of Bayesian decision networks for the evaluation and interpretation of scientific findings in forensic science, and for the support of decision-makers in their scientific and legal tasks. Includes self-contained introductions to probability and decision theory. Develops the characteristics of Bayesian networks, object-oriented Bayesian networks and their extension to decision models. Features implementation of the methodology with reference to commercial and academically available software. Presents standard networks and their extensions that can be easily implemented and that can assist in the reader's own analysis of real cases. Provides a technique for structuring problems and organizing data based on methods and principles of scientific reasoning. Contains a method for the construction of coherent and defensible arguments for the analysis and evaluation of scientific findings and for decisions based on them. Is written in a lucid style, suitable for forensic scientists and lawyers with minimal mathematical background. Includes a foreword by Ian Evett. The clear and accessible style of this second edition makes this book ideal for all forensic scientists, applied statisticians and graduate students wishing to evaluate forensic findings from the perspective of probability and decision analysis. It will also appeal to lawyers and other scientists and professionals interested in the evaluation and interpretation of forensic findings, including decision making based on scientific information.
Resumo:
In this paper we propose the use of the independent component analysis (ICA) [1] technique for improving the classification rate of decision trees and multilayer perceptrons [2], [3]. The use of an ICA for the preprocessing stage, makes the structure of both classifiers simpler, and therefore improves the generalization properties. The hypothesis behind the proposed preprocessing is that an ICA analysis will transform the feature space into a space where the components are independent, and aligned to the axes and therefore will be more adapted to the way that a decision tree is constructed. Also the inference of the weights of a multilayer perceptron will be much easier because the gradient search in the weight space will follow independent trajectories. The result is that classifiers are less complex and on some databases the error rate is lower. This idea is also applicable to regression
Resumo:
In this paper, we develop finite-sample inference procedures for stationary and nonstationary autoregressive (AR) models. The method is based on special properties of Markov processes and a split-sample technique. The results on Markovian processes (intercalary independence and truncation) only require the existence of conditional densities. They are proved for possibly nonstationary and/or non-Gaussian multivariate Markov processes. In the context of a linear regression model with AR(1) errors, we show how these results can be used to simplify the distributional properties of the model by conditioning a subset of the data on the remaining observations. This transformation leads to a new model which has the form of a two-sided autoregression to which standard classical linear regression inference techniques can be applied. We show how to derive tests and confidence sets for the mean and/or autoregressive parameters of the model. We also develop a test on the order of an autoregression. We show that a combination of subsample-based inferences can improve the performance of the procedure. An application to U.S. domestic investment data illustrates the method.
Resumo:
We study the problem of measuring the uncertainty of CGE (or RBC)-type model simulations associated with parameter uncertainty. We describe two approaches for building confidence sets on model endogenous variables. The first one uses a standard Wald-type statistic. The second approach assumes that a confidence set (sampling or Bayesian) is available for the free parameters, from which confidence sets are derived by a projection technique. The latter has two advantages: first, confidence set validity is not affected by model nonlinearities; second, we can easily build simultaneous confidence intervals for an unlimited number of variables. We study conditions under which these confidence sets take the form of intervals and show they can be implemented using standard methods for solving CGE models. We present an application to a CGE model of the Moroccan economy to study the effects of policy-induced increases of transfers from Moroccan expatriates.
Resumo:
We propose finite sample tests and confidence sets for models with unobserved and generated regressors as well as various models estimated by instrumental variables methods. The validity of the procedures is unaffected by the presence of identification problems or \"weak instruments\", so no detection of such problems is required. We study two distinct approaches for various models considered by Pagan (1984). The first one is an instrument substitution method which generalizes an approach proposed by Anderson and Rubin (1949) and Fuller (1987) for different (although related) problems, while the second one is based on splitting the sample. The instrument substitution method uses the instruments directly, instead of generated regressors, in order to test hypotheses about the \"structural parameters\" of interest and build confidence sets. The second approach relies on \"generated regressors\", which allows a gain in degrees of freedom, and a sample split technique. For inference about general possibly nonlinear transformations of model parameters, projection techniques are proposed. A distributional theory is obtained under the assumptions of Gaussian errors and strictly exogenous regressors. We show that the various tests and confidence sets proposed are (locally) \"asymptotically valid\" under much weaker assumptions. The properties of the tests proposed are examined in simulation experiments. In general, they outperform the usual asymptotic inference methods in terms of both reliability and power. Finally, the techniques suggested are applied to a model of Tobin’s q and to a model of academic performance.
Resumo:
The technique of Monte Carlo (MC) tests [Dwass (1957), Barnard (1963)] provides an attractive method of building exact tests from statistics whose finite sample distribution is intractable but can be simulated (provided it does not involve nuisance parameters). We extend this method in two ways: first, by allowing for MC tests based on exchangeable possibly discrete test statistics; second, by generalizing the method to statistics whose null distributions involve nuisance parameters (maximized MC tests, MMC). Simplified asymptotically justified versions of the MMC method are also proposed and it is shown that they provide a simple way of improving standard asymptotics and dealing with nonstandard asymptotics (e.g., unit root asymptotics). Parametric bootstrap tests may be interpreted as a simplified version of the MMC method (without the general validity properties of the latter).
Resumo:
Statistical tests in vector autoregressive (VAR) models are typically based on large-sample approximations, involving the use of asymptotic distributions or bootstrap techniques. After documenting that such methods can be very misleading even with fairly large samples, especially when the number of lags or the number of equations is not small, we propose a general simulation-based technique that allows one to control completely the level of tests in parametric VAR models. In particular, we show that maximized Monte Carlo tests [Dufour (2002)] can provide provably exact tests for such models, whether they are stationary or integrated. Applications to order selection and causality testing are considered as special cases. The technique developed is applied to quarterly and monthly VAR models of the U.S. economy, comprising income, money, interest rates and prices, over the period 1965-1996.
Resumo:
Bayesian inference has been used to determine rigorous estimates of hydroxyl radical concentrations () and air mass dilution rates (K) averaged following air masses between linked observations of nonmethane hydrocarbons (NMHCs) spanning the North Atlantic during the Intercontinental Transport and Chemical Transformation (ITCT)-Lagrangian-2K4 experiment. The Bayesian technique obtains a refined (posterior) distribution of a parameter given data related to the parameter through a model and prior beliefs about the parameter distribution. Here, the model describes hydrocarbon loss through OH reaction and mixing with a background concentration at rate K. The Lagrangian experiment provides direct observations of hydrocarbons at two time points, removing assumptions regarding composition or sources upstream of a single observation. The estimates are sharpened by using many hydrocarbons with different reactivities and accounting for their variability and measurement uncertainty. A novel technique is used to construct prior background distributions of many species, described by variation of a single parameter . This exploits the high correlation of species, related by the first principal component of many NMHC samples. The Bayesian method obtains posterior estimates of , K and following each air mass. Median values are typically between 0.5 and 2.0 × 106 molecules cm−3, but are elevated to between 2.5 and 3.5 × 106 molecules cm−3, in low-level pollution. A comparison of estimates from absolute NMHC concentrations and NMHC ratios assuming zero background (the “photochemical clock” method) shows similar distributions but reveals systematic high bias in the estimates from ratios. Estimates of K are ∼0.1 day−1 but show more sensitivity to the prior distribution assumed.
Resumo:
Event-related functional magnetic resonance imaging (efMRI) has emerged as a powerful technique for detecting brains' responses to presented stimuli. A primary goal in efMRI data analysis is to estimate the Hemodynamic Response Function (HRF) and to locate activated regions in human brains when specific tasks are performed. This paper develops new methodologies that are important improvements not only to parametric but also to nonparametric estimation and hypothesis testing of the HRF. First, an effective and computationally fast scheme for estimating the error covariance matrix for efMRI is proposed. Second, methodologies for estimation and hypothesis testing of the HRF are developed. Simulations support the effectiveness of our proposed methods. When applied to an efMRI dataset from an emotional control study, our method reveals more meaningful findings than the popular methods offered by AFNI and FSL. (C) 2008 Elsevier B.V. All rights reserved.