74 resultados para Pseudo-Bayesian Design
Resumo:
This paper proposes a common and tractable framework for analyzingdifferent definitions of fixed and random effects in a contant-slopevariable-intercept model. It is shown that, regardless of whethereffects (i) are treated as parameters or as an error term, (ii) areestimated in different stages of a hierarchical model, or whether (iii)correlation between effects and regressors is allowed, when the sameinformation on effects is introduced into all estimation methods, theresulting slope estimator is also the same across methods. If differentmethods produce different results, it is ultimately because differentinformation is being used for each methods.
Resumo:
The proposal to work on this final project came after several discussions held with Dr. Elzbieta Malinowski Gadja, who in 2008 published the book entitled Advanced Data Warehouse Design: From Conventional to Spatial and Temporal Applications (Data-Centric Systems and Applications). The project was carried out under the technical supervision of Dr. Malinowski and the direct beneficiary was the University of Costa Rica (UCR) where Dr. Malinowski is a professor at the Department of Computer Science and Informatics. The purpose of this project was twofold: First, to translate chapter III of said book with the intention of generating educational material for the use of the UCR and, second, to venture in the field of technical translation related to data warehouse. For the first component, the goal was to generate a final product that would eventually serve as an educational tool for the post-graduate courses of the UCR. For the second component, this project allowed me to acquire new skills and put into practice techniques that have helped me not only to perfom better in my current job as an Assistant Translator of the Inter-American BAnk (IDB), but also to use them in similar projects. The process was lenggthy and required torough research and constant communication with the author. The investigation focused on the search of terms and definitions to prepare the glossary, which was the basis to start the translation project. The translation process itself was carried out by phases, so that comments and corrections by the author could be taken into account in subsequent stages. Later, based on the glossary and the translated text, illustrations had been created in the Visio software were translated. In addition to the technical revision by the author, professor Carme Mangiron was in charge of revising the non-technical text. The result was a high-quality document that is currently used as reference and study material by the Department of Computer Science and Informatics of Costa Rica.
Resumo:
This paper presents a case study that explores the advantages that can be derived from the use of a design support system during the design of wastewater treatment plants (WWTP). With this objective in mind a simplified but plausible WWTP design case study has been generated with KBDS, a computer-based support system that maintains a historical record of the design process. The study shows how, by employing such a historical record, it is possible to: (1) rank different design proposals responding to a design problem; (2) study the influence of changing the weight of the arguments used in the selection of the most adequate proposal; (3) take advantage of keywords to assist the designer in the search of specific items within the historical records; (4) evaluate automatically thecompliance of alternative design proposals with respect to the design objectives; (5) verify the validity of previous decisions after the modification of the current constraints or specifications; (6) re-use the design records when upgrading an existing WWTP or when designing similar facilities; (7) generate documentation of the decision making process; and (8) associate a variety of documents as annotations to any component in the design history. The paper also shows one possible future role of design support systems as they outgrow their current reactive role as repositories of historical information and start to proactively support the generation of new knowledge during the design process
Resumo:
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.
Resumo:
In this paper we present a Bayesian image reconstruction algorithm with entropy prior (FMAPE) that uses a space-variant hyperparameter. The spatial variation of the hyperparameter allows different degrees of resolution in areas of different statistical characteristics, thus avoiding the large residuals resulting from algorithms that use a constant hyperparameter. In the first implementation of the algorithm, we begin by segmenting a Maximum Likelihood Estimator (MLE) reconstruction. The segmentation method is based on using a wavelet decomposition and a self-organizing neural network. The result is a predetermined number of extended regions plus a small region for each star or bright object. To assign a different value of the hyperparameter to each extended region and star, we use either feasibility tests or cross-validation methods. Once the set of hyperparameters is obtained, we carried out the final Bayesian reconstruction, leading to a reconstruction with decreased bias and excellent visual characteristics. The method has been applied to data from the non-refurbished Hubble Space Telescope. The method can be also applied to ground-based images.
Resumo:
When preparing an article on image restoration in astronomy, it is obvious that some topics have to be dropped to keep the work at reasonable length. We have decided to concentrate on image and noise models and on the algorithms to find the restoration. Topics like parameter estimation and stopping rules are also commented on. We start by describing the Bayesian paradigm and then proceed to study the noise and blur models used by the astronomical community. Then the prior models used to restore astronomical images are examined. We describe the algorithms used to find the restoration for the most common combinations of degradation and image models. Then we comment on important issues such as acceleration of algorithms, stopping rules, and parameter estimation. We also comment on the huge amount of information available to, and made available by, the astronomical community.
Resumo:
We describe the motivation, design, and implementation of the CORNISH survey, an arcsecondresolution radio continuum survey of the inner galactic plane at 5 GHz using the Very Large Array (VLA). It is a blind survey coordinated with the northern SpitzerGLIMPSE I region covering 10°
Resumo:
A method of making a multiple matched filter which allows the recognition of different characters in successive planes in simple conditions is proposed. The generation of the filter is based on recording on the same plate the Fourier transforms of the different patterns to be recognized, each of which is affected by different spherical phase factors because the patterns have been placed at different distances from the lens. This is proved by means of experiments with a triple filter which allows satisfactory recognition of three characters.
Resumo:
One of the main questions to solve when analysing geographically added information consists of the design of territorial units adjusted to the objectives of the study. This is related with the reduction of the effects of the Modificable Areal Unit Problem (MAUP). In this paper an optimisation model to solve regionalisation problems is proposed. This model seeks to reduce disadvantages found in previous works about automated regionalisation tools
Resumo:
The aim of this paper is twofold. First, we study the determinants of economic growth among a wide set of potential variables for the Spanish provinces (NUTS3). Among others, we include various types of private, public and human capital in the group of growth factors. Also,we analyse whether Spanish provinces have converged in economic terms in recent decades. Thesecond objective is to obtain cross-section and panel data parameter estimates that are robustto model speci¯cation. For this purpose, we use a Bayesian Model Averaging (BMA) approach.Bayesian methodology constructs parameter estimates as a weighted average of linear regression estimates for every possible combination of included variables. The weight of each regression estimate is given by the posterior probability of each model.
Resumo:
InAlAs/InGaAs/InP based high electron mobility transistor devices have been structurally and electrically characterized, using transmission electron microscopy and Raman spectroscopy and measuring Hall mobilities. The InGaAs lattice matched channels, with an In molar fraction of 53%, grown at temperatures lower than 530¿°C exhibit alloy decomposition driving an anisotropic InGaAs surface roughness oriented along [1math0]. Conversely, lattice mismatched channels with an In molar fraction of 75% do not present this lateral decomposition but a strain induced roughness, with higher strength as the channel growth temperature increases beyond 490¿°C. In both cases the presence of the roughness implies low and anisotropic Hall mobilities of the two dimensional electron gas.
Resumo:
The aim of this paper is twofold. First, we study the determinants of economic growth among a wide set of potential variables for the Spanish provinces (NUTS3). Among others, we include various types of private, public and human capital in the group of growth factors. Also,we analyse whether Spanish provinces have converged in economic terms in recent decades. Thesecond objective is to obtain cross-section and panel data parameter estimates that are robustto model speci¯cation. For this purpose, we use a Bayesian Model Averaging (BMA) approach.Bayesian methodology constructs parameter estimates as a weighted average of linear regression estimates for every possible combination of included variables. The weight of each regression estimate is given by the posterior probability of each model.
Resumo:
One of the main questions to solve when analysing geographically added information consists of the design of territorial units adjusted to the objectives of the study. This is related with the reduction of the effects of the Modificable Areal Unit Problem (MAUP). In this paper an optimisation model to solve regionalisation problems is proposed. This model seeks to reduce disadvantages found in previous works about automated regionalisation tools
Resumo:
We design optimal band pass filters for electrons in semiconductor heterostructures, under a uniform applied electric field. The inner cells are chosen to provide a desired transmission window. The outer cells are then designed to transform purely incoming or outgoing waves into Bloch states of the inner cells. The transfer matrix is interpreted as a conformal mapping in the complex plane, which allows us to write constraints on the outer cell parameters, from which physically useful values can be obtained.
Resumo:
We apply majorization theory to study the quantum algorithms known so far and find that there is a majorization principle underlying the way they operate. Grover's algorithm is a neat instance of this principle where majorization works step by step until the optimal target state is found. Extensions of this situation are also found in algorithms based in quantum adiabatic evolution and the family of quantum phase-estimation algorithms, including Shor's algorithm. We state that in quantum algorithms the time arrow is a majorization arrow.