924 resultados para indirect inference


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Causal inference with a continuous treatment is a relatively under-explored problem. In this dissertation, we adopt the potential outcomes framework. Potential outcomes are responses that would be seen for a unit under all possible treatments. In an observational study where the treatment is continuous, the potential outcomes are an uncountably infinite set indexed by treatment dose. We parameterize this unobservable set as a linear combination of a finite number of basis functions whose coefficients vary across units. This leads to new techniques for estimating the population average dose-response function (ADRF). Some techniques require a model for the treatment assignment given covariates, some require a model for predicting the potential outcomes from covariates, and some require both. We develop these techniques using a framework of estimating functions, compare them to existing methods for continuous treatments, and simulate their performance in a population where the ADRF is linear and the models for the treatment and/or outcomes may be misspecified. We also extend the comparisons to a data set of lottery winners in Massachusetts. Next, we describe the methods and functions in the R package causaldrf using data from the National Medical Expenditure Survey (NMES) and Infant Health and Development Program (IHDP) as examples. Additionally, we analyze the National Growth and Health Study (NGHS) data set and deal with the issue of missing data. Lastly, we discuss future research goals and possible extensions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During our earlier research, it was recognised that in order to be successful with an indirect genetic algorithm approach using a decoder, the decoder has to strike a balance between being an optimiser in its own right and finding feasible solutions. Previously this balance was achieved manually. Here we extend this by presenting an automated approach where the genetic algorithm itself, simultaneously to solving the problem, sets weights to balance the components out. Subsequently we were able to solve a complex and non-linear scheduling problem better than with a standard direct genetic algorithm implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In-situ observations on the size and shape of particles in arctic cirrus are less common than those in mid-latitude and tropical cirrus with considerable uncertainty about the contributions of small ice crystals (maximum dimension D<50 µm) to the mass and radiative properties that impact radiative forcing. In situ measurements of small ice crystals in arctic cirrus were made during the Indirect and Semi-Direct Aerosol Campaign (ISDAC) in April 2008 during transits of the National Research Council of Canada Convair-580 between Fairbanks and Barrow, Alaska and during Mixed Phase Arctic Cloud Experiment (MPACE) in October 2004 with the University of North Dakota (UND) Citation over Barrow, Alaska. Concentrations of small ice crystals with D < 50 μm from a Cloud and Aerosol Spectrometer (CAS), a Cloud Droplet Probe (CDP), a Forward Scattering Spectrometer Probe (FSSP), and a two-dimensional stereo probe (2DS) were compared as functions of the concentrations of crystals with D > 100 μm measured by a Cloud Imaging Probe (CIP) and two-dimensional stereo probe (2DS) in order to assess whether the shattering of large ice crystals on protruding components of different probes artificially amplified measurements of small ice crystal concentrations. The dependence of the probe comparison on other variables as CIP N>100 (number concentrations greater than diameter D>100 μm),temperature, relative humidity respect to ice (RHice), dominant habit from the Cloud Particle Imager (CPI), aircraft roll, pitch, true air speed and angle of attack was examined to understand potential causes of discrepancies between probe concentrations. Data collected by these probes were also compared against the data collected by a CAS, CDP and CIP during the Tropical Warm Pool-International Cloud Experiment (TWP-ICE) and by a CAS and 2DS during the Tropical Composition, Cloud and Climate Coupling (TC4) missions. During ISDAC, the CAS and FSSP both overestimated measurements of small ice crystals compared to both the CDP and 2DS by 1-2 orders of magnitude. Further, the amount of overestimation increased with the concentrations from the CIP2 (N>100 > 0.1 L-1). There was an unexplained discrepancy in concentrations of small crystals between the CDP and 2DS during ISDAC. In addition, there was a strong dependence on RHice of the average ratios of the N3-50, CAS/N3-50,CDP, N3-50, FSSP096/N3-50,CDP, N3-50, CAS/N3-50,FSSP096, N10-50, CDP/N3-50,2DS, N10-50, FSSP096/N10-50,2DS. Continued studies are needed to understand the discrepancy of these probes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In physics, one attempts to infer the rules governing a system given only the results of imperfect measurements. Hence, microscopic theories may be effectively indistinguishable experimentally. We develop an operationally motivated procedure to identify the corresponding equivalence classes of states, and argue that the renormalization group (RG) arises from the inherent ambiguities associated with the classes: one encounters flow parameters as, e.g., a regulator, a scale, or a measure of precision, which specify representatives in a given equivalence class. This provides a unifying framework and reveals the role played by information in renormalization. We validate this idea by showing that it justifies the use of low-momenta n-point functions as statistically relevant observables around a Gaussian hypothesis. These results enable the calculation of distinguishability in quantum field theory. Our methods also provide a way to extend renormalization techniques to effective models which are not based on the usual quantum-field formalism, and elucidates the relationships between various type of RG.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this thesis was to examine the strategy formation in indirect procurement. Current literature in purchasing strategy has mainly focused on direct procurement whereas the strategy formation in indirect procurement has gained much less attention. Nevertheless, the weak economic times have been pushing companies to find savings from other places as well. The theoretical part of the thesis focused on three main subjects; indirect procurement, purchasing strategy and category management. This thesis was a case study, which initiative arose from the company’s need to develop their indirect procurement. The objectives of the empirical part were to identify current state of the indirect procurement and solve biggest issues. The analysis was based on multiple managerial interviews and a spend analysis from ten different countries. Based on these analyses and the theoretical findings a framework and recommendations for the case company was made.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new type of genetic algorithm for the set covering problem. It differs from previous evolutionary approaches first because it is an indirect algorithm, i.e. the actual solutions are found by an external decoder function. The genetic algorithm itself provides this decoder with permutations of the solution variables and other parameters. Second, it will be shown that results can be further improved by adding another indirect optimisation layer. The decoder will not directly seek out low cost solutions but instead aims for good exploitable solutions. These are then post optimised by another hill-climbing algorithm. Although seemingly more complicated, we will show that this three-stage approach has advantages in terms of solution quality, speed and adaptability to new types of problems over more direct approaches. Extensive computational results are presented and compared to the latest evolutionary and other heuristic approaches to the same data instances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During our earlier research, it was recognised that in order to be successful with an indirect genetic algorithm approach using a decoder, the decoder has to strike a balance between being an optimiser in its own right and finding feasible solutions. Previously this balance was achieved manually. Here we extend this by presenting an automated approach where the genetic algorithm itself, simultaneously to solving the problem, sets weights to balance the components out. Subsequently we were able to solve a complex and non-linear scheduling problem better than with a standard direct genetic algorithm implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a Genetic Algorithms approach to a manpower-scheduling problem arising at a major UK hospital. Although Genetic Algorithms have been successfully used for similar problems in the past, they always had to overcome the limitations of the classical Genetic Algorithms paradigm in handling the conflict between objectives and constraints. The approach taken here is to use an indirect coding based on permutations of the nurses, and a heuristic decoder that builds schedules from these permutations. Computational experiments based on 52 weeks of live data are used to evaluate three different decoders with varying levels of intelligence, and four well-known crossover operators. Results are further enhanced by introducing a hybrid crossover operator and by making use of simple bounds to reduce the size of the solution space. The results reveal that the proposed algorithm is able to find high quality solutions and is both faster and more flexible than a recently published Tabu Search approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new type of genetic algorithm for the set covering problem. It differs from previous evolutionary approaches first because it is an indirect algorithm, i.e. the actual solutions are found by an external decoder function. The genetic algorithm itself provides this decoder with permutations of the solution variables and other parameters. Second, it will be shown that results can be further improved by adding another indirect optimisation layer. The decoder will not directly seek out low cost solutions but instead aims for good exploitable solutions. These are then post optimised by another hill-climbing algorithm. Although seemingly more complicated, we will show that this three-stage approach has advantages in terms of solution quality, speed and adaptability to new types of problems over more direct approaches. Extensive computational results are presented and compared to the latest evolutionary and other heuristic approaches to the same data instances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During our earlier research, it was recognised that in order to be successful with an indirect genetic algorithm approach using a decoder, the decoder has to strike a balance between being an optimiser in its own right and finding feasible solutions. Previously this balance was achieved manually. Here we extend this by presenting an automated approach where the genetic algorithm itself, simultaneously to solving the problem, sets weights to balance the components out. Subsequently we were able to solve a complex and non-linear scheduling problem better than with a standard direct genetic algorithm implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: In sub-Saharan African countries, the chance of a child dying before the age of five years is high. The problem is similar in Ethiopia, but it shows a decrease over years. Methods: The 2000; 2005 and 2011 Ethiopian Demographic and Health Survey results were used for this work. The purpose of the study is to detect the pattern of under-five child mortality overtime. Indirect child mortality estimation technique is adapted to examine the under-five child mortality trend in Ethiopia. Results: From the result, it was possible to see the trend of under-five child mortality in Ethiopia. The under-five child mortality shows a decline in Ethiopia. Conclusion: From the study, it can be seen that there is a positive correlation between mother and child survival which is almost certain in any population. Therefore, this study shows the trend of under-five mortality in Ethiopia and decline over time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wind energy is one of the most promising and fast growing sector of energy production. Wind is ecologically friendly and relatively cheap energy resource available for development in practically all corners of the world (where only the wind blows). Today wind power gained broad development in the Scandinavian countries. Three important challenges concerning sustainable development, i.e. energy security, climate change and energy access make a compelling case for large-scale utilization of wind energy. In Finland, according to the climate and energy strategy, accepted in 2008, the total consumption of electricity generated by means of wind farms by 2020, should reach 6 - 7% of total consumption in the country [1]. The main challenges associated with wind energy production are harsh operational conditions that often accompany the turbine operation in the climatic conditions of the north and poor accessibility for maintenance and service. One of the major problems that require a solution is the icing of turbine structures. Icing reduces the performance of wind turbines, which in the conditions of a long cold period, can significantly affect the reliability of power supply. In order to predict and control power performance, the process of ice accretion has to be carefully tracked. There are two ways to detect icing – directly or indirectly. The first way applies to the special ice detection instruments. The second one is using indirect characteristics of turbine performance. One of such indirect methods for ice detection and power loss estimation has been proposed and used in this paper. The results were compared to the results directly gained from the ice sensors. The data used was measured in Muukko wind farm, southeast Finland during a project 'Wind power in cold climate and complex terrain'. The project was carried out in 9/2013 - 8/2015 with the partners Lappeenranta university of technology, Alstom renovables España S.L., TuuliMuukko, and TuuliSaimaa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Phylogenetic inference consist in the search of an evolutionary tree to explain the best way possible genealogical relationships of a set of species. Phylogenetic analysis has a large number of applications in areas such as biology, ecology, paleontology, etc. There are several criterias which has been defined in order to infer phylogenies, among which are the maximum parsimony and maximum likelihood. The first one tries to find the phylogenetic tree that minimizes the number of evolutionary steps needed to describe the evolutionary history among species, while the second tries to find the tree that has the highest probability of produce the observed data according to an evolutionary model. The search of a phylogenetic tree can be formulated as a multi-objective optimization problem, which aims to find trees which satisfy simultaneously (and as much as possible) both criteria of parsimony and likelihood. Due to the fact that these criteria are different there won't be a single optimal solution (a single tree), but a set of compromise solutions. The solutions of this set are called "Pareto Optimal". To find this solutions, evolutionary algorithms are being used with success nowadays.This algorithms are a family of techniques, which aren’t exact, inspired by the process of natural selection. They usually find great quality solutions in order to resolve convoluted optimization problems. The way this algorithms works is based on the handling of a set of trial solutions (trees in the phylogeny case) using operators, some of them exchanges information between solutions, simulating DNA crossing, and others apply aleatory modifications, simulating a mutation. The result of this algorithms is an approximation to the set of the “Pareto Optimal” which can be shown in a graph with in order that the expert in the problem (the biologist when we talk about inference) can choose the solution of the commitment which produces the higher interest. In the case of optimization multi-objective applied to phylogenetic inference, there is open source software tool, called MO-Phylogenetics, which is designed for the purpose of resolving inference problems with classic evolutionary algorithms and last generation algorithms. REFERENCES [1] C.A. Coello Coello, G.B. Lamont, D.A. van Veldhuizen. Evolutionary algorithms for solving multi-objective problems. Spring. Agosto 2007 [2] C. Zambrano-Vega, A.J. Nebro, J.F Aldana-Montes. MO-Phylogenetics: a phylogenetic inference software tool with multi-objective evolutionary metaheuristics. Methods in Ecology and Evolution. En prensa. Febrero 2016.