989 resultados para Aeroelascity, Optimization, Uncertainty


Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Species distribution modelling is used increasingly in both applied and theoretical research to predict how species are distributed and to understand attributes of species' environmental requirements. In species distribution modelling, various statistical methods are used that combine species occurrence data with environmental spatial data layers to predict the suitability of any site for that species. While the number of data sharing initiatives involving species' occurrences in the scientific community has increased dramatically over the past few years, various data quality and methodological concerns related to using these data for species distribution modelling have not been addressed adequately. 2. We evaluated how uncertainty in georeferences and associated locational error in occurrences influence species distribution modelling using two treatments: (1) a control treatment where models were calibrated with original, accurate data and (2) an error treatment where data were first degraded spatially to simulate locational error. To incorporate error into the coordinates, we moved each coordinate with a random number drawn from the normal distribution with a mean of zero and a standard deviation of 5 km. We evaluated the influence of error on the performance of 10 commonly used distributional modelling techniques applied to 40 species in four distinct geographical regions. 3. Locational error in occurrences reduced model performance in three of these regions; relatively accurate predictions of species distributions were possible for most species, even with degraded occurrences. Two species distribution modelling techniques, boosted regression trees and maximum entropy, were the best performing models in the face of locational errors. The results obtained with boosted regression trees were only slightly degraded by errors in location, and the results obtained with the maximum entropy approach were not affected by such errors. 4. Synthesis and applications. To use the vast array of occurrence data that exists currently for research and management relating to the geographical ranges of species, modellers need to know the influence of locational error on model quality and whether some modelling techniques are particularly robust to error. We show that certain modelling techniques are particularly robust to a moderate level of locational error and that useful predictions of species distributions can be made even when occurrence data include some error.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies optimal monetary policy in a framework that explicitly accounts for policymakers' uncertainty about the channels of transmission of oil prices into the economy. More specfically, I examine the robust response to the real price of oil that US monetary authorities would have been recommended to implement in the period 1970 2009; had they used the approach proposed by Cogley and Sargent (2005b) to incorporate model uncertainty and learning into policy decisions. In this context, I investigate the extent to which regulator' changing beliefs over different models of the economy play a role in the policy selection process. The main conclusion of this work is that, in the specific environment under analysis, one of the underlying models dominates the optimal interest rate response to oil prices. This result persists even when alternative assumptions on the model's priors change the pattern of the relative posterior probabilities, and can thus be attributed to the presence of model uncertainty itself.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objectives of the present study were to optimize the protocol of mouse immunization with Paracoccidioides brasiliensis antigens (Rifkind's protocol) and to test the modulation effect of cyclophosphamide (Cy) on the delayed hypersensitivity response (DHR) of immunized animals. Experiments were carried out using one to four immunizing doses of either crude particulate P. brasiliensis antigen or yeast-cell antigen, followed by DHR test four or seven days after the last immunizing dose. The data demonstrated that an immunizing dose already elicited response; higher DHR indices were obtained with two or three immunizing doses; there were no differences between DHR indices of animals challenged four or seven days after the last dose. Overall the inoculation of two or three doses of the yeast-cell antigen, which is easier to prepare, and DHR test at day 4 simplify the original Rifkind's immunization protocol and shorten the duration of the experiments. The modulation effect of Cy on DHR was assayed with administration of 2.5, 20 and 100 mg/kg weight at seven day intervals starting from day 4 prior to the first immunizing dose. Only the treatment with 2.5 mg Cy increased the DHR indices. Treatment with 100 mg Cy inhibited the DHR, whereas 20 mg Cy did not affect the DHR indices. Results suggest an immunostimulating effect of low dose of Cy on the DHR of mice immunized with P. brasiliensis antigens.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To determine the diagnostic accuracy of physicians' prior probability estimates of serious infection in critically ill neonates and children, we conducted a prospective cohort study in 2 intensive care units. Using available clinical, laboratory, and radiographic information, 27 physicians provided 2567 probability estimates for 347 patients (follow-up rate, 92%). The median probability estimate of infection increased from 0% (i.e., no antibiotic treatment or diagnostic work-up for sepsis), to 2% on the day preceding initiation of antibiotic therapy, to 20% at initiation of antibiotic treatment (P<.001). At initiation of treatment, predictions discriminated well between episodes subsequently classified as proven infection and episodes ultimately judged unlikely to be infection (area under the curve, 0.88). Physicians also showed a good ability to predict blood culture-positive sepsis (area under the curve, 0.77). Treatment and testing thresholds were derived from the provided predictions and treatment rates. Physicians' prognoses regarding the presence of serious infection were remarkably precise. Studies investigating the value of new tests for diagnosis of sepsis should establish that they add incremental value to physicians' judgment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En aquest projecte s’ha analitzat i optimitzat l’enllaç satèl·lit amb avió per a un sistema aeronàutic global. Aquest nou sistema anomenat ANTARES està dissenyat per a comunicar avions amb estacions base mitjançant un satèl·lit. Aquesta és una iniciativa on hi participen institucions oficials en l’aviació com ara l’ECAC i que és desenvolupat en una col·laboració europea d’universitats i empreses. El treball dut a terme en el projecte compren bàsicament tres aspectes. El disseny i anàlisi de la gestió de recursos. La idoneïtat d’utilitzar correcció d’errors en la capa d’enllaç i en cas que sigui necessària dissenyar una opció de codificació preliminar. Finalment, estudiar i analitzar l’efecte de la interferència co-canal en sistemes multifeix. Tots aquests temes són considerats només per al “forward link”. L’estructura que segueix el projecte és primer presentar les característiques globals del sistema, després centrar-se i analitzar els temes mencionats per a poder donar resultats i extreure conclusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the issue of policy evaluation in a context in which policymakers are uncertain about the effects of oil prices on economic performance. I consider models of the economy inspired by Solow (1980), Blanchard and Gali (2007), Kim and Loungani (1992) and Hamilton (1983, 2005), which incorporate different assumptions on the channels through which oil prices have an impact on economic activity. I first study the characteristics of the model space and I analyze the likelihood of the different specifications. I show that the existence of plausible alternative representations of the economy forces the policymaker to face the problem of model uncertainty. Then, I use the Bayesian approach proposed by Brock, Durlauf and West (2003, 2007) and the minimax approach developed by Hansen and Sargent (2008) to integrate this form of uncertainty into policy evaluation. I find that, in the environment under analysis, the standard Taylor rule is outperformed under a number of criteria by alternative simple rules in which policymakers introduce persistence in the policy instrument and respond to changes in the real price of oil.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We evaluate the performance of different optimization techniques developed in the context of optical flowcomputation with different variational models. In particular, based on truncated Newton methods (TN) that have been an effective approach for large-scale unconstrained optimization, we develop the use of efficient multilevel schemes for computing the optical flow. More precisely, we evaluate the performance of a standard unidirectional multilevel algorithm - called multiresolution optimization (MR/OPT), to a bidrectional multilevel algorithm - called full multigrid optimization (FMG/OPT). The FMG/OPT algorithm treats the coarse grid correction as an optimization search direction and eventually scales it using a line search. Experimental results on different image sequences using four models of optical flow computation show that the FMG/OPT algorithm outperforms both the TN and MR/OPT algorithms in terms of the computational work and the quality of the optical flow estimation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Restriction site-associated DNA sequencing (RADseq) provides researchers with the ability to record genetic polymorphism across thousands of loci for nonmodel organisms, potentially revolutionizing the field of molecular ecology. However, as with other genotyping methods, RADseq is prone to a number of sources of error that may have consequential effects for population genetic inferences, and these have received only limited attention in terms of the estimation and reporting of genotyping error rates. Here we use individual sample replicates, under the expectation of identical genotypes, to quantify genotyping error in the absence of a reference genome. We then use sample replicates to (i) optimize de novo assembly parameters within the program Stacks, by minimizing error and maximizing the retrieval of informative loci; and (ii) quantify error rates for loci, alleles and single-nucleotide polymorphisms. As an empirical example, we use a double-digest RAD data set of a nonmodel plant species, Berberis alpina, collected from high-altitude mountains in Mexico.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Vegeu el resum a l'inici del document del fitxer adjunt."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Graph pebbling is a network model for studying whether or not a given supply of discrete pebbles can satisfy a given demand via pebbling moves. A pebbling move across an edge of a graph takes two pebbles from one endpoint and places one pebble at the other endpoint; the other pebble is lost in transit as a toll. It has been shown that deciding whether a supply can meet a demand on a graph is NP-complete. The pebbling number of a graph is the smallest t such that every supply of t pebbles can satisfy every demand of one pebble. Deciding if the pebbling number is at most k is NP 2 -complete. In this paper we develop a tool, called theWeight Function Lemma, for computing upper bounds and sometimes exact values for pebbling numbers with the assistance of linear optimization. With this tool we are able to calculate the pebbling numbers of much larger graphs than in previous algorithms, and much more quickly as well. We also obtain results for many families of graphs, in many cases by hand, with much simpler and remarkably shorter proofs than given in previously existing arguments (certificates typically of size at most the number of vertices times the maximum degree), especially for highly symmetric graphs. Here we apply theWeight Function Lemma to several specific graphs, including the Petersen, Lemke, 4th weak Bruhat, Lemke squared, and two random graphs, as well as to a number of infinite families of graphs, such as trees, cycles, graph powers of cycles, cubes, and some generalized Petersen and Coxeter graphs. This partly answers a question of Pachter, et al., by computing the pebbling exponent of cycles to within an asymptotically small range. It is conceivable that this method yields an approximation algorithm for graph pebbling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses the use of probabilistic or randomized algorithms for solving combinatorial optimization problems. Our approach employs non-uniform probability distributions to add a biased random behavior to classical heuristics so a large set of alternative good solutions can be quickly obtained in a natural way and without complex conguration processes. This procedure is especially useful in problems where properties such as non-smoothness or non-convexity lead to a highly irregular solution space, for which the traditional optimization methods, both of exact and approximate nature, may fail to reach their full potential. The results obtained are promising enough to suggest that randomizing classical heuristics is a powerful method that can be successfully applied in a variety of cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper develops a stability theory for the optimal value and the optimal set mapping of optimization problems posed in a Banach space. The problems considered in this paper have an arbitrary number of inequality constraints involving lower semicontinuous (not necessarily convex) functions and one closed abstract constraint set. The considered perturbations lead to problems of the same type as the nominal one (with the same space of variables and the same number of constraints), where the abstract constraint set can also be perturbed. The spaces of functions involved in the problems (objective and constraints) are equipped with the metric of the uniform convergence on the bounded sets, meanwhile in the space of closed sets we consider, coherently, the Attouch-Wets topology. The paper examines, in a unified way, the lower and upper semicontinuity of the optimal value function, and the closedness, lower and upper semicontinuity (in the sense of Berge) of the optimal set mapping. This paper can be seen as a second part of the stability theory presented in [17], where we studied the stability of the feasible set mapping (completed here with the analysis of the Lipschitz-like property).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the Juste-Neige system for predicting the snow height on the ski runs of a resort using a multi-agent simulation software. Its aim is to facilitate snow cover management in order to i) reduce the production cost of artificial snow and to improve the profit margin for the companies managing the ski resorts; and ii) to reduce the water and energy consumption, and thus to reduce the environmental impact, by producing only the snow needed for a good skiing experience. The software provides maps with the predicted snow heights for up to 13 days. On these maps, the areas most exposed to snow erosion are highlighted. The software proceeds in three steps: i) interpolation of snow height measurements with a neural network; ii) local meteorological forecasts for every ski resort; iii) simulation of the impact caused by skiers using a multi-agent system. The software has been evaluated in the Swiss ski resort of Verbier and provides useful predictions.