902 resultados para Deterministic imputation
Resumo:
The prevalence of hypertension in African Americans (AAs) is higher than in other US groups; yet, few have performed genome-wide association studies (GWASs) in AA. Among people of European descent, GWASs have identified genetic variants at 13 loci that are associated with blood pressure. It is unknown if these variants confer susceptibility in people of African ancestry. Here, we examined genome-wide and candidate gene associations with systolic blood pressure (SBP) and diastolic blood pressure (DBP) using the Candidate Gene Association Resource (CARe) consortium consisting of 8591 AAs. Genotypes included genome-wide single-nucleotide polymorphism (SNP) data utilizing the Affymetrix 6.0 array with imputation to 2.5 million HapMap SNPs and candidate gene SNP data utilizing a 50K cardiovascular gene-centric array (ITMAT-Broad-CARe [IBC] array). For Affymetrix data, the strongest signal for DBP was rs10474346 (P= 3.6 × 10(-8)) located near GPR98 and ARRDC3. For SBP, the strongest signal was rs2258119 in C21orf91 (P= 4.7 × 10(-8)). The top IBC association for SBP was rs2012318 (P= 6.4 × 10(-6)) near SLC25A42 and for DBP was rs2523586 (P= 1.3 × 10(-6)) near HLA-B. None of the top variants replicated in additional AA (n = 11 882) or European-American (n = 69 899) cohorts. We replicated previously reported European-American blood pressure SNPs in our AA samples (SH2B3, P= 0.009; TBX3-TBX5, P= 0.03; and CSK-ULK3, P= 0.0004). These genetic loci represent the best evidence of genetic influences on SBP and DBP in AAs to date. More broadly, this work supports that notion that blood pressure among AAs is a trait with genetic underpinnings but also with significant complexity.
Resumo:
We present an analytic and numerical study of the effects of external fluctuations in active media. Our analytical methodology transforms the initial stochastic partial differential equations into an effective set of deterministic reaction-diffusion equations. As a result we are able to explain and make quantitative predictions on the systematic and constructive effects of the noise, for example, target patterns created out of noise and traveling or spiral waves sustained by noise. Our study includes the case of realistic noises with temporal and spatial structures.
Resumo:
Aim Conservation strategies are in need of predictions that capture spatial community composition and structure. Currently, the methods used to generate these predictions generally focus on deterministic processes and omit important stochastic processes and other unexplained variation in model outputs. Here we test a novel approach of community models that accounts for this variation and determine how well it reproduces observed properties of alpine butterfly communities. Location The western Swiss Alps. Methods We propose a new approach to process probabilistic predictions derived from stacked species distribution models (S-SDMs) in order to predict and assess the uncertainty in the predictions of community properties. We test the utility of our novel approach against a traditional threshold-based approach. We used mountain butterfly communities spanning a large elevation gradient as a case study and evaluated the ability of our approach to model species richness and phylogenetic diversity of communities. Results S-SDMs reproduced the observed decrease in phylogenetic diversity and species richness with elevation, syndromes of environmental filtering. The prediction accuracy of community properties vary along environmental gradient: variability in predictions of species richness was higher at low elevation, while it was lower for phylogenetic diversity. Our approach allowed mapping the variability in species richness and phylogenetic diversity projections. Main conclusion Using our probabilistic approach to process species distribution models outputs to reconstruct communities furnishes an improved picture of the range of possible assemblage realisations under similar environmental conditions given stochastic processes and help inform manager of the uncertainty in the modelling results
Resumo:
Abstract In social insects, workers perform a multitude of tasks, such as foraging, nest construction, and brood rearing, without central control of how work is allocated among individuals. It has been suggested that workers choose a task by responding to stimuli gathered from the environment. Response-threshold models assume that individuals in a colony vary in the stimulus intensity (response threshold) at which they begin to perform the corresponding task. Here we highlight the limitations of these models with respect to colony performance in task allocation. First, we show with analysis and quantitative simulations that the deterministic response-threshold model constrains the workers' behavioral flexibility under some stimulus conditions. Next, we show that the probabilistic response-threshold model fails to explain precise colony responses to varying stimuli. Both of these limitations would be detrimental to colony performance when dynamic and precise task allocation is needed. To address these problems, we propose extensions of the response-threshold model by adding variables that weigh stimuli. We test the extended response-threshold model in a foraging scenario and show in simulations that it results in an efficient task allocation. Finally, we show that response-threshold models can be formulated as artificial neural networks, which consequently provide a comprehensive framework for modeling task allocation in social insects.
Resumo:
We study the effects of external noise in a one-dimensional model of front propagation. Noise is introduced through the fluctuations of a control parameter leading to a multiplicative stochastic partial differential equation. Analytical and numerical results for the front shape and velocity are presented. The linear-marginal-stability theory is found to increase its range of validity in the presence of external noise. As a consequence noise can stabilize fronts not allowed by the deterministic equation.
Resumo:
BACKGROUND: Raltegravir (RAL) achieved remarkable virologic suppression rates in randomized-clinical trials, but today efficacy data and factors for treatment failures in a routine clinical care setting are limited. METHODS: First, factors associated with a switch to RAL were identified with a logistic regression including patients from the Swiss HIV Cohort Study with a history of 3 class failure (n = 423). Second, predictors for virologic outcome were identified in an intent-to-treat analysis including all patients who received RAL. Last observation carried forward imputation was used to determine week 24 response rate (HIV-1 RNA >or= 50 copies/mL). RESULTS: The predominant factor associated with a switch to RAL in patients with suppressed baseline RNA was a regimen containing enfuvirtide [odds ratio 41.9 (95% confidence interval: 11.6-151.6)]. Efficacy analysis showed an overall response rate of 80.9% (152/188), whereas 71.8% (84/117) and 95.8% (68/71) showed viral suppression when stratified for detectable and undetectable RNA at baseline, respectively. Overall CD4 cell counts increased significantly by 42 cells/microL (P < 0.001). Characteristics of failures were a genotypic sensitivity score of the background regimen <or=1, very low RAL plasma concentrations, poor adherence, and high viral load at baseline. CONCLUSIONS: Virologic suppression rates in our routine clinical care setting were promising and comparable with data from previously published randomized-controlled trials.
Resumo:
Soil slope instability concerning highway infrastructure is an ongoing problem in Iowa, as slope failures endanger public safety and continue to result in costly repair work. While in the past extensive research has been conducted on slope stability investigations and analysis, this current research study consists of field investigations addressing both the characterization and reinforcement of such slope failures. While Volume I summarizes the research methods and findings of this study, Volume II provides procedural details for incorporating an infrequently-used testing technique, borehole shear tests, into practice. Fifteen slopes along Iowa highways were investigated, including thirteen slides (failed slopes), one unfailed slope, and one proposed embankment slope (the Sugar Creek Project). The slopes are mainly comprised of either clay shale or glacial till, and are generally gentle and of small scale, with slope angle ranging from 11 deg to 23 deg and height ranging from 6 to 23 m. Extensive field investigations and laboratory tests were performed for each slope. Field investigations included survey of slope geometry, borehole drilling, soil sampling, in-situ Borehole Shear Testing (BST) and ground water table measurement. Laboratory investigations mainly comprised of ring shear tests, soil basic property tests (grain size analysis and Atterberg limits test), mineralogy analyses, soil classifications, and natural water contents and density measurements on the representative soil samples from each slope. Extensive direct shear tests and a few triaxial compression tests and unconfined compression tests were also performed on undisturbed soil samples for the Sugar Creek Project. Based on the results of field and lab investigations, slope stability analysis was performed on each of the slopes to determine the possible factors resulting in the slope failures or to evaluate the potential slope instabilities using limit equilibrium methods. Deterministic slope analyses were performed for all the slopes. Probabilistic slope analysis and sensitivity study were also performed for the slope of the Sugar Creek Project. Results indicate that while the in-situ test rapidly provides effective shear strength parameters of soils, some training may be required for effective and appropriate use of the BST. Also, it is primarily intended to test cohesive soils and can produce erroneous results in gravelly soils. Additionally, the quality of boreholes affects test results, and disturbance to borehole walls should be minimized before test performance. A final limitation of widespread borehole shear testing may be its limited availability, as only about four to six test devices are currently being used in Iowa. Based on the data gathered in the field testing, reinforcement investigations are continued in Volume III.
Resumo:
The theoretical aspects and the associated software of a bioeconomic model for Mediterranean fisheries are presented. The first objective of the model is to reproduce the bioeconomic conditions in which the fisheries occur. The model is, perforce, multispecies and multigear. The main management procedure is effort limitation. The model also incorporates the usual fishermen strategy of increasing efficiency to obtain increased fishing mortality while maintaining the nominal effort. This is modelled by means of a function relating the efficiency (or technological progress) with the capital invested in the fishery and time. A second objective is to simulate alternative management strategies. The model allows the operation of technical and economic management measures in the presence of different kind of events. Both deterministic and stochastic simulations can be performed. An application of this tool to the hake fishery off Catalonia is presented, considering the other species caught and the different gears used. Several alternative management measures are tested and their consequences for the stock and economy of fishermen are analysed.
Resumo:
We describe the version of the GPT planner to be used in the planning competition. This version, called mGPT, solves mdps specified in the ppddllanguage by extracting and using different classes of lower bounds, along with various heuristic-search algorithms. The lower bounds are extracted from deterministic relaxations of the mdp where alternativeprobabilistic effects of an action are mapped into different, independent, deterministic actions. The heuristic-search algorithms, on the other hand, use these lower bounds for focusing the updates and delivering a consistent value function over all states reachable from the initial state with the greedy policy.
Resumo:
Planning with partial observability can be formulated as a non-deterministic search problem in belief space. The problem is harder than classical planning as keeping track of beliefs is harder than keeping track of states, and searching for action policies is harder than searching for action sequences. In this work, we develop a framework for partial observability that avoids these limitations and leads to a planner that scales up to larger problems. For this, the class of problems is restricted to those in which 1) the non-unary clauses representing the uncertainty about the initial situation are nvariant, and 2) variables that are hidden in the initial situation do not appear in the body of conditional effects, which are all assumed to be deterministic. We show that such problems can be translated in linear time into equivalent fully observable non-deterministic planning problems, and that an slight extension of this translation renders the problem solvable by means of classical planners. The whole approach is sound and complete provided that in addition, the state-space is connected. Experiments are also reported.
Resumo:
The choice network revenue management (RM) model incorporates customer purchase behavioras customers purchasing products with certain probabilities that are a function of the offeredassortment of products, and is the appropriate model for airline and hotel network revenuemanagement, dynamic sales of bundles, and dynamic assortment optimization. The underlyingstochastic dynamic program is intractable and even its certainty-equivalence approximation, inthe form of a linear program called Choice Deterministic Linear Program (CDLP) is difficultto solve in most cases. The separation problem for CDLP is NP-complete for MNL with justtwo segments when their consideration sets overlap; the affine approximation of the dynamicprogram is NP-complete for even a single-segment MNL. This is in contrast to the independentclass(perfect-segmentation) case where even the piecewise-linear approximation has been shownto be tractable. In this paper we investigate the piecewise-linear approximation for network RMunder a general discrete-choice model of demand. We show that the gap between the CDLP andthe piecewise-linear bounds is within a factor of at most 2. We then show that the piecewiselinearapproximation is polynomially-time solvable for a fixed consideration set size, bringing itinto the realm of tractability for small consideration sets; small consideration sets are a reasonablemodeling tradeoff in many practical applications. Our solution relies on showing that forany discrete-choice model the separation problem for the linear program of the piecewise-linearapproximation can be solved exactly by a Lagrangian relaxation. We give modeling extensionsand show by numerical experiments the improvements from using piecewise-linear approximationfunctions.
Resumo:
Geophysical techniques can help to bridge the inherent gap with regard to spatial resolution and the range of coverage that plagues classical hydrological methods. This has lead to the emergence of the new and rapidly growing field of hydrogeophysics. Given the differing sensitivities of various geophysical techniques to hydrologically relevant parameters and their inherent trade-off between resolution and range the fundamental usefulness of multi-method hydrogeophysical surveys for reducing uncertainties in data analysis and interpretation is widely accepted. A major challenge arising from such endeavors is the quantitative integration of the resulting vast and diverse database in order to obtain a unified model of the probed subsurface region that is internally consistent with all available data. To address this problem, we have developed a strategy towards hydrogeophysical data integration based on Monte-Carlo-type conditional stochastic simulation that we consider to be particularly suitable for local-scale studies characterized by high-resolution and high-quality datasets. Monte-Carlo-based optimization techniques are flexible and versatile, allow for accounting for a wide variety of data and constraints of differing resolution and hardness and thus have the potential of providing, in a geostatistical sense, highly detailed and realistic models of the pertinent target parameter distributions. Compared to more conventional approaches of this kind, our approach provides significant advancements in the way that the larger-scale deterministic information resolved by the hydrogeophysical data can be accounted for, which represents an inherently problematic, and as of yet unresolved, aspect of Monte-Carlo-type conditional simulation techniques. We present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on pertinent synthetic data and then applied to corresponding field data collected at the Boise Hydrogeophysical Research Site near Boise, Idaho, USA.
Resumo:
BACKGROUND: Nucleoside reverse transcriptase inhibitors (NRTIs) are often administered in salvage therapy even if genotypic resistance tests (GRTs) indicate high-level resistance, but little is known about the benefit of these additional NRTIs. METHODS: The effect of <2 compared with 2 NRTIs on viral suppression (HIV-1 RNA < 50 copies/mL) at week 24 was studied in salvage patients receiving raltegravir. Intent-to-treat and per-protocol analyses were performed; last observation carried forward imputation was used to deal with missing information. Logistic regressions were weighted to create a pseudopopulation in which the probability of receiving <2 and 2 NRTIs was unrelated to baseline factors predicting treatment response. RESULTS: One-hundred thirty patients were included, of whom 58.5% (n = 76) received <2 NRTIs. NRTIs were often replaced by other drug classes. Patients with 2 NRTIs received less additional drug classes compared with patients with <2 NRTIs [median (IQR): 1 (1-2) compared with 2 (1-2), P Wilcoxon < 0.001]. The activity of non-NRTI treatment components was lower in the 2 NRTIs group compared with the <2 NRTIs group [median (IQR) genotypic sensitivity score: 2 (1.5-2.5) compared with 2.5 (2-3), P Wilcoxon < 0.001]. The administration of <2 NRTIs was associated with a worse viral suppression rate at week 24. The odds ratios were 0.34 (95% confidence interval: 0.13 to 0.89, P = 0.027) and 0.19 (95% confidence interval: 0.05 to 0.79, P = 0.023) when performing the last observation carried forward and the per-protocol approach, respectively. CONCLUSIONS: Our findings showed that partially active or inactive NRTIs contribute to treatment response, and thus the use of 2 NRTIs in salvage regimens that include raltegravir seems warranted.
Resumo:
In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.
Resumo:
In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.