985 resultados para capture-recapture models
Resumo:
The paper presents a new model based on the basic Maximum Capture model,MAXCAP. The New Chance Constrained Maximum Capture modelintroduces astochastic threshold constraint, which recognises the fact that a facilitycan be open only if a minimum level of demand is captured. A metaheuristicbased on MAX MIN ANT system and TABU search procedure is presented tosolve the model. This is the first time that the MAX MIN ANT system isadapted to solve a location problem. Computational experience and anapplication to 55 node network are also presented.
Resumo:
A new direction of research in Competitive Location theory incorporatestheories of Consumer Choice Behavior in its models. Following thisdirection, this paper studies the importance of consumer behavior withrespect to distance or transportation costs in the optimality oflocations obtained by traditional Competitive Location models. To dothis, it considers different ways of defining a key parameter in thebasic Maximum Capture model (MAXCAP). This parameter will reflectvarious ways of taking into account distance based on several ConsumerChoice Behavior theories. The optimal locations and the deviation indemand captured when the optimal locations of the other models are usedinstead of the true ones, are computed for each model. A metaheuristicbased on GRASP and Tabu search procedure is presented to solve all themodels. Computational experience and an application to 55-node networkare also presented.
Resumo:
In this paper we propose a metaheuristic to solve a new version of the Maximum CaptureProblem. In the original MCP, market capture is obtained by lower traveling distances or lowertraveling time, in this new version not only the traveling time but also the waiting time willaffect the market share. This problem is hard to solve using standard optimization techniques.Metaheuristics are shown to offer accurate results within acceptable computing times.
Resumo:
Most facility location decision models ignore the fact that for a facility to survive it needs a minimum demand level to cover costs. In this paper we present a decision model for a firm thatwishes to enter a spatial market where there are several competitors already located. This market is such that for each outlet there is a demand threshold level that has to be achievedin order to survive. The firm wishes to know where to locate itsoutlets so as to maximize its market share taking into account the threshold level. It may happen that due to this new entrance, some competitors will not be able to meet the threshold and therefore will disappear. A formulation is presented together with a heuristic solution method and computational experience.
Resumo:
Uncertainty quantification of petroleum reservoir models is one of the present challenges, which is usually approached with a wide range of geostatistical tools linked with statistical optimisation or/and inference algorithms. The paper considers a data driven approach in modelling uncertainty in spatial predictions. Proposed semi-supervised Support Vector Regression (SVR) model has demonstrated its capability to represent realistic features and describe stochastic variability and non-uniqueness of spatial properties. It is able to capture and preserve key spatial dependencies such as connectivity, which is often difficult to achieve with two-point geostatistical models. Semi-supervised SVR is designed to integrate various kinds of conditioning data and learn dependences from them. A stochastic semi-supervised SVR model is integrated into a Bayesian framework to quantify uncertainty with multiple models fitted to dynamic observations. The developed approach is illustrated with a reservoir case study. The resulting probabilistic production forecasts are described by uncertainty envelopes.
Resumo:
Understanding the structure of interphase chromosomes is essential to elucidate regulatory mechanisms of gene expression. During recent years, high-throughput DNA sequencing expanded the power of chromosome conformation capture (3C) methods that provide information about reciprocal spatial proximity of chromosomal loci. Since 2012, it is known that entire chromatin in interphase chromosomes is organized into regions with strongly increased frequency of internal contacts. These regions, with the average size of ∼1 Mb, were named topological domains. More recent studies demonstrated presence of unconstrained supercoiling in interphase chromosomes. Using Brownian dynamics simulations, we show here that by including supercoiling into models of topological domains one can reproduce and thus provide possible explanations of several experimentally observed characteristics of interphase chromosomes, such as their complex contact maps.
Resumo:
Aim Conservation strategies are in need of predictions that capture spatial community composition and structure. Currently, the methods used to generate these predictions generally focus on deterministic processes and omit important stochastic processes and other unexplained variation in model outputs. Here we test a novel approach of community models that accounts for this variation and determine how well it reproduces observed properties of alpine butterfly communities. Location The western Swiss Alps. Methods We propose a new approach to process probabilistic predictions derived from stacked species distribution models (S-SDMs) in order to predict and assess the uncertainty in the predictions of community properties. We test the utility of our novel approach against a traditional threshold-based approach. We used mountain butterfly communities spanning a large elevation gradient as a case study and evaluated the ability of our approach to model species richness and phylogenetic diversity of communities. Results S-SDMs reproduced the observed decrease in phylogenetic diversity and species richness with elevation, syndromes of environmental filtering. The prediction accuracy of community properties vary along environmental gradient: variability in predictions of species richness was higher at low elevation, while it was lower for phylogenetic diversity. Our approach allowed mapping the variability in species richness and phylogenetic diversity projections. Main conclusion Using our probabilistic approach to process species distribution models outputs to reconstruct communities furnishes an improved picture of the range of possible assemblage realisations under similar environmental conditions given stochastic processes and help inform manager of the uncertainty in the modelling results
Resumo:
This paper discusses uncertainties in model projections of summer drying in the Euro-Mediterranean region related to errors and uncertainties in the simulation of the summer NAO (SNAO). The SNAO is the leading mode of summer SLP variability in the North Atlantic/European sector and modulates precipitation not only in the vicinity of the SLP dipole (northwest Europe) but also in the Mediterranean region. An analysis of CMIP3 models is conducted to determine the extent to which models reproduce the signature of the SNAO and its impact on precipitation and to assess the role of the SNAO in the projected precipitation reductions. Most models correctly simulate the spatial pattern of the SNAO and the dry anomalies in northwest Europe that accompany the positive phase. The models also capture the concurrent wet conditions in the Mediterranean, but the amplitude of this signal is too weak, especially in the east. This error is related to the poor simulation of the upper-level circulation response to a positive SNAO, namely the observed trough over the Balkans that creates potential instability and favors precipitation. The SNAO is generally projected to trend upwards in CMIP3 models, leading to a consistent signal of precipitation reduction in NW Europe, but the intensity of the trend varies greatly across models, resulting in large uncertainties in the magnitude of the projected drying. In the Mediterranean, because the simulated influence of the SNAO is too weak, no precipitation increase occurs even in the presence of a strong SNAO trend, reducing confidence in these projections.
Resumo:
The aim of this work is to compare two families of mathematical models for their respective capability to capture the statistical properties of real electricity spot market time series. The first model family is ARMA-GARCH models and the second model family is mean-reverting Ornstein-Uhlenbeck models. These two models have been applied to two price series of Nordic Nord Pool spot market for electricity namely to the System prices and to the DenmarkW prices. The parameters of both models were calibrated from the real time series. After carrying out simulation with optimal models from both families we conclude that neither ARMA-GARCH models, nor conventional mean-reverting Ornstein-Uhlenbeck models, even when calibrated optimally with real electricity spot market price or return series, capture the statistical characteristics of the real series. But in the case of less spiky behavior (System prices), the mean-reverting Ornstein-Uhlenbeck model could be seen to partially succeeded in this task.
Resumo:
This thesis presents a one-dimensional, semi-empirical dynamic model for the simulation and analysis of a calcium looping process for post-combustion CO2 capture. Reduction of greenhouse emissions from fossil fuel power production requires rapid actions including the development of efficient carbon capture and sequestration technologies. The development of new carbon capture technologies can be expedited by using modelling tools. Techno-economical evaluation of new capture processes can be done quickly and cost-effectively with computational models before building expensive pilot plants. Post-combustion calcium looping is a developing carbon capture process which utilizes fluidized bed technology with lime as a sorbent. The main objective of this work was to analyse the technological feasibility of the calcium looping process at different scales with a computational model. A one-dimensional dynamic model was applied to the calcium looping process, simulating the behaviour of the interconnected circulating fluidized bed reactors. The model incorporates fundamental mass and energy balance solvers to semi-empirical models describing solid behaviour in a circulating fluidized bed and chemical reactions occurring in the calcium loop. In addition, fluidized bed combustion, heat transfer and core-wall layer effects were modelled. The calcium looping model framework was successfully applied to a 30 kWth laboratory scale and a pilot scale unit 1.7 MWth and used to design a conceptual 250 MWth industrial scale unit. Valuable information was gathered from the behaviour of a small scale laboratory device. In addition, the interconnected behaviour of pilot plant reactors and the effect of solid fluidization on the thermal and carbon dioxide balances of the system were analysed. The scale-up study provided practical information on the thermal design of an industrial sized unit, selection of particle size and operability in different load scenarios.
Resumo:
Immunoglobulin E (IgE) and mast cells are believed to play important roles in allergic inflammation. However, their contributions to the pathogenesis of human asthma have not been clearly established. Significant progress has been made recently in our understanding of airway inflammation and airway hyperresponsiveness through studies of murine models of asthma and genetically engineered mice. Some of the studies have provided significant insights into the role of IgE and mast cells in the allergic airway response. In these models mice are immunized systemically with soluble protein antigens and then receive an antigen challenge through the airways. Bronchoalveolar lavage fluid from mice with allergic airway inflammation contains significant amounts of IgE. The IgE can capture the antigen presented to the airways and the immune complexes so formed can augment allergic airway response in a high-affinity IgE receptor (FcepsilonRI)-dependent manner. Previously, there were conflicting reports regarding the role of mast cells in murine models of asthma, based on studies of mast cell-deficient mice. More recent studies have suggested that the extent to which mast cells contribute to murine models of asthma depends on the experimental conditions employed to generate the airway response. This conclusion was further supported by studies using FcepsilonRI-deficient mice. Therefore, IgE-dependent activation of mast cells plays an important role in the development of allergic airway inflammation and airway hyperresponsiveness in mice under specific conditions. The murine models used should be of value for testing inhibitors of IgE or mast cells for the development of therapeutic agents for human asthma.
Resumo:
The advancement of science and technology makes it clear that no single perspective is any longer sufficient to describe the true nature of any phenomenon. That is why the interdisciplinary research is gaining more attention overtime. An excellent example of this type of research is natural computing which stands on the borderline between biology and computer science. The contribution of research done in natural computing is twofold: on one hand, it sheds light into how nature works and how it processes information and, on the other hand, it provides some guidelines on how to design bio-inspired technologies. The first direction in this thesis focuses on a nature-inspired process called gene assembly in ciliates. The second one studies reaction systems, as a modeling framework with its rationale built upon the biochemical interactions happening within a cell. The process of gene assembly in ciliates has attracted a lot of attention as a research topic in the past 15 years. Two main modelling frameworks have been initially proposed in the end of 1990s to capture ciliates’ gene assembly process, namely the intermolecular model and the intramolecular model. They were followed by other model proposals such as templatebased assembly and DNA rearrangement pathways recombination models. In this thesis we are interested in a variation of the intramolecular model called simple gene assembly model, which focuses on the simplest possible folds in the assembly process. We propose a new framework called directed overlap-inclusion (DOI) graphs to overcome the limitations that previously introduced models faced in capturing all the combinatorial details of the simple gene assembly process. We investigate a number of combinatorial properties of these graphs, including a necessary property in terms of forbidden induced subgraphs. We also introduce DOI graph-based rewriting rules that capture all the operations of the simple gene assembly model and prove that they are equivalent to the string-based formalization of the model. Reaction systems (RS) is another nature-inspired modeling framework that is studied in this thesis. Reaction systems’ rationale is based upon two main regulation mechanisms, facilitation and inhibition, which control the interactions between biochemical reactions. Reaction systems is a complementary modeling framework to traditional quantitative frameworks, focusing on explicit cause-effect relationships between reactions. The explicit formulation of facilitation and inhibition mechanisms behind reactions, as well as the focus on interactions between reactions (rather than dynamics of concentrations) makes their applicability potentially wide and useful beyond biological case studies. In this thesis, we construct a reaction system model corresponding to the heat shock response mechanism based on a novel concept of dominance graph that captures the competition on resources in the ODE model. We also introduce for RS various concepts inspired by biology, e.g., mass conservation, steady state, periodicity, etc., to do model checking of the reaction systems based models. We prove that the complexity of the decision problems related to these properties varies from P to NP- and coNP-complete to PSPACE-complete. We further focus on the mass conservation relation in an RS and introduce the conservation dependency graph to capture the relation between the species and also propose an algorithm to list the conserved sets of a given reaction system.
Resumo:
This Master’s Thesis analyses the effectiveness of different hedging models on BRICS (Brazil, Russia, India, China, and South Africa) countries. Hedging performance is examined by comparing two different dynamic hedging models to conventional OLS regression based model. The dynamic hedging models being employed are Constant Conditional Correlation (CCC) GARCH(1,1) and Dynamic Conditional Correlation (DCC) GARCH(1,1) with Student’s t-distribution. In order to capture the period of both Great Moderation and the latest financial crisis, the sample period extends from 2003 to 2014. To determine whether dynamic models outperform the conventional one, the reduction of portfolio variance for in-sample data with contemporaneous hedge ratios is first determined and then the holding period of the portfolios is extended to one and two days. In addition, the accuracy of hedge ratio forecasts is examined on the basis of out-of-sample variance reduction. The results are mixed and suggest that dynamic hedging models may not provide enough benefits to justify harder estimation and daily portfolio adjustment. In this sense, the results are consistent with the existing literature.
Resumo:
In this paper, we test a version of the conditional CAPM with respect to a local market portfolio, proxied by the Brazilian stock index during the 1976-1992 period. We also test a conditional APT model by using the difference between the 30-day rate (Cdb) and the overnight rate as a second factor in addition to the market portfolio in order to capture the large inflation risk present during this period. The conditional CAPM and APT models are estimated by the Generalized Method of Moments (GMM) and tested on a set of size portfolios created from a total of 25 securities exchanged on the Brazilian markets. The inclusion of this second factor proves to be crucial for the appropriate pricing of the portfolios.
Resumo:
In this paper, we test a version of the conditional CAPM with respect to a local market portfolio, proxied by the Brazilian stock index during the 1976-1992 period. We also test a conditional APT model by using the difference between the 30-day rate (Cdb) and the overnight rate as a second factor in addition to the market portfolio in order to capture the large inflation risk present during this period. the conditional CAPM and APT models are estimated by the Generalized Method of Moments (GMM) and tested on a set of size portfolios created from a total of 25 securities exchanged on the Brazilian markets. the inclusion of this second factor proves to be crucial for the appropriate pricing of the portfolios.