8 resultados para Shadow and Highlight Invariant Algorithm.
Resumo:
The purpose of this article is to characterize dynamic optimal harvesting trajectories that maximize discounted utility assuming an age-structured population model, in the same line as Tahvonen (2009). The main novelty of our study is that uses as an age-structured population model the standard stochastic cohort framework applied in Virtual Population Analysis for fish stock assessment. This allows us to compare optimal harvesting in a discounted economic context with standard reference points used by fisheries agencies for long term management plans (e.g. Fmsy). Our main findings are the following. First, optimal steady state is characterized and sufficient conditions that guarantees its existence and uniqueness for the general case of n cohorts are shown. It is also proved that the optimal steady state coincides with the traditional target Fmsy when the utility function to be maximized is the yield and the discount rate is zero. Second, an algorithm to calculate the optimal path that easily drives the resource to the steady state is developed. And third, the algorithm is applied to the Northern Stock of hake. Results show that management plans based exclusively on traditional reference targets as Fmsy may drive fishery economic results far from the optimal.
Resumo:
In this thesis we propose a new approach to deduction methods for temporal logic. Our proposal is based on an inductive definition of eventualities that is different from the usual one. On the basis of this non-customary inductive definition for eventualities, we first provide dual systems of tableaux and sequents for Propositional Linear-time Temporal Logic (PLTL). Then, we adapt the deductive approach introduced by means of these dual tableau and sequent systems to the resolution framework and we present a clausal temporal resolution method for PLTL. Finally, we make use of this new clausal temporal resolution method for establishing logical foundations for declarative temporal logic programming languages. The key element in the deduction systems for temporal logic is to deal with eventualities and hidden invariants that may prevent the fulfillment of eventualities. Different ways of addressing this issue can be found in the works on deduction systems for temporal logic. Traditional tableau systems for temporal logic generate an auxiliary graph in a first pass.Then, in a second pass, unsatisfiable nodes are pruned. In particular, the second pass must check whether the eventualities are fulfilled. The one-pass tableau calculus introduced by S. Schwendimann requires an additional handling of information in order to detect cyclic branches that contain unfulfilled eventualities. Regarding traditional sequent calculi for temporal logic, the issue of eventualities and hidden invariants is tackled by making use of a kind of inference rules (mainly, invariant-based rules or infinitary rules) that complicates their automation. A remarkable consequence of using either a two-pass approach based on auxiliary graphs or aone-pass approach that requires an additional handling of information in the tableau framework, and either invariant-based rules or infinitary rules in the sequent framework, is that temporal logic fails to carry out the classical correspondence between tableaux and sequents. In this thesis, we first provide a one-pass tableau method TTM that instead of a graph obtains a cyclic tree to decide whether a set of PLTL-formulas is satisfiable. In TTM tableaux are classical-like. For unsatisfiable sets of formulas, TTM produces tableaux whose leaves contain a formula and its negation. In the case of satisfiable sets of formulas, TTM builds tableaux where each fully expanded open branch characterizes a collection of models for the set of formulas in the root. The tableau method TTM is complete and yields a decision procedure for PLTL. This tableau method is directly associated to a one-sided sequent calculus called TTC. Since TTM is free from all the structural rules that hinder the mechanization of deduction, e.g. weakening and contraction, then the resulting sequent calculus TTC is also free from this kind of structural rules. In particular, TTC is free of any kind of cut, including invariant-based cut. From the deduction system TTC, we obtain a two-sided sequent calculus GTC that preserves all these good freeness properties and is finitary, sound and complete for PLTL. Therefore, we show that the classical correspondence between tableaux and sequent calculi can be extended to temporal logic. The most fruitful approach in the literature on resolution methods for temporal logic, which was started with the seminal paper of M. Fisher, deals with PLTL and requires to generate invariants for performing resolution on eventualities. In this thesis, we present a new approach to resolution for PLTL. The main novelty of our approach is that we do not generate invariants for performing resolution on eventualities. Our method is based on the dual methods of tableaux and sequents for PLTL mentioned above. Our resolution method involves translation into a clausal normal form that is a direct extension of classical CNF. We first show that any PLTL-formula can be transformed into this clausal normal form. Then, we present our temporal resolution method, called TRS-resolution, that extends classical propositional resolution. Finally, we prove that TRS-resolution is sound and complete. In fact, it finishes for any input formula deciding its satisfiability, hence it gives rise to a new decision procedure for PLTL. In the field of temporal logic programming, the declarative proposals that provide a completeness result do not allow eventualities, whereas the proposals that follow the imperative future approach either restrict the use of eventualities or deal with them by calculating an upper bound based on the small model property for PLTL. In the latter, when the length of a derivation reaches the upper bound, the derivation is given up and backtracking is used to try another possible derivation. In this thesis we present a declarative propositional temporal logic programming language, called TeDiLog, that is a combination of the temporal and disjunctive paradigms in Logic Programming. We establish the logical foundations of our proposal by formally defining operational and logical semantics for TeDiLog and by proving their equivalence. Since TeDiLog is, syntactically, a sublanguage of PLTL, the logical semantics of TeDiLog is supported by PLTL logical consequence. The operational semantics of TeDiLog is based on TRS-resolution. TeDiLog allows both eventualities and always-formulas to occur in clause heads and also in clause bodies. To the best of our knowledge, TeDiLog is the first declarative temporal logic programming language that achieves this high degree of expressiveness. Since the tableau method presented in this thesis is able to detect that the fulfillment of an eventuality is prevented by a hidden invariant without checking for it by means of an extra process, since our finitary sequent calculi do not include invariant-based rules and since our resolution method dispenses with invariant generation, we say that our deduction methods are invariant-free.
Resumo:
The learning of probability distributions from data is a ubiquitous problem in the fields of Statistics and Artificial Intelligence. During the last decades several learning algorithms have been proposed to learn probability distributions based on decomposable models due to their advantageous theoretical properties. Some of these algorithms can be used to search for a maximum likelihood decomposable model with a given maximum clique size, k, which controls the complexity of the model. Unfortunately, the problem of learning a maximum likelihood decomposable model given a maximum clique size is NP-hard for k > 2. In this work, we propose a family of algorithms which approximates this problem with a computational complexity of O(k · n^2 log n) in the worst case, where n is the number of implied random variables. The structures of the decomposable models that solve the maximum likelihood problem are called maximal k-order decomposable graphs. Our proposals, called fractal trees, construct a sequence of maximal i-order decomposable graphs, for i = 2, ..., k, in k − 1 steps. At each step, the algorithms follow a divide-and-conquer strategy based on the particular features of this type of structures. Additionally, we propose a prune-and-graft procedure which transforms a maximal k-order decomposable graph into another one, increasing its likelihood. We have implemented two particular fractal tree algorithms called parallel fractal tree and sequential fractal tree. These algorithms can be considered a natural extension of Chow and Liu’s algorithm, from k = 2 to arbitrary values of k. Both algorithms have been compared against other efficient approaches in artificial and real domains, and they have shown a competitive behavior to deal with the maximum likelihood problem. Due to their low computational complexity they are especially recommended to deal with high dimensional domains.
Resumo:
48 p.
Resumo:
In this work we extend to the multistage case two recent risk averse measures for two-stage stochastic programs based on first- and second-order stochastic dominance constraints induced by mixed-integer linear recourse. Additionally, we consider Time Stochastic Dominance (TSD) along a given horizon. Given the dimensions of medium-sized problems augmented by the new variables and constraints required by those risk measures, it is unrealistic to solve the problem up to optimality by plain use of MIP solvers in a reasonable computing time, at least. Instead of it, decomposition algorithms of some type should be used. We present an extension of our Branch-and-Fix Coordination algorithm, so named BFC-TSD, where a special treatment is given to cross scenario group constraints that link variables from different scenario groups. A broad computational experience is presented by comparing the risk neutral approach and the tested risk averse strategies. The performance of the new version of the BFC algorithm versus the plain use of a state-of-the-artMIP solver is also reported.
Resumo:
Sex workers are traditionally considered important vectors of transmission of sexually transmitted infections (STI). The role of clients is commonly overlooked, partially due to the lack of evidence on clients' position in the sexual network created by commercial sex. Contrasting the diffusion importance of sex workers and their clients in the map of their sexual encounters in two Web-mediated communities, we find that from diffusion perspective, clients are as important as sex workers. Their diffusion importance is closely linked to the geography of the sexual encounters: as a result of different movement patterns, travelling clients shorten network distances between distant network neighborhoods and thus facilitate contagion among them more than sex workers, and find themselves more often in the core of the network by which they could contribute to the persistence of STIs in the community. These findings position clients into the set of the key actors and highlight the role of human mobility in the transmission of STIs in commercial sexual networks.
Resumo:
[EN]Hyperventilation, which is common both in-hospital and out-of-hospital cardiac arrest, decreases coronary and cerebral perfusion contributing to poorer survival rates in both animals and humans. Current resucitation guidelines recommend continuous monitoring of exhaled carbon dioxide (CO2) during cardiopulmonary resucitation (CPR) and emphasize good quality of CPR, including ventilations at 8-10 min1. Most of commercial monitors/de- brilators incorporate methods to compute the respiratory rate based on capnography since it shows uctuations caused by ventilations. Chest compressions may induce artifacts in this signal making the calculation of the respiratory rate di cult. Nevertheless, the accuracy of these methods during CPR has not been documented yet. The aim of this project is to analyze whether the capnogram is reliable to compute ventilation rate during CPR. A total of 91 episodes, 63 out-of-hospital cardiac arrest episodes ( rst database) and 28 in-hospital cardiac arrest episodes (second database) were used to develop an algorithm to detect ventilations in the capnogram, and the nal aim is to provide an accurate ventilation rate for feedback purposes during CPR. Two graphic user interfaces were developed to make the analysis easier and another two were adapted to carry out this project. The use of this interfaces facilitates the managment of the databases and the calculation of the algorithm accuracy. In the rst database, as gold standard every ventilation was marked by visual inspection of both the impedance, which shows uctuations with every ventilation, and the capnography signal. In the second database, volume of the respiratory ow signal was used as gold standard to mark ventilation instants since it is not a ected by chest compressions. The capnogram was preprocessed to remove high frequency noise, and the rst di erence was computed to de ne the onset of inspiration and expiration. Then, morphological features were extracted and a decission algorithm built based on the extracted features to detect ventilation instants. Finally, ventilation rate was calculated using the detected instants of ventilation. According to the results obtained in this project, the capnogram can be reliably used to give feedback ventilation rate, and therefore, on hyperventilation in a resucitation scenario.
Resumo:
Background: Intratumor heterogeneity may be responsible of the unpredictable aggressive clinical behavior that some clear cell renal cell carcinomas display. This clinical uncertainty may be caused by insufficient sampling, leaving out of histological analysis foci of high grade tumor areas. Although molecular approaches are providing important information on renal intratumor heterogeneity, a focus on this topic from the practicing pathologist' perspective is still pending. Methods: Four distant tumor areas of 40 organ-confined clear cell renal cell carcinomas were selected for histopathological and immunohistochemical evaluation. Tumor size, cell type (clear/granular), Fuhrman's grade, Staging, as well as immunostaining with Snail, ZEB1, Twist, Vimentin, E-cadherin, beta-catenin, PTEN, p-Akt, p110 alpha, and SETD2, were analyzed for intratumor heterogeneity using a classification and regression tree algorithm. Results: Cell type and Fuhrman's grade were heterogeneous in 12.5 and 60 % of the tumors, respectively. If cell type was homogeneous (clear cell) then the tumors were low-grade in 88.57 % of cases. Immunostaining heterogeneity was significant in the series and oscillated between 15 % for p110a and 80 % for Snail. When Snail immunostaining was homogeneous the tumor was histologically homogeneous in 100 % of cases. If Snail was heterogeneous, the tumor was heterogeneous in 75 % of the cases. Average tumor diameter was 4.3 cm. Tumors larger than 3.7 cm were heterogeneous for Vimentin immunostaining in 72.5 % of cases. Tumors displaying negative immunostaining for both ZEB1 and Twist were low grade in 100 % of the cases. Conclusions: Intratumor heterogeneity is a common event in clear cell renal cell carcinoma, which can be monitored by immunohistochemistry in routine practice. Snail seems to be particularly useful in the identification of intratumor heterogeneity. The suitability of current sampling protocols in renal cancer is discussed.