10 resultados para propositional linear-time temporal logic
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
A deep theoretical analysis of the graph cut image segmentation framework presented in this paper simultaneously translates into important contributions in several directions. The most important practical contribution of this work is a full theoretical description, and implementation, of a novel powerful segmentation algorithm, GC(max). The output of GC(max) coincides with a version of a segmentation algorithm known as Iterative Relative Fuzzy Connectedness, IRFC. However, GC(max) is considerably faster than the classic IRFC algorithm, which we prove theoretically and show experimentally. Specifically, we prove that, in the worst case scenario, the GC(max) algorithm runs in linear time with respect to the variable M=|C|+|Z|, where |C| is the image scene size and |Z| is the size of the allowable range, Z, of the associated weight/affinity function. For most implementations, Z is identical to the set of allowable image intensity values, and its size can be treated as small with respect to |C|, meaning that O(M)=O(|C|). In such a situation, GC(max) runs in linear time with respect to the image size |C|. We show that the output of GC(max) constitutes a solution of a graph cut energy minimization problem, in which the energy is defined as the a"" (a) norm ayenF (P) ayen(a) of the map F (P) that associates, with every element e from the boundary of an object P, its weight w(e). This formulation brings IRFC algorithms to the realm of the graph cut energy minimizers, with energy functions ayenF (P) ayen (q) for qa[1,a]. Of these, the best known minimization problem is for the energy ayenF (P) ayen(1), which is solved by the classic min-cut/max-flow algorithm, referred to often as the Graph Cut algorithm. We notice that a minimization problem for ayenF (P) ayen (q) , qa[1,a), is identical to that for ayenF (P) ayen(1), when the original weight function w is replaced by w (q) . Thus, any algorithm GC(sum) solving the ayenF (P) ayen(1) minimization problem, solves also one for ayenF (P) ayen (q) with qa[1,a), so just two algorithms, GC(sum) and GC(max), are enough to solve all ayenF (P) ayen (q) -minimization problems. We also show that, for any fixed weight assignment, the solutions of the ayenF (P) ayen (q) -minimization problems converge to a solution of the ayenF (P) ayen(a)-minimization problem (ayenF (P) ayen(a)=lim (q -> a)ayenF (P) ayen (q) is not enough to deduce that). An experimental comparison of the performance of GC(max) and GC(sum) algorithms is included. This concentrates on comparing the actual (as opposed to provable worst scenario) algorithms' running time, as well as the influence of the choice of the seeds on the output.
Resumo:
Background: Decreasing costs of DNA sequencing have made prokaryotic draft genome sequences increasingly common. A contig scaffold is an ordering of contigs in the correct orientation. A scaffold can help genome comparisons and guide gap closure efforts. One popular technique for obtaining contig scaffolds is to map contigs onto a reference genome. However, rearrangements that may exist between the query and reference genomes may result in incorrect scaffolds, if these rearrangements are not taken into account. Large-scale inversions are common rearrangement events in prokaryotic genomes. Even in draft genomes it is possible to detect the presence of inversions given sufficient sequencing coverage and a sufficiently close reference genome. Results: We present a linear-time algorithm that can generate a set of contig scaffolds for a draft genome sequence represented in contigs given a reference genome. The algorithm is aimed at prokaryotic genomes and relies on the presence of matching sequence patterns between the query and reference genomes that can be interpreted as the result of large-scale inversions; we call these patterns inversion signatures. Our algorithm is capable of correctly generating a scaffold if at least one member of every inversion signature pair is present in contigs and no inversion signatures have been overwritten in evolution. The algorithm is also capable of generating scaffolds in the presence of any kind of inversion, even though in this general case there is no guarantee that all scaffolds in the scaffold set will be correct. We compare the performance of SIS, the program that implements the algorithm, to seven other scaffold-generating programs. The results of our tests show that SIS has overall better performance. Conclusions: SIS is a new easy-to-use tool to generate contig scaffolds, available both as stand-alone and as a web server. The good performance of SIS in our tests adds evidence that large-scale inversions are widespread in prokaryotic genomes.
Resumo:
Semi-qualitative probabilistic networks (SQPNs) merge two important graphical model formalisms: Bayesian networks and qualitative probabilistic networks. They provade a very Complexity of inferences in polytree-shaped semi-qualitative probabilistic networks and qualitative probabilistic networks. They provide a very general modeling framework by allowing the combination of numeric and qualitative assessments over a discrete domain, and can be compactly encoded by exploiting the same factorization of joint probability distributions that are behind the bayesian networks. This paper explores the computational complexity of semi-qualitative probabilistic networks, and takes the polytree-shaped networks as its main target. We show that the inference problem is coNP-Complete for binary polytrees with multiple observed nodes. We also show that interferences can be performed in time linear in the number of nodes if there is a single observed node. Because our proof is construtive, we obtain an efficient linear time algorithm for SQPNs under such assumptions. To the best of our knowledge, this is the first exact polynominal-time algorithm for SQPn. Together these results provide a clear picture of the inferential complexity in polytree-shaped SQPNs.
Resumo:
This work addresses the solution to the problem of robust model predictive control (MPC) of systems with model uncertainty. The case of zone control of multi-variable stable systems with multiple time delays is considered. The usual approach of dealing with this kind of problem is through the inclusion of non-linear cost constraint in the control problem. The control action is then obtained at each sampling time as the solution to a non-linear programming (NLP) problem that for high-order systems can be computationally expensive. Here, the robust MPC problem is formulated as a linear matrix inequality problem that can be solved in real time with a fraction of the computer effort. The proposed approach is compared with the conventional robust MPC and tested through the simulation of a reactor system of the process industry.
Resumo:
As a part of the AMAZE-08 campaign during the wet season in the rainforest of central Amazonia, an ultraviolet aerodynamic particle sizer (UV-APS) was operated for continuous measurements of fluorescent biological aerosol particles (FBAP). In the coarse particle size range (> 1 mu m) the campaign median and quartiles of FBAP number and mass concentration were 7.3x10(4) m(-3) (4.0-13.2x10(4) m(-3)) and 0.72 mu g m(-3) (0.42-1.19 mu g m(-3)), respectively, accounting for 24% (11-41%) of total particle number and 47% (25-65%) of total particle mass. During the five-week campaign in February-March 2008 the concentration of coarse-mode Saharan dust particles was highly variable. In contrast, FBAP concentrations remained fairly constant over the course of weeks and had a consistent daily pattern, peaking several hours before sunrise, suggesting observed FBAP was dominated by nocturnal spore emission. This conclusion was supported by the consistent FBAP number size distribution peaking at 2.3 mu m, also attributed to fungal spores and mixed biological particles by scanning electron microscopy (SEM), light microscopy and biochemical staining. A second primary biological aerosol particle (PBAP) mode between 0.5 and 1.0 mu m was also observed by SEM, but exhibited little fluorescence and no true fungal staining. This mode may have consisted of single bacterial cells, brochosomes, various fragments of biological material, and small Chromalveolata (Chromista) spores. Particles liquid-coated with mixed organic-inorganic material constituted a large fraction of observations, and these coatings contained salts likely from primary biological origin. We provide key support for the suggestion that real-time laser-induce fluorescence (LIF) techniques using 355 nm excitation provide size-resolved concentrations of FBAP as a lower limit for the atmospheric abundance of biological particles in a pristine environment. We also show some limitations of using the instrument for ambient monitoring of weakly fluorescent particles < 2 mu m. Our measurements confirm that primary biological particles, fungal spores in particular, are an important fraction of supermicron aerosol in the Amazon and that may contribute significantly to hydrological cycling, especially when coated by mixed inorganic material.
Resumo:
We analyzed the effectiveness of linear short- and long-term variability time domain parameters, an index of sympatho-vagal balance (SDNN/RMSSD) and entropy in differentiating fetal heart rate patterns (fHRPs) on the fetal heart rate (fHR) series of 5, 3 and 2 min duration reconstructed from 46 fetal magnetocardiograms. Gestational age (GA) varied from 21 to 38 weeks. FHRPs were classified based on the fHR standard deviation. In sleep states, we observed that vagal influence increased with GA, and entropy significantly increased (decreased) with GA (SDNN/RMSSD), demonstrating that a prevalence of vagal activity with autonomous nervous system maturation may be associated with increased sleep state complexity. In active wakefulness, we observed a significant negative (positive) correlation of short-term (long-term) variability parameters with SDNN/RMSSD. ANOVA statistics demonstrated that long-term irregularity and standard deviation of normal-to-normal beat intervals (SDNN) best differentiated among fHRPs. Our results confirm that short-and long-term variability parameters are useful to differentiate between quiet and active states, and that entropy improves the characterization of sleep states. All measures differentiated fHRPs more effectively on very short HR series, as a result of the fMCG high temporal resolution and of the intrinsic timescales of the events that originate the different fHRPs.
Resumo:
Objective. Mortality from asthma has varied among countries during the last several decades. This study aimed to identify temporal trends of asthma mortality in Brazil from 1980 to 2010. Method. We analyzed 6840 deaths of patients aged 5-34 years that occurred in Brazil with the underlying cause of asthma. We applied a log-linear model using Poisson regression to verify peaks and trends. We also calculated the point estimation and 95% confidence interval (CI 95%) of the annual percent change (APC) of the mortality rates, and the average annual percent change (AAPC) for 2001-2010. Results. A decline was observed from 1980 to 1992 [APC = -3.4 (-5.0 to -1.8)], followed by a nonsignificant rise until 1996 [APC = 6.8 (-1.4 to 15.6)], and a new downward trend from 1997 to 2010 [APC = -2.7 (-3.9 to -1.6)]. The APCs varied according to age strata: 5-14 years from 1980 to 2010 [-0.3 (-1.1 to 0.5)]; 15-24 years from 1980 to 1991 [-2.1 (-5.0 to 0.9)], from 1992 to 1996 [6.8 (-6.7 to 22.2)], and from 1997 to 2010 [-3.9 (-5.7 to -2.0)]; 24-25 years from 1980 to 1992 [-2.5 (-4.6 to -0.3)], from 1993 to 1995 [12.0 (-21.1 to 59.1)], and from 1996-2010 [-1.7 (-3.0 to -0.4)]. AAPC from 2001 to 2010 was -1.7 (-3.0 to -0.4); the decline for this period was significant for patients over 15 years old, women, and those living in the Southeast region. Conclusion. Asthma mortality rates in Brazil have been declining since the late 1990s.
Resumo:
In this paper, we consider the stochastic optimal control problem of discrete-time linear systems subject to Markov jumps and multiplicative noises under two criteria. The first one is an unconstrained mean-variance trade-off performance criterion along the time, and the second one is a minimum variance criterion along the time with constraints on the expected output. We present explicit conditions for the existence of an optimal control strategy for the problems, generalizing previous results in the literature. We conclude the paper by presenting a numerical example of a multi-period portfolio selection problem with regime switching in which it is desired to minimize the sum of the variances of the portfolio along the time under the restriction of keeping the expected value of the portfolio greater than some minimum values specified by the investor. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Gravity Recovery and Climate Experiment (GRACE) mission is dedicated to measuring temporal variations of the Earth's gravity field. In this study, the Stokes coefficients made available by Groupe de Recherche en Géodésie Spatiale (GRGS) at a 10-day interval were converted into equivalent water height (EWH) for a ~4-year period in the Amazon basin (from July-2002 to May-2006). The seasonal amplitudes of EWH signal are the largest on the surface of Earth and reach ~ 1250mm at that basin's center. Error budget represents ~130 mm of EWH, including formal errors on Stokes coefficient, leakage errors (12 ~ 21 mm) and spectrum truncation (10 ~ 15 mm). Comparison between in situ river level time series measured at 233 ground-based hydrometric stations (HS) in the Amazon basin and vertically-integrated EWH derived from GRACE is carried out in this paper. Although EWH and HS measure different water bodies, in most of the cases a high correlation (up to ~80%) is detected between the HS series and EWH series at the same site. This correlation allows adjusting linear relationships between in situ and GRACE-based series for the major tributaries of the Amazon river. The regression coefficients decrease from up to down stream along the rivers reaching the theoretical value 1 at the Amazon's mouth in the Atlantic Ocean. The variation of the regression coefficients versus the distance from estuary is analysed for the largest rivers in the basin. In a second step, a classification of the proportionality between in situ and GRACE time-series is proposed.
Resumo:
The main objective of this work is to present an efficient method for phasor estimation based on a compact Genetic Algorithm (cGA) implemented in Field Programmable Gate Array (FPGA). To validate the proposed method, an Electrical Power System (EPS) simulated by the Alternative Transients Program (ATP) provides data to be used by the cGA. This data is as close as possible to the actual data provided by the EPS. Real life situations such as islanding, sudden load increase and permanent faults were considered. The implementation aims to take advantage of the inherent parallelism in Genetic Algorithms in a compact and optimized way, making them an attractive option for practical applications in real-time estimations concerning Phasor Measurement Units (PMUs).