937 resultados para temporal-logic model
Resumo:
Processes of European integration and growing consumer scrutiny of public services have served to place the spotlight on the traditional French model of public/private interaction in the urban services domain. This article discusses recent debates within France of the institutionalised approach to local public/private partnership, and presents case study evidence from three urban agglomerations of a possible divergence from this approach. Drawing on the work of French academic, Dominique Lorrain, whose historical institutionalist accounts of the French model are perhaps the most comprehensive and best known, the article develops two hypotheses of institutional change, one from the historical institutionalist perspective of institutional stability and persistence, and the other from an explicitly sociological perspective, which emphasises the legitimating benefits of following appropriate rules of conduct. It argues that further studying the French model as an institution offers valuable empirical insight into processes of institutional change and persistence. © 2004 Taylor & Francis Ltd.
Resumo:
The paper presents a new network-flow interpretation of Łukasiewicz’s logic based on models with an increased effectiveness. The obtained results show that the presented network-flow models principally may work for multivalue logics with more than three states of the variables i.e. with a finite set of states in the interval from 0 to 1. The described models give the opportunity to formulate various logical functions. If the results from a given model that are contained in the obtained values of the arc flow functions are used as input data for other models then it is possible in Łukasiewicz’s logic to interpret successfully other sophisticated logical structures. The obtained models allow a research of Łukasiewicz’s logic with specific effective methods of the network-flow programming. It is possible successfully to use the specific peculiarities and the results pertaining to the function ‘traffic capacity of the network arcs’. Based on the introduced network-flow approach it is possible to interpret other multivalue logics – of E.Post, of L.Brauer, of Kolmogorov, etc.
Resumo:
Perception of simultaneity and temporal order is studied with simultaneity judgment (SJ) and temporal-order judgment (TOJ) tasks. In the former, observers report whether presentation of two stimuli was subjectively simultaneous; in the latter, they report which stimulus was subjectively presented first. SJ and TOJ tasks typically give discrepant results, which has prompted the view that performance is mediated by different processes in each task. We looked at these discrepancies from a model that yields psychometric functions whose parameters characterize the timing, decisional, and response processes involved in SJ and TOJ tasks. We analyzed 12 data sets from published studies in which both tasks had been used in within-subjects designs, all of which had reported differences in performance across tasks. Fitting the model jointly to data from both tasks, we tested the hypothesis that common timing processes sustain simultaneity and temporal order judgments, with differences in performance arising from task-dependent decisional and response processes. The results supported this hypothesis, also showing that model psychometric functions account for aspects of SJ and TOJ data that classical analyses overlook. Implications for research on perception of simultaneity and temporal order are discussed.
Resumo:
Research on temporal-order perception uses temporal-order judgment (TOJ) tasks or synchrony judgment (SJ) tasks in their binary SJ2 or ternary SJ3 variants. In all cases, two stimuli are presented with some temporal delay, and observers judge the order of presentation. Arbitrary psychometric functions are typically fitted to obtain performance measures such as sensitivity or the point of subjective simultaneity, but the parameters of these functions are uninterpretable. We describe routines in MATLAB and R that fit model-based functions whose parameters are interpretable in terms of the processes underlying temporal-order and simultaneity judgments and responses. These functions arise from an independent-channels model assuming arrival latencies with exponential distributions and a trichotomous decision space. Different routines fit data separately for SJ2, SJ3, and TOJ tasks, jointly for any two tasks, or also jointly for the three tasks (for common cases in which two or even the three tasks were used with the same stimuli and participants). Additional routines provide bootstrap p-values and confidence intervals for estimated parameters. A further routine is included that obtains performance measures from the fitted functions. An R package for Windows and source code of the MATLAB and R routines are available as Supplementary Files.
Resumo:
Research on the perception of temporal order uses either temporal-order judgment (TOJ) tasks or synchrony judgment (SJ) tasks, in both of which two stimuli are presented with some temporal delay and observers must judge the order of presentation. Results generally differ across tasks, raising concerns about whether they measure the same processes. We present a model including sensory and decisional parameters that places these tasks in a common framework that allows studying their implications on observed performance. TOJ tasks imply specific decisional components that explain the discrepancy of results obtained with TOJ and SJ tasks. The model is also tested against published data on audiovisual temporal-order judgments, and the fit is satisfactory, although model parameters are more accurately estimated with SJ tasks. Measures of latent point of subjective simultaneity and latent sensitivity are defined that are invariant across tasks by isolating the sensory parameters governing observed performance, whereas decisional parameters vary across tasks and account for observed differences across them. Our analyses concur with other evidence advising against the use of TOJ tasks in research on perception of temporal order.
Resumo:
Investigating the variability of Agulhas leakage, the volume transport of water from the Indian Ocean to the South Atlantic Ocean, is highly relevant due to its potential contribution to the Atlantic Meridional Overturning Circulation as well as the global circulation of heat and salt and hence global climate. Quantifying Agulhas leakage is challenging due to the non-linear nature of this process; current observations are insufficient to estimate its variability and ocean models all have biases in this region, even at high resolution . An Eulerian threshold integration method is developed to examine the mechanisms of Agulhas leakage variability in six ocean model simulations of varying resolution. This intercomparison, based on the circulation and thermo- haline structure at the Good Hope line, a transect to the south west of the southern tip of Africa, is used to identify features that are robust regardless of the model used and takes into account the thermohaline biases of each model. When determined by a passive tracer method, 60 % of the magnitude of Agulhas leakage is captured and more than 80 % of its temporal fluctuations, suggesting that the method is appropriate for investigating the variability of Agulhas leakage. In all simulations but one, the major driver of variability is associated with mesoscale features passing through the section. High resolution (<1/10 deg.) hindcast models agree on the temporal (2–4 cycles per year) and spatial (300–500 km) scales of these features corresponding to observed Agulhas Rings. Coarser resolution models (<1/4 deg.) reproduce similar time scale of variability of Agulhas leakage in spite of their difficulties in representing the Agulhas rings properties. A coarser resolution climate model (2 deg.) does not resolve the spatio-temporal mechanism of variability of Agulhas leakage. Hence it is expected to underestimate the contribution of Agulhas Current System to climate variability.
Resumo:
Investigating the variability of Agulhas leakage, the volume transport of water from the Indian Ocean to the South Atlantic Ocean, is highly relevant due to its potential contribution to the Atlantic Meridional Overturning Circulation as well as the global circulation of heat and salt and hence global climate. Quantifying Agulhas leakage is challenging due to the non-linear nature of this process; current observations are insufficient to estimate its variability and ocean models all have biases in this region, even at high resolution . An Eulerian threshold integration method is developed to examine the mechanisms of Agulhas leakage variability in six ocean model simulations of varying resolution. This intercomparison, based on the circulation and thermo- haline structure at the Good Hope line, a transect to the south west of the southern tip of Africa, is used to identify features that are robust regardless of the model used and takes into account the thermohaline biases of each model. When determined by a passive tracer method, 60 % of the magnitude of Agulhas leakage is captured and more than 80 % of its temporal fluctuations, suggesting that the method is appropriate for investigating the variability of Agulhas leakage. In all simulations but one, the major driver of variability is associated with mesoscale features passing through the section. High resolution (<1/10 deg.) hindcast models agree on the temporal (2–4 cycles per year) and spatial (300–500 km) scales of these features corresponding to observed Agulhas Rings. Coarser resolution models (<1/4 deg.) reproduce similar time scale of variability of Agulhas leakage in spite of their difficulties in representing the Agulhas rings properties. A coarser resolution climate model (2 deg.) does not resolve the spatio-temporal mechanism of variability of Agulhas leakage. Hence it is expected to underestimate the contribution of Agulhas Current System to climate variability.
Resumo:
International audience
MINING AND VERIFICATION OF TEMPORAL EVENTS WITH APPLICATIONS IN COMPUTER MICRO-ARCHITECTURE RESEARCH
Resumo:
Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.
Resumo:
In this paper, we present a fuzzy approach to the Reed-Frost model for epidemic spreading taking into account uncertainties in the diagnostic of the infection. The heterogeneities in the infected group is based on the clinical signals of the individuals (symptoms, laboratorial exams, medical findings, etc.), which are incorporated into the dynamic of the epidemic. The infectivity level is time-varying and the classification of the individuals is performed through fuzzy relations. Simulations considering a real problem with data of the viral epidemic in a children daycare are performed and the results are compared with a stochastic Reed-Frost generalization
Resumo:
The dengue virus has a single-stranded positive-sense RNA genome of similar to 10.700 nucleotides with a single open reading frame that encodes three structural (C, prM, and E) and seven nonstructural (NS1, NS2A, NS2B, NS3, NS4A, NS4B, and NS5) proteins. It possesses four antigenically distinct serotypes (DENV 1-4). Many phylogenetic studies address particularities of the different serotypes using convenience samples that are not conducive to a spatio-temporal analysis in a single urban setting. We describe the pattern of spread of distinct lineages of DENV-3 circulating in Sao Jose do Rio Preto, Brazil, during 2006. Blood samples from patients presenting dengue-like symptoms were collected for DENV testing. We performed M-N-PCR using primers based on NS5 for virus detection and identification. The fragments were purified from PCR mixtures and sequenced. The positive dengue cases were geo-coded. To type the sequenced samples, 52 reference sequences were aligned. The dataset generated was used for iterative phylogenetic reconstruction with the maximum likelihood criterion. The best demographic model, the rate of growth, rate of evolutionary change, and Time to Most Recent Common Ancestor (TMRCA) were estimated. The basic reproductive rate during the epidemics was estimated. We obtained sequences from 82 patients among 174 blood samples. We were able to geo-code 46 sequences. The alignment generated a 399-nucleotide-long dataset with 134 taxa. The phylogenetic analysis indicated that all samples were of DENV-3 and related to strains circulating on the isle of Martinique in 2000-2001. Sixty DENV-3 from Sao Jose do Rio Preto formed a monophyletic group (lineage 1), closely related to the remaining 22 isolates (lineage 2). We assumed that these lineages appeared before 2006 in different occasions. By transforming the inferred exponential growth rates into the basic reproductive rate, we obtained values for lineage 1 of R(0) = 1.53 and values for lineage 2 of R(0) = 1.13. Under the exponential model, TMRCA of lineage 1 dated 1 year and lineage 2 dated 3.4 years before the last sampling. The possibility of inferring the spatio-temporal dynamics from genetic data has been generally little explored, and it may shed light on DENV circulation. The use of both geographic and temporally structured phylogenetic data provided a detailed view on the spread of at least two dengue viral strains in a populated urban area.
Resumo:
This study proposes a simplified mathematical model to describe the processes occurring in an anaerobic sequencing batch biofilm reactor (ASBBR) treating lipid-rich wastewater. The reactor, subjected to rising organic loading rates, contained biomass immobilized cubic polyurethane foam matrices, and was operated at 32 degrees C +/- 2 degrees C, using 24-h batch cycles. In the adaptation period, the reactor was fed with synthetic substrate for 46 days and was operated without agitation. Whereas agitation was raised to 500 rpm, the organic loading rate (OLR) rose from 0.3 g chemical oxygen demand (COD) . L(-1) . day(-1) to 1.2 g COD . L(-1) . day(-1). The ASBBR was fed fat-rich wastewater (dairy wastewater), in an operation period lasting for 116 days, during which four operational conditions (OCs) were tested: 1.1 +/- 0.2 g COD . L(-1) . day(-1) (OC1), 4.5 +/- 0.4 g COD . L(-1) . day(-1) (OC2), 8.0 +/- 0.8 g COD . L(-1) . day(-1) (OC3), and 12.1 +/- 2.4 g COD . L(-1) . day(-1) (OC4). The bicarbonate alkalinity (BA)/COD supplementation ratio was 1:1 at OC1, 1:2 at OC2, and 1:3 at OC3 and OC4. Total COD removal efficiencies were higher than 90%, with a constant production of bicarbonate alkalinity, in all OCs tested. After the process reached stability, temporal profiles of substrate consumption were obtained. Based on these experimental data a simplified first-order model was fit, making possible the inference of kinetic parameters. A simplified mathematical model correlating soluble COD with volatile fatty acids (VFA) was also proposed, and through it the consumption rates of intermediate products as propionic and acetic acid were inferred. Results showed that the microbial consortium worked properly and high efficiencies were obtained, even with high initial substrate concentrations, which led to the accumulation of intermediate metabolites and caused low specific consumption rates.
Resumo:
Context. About 2/3 of the Be stars present the so-called V/R variations, a phenomenon characterized by the quasi-cyclic variation in the ratio between the violet and red emission peaks of the HI emission lines. These variations are generally explained by global oscillations in the circumstellar disk forming a one-armed spiral density pattern that precesses around the star with a period of a few years. Aims. This paper presents self-consistent models of polarimetric, photometric, spectrophotometric, and interferometric observations of the classical Be star zeta Tauri. The primary goal is to conduct a critical quantitative test of the global oscillation scenario. Methods. Detailed three-dimensional, NLTE radiative transfer calculations were carried out using the radiative transfer code HDUST. The most up-to-date research on Be stars was used as input for the code in order to include a physically realistic description for the central star and the circumstellar disk. The model adopts a rotationally deformed, gravity darkened central star, surrounded by a disk whose unperturbed state is given by a steady-state viscous decretion disk model. It is further assumed that this disk is in vertical hydrostatic equilibrium. Results. By adopting a viscous decretion disk model for zeta Tauri and a rigorous solution of the radiative transfer, a very good fit of the time-average properties of the disk was obtained. This provides strong theoretical evidence that the viscous decretion disk model is the mechanism responsible for disk formation. The global oscillation model successfully fitted spatially resolved VLTI/AMBER observations and the temporal V/R variations in the H alpha and Br gamma lines. This result convincingly demonstrates that the oscillation pattern in the disk is a one-armed spiral. Possible model shortcomings, as well as suggestions for future improvements, are also discussed.
Resumo:
Currently there is a trend for the expansion of the area cropped with sugarcane (Saccharum officinarum L.), driven by an increase in the world demand for biofuels, due to economical, environmental, and geopolitical issues. Although sugarcane is traditionally harvested by burning dried leaves and tops, the unburned, mechanized harvest has been progressively adopted. The use of process based models is useful in understanding the effects of plant litter in soil C dynamics. The objective of this work was to use the CENTURY model in evaluating the effect of sugarcane residue management in the temporal dynamics of soil C. The approach taken in this work was to parameterize the CENTURY model for the sugarcane crop, to simulate the temporal dynamics of soil C, validating the model through field experiment data, and finally to make predictions in the long term regarding soil C. The main focus of this work was the comparison of soil C stocks between the burned and unburned litter management systems, but the effect of mineral fertilizer and organic residue applications were also evaluated. The simulations were performed with data from experiments with different durations, from 1 to 60 yr, in Goiana and Timbauba, Pernambuco, and Pradopolis, Sao Paulo, all in Brazil; and Mount Edgecombe, Kwazulu-Natal, South Africa. It was possible to simulate the temporal dynamics of soil C (R(2) = 0.89). The predictions made with the model revealed that there is, in the long term, a trend for higher soil C stocks with the unburned management. This increase is conditioned by factors such as climate, soil texture, time of adoption of the unburned system, and N fertilizer management.