940 resultados para Temporal constraints analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Twitter network has been labelled the most commonly used microblogging application around today. With about 500 million estimated registered users as of June, 2012, Twitter has become a credible medium of sentiment/opinion expression. It is also a notable medium for information dissemination; including breaking news on diverse issues since it was launched in 2007. Many organisations, individuals and even government bodies follow activities on the network in order to obtain knowledge on how their audience reacts to tweets that affect them. We can use postings on Twitter (known as tweets) to analyse patterns associated with events by detecting the dynamics of the tweets. A common way of labelling a tweet is by including a number of hashtags that describe its contents. Association Rule Mining can find the likelihood of co-occurrence of hashtags. In this paper, we propose the use of temporal Association Rule Mining to detect rule dynamics, and consequently dynamics of tweets. We coined our methodology Transaction-based Rule Change Mining (TRCM). A number of patterns are identifiable in these rule dynamics including, new rules, emerging rules, unexpected rules and ?dead' rules. Also the linkage between the different types of rule dynamics is investigated experimentally in this paper.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The kinematic expansion history of the universe is investigated by using the 307 supernovae type Ia from the Union Compilation set. Three simple model parameterizations for the deceleration parameter ( constant, linear and abrupt transition) and two different models that are explicitly parametrized by the cosmic jerk parameter ( constant and variable) are considered. Likelihood and Bayesian analyses are employed to find best fit parameters and compare models among themselves and with the flat Lambda CDM model. Analytical expressions and estimates for the deceleration and cosmic jerk parameters today (q(0) and j(0)) and for the transition redshift (z(t)) between a past phase of cosmic deceleration to a current phase of acceleration are given. All models characterize an accelerated expansion for the universe today and largely indicate that it was decelerating in the past, having a transition redshift around 0.5. The cosmic jerk is not strongly constrained by the present supernovae data. For the most realistic kinematic models the 1 sigma confidence limits imply the following ranges of values: q(0) is an element of [-0.96, -0.46], j(0) is an element of [-3.2,-0.3] and z(t) is an element of [0.36, 0.84], which are compatible with the Lambda CDM predictions, q(0) = -0.57 +/- 0.04, j(0) = -1 and z(t) = 0.71 +/- 0.08. We find that even very simple kinematic models are equally good to describe the data compared to the concordance Lambda CDM model, and that the current observations are not powerful enough to discriminate among all of them.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Several accounts put forth to explain the flash-lag effect (FLE) rely mainly on either spatial or temporal mechanisms. Here we investigated the relationship between these mechanisms by psychophysical and theoretical approaches. In a first experiment we assessed the magnitudes of the FLE and temporal-order judgments performed under identical visual stimulation. The results were interpreted by means of simulations of an artificial neural network, that wits also employed to make predictions concerning the F LE. The model predicted that a spatio-temporal mislocalisation would emerge from two, continuous and abrupt-onset, moving stimuli. Additionally, a straightforward prediction of the model revealed that the magnitude of this mislocalisation should be task-dependent, increasing when the use of the abrupt-onset moving stimulus switches from a temporal marker only to both temporal and spatial markers. Our findings confirmed the model`s predictions and point to an indissoluble interplay between spatial facilitation and processing delays in the FLE.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The recent advances in CMOS technology have allowed for the fabrication of transistors with submicronic dimensions, making possible the integration of tens of millions devices in a single chip that can be used to build very complex electronic systems. Such increase in complexity of designs has originated a need for more efficient verification tools that could incorporate more appropriate physical and computational models. Timing verification targets at determining whether the timing constraints imposed to the design may be satisfied or not. It can be performed by using circuit simulation or by timing analysis. Although simulation tends to furnish the most accurate estimates, it presents the drawback of being stimuli dependent. Hence, in order to ensure that the critical situation is taken into account, one must exercise all possible input patterns. Obviously, this is not possible to accomplish due to the high complexity of current designs. To circumvent this problem, designers must rely on timing analysis. Timing analysis is an input-independent verification approach that models each combinational block of a circuit as a direct acyclic graph, which is used to estimate the critical delay. First timing analysis tools used only the circuit topology information to estimate circuit delay, thus being referred to as topological timing analyzers. However, such method may result in too pessimistic delay estimates, since the longest paths in the graph may not be able to propagate a transition, that is, may be false. Functional timing analysis, in turn, considers not only circuit topology, but also the temporal and functional relations between circuit elements. Functional timing analysis tools may differ by three aspects: the set of sensitization conditions necessary to declare a path as sensitizable (i.e., the so-called path sensitization criterion), the number of paths simultaneously handled and the method used to determine whether sensitization conditions are satisfiable or not. Currently, the two most efficient approaches test the sensitizability of entire sets of paths at a time: one is based on automatic test pattern generation (ATPG) techniques and the other translates the timing analysis problem into a satisfiability (SAT) problem. Although timing analysis has been exhaustively studied in the last fifteen years, some specific topics have not received the required attention yet. One such topic is the applicability of functional timing analysis to circuits containing complex gates. This is the basic concern of this thesis. In addition, and as a necessary step to settle the scenario, a detailed and systematic study on functional timing analysis is also presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The primary objective of this study was to estimate the amount of gas not emitted into the air in areas cultivated with sugarcane (Saccharum officinarum) that were mechanically harvested. Satellite images CBERS-2/CCD, from 08-13-2004, 08-14-2005, 08-15-2006 and 08-16-2007, of northwestern São Paulo State were processed using the Geographic Information System (GIS)-IDRISI 15.0. Areas of interest (the mechanically-harvested sugarcane fields) were identified and quantified based on the spectral response of the bands studied. Based on these data, the amount of gas that was not emitted was evaluated, according to the estimate equation proposed by the Intergovernmental Panel on Climate Change (IPCC). The results of 396.65 km(2) (5.91% for 2004); 447.56 km(2) (6.67% for 2005); 511.54 km(2) (7.62% in 2006); and 474.60 km(2) (7.07% for 2007), calculated from a total area of 6,710.89 km(2) with sugarcane, showed a significant increase of mechanical harvesting in the study area and a reduction of gas emissions of more than 300,000 t yr(-1).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Constrained systems in quantum field theories call for a careful study of diverse classes of constraints and consistency checks over their temporal evolution. Here we study the functional structure of the free electromagnetic and pure Yang-Mills fields on the front-form coordinates with the null-plane gauge condition. It is seen that in this framework, we can deal with strictu sensu physical fields.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Precambrian crystalline basement of southeast Brazil is affected by many Phanerozoic reactivations of shear zones that developed during the end of the Neoproterozoic in the Brasiliano orogeny. These reactivations with specific tectonic events, a multidisciplinary study was done, involving geology, paleostress, and structural analysis of faults, associated with apatite fission track methods along the northeastern border of the Parana basin in southeast Brazil.The results show that the study area consists of three main tectonic domains, which record different episodes of uplift and reactivation of faults. These faults were brittle in character and resulted in multiple generations of fault products as pseudotachylytes and ultracataclasites, foliated cataclasites and fault gouges.Based on geological evidence and fission track data, an uplift of basement rocks and related tectonic subsidence with consequent deposition in the Parana basin were modeled.The reactivations of the basement record successive uplift events during the Phanerozoic dated via corrected fission track ages, at 387 +/- 50 Ma (Ordovician); 193 +/- 19 Ma (Triassic); 142 +/- 18 Ma (Jurassic), 126 +/- 11 Ma (Early Cretaceous); 89 +/- 10 Ma (Late Cretaceous) and 69 +/- 10 Ma (Late Cretaceous). These results indicate differential uplift of tectonic domains of basement units, probably related to Parana basin subsidence. Six major sedimentary units (supersequences) that have been deposited with their bounding unconformities, seem to have a close relationship with the orogenic events during the evolution of southwestern Gondwana. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The formation of sulfated zirconia films from a sol-gel derived aqueous suspension is subjected to double-optical monitoring during batch dip coating. Interpretation of interferometric patterns, previously obscured by a variable refractive index, is now made possible by addition of its direct measurement by a polarimetric technique in real time. Significant sensitivity of the resulting physical thickness and refractive index curves (uncertainties of ±7 nm and ±0.005, respectively) to temporal film evolution is shown under different withdrawal speeds. As a first contribution to quantitative understanding of temporal film formation with varying nanostructure during dip coating, detailed analysis is directed to the stage of the process dominated by mass drainage, whose simple modeling with temporal t-1/2 dependence is verified experimentally. © 2006 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper reports on a process to validate a revised version of a system for coding classroom discourse in foreign language lessons, a context in which the dual role of language (as content and means of communication) and the speakers' specific pedagogical aims lead to a certain degree of ambiguity in language analysis. The language used by teachers and students has been extensively studied, and a framework of concepts concerning classroom discourse well-established. Models for coding classroom language need, however, to be revised when they are applied to specific research contexts. The application and revision of an initial framework can lead to the development of earlier models, and to the re-definition of previously established categories of analysis that have to be validated. The procedures followed to validate a coding system are related here as guidelines for conducting research under similar circumstances. The advantages of using instruments that incorporate two types of data, that is, quantitative measures and qualitative information from raters' metadiscourse, are discussed, and it is suggested that such procedure can contribute to the process of validation itself, towards attaining reliability of research results, as well as indicate some constraints of the adopted research methodology.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)