942 resultados para Critical clearing time


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent research into sea ice friction has focussed on ways to provide a model which maintains much of the clarity and simplicity of Amonton's law, yet also accounts for memory effects. One promising avenue of research has been to adapt the rate- and state- dependent models which are prevalent in rock friction. In such models it is assumed that there is some fixed critical slip displacement, which is effectively a measure of the displacement over which memory effects might be considered important. Here we show experimentally that a fixed critical slip displacement is not a valid assumption in ice friction, whereas a constant critical slip time appears to hold across a range of parameters and scales. As a simple rule of thumb, memory effects persist to a significant level for 10 s. We then discuss the implications of this finding for modelling sea ice friction and for our understanding of friction in general.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Critical real-time ebedded (CRTE) Systems require safe and tight worst-case execution time (WCET) estimations to provide required safety levels and keep costs low. However, CRTE Systems require increasing performance to satisfy performance needs of existing and new features. Such performance can be only achieved by means of more agressive hardware architectures, which are much harder to analyze from a WCET perspective. The main features considered include cache memòries and multi-core processors.Thus, althoug such features provide higher performance, corrent WCET analysis methods are unable to provide tight WCET estimations. In fact, WCET estimations become worse than for simple rand less powerful hardware. The main reason is the fact that hardware behavior is deterministic but unknown and, therefore, the worst-case behavior must be assumed most of the time, leading to large WCET estimations. The purpose of this project is developing new hardware designs together with WCET analysis tools able to provide tight and safe WCET estimations. In order to do so, those pieces of hardware whose behavior is not easily analyzable due to lack of accurate information during WCET analysis will be enhanced to produce a probabilistically analyzable behavior. Thus, even if the worst-case behavior cannot be removed, its probabilty can be bounded, and hence, a safe and tight WCET can be provided for a particular safety level in line with the safety levels of the remaining components of the system. During the first year the project we have developed molt of the evaluation infraestructure as well as the techniques hardware techniques to analyze cache memories. During the second year those techniques have been evaluated, and new purely-softwar techniques have been developed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We consider the critical short-time evolution of magnetic and droplet-percolation order parameters for the Ising model in two and three dimensions, through Monte Carlo simulations with the (local) heat-bath method. We find qualitatively different dynamic behaviors for the two types of order parameters. More precisely, we find that the percolation order parameter does not have a power-law behavior as encountered for the magnetization, but develops a scale (related to the relaxation time to equilibrium) in the Monte Carlo time. We argue that this difference is due to the difficulty in forming large clusters at the early stages of the evolution. Our results show that, although the descriptions in terms of magnetic and percolation order parameters may be equivalent in the equilibrium regime, greater care must be taken to interpret percolation observables at short times. In particular, this concerns the attempts to describe the dynamics of the deconfinement phase transition in QCD using cluster observables.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The precise timing of events in the brain has consequences for intracellular processes, synaptic plasticity, integration and network behaviour. Pyramidal neurons, the most widespread excitatory neuron of the neocortex have multiple spike initiation zones, which interact via dendritic and somatic spikes actively propagating in all directions within the dendritic tree. For these neurons, therefore, both the location and timing of synaptic inputs are critical. The time window for which the backpropagating action potential can influence dendritic spike generation has been extensively studied in layer 5 neocortical pyramidal neurons of rat somatosensory cortex. Here, we re-examine this coincidence detection window for pyramidal cell types across the rat somatosensory cortex in layers 2/3, 5 and 6. We find that the time-window for optimal interaction is widest and shifted in layer 5 pyramidal neurons relative to cells in layers 6 and 2/3. Inputs arriving at the same time and locations will therefore differentially affect spike-timing dependent processes in the different classes of pyramidal neurons.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of this work was to determine the critical irrigation time for common bean (Phaseolus vulgaris L. cv. Carioca) using infrared thermometry. Five treatments were analyzed. Canopy temperature differences between plants and a well-watered control about 1, 2, 3, 4, and 5±0.5ºC were tested. Physiological variables and plant growth were analyzed to establish the best time to irrigate. There was a significant linear correlation between the index and stomatal resistance, transpiration rate, and leaf water potential. Although significant linear correlation between the index and mean values of total dry matter, absolute growth rate, and leaf area index was found, no correlation was found with other growth index like relative growth rate, net assimilation rate, and leaf area ratio. Plants irrigated when their canopy temperature was 3±0.5ºC above the control had their relative growth rate mean value increased up to 59.7%, yielding 2,260.2 kg ha-1, with a reduction of 38.0% in the amount of water used. Plants irrigated when their canopy temperature was 4±0.5ºC yielded 1,907.6 kg ha-1, although their relative growth rate mean value was 4.0% below the control. These results show that the best moment to irrigate common bean is when their canopy temperature is between 3ºC and 4±0.5ºC above the control.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Indices that report how much a contingency is stable or unstable in an electrical power system have been the object of several studies in the last decades. In some approaches, indices are obtained from time-domain simulation; others explore the calculation of the stability margin from the so-called direct methods, or even by neural networks.The goal is always to obtain a fast and reliable way of analysing large disturbance that might occur on the power systems. A fast classification in stable and unstable, as a function of transient stability is crucial for a dynamic security analysis. All good propositions as how to analyse contingencies must present some important features: classification of contingencies; precision and reliability; and efficiency computation. Indices obtained from time-domain simulations have been used to classify the contingencies as stable or unstable. These indices are based on the concepts of coherence, transient energy conversion between kinetic energy and potential energy, and three dot products of state variable. The classification of the contingencies using the indices individually is not reliable, since the performance of these indices varies with each simulated condition. However, collapsing these indices into a single one can improve the analysis significantly. In this paper, it is presented the results of an approach to filter the contingencies, by a simple classification of them into stable, unstable or marginal. This classification is performed from the composite indices obtained from step by step simulation with a time period of the clearing time plus 0.5 second. The contingencies originally classified as stable or unstable do not require this extra simulation. The methodology requires an initial effort to obtain the values of the intervals for classification, and the weights. This is performed once for each power system and can be used in different operating conditions and for different contingencies. No misplaced classification o- - ccurred in any of the tests, i.e., we detected no stable case classified as unstable or otherwise. The methodology is thus well fitted for it allows for a rapid conclusion about the stability of th system, for the majority of the contingencies (Stable or Unstable Cases). The tests, results and discussions are presented using two power systems: (1) the IEEE17 system, composed of 17 generators, 162 buses and 284 transmission lines; and (2) a South Brazilian system configuration, with 10 generators, 45 buses and 71 lines.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present results of our numerical study of the critical dynamics of percolation observables for the two-dimensional Ising model. We consider the (Monte Carlo) short-time evolution of the system with small initial magnetization and heat-bath dynamics. We find qualitatively different dynamic behaviors for the magnetization M and for Ω, the so-called strength of the percolating cluster, which is the order parameter of the percolation transition. More precisely, we obtain a (leading) exponential form for Ω as a function of the Monte Carlo time t, to be compared with the power-law increase encountered for M at short times. Our results suggest that, although the descriptions in terms of magnetic or percolation order parameters may be equivalent in the equilibrium regime, greater care must be taken to interpret percolation observables at short times.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The effect of the ionosphere on the signals of Global Navigation Satellite Systems (GNSS), such as the Global Positionig System (GPS) and the proposed European Galileo, is dependent on the ionospheric electron density, given by its Total Electron Content (TEC). Ionospheric time-varying density irregularities may cause scintillations, which are fluctuations in phase and amplitude of the signals. Scintillations occur more often at equatorial and high latitudes. They can degrade navigation and positioning accuracy and may cause loss of signal tracking, disrupting safety-critical applications, such as marine navigation and civil aviation. This paper addresses the results of initial research carried out on two fronts that are relevant to GNSS users if they are to counter ionospheric scintillations, i.e. forecasting and mitigating their effects. On the forecasting front, the dynamics of scintillation occurrence were analysed during the severe ionospheric storm that took place on the evening of 30 October 2003, using data from a network of GPS Ionospheric Scintillation and TEC Monitor (GISTM) receivers set up in Northern Europe. Previous results [1] indicated that GPS scintillations in that region can originate from ionospheric plasma structures from the American sector. In this paper we describe experiments that enabled confirmation of those findings. On the mitigation front we used the variance of the output error of the GPS receiver DLL (Delay Locked Loop) to modify the least squares stochastic model applied by an ordinary receiver to compute position. This error was modelled according to [2], as a function of the S4 amplitude scintillation index measured by the GISTM receivers. An improvement of up to 21% in relative positioning accuracy was achieved with this technnique.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Histograms of Oriented Gradients (HoGs) provide excellent results in object detection and verification. However, their demanding processing requirements bound their applicability in some critical real-time scenarios, such as for video-based on-board vehicle detection systems. In this work, an efficient HOG configuration for pose-based on-board vehicle verification is proposed, which alleviates both the processing requirements and required feature vector length without reducing classification performance. The impact on classification of some critical configuration and processing parameters is in depth analyzed to propose a baseline efficient descriptor. Based on the analysis of its cells contribution to classification, new view-dependent cell-configuration patterns are proposed, resulting in reduced descriptors which provide an excellent balance between performance and computational requirements, rendering higher verification rates than other works in the literature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Accelerating Moment Release (AMR) preceding earthquakes with magnitude above 5 in Australia that occurred during the last 20 years was analyzed to test the Critical Point Hypothesis. Twelve earthquakes in the catalog were chosen based on a criterion for the number of nearby events. Results show that seven sequences with numerous events recorded leading up to the main earthquake exhibited accelerating moment release. Two occurred near in time and space to other earthquakes preceded by AM R. The remaining three sequences had very few events in the catalog so the lack of AMR detected in the analysis may be related to catalog incompleteness. Spatio-temporal scanning of AMR parameters shows that 80% of the areas in which AMR occurred experienced large events. In areas of similar background seismicity with no large events, 10 out of 12 cases exhibit no AMR, and two others are false alarms where AMR was observed but no large event followed. The relationship between AMR and Load-Unload Response Ratio (LURR) was studied. Both methods predict similar critical region sizes, however, the critical point time using AMR is slightly earlier than the time of the critical point LURR anomaly.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The published requirements for accurate measurement of heat transfer at the interface between two bodies have been reviewed. A strategy for reliable measurement has been established, based on the depth of the temperature sensors in the medium, on the inverse method parameters and on the time response of the sensors. Sources of both deterministic and stochastic errors have been investigated and a method to evaluate them has been proposed, with the help of a normalisation technique. The key normalisation variables are the duration of the heat input and the maximum heat flux density. An example of application of this technique in the field of high pressure die casting is demonstrated. The normalisation study, coupled with previous determination of the heat input duration, makes it possible to determine the optimum location for the sensors, along with an acceptable sampling rate and the thermocouples critical response-time (as well as eventual filter characteristics). Results from the gauge are used to assess the suitability of the initial design choices. In particular the unavoidable response time of the thermocouples is estimated by comparison with the normalised simulation. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The dynamics of drop formation and pinch-off have been investigated for a series of low viscosity elastic fluids possessing similar shear viscosities, but differing substantially in elastic properties. On initial approach to the pinch region, the viscoelastic fluids all exhibit the same global necking behavior that is observed for a Newtonian fluid of equivalent shear viscosity. For these low viscosity dilute polymer solutions, inertial and capillary forces form the dominant balance in this potential flow regime, with the viscous force being negligible. The approach to the pinch point, which corresponds to the point of rupture for a Newtonian fluid, is extremely rapid in such solutions, with the sudden increase in curvature producing very large extension rates at this location. In this region the polymer molecules are significantly extended, causing a localized increase in the elastic stresses, which grow to balance the capillary pressure. This prevents the necked fluid from breaking off, as would occur in the equivalent Newtonian fluid. Alternatively, a cylindrical filament forms in which elastic stresses and capillary pressure balance, and the radius decreases exponentially with time. A (0+1)-dimensional finitely extensible nonlinear elastic dumbbell theory incorporating inertial, capillary, and elastic stresses is able to capture the basic features of the experimental observations. Before the critical "pinch time" t(p), an inertial-capillary balance leads to the expected 2/3-power scaling of the minimum radius with time: R-min similar to(t(p)-t)(2/3). However, the diverging deformation rate results in large molecular deformations and rapid crossover to an elastocapillary balance for times t>t(p). In this region, the filament radius decreases exponentially with time R-min similar to exp[(t(p)-t)/lambda(1)], where lambda(1) is the characteristic time constant of the polymer molecules. Measurements of the relaxation times of polyethylene oxide solutions of varying concentrations and molecular weights obtained from high speed imaging of the rate of change of filament radius are significantly higher than the relaxation times estimated from Rouse-Zimm theory, even though the solutions are within the dilute concentration region as determined using intrinsic viscosity measurements. The effective relaxation times exhibit the expected scaling with molecular weight but with an additional dependence on the concentration of the polymer in solution. This is consistent with the expectation that the polymer molecules are in fact highly extended during the approach to the pinch region (i.e., prior to the elastocapillary filament thinning regime) and subsequently as the filament is formed they are further extended by filament stretching at a constant rate until full extension of the polymer coil is achieved. In this highly extended state, intermolecular interactions become significant, producing relaxation times far above theoretical predictions for dilute polymer solutions under equilibrium conditions. (C) 2006 American Institute of Physics