939 resultados para Time analysis
Resumo:
A study of chemical transformations of cork during heat treatments was made using colour variation and FTIR analysis. The cork enriched fractions from Quercus cerris bark were subjected to isothermal heating in the temperature range 150–400 ◦C and treatment time from 5 to 90 min. Mass loss ranged from 3% (90 min at 150 ◦C) to 71% (60 min at 350 ◦C). FTIR showed that hemicelluloses were thermally degraded first while suberin remained as the most heat resistant component. The change of CIE-Lab parameters was rapid for low intensity treatments where no significant mass loss occurred (at 150 ◦C L* decreased from the initial 51.5 to 37.3 after 20 min). The decrease in all colour parameters continued with temperature until they remained substantially constant with over 40% mass loss. Modelling of the thermally induced mass loss could be made using colour analysis. This is applicable to monitoring the production of heat expanded insulation agglomerates.
Resumo:
An improved class of Boussinesq systems of an arbitrary order using a wave surface elevation and velocity potential formulation is derived. Dissipative effects and wave generation due to a time-dependent varying seabed are included. Thus, high-order source functions are considered. For the reduction of the system order and maintenance of some dispersive characteristics of the higher-order models, an extra O(mu 2n+2) term (n ??? N) is included in the velocity potential expansion. We introduce a nonlocal continuous/discontinuous Galerkin FEM with inner penalty terms to calculate the numerical solutions of the improved fourth-order models. The discretization of the spatial variables is made using continuous P2 Lagrange elements. A predictor-corrector scheme with an initialization given by an explicit RungeKutta method is also used for the time-variable integration. Moreover, a CFL-type condition is deduced for the linear problem with a constant bathymetry. To demonstrate the applicability of the model, we considered several test cases. Improved stability is achieved.
Resumo:
Dissertação apresentada como requisito parcial para obtenção do grau de Mestre em Estatística e Gestão de Informação
Resumo:
In this paper a modified version of the classical Van der Pol oscillator is proposed, introducing fractional-order time derivatives into the state-space model. The resulting fractional-order Van der Pol oscillator is analyzed in the time and frequency domains, using phase portraits, spectral analysis and bifurcation diagrams. The fractional-order dynamics is illustrated through numerical simulations of the proposed schemes using approximations to fractional-order operators. Finally, the analysis is extended to the forced Van der Pol oscillator.
Resumo:
Arquivos de Medicina 1998; 12(4): 246-248
Resumo:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau em Mestre em Engenharia Física
Resumo:
Applied Physics B Lasers and Optics, vol.71
Resumo:
The computations performed by the brain ultimately rely on the functional connectivity between neurons embedded in complex networks. It is well known that the neuronal connections, the synapses, are plastic, i.e. the contribution of each presynaptic neuron to the firing of a postsynaptic neuron can be independently adjusted. The modulation of effective synaptic strength can occur on time scales that range from tens or hundreds of milliseconds, to tens of minutes or hours, to days, and may involve pre- and/or post-synaptic modifications. The collection of these mechanisms is generally believed to underlie learning and memory and, hence, it is fundamental to understand their consequences in the behavior of neurons.(...)
Resumo:
In the current context of serious climate changes, where the increase of the frequency of some extreme events occurrence can enhance the rate of periods prone to high intensity forest fires, the National Forest Authority often implements, in several Portuguese forest areas, a regular set of measures in order to control the amount of fuel mass availability (PNDFCI, 2008). In the present work we’ll present a preliminary analysis concerning the assessment of the consequences given by the implementation of prescribed fire measures to control the amount of fuel mass in soil recovery, in particular in terms of its water retention capacity, its organic matter content, pH and content of iron. This work is included in a larger study (Meira-Castro, 2009(a); Meira-Castro, 2009(b)). According to the established praxis on the data collection, embodied in multidimensional matrices of n columns (variables in analysis) by p lines (sampled areas at different depths), and also considering the quantitative data nature present in this study, we’ve chosen a methodological approach that considers the multivariate statistical analysis, in particular, the Principal Component Analysis (PCA ) (Góis, 2004). The experiments were carried out in a soil cover over a natural site of Andaluzitic schist, in Gramelas, Caminha, NW Portugal, who was able to maintain itself intact from prescribed burnings from four years and was submit to prescribed fire in March 2008. The soils samples were collected from five different plots at six different time periods. The methodological option that was adopted have allowed us to identify the most relevant relational structures inside the n variables, the p samples and in two sets at the same time (Garcia-Pereira, 1990). Consequently, and in addition to the traditional outputs produced from the PCA, we have analyzed the influence of both sampling depths and geomorphological environments in the behavior of all variables involved.
Resumo:
Measurements in civil engineering load tests usually require considerable time and complex procedures. Therefore, measurements are usually constrained by the number of sensors resulting in a restricted monitored area. Image processing analysis is an alternative way that enables the measurement of the complete area of interest with a simple and effective setup. In this article photo sequences taken during load displacement tests were captured by a digital camera and processed with image correlation algorithms. Three different image processing algorithms were used with real images taken from tests using specimens of PVC and Plexiglas. The data obtained from the image processing algorithms were also compared with the data from physical sensors. A complete displacement and strain map were obtained. Results show that the accuracy of the measurements obtained by photogrammetry is equivalent to that from the physical sensors but with much less equipment and fewer setup requirements. © 2015Computer-Aided Civil and Infrastructure Engineering.
Resumo:
This study aims to optimize the water quality monitoring of a polluted watercourse (Leça River, Portugal) through the principal component analysis (PCA) and cluster analysis (CA). These statistical methodologies were applied to physicochemical, bacteriological and ecotoxicological data (with the marine bacterium Vibrio fischeri and the green alga Chlorella vulgaris) obtained with the analysis of water samples monthly collected at seven monitoring sites and during five campaigns (February, May, June, August, and September 2006). The results of some variables were assigned to water quality classes according to national guidelines. Chemical and bacteriological quality data led to classify Leça River water quality as “bad” or “very bad”. PCA and CA identified monitoring sites with similar pollution pattern, giving to site 1 (located in the upstream stretch of the river) a distinct feature from all other sampling sites downstream. Ecotoxicity results corroborated this classification thus revealing differences in space and time. The present study includes not only physical, chemical and bacteriological but also ecotoxicological parameters, which broadens new perspectives in river water characterization. Moreover, the application of PCA and CA is very useful to optimize water quality monitoring networks, defining the minimum number of sites and their location. Thus, these tools can support appropriate management decisions.
Resumo:
This study analysed 22 strawberry and soil samples after their collection over the course of 2 years to compare the residue profiles from organic farming with integrated pest management practices in Portugal. For sample preparation, we used the citrate-buffered version of the quick, easy, cheap, effective, rugged, and safe (QuEChERS) method. We applied three different methods for analysis: (1) 27 pesticides were targeted using LC-MS/MS; (2) 143 were targeted using low pressure GC-tandem mass spectrometry (LP-GC-MS/MS); and (3) more than 600 pesticides were screened in a targeted and untargeted approach using comprehensive, two-dimensional gas chromatography time-of-flight mass spectrometry (GC × GC-TOF-MS). Comparison was made of the analyses using the different methods for the shared samples. The results were similar, thereby providing satisfactory confirmation of both similarly positive and negative findings. No pesticides were found in the organic-farmed samples. In samples from integrated pest management practices, nine pesticides were determined and confirmed to be present, ranging from 2 μg kg−1 for fluazifop-pbutyl to 50 μg kg−1 for fenpropathrin. Concentrations of residues in strawberries were less than European maximum residue limits.
Resumo:
A retrospective survey of 473 cases of snake bite admitted to a Brazilian teaching hospital from 1984 to 1990 revealed 91 cases of bite without envenoming and/or caused by non-venomous snakes. In 17 of these cases the snake was identified, and one patient was bitten by a snake-like reptile (Amphisbaena mertensii). In 43 cases diagnosis was made on clinical grounds (fang marks in the absence of signs of envenoming). The other 30 cases were of patients who complained of being bitten but who did not show any sign of envenoming or fang mark. Most cases occurred in men (66;73%), in the 10-19 years age group (26;29%), in the lower limbs (51/74;69%), between 6 A. M. and 2 P.M. (49;61%) and in the month of April (16; 18%). One patient bitten by Philodryas olfersii developed severe local pain, swelling and redness at the site of the bite, with normal clotting time. The patient bitten by Drymarcon corais was misdiagnosed as being bitten by a snake of the genus Bothrops, was given the specific antivenom, and developed anaphylaxis. One patient bitten by Sibynomorphus mikanii presented prolonged clotting time, and was also given antivenom as a case of Bothrops bite. Correct identification of venomous snakes by physicians is necessary to provide correct treatment to victims of snake bite, avoiding unnecessary distress to the patient, and overprescription of antivenom, which may eventually cause severe untoward effects.
Resumo:
Hard real- time multiprocessor scheduling has seen, in recent years, the flourishing of semi-partitioned scheduling algorithms. This category of scheduling schemes combines elements of partitioned and global scheduling for the purposes of achieving efficient utilization of the system’s processing resources with strong schedulability guarantees and with low dispatching overheads. The sub-class of slot-based “task-splitting” scheduling algorithms, in particular, offers very good trade-offs between schedulability guarantees (in the form of high utilization bounds) and the number of preemptions/migrations involved. However, so far there did not exist unified scheduling theory for such algorithms; each one was formulated in its own accompanying analysis. This article changes this fragmented landscape by formulating a more unified schedulability theory covering the two state-of-the-art slot-based semi-partitioned algorithms, S-EKG and NPS-F (both fixed job-priority based). This new theory is based on exact schedulability tests, thus also overcoming many sources of pessimism in existing analysis. In turn, since schedulability testing guides the task assignment under the schemes in consideration, we also formulate an improved task assignment procedure. As the other main contribution of this article, and as a response to the fact that many unrealistic assumptions, present in the original theory, tend to undermine the theoretical potential of such scheduling schemes, we identified and modelled into the new analysis all overheads incurred by the algorithms in consideration. The outcome is a new overhead-aware schedulability analysis that permits increased efficiency and reliability. The merits of this new theory are evaluated by an extensive set of experiments.