955 resultados para THRESHOLD
Resumo:
Objective: The purpose of this study was to investigate effects of different manual techniques on cervical ranges of 17 motion and pressure pain sensitivity in subjects with latent trigger point of the upper trapezius muscle. 18 Methods: One hundred seventeen volunteers, with a unilateral latent trigger point on upper trapezius due to computer 19 work, were randomly divided into 5 groups: ischemic compression (IC) group (n = 24); passive stretching group (n = 20 23); muscle energy technique group (n = 23); and 2 control groups, wait-and-see group (n = 25) and placebo group 21 (n = 22). Cervical spine range of movement was measured using a cervical range of motion instrument as well as 22 pressure pain sensitivity by means of an algometer and a visual analog scale. Outcomes were assessed pretreatment, 23 immediately, and 24 hours after the intervention and 1 week later by a blind researcher. A 4 × 5 mixed repeated- 24 measures analysis of variance was used to examine the effects of the intervention and Cohen d coefficient was used. 25 Results: A group-by-time interaction was detected in all variables (P b .01), except contralateral rotation. The 26 immediate effect sizes of the contralateral flexion, ipsilateral rotation, and pressure pain threshold were large for 3 27 experimental groups. Nevertheless, after 24 hours and 1 week, only IC group maintained the effect size. 28 Conclusions: Manual techniques on upper trapezius with latent trigger point seemed to improve the cervical range of 29 motion and the pressure pain sensitivity. These effects persist after 1 week in the IC group. (J Manipulative Physiol 301 Ther 2013;xx:1-10)
Resumo:
This paper studies the effects of the diffusion of a General Purpose Technology (GPT) that spreads first within the developed North country of its origin, and then to a developing South country. In the developed general equilibrium growth model, each final good can be produced by one of two technologies. Each technology is characterized by a specific labor complemented by a specific set of intermediate goods, which are enhanced periodically by Schumpeterian R&D activities. When quality reaches a threshold level, a GPT arises in one of the technologies and spreads first to the other technology within the North. Then, it propagates to the South, following a similar sequence. Since diffusion is not even, neither intra- nor inter-country, the GPT produces successive changes in the direction of technological knowledge and in inter- and intra-country wage inequality. Through this mechanism the different observed paths of wage inequality can be accommodated.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Química e Biológica - Processos Químicos
Resumo:
Desertification is a critical issue for Mediterranean drylands. Climate change is expected to aggravate its extension and severity by reinforcing the biophysical driving forces behind desertification processes: hydrology, vegetation cover and soil erosion. The main objective of this thesis is to assess the vulnerability of Mediterranean watersheds to climate change, by estimating impacts on desertification drivers and the watersheds’ resilience to them. To achieve this objective, a modeling framework capable of analyzing the processes linking climate and the main drivers is developed. The framework couples different models adapted to different spatial and temporal scales. A new model for the event scale is developed, the MEFIDIS model, with a focus on the particular processes governing Mediterranean watersheds. Model results are compared with desertification thresholds to estimate resilience. This methodology is applied to two contrasting study areas: the Guadiana and the Tejo, which currently present a semi-arid and humid climate. The main conclusions taken from this work can be summarized as follows: • hydrological processes show a high sensitivity to climate change, leading to a significant decrease in runoff and an increase in temporal variability; • vegetation processes appear to be less sensitive, with negative impacts for agricultural species and forests, and positive impacts for Mediterranean species; • changes to soil erosion processes appear to depend on the balance between changes to surface runoff and vegetation cover, itself governed by relationship between changes to temperature and rainfall; • as the magnitude of changes to climate increases, desertification thresholds are surpassed in a sequential way, starting with the watersheds’ ability to sustain current water demands and followed by the vegetation support capacity; • the most important thresholds appear to be a temperature increase of +3.5 to +4.5 ºC and a rainfall decrease of -10 to -20 %; • rainfall changes beyond this threshold could lead to severe water stress occurring even if current water uses are moderated, with droughts occurring in 1 out of 4 years; • temperature changes beyond this threshold could lead to a decrease in agricultural yield accompanied by an increase in soil erosion for croplands; • combined changes of temperature and rainfall beyond the thresholds could shift both systems towards a more arid state, leading to severe water stresses and significant changes to the support capacity for current agriculture and natural vegetation in both study areas.
Resumo:
While the earliest deadline first algorithm is known to be optimal as a uniprocessor scheduling policy, the implementation comes at a cost in terms of complexity. Fixed taskpriority algorithms on the other hand have lower complexity but higher likelihood of task sets being declared unschedulable, when compared to earliest deadline first (EDF). Various attempts have been undertaken to increase the chances of proving a task set schedulable with similar low complexity. In some cases, this was achieved by modifying applications to limit preemptions, at the cost of flexibility. In this work, we explore several variants of a concept to limit interference by locking down the ready queue at certain instances. The aim is to increase the prospects of schedulability of a given task system, without compromising on complexity or flexibility, when compared to the regular fixed task-priority algorithm. As a final contribution, a new preemption threshold assignment algorithm is provided which is less complex and more straightforward than the previous method available in the literature.
Resumo:
Worldwide formaldehyde is manipulated with diverse usage properties, since industrial purposes to health laboratory objectives, representing the economic importance of this chemical agent. Therefore, many people are exposed to formaldehyde environmentally and/or occupationally. Considering the latter, there was recommended occupational exposure limits based on threshold mechanisms, limit values and indoor guidelines. Formaldehyde is classified by the International Agency for Cancer Research (IARC) as carcinogenic to humans (group 1), since a wide range of epidemiological studies in occupational exposure settings have suggested possible links between the concentration and duration of exposure and elevated risks of nasopharyngeal cancer, and others cancers, and more recently, with leukemia. Although there are different classifications, such as U.S. EPA that classified formaldehyde as a B1 compound, probable human carcinogen under the conditions of unusually high or prolonged exposure, on basis of limited evidence in humans but with sufficient evidence in animals. Formaldehyde genotoxicity is well-known, being a direct-acting genotoxic compound positively associated for almost all genetic endpoints evaluated in bacteria, yeast, fungi, plants, insects, nematodes, and cultured mammalian cells. There are many human biomonitoring studies that associate formaldehyde occupational exposure to genomic instability, and consequently possible health effects. Besides the link with cancer, also other pathologies and symptoms are associated with formaldehyde exposure, namely respiratory disorders such as asthma, and allergic contact dermatitis. Nowadays, there are efforts to reduce formaldehyde exposure, namely indoor. Europe and United States developed more strict regulation regarding formaldehyde emissions from materials containing this agent. Despite the regulations and restrictions, formaldehyde still continues to be difficult to eliminate or substitute, being biomonitoring an important tool to control possible future health effects.
Resumo:
Unstabilized rammed earth is a recyclable, economical, and eco-friendly building material, used in the past and still applied today. Traditionally, its use was based on a long empirical knowledge of the local materials. Because this knowledge was mostly lost or is no longer sufficient, in many countries normative documents have been produced to allow the assessment of rammed earth soils. With the aim of contributing for a refining of these normative requirements, this article presents a research work that included: (i) collection of Unstabilized rammed earth samples from six constructions in Portugal; (ii) a literature survey of normative and complementary documents to identify the most mentioned key-properties, the test procedures and the corresponding threshold limits; and (iii) a discussion of the test procedures and of the thresholds limits in the light of the experimental results. The analyzed properties are the particle size distribution, maximum particle size, plasticity, compaction, linear shrinkage, organic content, and salt content. The work highlights the advantages of taking into account the characteristics of existing constructions as a basis for the establishment and further refining of consistent threshold values. In particular, it shows that it is essential to adjust the requirements to the specificities of local materials.
Resumo:
Waste oil recycling companies play a very important role in our society. Competition among companies is tough and process optimization is essential for survival. By equipping oil containers with a level monitoring system that periodically reports the level and alerts when it reaches the preset threshold, the oil recycling companies are able to streamline the oil collection process and, thus, reduce the operation costs while maintaining the quality of service. This paper describes the development of this level monitoring system by a team of four students from different engineering backgrounds and nationalities. The team conducted a study of the state of the art, draw marketing and sustainable development plans and, finally, designed and implemented a prototype that continuously measures the container content level and sends an alert message as soon as it reaches the preset capacity.
Resumo:
Mestrado em Segurança e Higiene no Trabalho
Resumo:
The morphological and structural modifications induced in sapphire by surface treatment with femtosecond laser radiation were studied. Single-crystal sapphire wafers cut parallel to the (0 1 2) planes were treated with 560 fs, 1030 nm wavelength laser radiation using wide ranges of pulse energy and repetition rate. Self-ordered periodic structures with an average spatial periodicity of similar to 300 nm were observed for fluences slightly higher than the ablation threshold. For higher fluences the interaction was more disruptive and extensive fracture, exfoliation, and ejection of ablation debris occurred. Four types of particles were found in the ablation debris: (a) spherical nanoparticles about 50 nm in diameter; (b) composite particles between 150 and 400 nm in size; (c) rounded resolidified particles about 100-500 nm in size; and (d) angular particles presenting a lamellar structure and deformation twins. The study of those particles by selected area electron diffraction showed that the spherical nanoparticles and the composite particles are amorphous, while the resolidified droplets and the angular particles, present a crystalline a-alumina structure, the same of the original material. Taking into consideration the existing ablation theories, it is proposed that the spherical nanoparticles are directly emitted from the surface in the ablation plume, while resolidified droplets are emitted as a result of the ablation process, in the liquid phase, in the low intensity regime, and by exfoliation, in the high intensity regime. Nanoparticle clusters are formed by nanoparticle coalescence in the cooling ablation plume. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
We propose a blind method to detect interference in GNSS signals whereby the algorithms do not require knowledge of the interference or channel noise features. A sample covariance matrix is constructed from the received signal and its eigenvalues are computed. The generalized likelihood ratio test (GLRT) and the condition number test (CNT) are developed and compared in the detection of sinusoidal and chirp jamming signals. A computationally-efficient decision threshold was proposed for the CNT.
Resumo:
The phase diagram of a simple model with two patches of type A and ten patches of type B (2A10B) on the face centred cubic lattice has been calculated by simulations and theory. Assuming that there is no interaction between the B patches the behavior of the system can be described in terms of the ratio of the AB and AA interactions, r. Our results show that, similarly to what happens for related off-lattice and two-dimensional lattice models, the liquid-vapor phase equilibria exhibit reentrant behavior for some values of the interaction parameters. However, for the model studied here the liquid-vapor phase equilibria occur for values of r lower than 1/3, a threshold value which was previously thought to be universal for 2AnB models. In addition, the theory predicts that below r = 1/3 (and above a new condensation threshold which is < 1/3) the reentrant liquid-vapor equilibria are so extreme that it exhibits a closed loop with a lower critical point, a very unusual behavior in single-component systems. An order-disorder transition is also observed at higher densities than the liquid-vapor equilibria, which shows that the liquid-vapor reentrancy occurs in an equilibrium region of the phase diagram. These findings may have implications in the understanding of the condensation of dipolar hard spheres given the analogy between that system and the 2AnB models considered here.
Resumo:
The dynamics of catalytic networks have been widely studied over the last decades because of their implications in several fields like prebiotic evolution, virology, neural networks, immunology or ecology. One of the most studied mathematical bodies for catalytic networks was initially formulated in the context of prebiotic evolution, by means of the hypercycle theory. The hypercycle is a set of self-replicating species able to catalyze other replicator species within a cyclic architecture. Hypercyclic organization might arise from a quasispecies as a way to increase the informational containt surpassing the so-called error threshold. The catalytic coupling between replicators makes all the species to behave like a single and coherent evolutionary multimolecular unit. The inherent nonlinearities of catalytic interactions are responsible for the emergence of several types of dynamics, among them, chaos. In this article we begin with a brief review of the hypercycle theory focusing on its evolutionary implications as well as on different dynamics associated to different types of small catalytic networks. Then we study the properties of chaotic hypercycles with error-prone replication with symbolic dynamics theory, characterizing, by means of the theory of topological Markov chains, the topological entropy and the periods of the orbits of unimodal-like iterated maps obtained from the strange attractor. We will focus our study on some key parameters responsible for the structure of the catalytic network: mutation rates, autocatalytic and cross-catalytic interactions.
Resumo:
The effect of monopolar and bipolar shaped pulses in additional yield of apple juice extraction is evaluated. The applied electric field strength, pulsewidth, and number of pulses are assessed for both pulse types, and divergences are analyzed. Variation of electric field strength is ranged from 100 to 1300 V/cm, pulsewidth from 20 to 300 mu s, and the number of pulses from 10 to 200, at a frequency of 200 Hz. Two pulse trains separated by 1 s are applied to apple cubes. Results are plotted against reference untreated samples for all assays. Specific energy consumption is calculated for each experiment as well as qualitative indicators for apple juice of total soluble dry matter and absorbance at 390-nm wavelength. Bipolar pulses demonstrated higher efficiency, and specific energetic consumption has a threshold where higher inputs of energy do not result in higher juice extraction when electric field variation is applied. Total soluble dry matter and absorbance results do not illustrate significant differences between application of monopolar and bipolar pulses, but all values are inside the limits proposed for apple juice intended for human consumption.
Resumo:
Additional apple juice extraction with pulsed electric field pretreated apple cubes towards control samples is evaluated. Monopolar and bipolar shaped pulses are compared and their effect is studied with variation of electric field, pulse width and number of pulses. Variation of electric field strength is ranged from 100 V/cm to 1300 V/cm, pulse width from 20 mu s to 300 mu s and number of pulses from 10 to 200, at frequency of 200Hz. Two pulse trains separated by 1 second are applied to all samples. Bipolar pulses showed higher apple juice yields with all studied parameters. Calculation of specific energies consumed was assessed and a threshold where higher energy inputs do not increase juice yield is found for a number of used parameters. Qualitative parameters of total soluble matter (Brix) and absorbance at 390 nm wavelength were determined for each sample and results show that no substantial differences are found for PEF pre-treated and control samples.