933 resultados para Multifactorial scale of locus of control
Resumo:
Chagas disease (CD) is a parasitic infection that originated in the Americas and is caused by Trypanosoma cruzi. In the last few years, the disease has spread to countries in North America, Asia and Europe due to the migration of Latin Americans. In the Brazilian Amazon, CD has an endemic transmission, especially in the Rio Negro region, where an occupational hazard was described for piaçaveiros (piassaba gatherers). In the State of Amazonas, the first chagasic infection was reported in 1977, and the first acute CD case was recorded in 1980. After initiatives to integrate acute CD diagnostics with the malaria laboratories network, reports of acute CD cases have increased. Most of these cases are associated with oral transmission by the consumption of contaminated food. Chronic cases have also been diagnosed, mostly in the indeterminate form. These cases were detected by serological surveys in cardiologic outpatient clinics and during blood donor screening. Considering that the control mechanisms adopted in Brazil's classic transmission areas are not fully applicable in the Amazon, it is important to understand the disease behavior in this region, both in the acute and chronic cases. Therefore, the pursuit of control measures for the Amazon region should be a priority given that CD represents a challenge to preserving the way of life of the Amazon's inhabitants.
Resumo:
OBJECTIVES: To evaluate the use of inhaled nitric oxide (NO) in the management of persistent pulmonary hypertension of the newborn. METHODS: Computerized bibliographic search on MEDLINE, CURRENT CONTENTS and LILACS covering the period from January 1990 to March 1998; review of references of all papers found on the subject. Only randomized clinical trials evaluating nitric oxide and conventional treatment were included. OUTCOMES STUDIED: death, requirement for extracorporeal membrane oxygenation (ECMO), systemic oxygenation, complications at the central nervous system and development of chronic pulmonary disease. The methodologic quality of the studies was evaluated by a quality score system, on a scale of 13 points. RESULTS: For infants without congenital diaphragmatic hernia, inhaled NO did not change mortality (typical odds ratio: 1.04; 95% CI: 0.6 to 1.8); the need for ECMO was reduced (relative risk: 0.73; 95% CI: 0.60 to 0.90), and the oxygenation was improved (PaO2 by a mean of 53.3 mm Hg; 95% CI: 44.8 to 61.4; oxygenation index by a mean of -12.2; 95% CI: -14.1 to -9.9). For infants with congenital diaphragmatic hernia, mortality, requirement for ECMO, and oxygenation were not changed. For all infants, central nervous system complications and incidence of chronic pulmonary disease did not change. CONCLUSIONS: Inhaled NO improves oxygenation and reduces requirement for ECMO only in newborns with persistent pulmonary hypertension who do not have diaphragmatic hernia. The risk of complications of the central nervous system and chronic pulmonary disease were not affected by inhaled NO.
Resumo:
If widespread deforestation in Amazon results in reduced evaporative water flux, then either a decrease in evaporation is compensated locally by reduced rainfall,or else changed moisture balance expresses itself downwind in the yet undisturbed forest. The question of where rain will occur is crucial. It is suggested that the appearance of clouds and the occurrence of rainout is governed primarily by the interplay of local meteorologic and physical geography parameters with the atmospheric stability structure except for a few well-defined periods when rain is dominated by large scale atmospheric instability. This means that the study of these phenomena (local heat balances,studies on cloud formation mechanism, vertical atmospheric stability, etc.) must be made on the scale of the cloud size, a few tens of kilometers at most.
Resumo:
Purpose – The purpose of this paper is to explore the impact of corporate volunteering on employee bonding and to understand the barriers and motivation to participation in these events. In contrast to other studies the participants volunteer in their spare time without expecting any financial reward. Design/methodology/approach – Employees (n 3951) of a logistic company participated in the study based on an online questionnaire with 6 items and open questions. The employee sample was divided into 3 groups depending on the frequency of participation in volunteering events. Findings – Significant differences were found on bonding between the three groups. In addition, the relevance of control variables like gender, age and job level were obtained. Furthermore a moderation effect of motivation was found. The results were interpreted within the broader context that ties motivation theory, organizational identification and social exchange theory.
Resumo:
The assessment of existing timber structures is often limited to information obtained from non or semi destructive testing, as mechanical testing is in many cases not possible due to its destructive nature. Therefore, the available data provides only an indirect measurement of the reference mechanical properties of timber elements, often obtained through empirical based correlations. Moreover, the data must result from the combination of different tests, as to provide a reliable source of information for a structural analysis. Even if general guidelines are available for each typology of testing, there is still a need for a global methodology allowing to combine information from different sources and infer upon that information in a decision process. In this scope, the present work presents the implementation of a probabilistic based framework for safety assessment of existing timber elements. This methodology combines information gathered in different scales and follows a probabilistic framework allowing for the structural assessment of existing timber elements with possibility of inference and updating of its mechanical properties, through Bayesian methods. The probabilistic based framework is based in four main steps: (i) scale of information; (ii) measurement data; (iii) probability assignment; and (iv) structural analysis. In this work, the proposed methodology is implemented in a case study. Data was obtained through a multi-scale experimental campaign made to old chestnut timber beams accounting correlations of non and semi-destructive tests with mechanical properties. Finally, different inference scenarios are discussed aiming at the characterization of the safety level of the elements.
Resumo:
The main feature of the so called multiproblem families is the persistence along time of a set of problems in various areas of the individual’s functioning in several family members.This research study aims: a) To identify and characterise the major health problems faced by the members of these families; b) To explore the perceived relevance of these problems; c) To explore the perceived effectiveness of health care interventions received by respondents; d) To explore the level of control perceived over these problems.
Resumo:
This paper describes the trigger and offline reconstruction, identification and energy calibration algorithms for hadronic decays of tau leptons employed for the data collected from pp collisions in 2012 with the ATLAS detector at the LHC center-of-mass energy s√ = 8 TeV. The performance of these algorithms is measured in most cases with Z decays to tau leptons using the full 2012 dataset, corresponding to an integrated luminosity of 20.3 fb−1. An uncertainty on the offline reconstructed tau energy scale of 2% to 4%, depending on transverse energy and pseudorapidity, is achieved using two independent methods. The offline tau identification efficiency is measured with a precision of 2.5% for hadronically decaying tau leptons with one associated track, and of 4% for the case of three associated tracks, inclusive in pseudorapidity and for a visible transverse energy greater than 20 GeV. For hadronic tau lepton decays selected by offline algorithms, the tau trigger identification efficiency is measured with a precision of 2% to 8%, depending on the transverse energy. The performance of the tau algorithms, both offline and at the trigger level, is found to be stable with respect to the number of concurrent proton--proton interactions and has supported a variety of physics results using hadronically decaying tau leptons at ATLAS.
Resumo:
Purpose: To study the relationship among the variables intensity ofthe end-of-day (EOD) dryness, corneal sensitivity and blink rate in soft contact lens (CL) wearers. Methods: Thirty-eight soft CL wearers (25 women and 13 men; mean age 27.1 ± 7.2 years) were enrolled. EOD dryness was assessed using a scale of 0–5 (0, none to 5, very intense). Mechanical and thermal (heat and cold) sensitivity were measured using a Belmonte’s gas esthesiometer. The blink rate was recorded using a video camera while subjects were wearing a hydrogel CL and watching a film for 90 min in a controlled environmental chamber. Results: A significant inverse correlation was found between EOD dryness and mechanical sensitivity (r: −0.39; p = 0.02); however, there were no significant correlations between EOD dryness and thermal sensitivity. A significant (r: 0.56; p < 0.001) correlation also was observed between EOD dryness and blink rate, but no correlations were found between blink rate and mechanical or thermal sensitivity. Conclusions: CL wearers with higher corneal sensitivity to mechanical stimulation reported more EOD dryness with habitual CL wear. Moreover, subjects reporting more EOD dryness had an increased blink rates during wear of a standard CL type. The increased blink rate could act to improve the ocular surface environment and relieve symptoms
Resumo:
Within a research project on «academic excellence in the state school», this paper is a contribution to the sociological reflection on the cultural and organisational characteristics of the school and its relationship with the academic success of students. The data we present stem from a case study underway at a secondary school in the north of Portugal, referring to the universe of students that since 2003 have distinguished themselves for achieving grades equal to or greater than 18 (on a scale of 0 to 20) and have thus been included in the school’s Framework of Excellence. From a contextual approach to this educational practice, we focused on the cultural characteristics of the school/subject as analytical support for the study of school and non-school dimensions in their mutual connections. To this end, we used the information from document analysis and data collected from a questionnaire survey administered to more than two-thirds of the students included in the above-mentioned Framework of Excellence. Subsequently, we will use the data from this survey to understand the extent to which academic excellence is perceived as an indivisible social construction of the school’s political and organisational matrix, particularly in terms of the educational and teaching guidelines adopted by the management body. We will conclude by questioning the meaning of the school’s management policies regarding the emphasis on educational outcomes, with particular focus on the representations of excellent students in the processes of school leadership, teaching organisation, school merit and justice.
Resumo:
The currently available clinical imaging methods do not provide highly detailed information about location and severity of axonal injury or the expected recovery time of patients with traumatic brain injury [1]. High-Definition Fiber Tractography (HDFT) is a novel imaging modality that allows visualizing and quantifying, directly, the degree of axons damage, predicting functional deficits due to traumatic axonal injury and loss of cortical projections. This imaging modality is based on diffusion technology [2]. The inexistence of a phantom able to mimic properly the human brain hinders the possibility of testing, calibrating and validating these medical imaging techniques. Most research done in this area fails in key points, such as the size limit reproduced of the brain fibers and the quick and easy reproducibility of phantoms [3]. For that reason, it is necessary to develop similar structures matching the micron scale of axon tubes. Flexible textiles can play an important role since they allow producing controlled packing densities and crossing structures that match closely the human crossing patterns of the brain. To build a brain phantom, several parameters must be taken into account in what concerns to the materials selection, like hydrophobicity, density and fiber diameter, since these factors influence directly the values of fractional anisotropy. Fiber cross-section shape is other important parameter. Earlier studies showed that synthetic fibrous materials are a good choice for building a brain phantom [4]. The present work is integrated in a broader project that aims to develop a brain phantom made by fibrous materials to validate and calibrate HDFT. Due to the similarity between thousands of hollow multifilaments in a fibrous arrangement, like a yarn, and the axons, low twist polypropylene multifilament yarns were selected for this development. In this sense, extruded hollow filaments were analysed in scanning electron microscope to characterize their main dimensions and shape. In order to approximate the dimensional scale to human axons, five types of polypropylene yarns with different linear density (denier) were used, aiming to understand the effect of linear density on the filament inner and outer areas. Moreover, in order to achieve the required dimensions, the polypropylene filaments cross-section was diminished in a drawing stage of a filament extrusion line. Subsequently, tensile tests were performed to characterize the mechanical behaviour of hollow filaments and to evaluate the differences between stretched and non-stretched filaments. In general, an increase of the linear density causes the increase in the size of the filament cross section. With the increase of structure orientation of filaments, induced by stretching, breaking tenacity increases and elongation at break decreases. The production of hollow fibers, with the required characteristics, is one of the key steps to create a brain phantom that properly mimics the human brain that may be used for the validation and calibration of HDFT, an imaging approach that is expected to contribute significantly to the areas of brain related research.
Resumo:
OBJECTIVE: The aim of this study was to investigate the polymorphism Ile349Val of the enzyme alcohol dehydrogenase ADH1C gene among individuals with alcohol dependence syndrome (ADS) attending Alcoholics Anonymous (AA) meetings. METHODS: A total of 120 subjects residing in Rio de Janeiro city participated in this study. Subjects were divided into two groups: a group consisting of 54 individuals from the ADS group and 66 individuals that declared not having any alcohol dependence (control group). DNA was extracted from mouth epithelial cells by phenol-chloroform method and further submitted to amplification by polymerase chain reaction (PCR). RESULTS: Our results did not show differences between the genotypes of control individuals and ADS subjects. Nevertheless, we found increased rates of alcoholism in families of ADS subjects as compared to controls. CONCLUSIONS: Our results did not show any genotype difference on the ADH1C gene when control and AA genotypes are compared.
Resumo:
OBJECTIVE: To report a case and to discuss the use of psychodynamic psychotherapy (PD-P) to treat individuals at ultra-high risk (UHR) of psychosis. METHODS: An individual at UHR was followed up for 24 months. The baseline evaluation included a psychiatric interview, the Structured Interview for Prodromal Symptoms (SIPS), the Scale of Prodromal Symptoms (SOPS), and neuropsychological assessment. He underwent weekly sessions of PD-P for 12 months and was followed up for 12 months after the end of PD-P. The evaluations were at baseline, after 6-, 12-, and 24-month follow-up. No medication was prescribed during the 24-month follow-up. RESULTS: The prodromal symptoms remitted. The initial total score on the SIPS/SOPS was 37 points. After the first 12 months of PD-P, there was a reduction to 12 points on the SIPS/SOPS score, which stabilized in the 24-month follow-up. There was also a slight improvement in his performance on the neuropsychological evaluations. CONCLUSION: This case report suggests that PD-P can reduce prodromal symptoms; nevertheless, a better understanding of the specificity and efficacy of PD-P as an option of treatment for UHR individuals is needed.
Resumo:
Objective: To test the potential mediation effect of psychosomatic symptoms on the relationship between parents' history of childhood physical victimization and current risk for child physical maltreatment. Methods: Data from the Portuguese National Representative Study of Psychosocial Context of Child Abuse and Neglect were used. Nine-hundred and twenty-four parents completed the Childhood History Questionnaire, the Psychosomatic Scale of the Brief Symptom Inventory, and the Child Abuse Potential Inventory. Results: Mediation analysis revealed that the total effect of the childhood physical victimization on child maltreatment risk was significant. The results showed that the direct effect from the parents' history of childhood physical victimization to their current maltreatment risk was still significant once parents' psychosomatic symptoms were added to the model, indicating that the increase in psychosomatic symptomatology mediated in part the increase of parents' current child maltreatment risk. Discussion: The mediation analysis showed parents' psychosomatic symptomatology as a causal pathway through which parents' childhood history of physical victimization exerts its effect on increased of child maltreatment risk. Somatization-related alterations in stress and emotional regulation are discussed as potential theoretical explanation of our findings. A cumulative risk perspective is also discussed in order to elucidate about the mechanisms that contribute for the intergenerational continuity of child physical maltreatment.
Resumo:
OBJECTIVE: To assess the influence of the quality of sleep on the nocturnal physiological drop in blood pressure during ambulatory blood pressure monitoring. METHODS: We consecutively assessed ambulatory blood pressure monitoring, the degree of tolerance for the examination, and the quality of sleep in 168 patients with hypertension or with the suspected "white-coat" effect. Blood pressure fall during sleep associated with a specific questionnaire and an analogical visual scale of tolerance for ambulatory blood pressure monitoring were used to assess usual sleep and sleep on the day of examination. Two specialists in sleep disturbances classified the patients into 2 groups: those with normal sleep and those with abnormal sleep. RESULTS: Fifty-nine (35 %) patients comprised the abnormal sleep group. Findings regarding the quality of sleep on the day of ambulatory blood pressure monitoring as compared with those regarding the quality of sleep on a usual day were different and were as follows, respectively: total duration of sleep (-12.4±4.7 versus -42.2±14.9 minutes, P=0.02), latency of sleep (0.4±2.7 versus 17±5.1 minutes, P<0.001), number of awakenings (0.1±0.1 versus 1.35±0.3 times, P<0.001), and tolerance for ambulatory blood pressure monitoring (8±0.2 versus 6.7±0.35, P=0.035). An abnormal drop in blood pressure during sleep occurred in 20 (18%) patients in the normal sleep group and in 14 (24%) patients in the abnormal sleep group, P=0.53. CONCLUSION: Ambulatory blood pressure monitoring causes sleep disturbances in some patients, and a positive association between quality of sleep and tolerance for the examination was observed.
Resumo:
The purpose of this study was to evaluate the determinism of the AS-lnterface network and the 3 main families of control systems, which may use it, namely PLC, PC and RTOS. During the course of this study the PROFIBUS and Ethernet field level networks were also considered in order to ensure that they would not introduce unacceptable latencies into the overall control system. This research demonstrated that an incorrectly configured Ethernet network introduces unacceptable variable duration latencies into the control system, thus care must be exercised if the determinism of a control system is not to be compromised. This study introduces a new concept of using statistics and process capability metrics in the form of CPk values, to specify how suitable a control system is for a given control task. The PLC systems, which were tested, demonstrated extremely deterministic responses, but when a large number of iterations were introduced in the user program, the mean control system latency was much too great for an AS-I network. Thus the PLC was found to be unsuitable for an AS-I network if a large, complex user program Is required. The PC systems, which were tested were non-deterministic and had latencies of variable duration. These latencies became extremely exaggerated when a graphing ActiveX was included in the control application. These PC systems also exhibited a non-normal frequency distribution of control system latencies, and as such are unsuitable for implementation with an AS-I network. The RTOS system, which was tested, overcame the problems identified with the PLC systems and produced an extremely deterministic response, even when a large number of iterations were introduced in the user program. The RTOS system, which was tested, is capable of providing a suitable deterministic control system response, even when an extremely large, complex user program is required.