17 resultados para Warning devices
em Scielo Saúde Pública - SP
Resumo:
OBJECTIVE To analyze whether sociodemographic, occupational, and health-related data are associated with the use of hearing protection devices at work, according to gender. METHODS A cross-sectional study was conducted in 2006, using a random sample of 2,429 workers, aged between 18 and 65 years old, from residential sub-areas in Salvador, BA, Northeastern Brazil. Questionnaires were used to obtain sociodemographic, occupational, and health-related data. Workers who reported that they worked in places where they needed to shout in order to be heard were considered to be exposed to noise. Exposed workers were asked whether they used hearing protection devices, and if so, how frequently. Analyses were conducted according to gender, with estimates made about prevalence of the use of hearing protection devices, prevalence ratios, and their respective 95% confidence intervals. RESULTS Twelve percent (12.3%) of study subjects reported that they were exposed to noise while working. Prevalence of the use of hearing protection devices was 59.3% for men and 21.4% for women. Men from higher socioeconomic levels (PR = 1.47; 95%CI 1.14;1.90) and who had previous audiometric tests (PR = 1.47; 95%CI 1.15;1.88) were more likely to use hearing protection devices. For women, greater perceived safety was associated with the use of protection devices (PR = 2.92; 95%CI 1.34;6.34). This perception was specifically related to the presence of supervisors committed to safety (PR = 2.09; 95%CI 1.04;4.21), the existence of clear rules to prevent workplace injuries (PR = 2.81; 95%CI 1.41;5.59), and whether they were informed about workplace safety (PR = 2.42; 95%CI 1.23;4.76). CONCLUSIONS There is a gender bias regarding the use of hearing protection devices that is less favorable to women. The use of such devices among women is positively influenced by their perception of a safe workplace, suggesting that gender should be considered as a factor in hearing conservation programs.
Resumo:
PURPOSE: Infection is the leading complication of long-term central venous catheters, and its incidence may vary according to catheter type. The objective of this study was to compare the frequency and probability of infection between two types of long-term intravenous devices. METHODS: Retrospective study in 96 onco-hematology patients with partially implanted catheters (n = 55) or completely implanted ones (n = 42). Demographic data and catheter care were similar in both groups. Infection incidence and infection-free survival were used for the comparison of the two devices. RESULTS: In a median follow-up time of 210 days, the catheter-related infection incidence was 0.2102/100 catheter-days for the partially implanted devices and 0.0045/100 catheter-days for the completely implanted devices; the infection incidence rate was 46.7 (CI 95% = 6.2 to 348.8). The 1-year first infection-free survival ratio was 45% versus 97%, and the 1-year removal due to infection-free survival ratio was 42% versus 97% for partially and totally implanted catheters, respectively (P <.001 for both comparisons). CONCLUSION: In the present study, the infection risk was lower in completely implanted devices than in partially implanted ones.
Resumo:
OBJECTIVE: To determine technical procedures and criteria used by Brazilian physicians for measuring blood pressure and diagnosing hypertension. METHODS: A questionnaire with 5 questions about practices and behaviors regarding blood pressure measurement and the diagnosis of hypertension was sent to 25,606 physicians in all Brazilian regions through a mailing list. The responses were compared with the recommendations of a specific consensus and descriptive analysis. RESULTS: Of the 3,621 (14.1%) responses obtained, 57% were from the southeastern region of Brazil. The following items were reported: use of an aneroid device by 67.8%; use of a mercury column device by 14.6%; 11.9% of the participants never calibrated the devices; 35.7% calibrated the devices at intervals < 1 year; 85.8% measured blood pressure in 100% of the medical visits; 86.9% measured blood pressure more than once and on more than one occasion. For hypertension diagnosis, 55.7% considered the patient's age, and only 1/3 relied on consensus statements. CONCLUSION: Despite the adequate frequency of both practices, it was far from that expected, and some contradictions between the diagnostic criterion for hypertension and the number of blood pressure measurements were found. The results suggest that, to include the great majority of the medical professionals, disclosure of consensus statements and techniques for blood pressure measurement should go beyond the boundaries of medical events and specialized journals.
Resumo:
Background: To alert for the diagnosis of the 22q11.2 deletion syndrome (22q11.2DS) in patients with congenital heart disease (CHD). Objective: To describe the main CHDs, as well as phenotypic, metabolic and immunological findings in a series of 60 patients diagnosed with 22q11.2DS. Methods: The study included 60 patients with 22q11.2DS evaluated between 2007 and 2013 (M:F=1.3, age range 14 days to 20 years and 3 months) at a pediatric reference center for primary immunodeficiencies. The diagnosis was established by detection of the 22q11.2 microdeletion using FISH (n = 18) and/or MLPA (n = 42), in association with clinical and laboratory information. Associated CHDs, progression of phenotypic facial features, hypocalcemia and immunological changes were analyzed. Results: CHDs were detected in 77% of the patients and the most frequent type was tetralogy of Fallot (38.3%). Surgical correction of CHD was performed in 34 patients. Craniofacial dysmorphisms were detected in 41 patients: elongated face (60%) and/or elongated nose (53.3%), narrow palpebral fissure (50%), dysplastic, overfolded ears (48.3%), thin lips (41.6%), elongated fingers (38.3%) and short stature (36.6%). Hypocalcemia was detected in 64.2% and decreased parathyroid hormone (PTH) level in 25.9%. Decrease in total lymphocytes, CD4 and CD8 counts were present in 40%, 53.3% and 33.3%, respectively. Hypogammaglobulinemia was detected in one patient and decreased concentrations of immunoglobulin M (IgM) in two other patients. Conclusion: Suspicion for 22q11.2DS should be raised in all patients with CHD associated with hypocalcemia and/or facial dysmorphisms, considering that many of these changes may evolve with age. The 22q11.2 microdeletion should be confirmed by molecular testing in all patients.
Resumo:
Risk factor surveillance is a complementary tool of morbidity and mortality surveillance that improves the likelihood that public health interventions are implemented in a timely fashion. The aim of this study was to identify population predictors of malaria outbreaks in endemic municipalities of Colombia with the goal of developing an early warning system for malaria outbreaks. We conducted a multiple-group, exploratory, ecological study at the municipal level. Each of the 290 municipalities with endemic malaria that we studied was classified according to the presence or absence of outbreaks. The measurement of variables was based on historic registries and logistic regression was performed to analyse the data. Altitude above sea level [odds ratio (OR) 3.65, 95% confidence interval (CI) 1.34-9.98], variability in rainfall (OR 1.85, 95% CI 1.40-2.44) and the proportion of inhabitants over 45 years of age (OR 0.17, 95% CI 0.08-0.38) were factors associated with malaria outbreaks in Colombian municipalities. The results suggest that environmental and demographic factors could have a significant ability to predict malaria outbreaks on the municipal level in Colombia. To advance the development of an early warning system, it will be necessary to adjust and standardise the collection of required data and to evaluate the accuracy of the forecast models.
Resumo:
A hybrid study combining technological production and methodological research aiming to establish associations between the data and information that are part of a Computerized Nursing Process according to the ICNP® Version 1.0, indicators of patient safety and quality of care. Based on the guidelines of the Agency for Healthcare Research and Quality and the American Association of Critical Care Nurses for the expansion of warning systems, five warning systems were developed: potential for iatrogenic pneumothorax, potential for care-related infections, potential for suture dehiscence in patients after abdominal or pelvic surgery, potential for loss of vascular access, and potential for endotracheal extubation. The warning systems are a continuous computerized resource of essential situations that promote patient safety and enable the construction of a way to stimulate clinical reasoning and support clinical decision making of nurses in intensive care.
Resumo:
AbstractOBJECTIVETo analyze the care implemented by the nursing team to promote the safety of adult patients and prevention of skin and mucosal lesions associated with the presence of lower airways invasive devices.METHODStudy with qualitative and quantitative approach, descriptive and exploratory type, whose investigative scenarios were adult inpatient units of a hospital in the West Frontier of Rio Grande do Sul. The study subjects consisted of nurses, nursing technicians and nursing assistants.RESULTSA total of 118 professionals were interviewed. We highlight the observed specific care with endotracheal tube and tracheostomy, management and assessment of the cuff and the criteria used to secretion aspiration.CONCLUSIONThere is a superficial nursing work in the patient direct care and a differentiation in relation to the perception of nurse technicians, especially those working in the intensive care unit, who presented major property and view of the patient's clinical status.
Resumo:
The objective of this work was to compare fungicide application timing for the control of sooty blotch and flyspeck (SBFS) of 'Fuji' apples in Rio Grande do Sul state, Brazil. The following treatments were evaluated in two growing seasons: two warning system-based (modified version of the Brown-Sutton-Hartmann system) spray of captan plus thiophanate methyl, with or without summer pruning; two calendar/rain-based spray of captan or a mixture of captan plus thiophanate methyl; fungicide spray timing based on a local integrated pest management (IPM) for the control of summer diseases; and a check without spraying. Sooty blotch and flyspeck incidence over time and their severity at harvest were evaluated. The highest number of spray was required by calendar/rain-based treatments (eight and seven sprays in the sequential years). The warning system recommended five and three sprays, in the sequential years, which led to the highest SBFS control efficacy expressed by the reduced initial inoculum and disease progress rate. Summer pruning enhanced SBFS control efficacy, especially by suppressing SBFS signs which tended to be restrained to the peduncle region of the fruit. Sooty blotch and flyspeck can be managed both with calendar and the grower-based IPM practices in Brazil, but a reduced number of sprays is required when the warning system is used.
Resumo:
Wheat (Triticum aestivum L.) blast caused by Pyricularia grisea is a new disease in Brazil and no resistant cultivars are available. The interactions between temperature and wetness durations have been used in many early warning systems. Hence, growth chamber experiments to assess the effect of different temperatures (10, 15, 20, 25, 30 and 35ºC) and the duration of spike-wetness (0, 5, 10, 15, 20, 25, 30, 35 and 40 hours) on the intensity of blast in cultivar BR23 were carried out. Each temperature formed an experiment and the duration of wetness the treatments. The highest blast intensity was observed at 30°C and increased as the duration of the wetting period increased while the lowest occurred at 25°C and 10 hours of spike wetness. Regardless of the temperature, no symptoms occurred when the wetting period was less than 10 hours but at 25°C and a 40 h wetting period blast intensity exceeded 85%. These variations in blast intensity as a function of temperature are explained by a generalized beta model and as a function of the duration of spike wetness by the Gompertz model. Disease intensity was modeled as a function of both temperature and the durations of spike wetness and the resulting equation provided a precise description of the response of P. grisea to temperatures and the durations of spike wetness. This model was used to construct tables that can be used to predict the intensity of P. grisea wheat blast based on the temperatures and the durations of wheat spike wetness obtained in the field.
Resumo:
Data available in the literature were used to develop a warning system for bean angular leaf spot and anthracnose, caused by Phaeoisariopsis griseola and Colletotrichum lindemuthianum, respectively. The model is based on favorable environmental conditions for the infectious process such as continuous leaf wetness duration and mean air temperature during this subphase of the pathogen-host relationship cycle. Equations published by DALLA PRIA (1977) showing the interactions of those two factors on the disease severity were used. Excell spreadsheet was used to calculate the leaf wetness period needed to cause different infection probabilities at different temperature ranges. These data were employed to elaborate critical period tables used to program a computerized electronic device that records leaf wetness duration and mean temperature and automatically shows the daily disease severity value (DDSV) for each disease. The model should be validated in field experiments under natural infection for which the daily disease severity sum (DDSS) should be identified as a criterion to indicate the beginning and the interval of fungicide applications to control both diseases.
Resumo:
Most warning systems for plant disease control are based on Vinho, in Bento Gonçalves - RS, during the growing seasons 2000/ weather models dependent on the relationships between leaf wetness 01, 2002/03 and 2003/2004, using the grape cultivar Isabel. The duration and mean air temperature in this period considering the conventional system used by local growers was compared with the target disease intensity. For the development of a warning system to new warning system by using different cumulative daily disease severity control grapevine downy mildew, the equation generated by Lalancette values (CDDSV) as the criterion to schedule fungicide application and et al. (7) was used. This equation was employed to elaborate a critical reapplication. In experiments conducted in 2003/04, CDDSV of 12 - period table and program a computerized device, which records, though 14 showed promising to schedule the first spraying and the interval electronic sensors, leaf wetness duration, mean temperature in this between fungicide applications, reducing by 37.5% the number of period and automatically calculates the daily value of probability of applications and maintaining the same control efficiency in leaves infection occurrence. The system was validated at Embrapa Uva e and bunches, similarly to the conventional system.
Warning system based on theoretical-experimental study of dispersion of soluble pollutants in rivers
Resumo:
Information about capacity of transport and dispersion of soluble pollutants in natural streams are important in the management of water resources, especially in planning preventive measures to minimize the problems caused by accidental or intentional waste, in public health and economic activities that depend on the use of water. Considering this importance, this study aimed to develop a warning system for rivers, based on experimental techniques using tracers and analytical equations of one-dimensional transport of soluble pollutants conservative, to subsidizing the decision-making in the management of water resources. The system was development in JAVA programming language and MySQL database can predict the travel time of pollutants clouds from a point of eviction and graphically displays the temporal distribution of concentrations of passage clouds, in a particular location, downstream from the point of its launch.
Resumo:
Purpose To evaluate the compliance and degree of satisfaction of nulligravida (has not given birth) and parous (had already given birth) women who are using intrauterine devices (IUDs). Methods A cross-sectional cohort study was conducted comparing nulligravida and parous women who had had an IUD inserted between July 2009 and November 2011. A total of 84 nulligravida women and 73 parous women were included. Interviews were conducted with women who agreed to participate through telephone contact. Statistical analysis was performed with Student s t-test and Mann-Whitney test for numeric variables; Pearson s chi-square test to test associations; and, whenever pertinent, Fisher s exact test for categorical variables. A survival curve was constructed to estimate the likelihood of each woman continuing the use of the IUD. A significance level of 5% was established. Results When compared with parous women, nulligravida women had a higher education level (median: 12 vs. 10 years). No statistically significant differences were found between the nulligravida and parous women with respect to information on the use of the IUD, prior use of other contraceptive methods, the reason for having chosen the IUD as the current contraceptive method, reasons for discontinuing the use and adverse effects, compliance, and degree of satisfaction. The two groups did not show any difference in terms of continued use of the IUD (p = 0.4). Conclusion There was no difference in compliance or the degree of satisfaction or continued use of IUDs between nulligravida and parous women, suggesting that IUD use may be recommended for women who have never been pregnant.
Resumo:
The inferior colliculus is a primary relay for the processing of auditory information in the brainstem. The inferior colliculus is also part of the so-called brain aversion system as animals learn to switch off the electrical stimulation of this structure. The purpose of the present study was to determine whether associative learning occurs between aversion induced by electrical stimulation of the inferior colliculus and visual and auditory warning stimuli. Rats implanted with electrodes into the central nucleus of the inferior colliculus were placed inside an open-field and thresholds for the escape response to electrical stimulation of the inferior colliculus were determined. The rats were then placed inside a shuttle-box and submitted to a two-way avoidance paradigm. Electrical stimulation of the inferior colliculus at the escape threshold (98.12 ± 6.15 (A, peak-to-peak) was used as negative reinforcement and light or tone as the warning stimulus. Each session consisted of 50 trials and was divided into two segments of 25 trials in order to determine the learning rate of the animals during the sessions. The rats learned to avoid the inferior colliculus stimulation when light was used as the warning stimulus (13.25 ± 0.60 s and 8.63 ± 0.93 s for latencies and 12.5 ± 2.04 and 19.62 ± 1.65 for frequencies in the first and second halves of the sessions, respectively, P<0.01 in both cases). No significant changes in latencies (14.75 ± 1.63 and 12.75 ± 1.44 s) or frequencies of responses (8.75 ± 1.20 and 11.25 ± 1.13) were seen when tone was used as the warning stimulus (P>0.05 in both cases). Taken together, the present results suggest that rats learn to avoid the inferior colliculus stimulation when light is used as the warning stimulus. However, this learning process does not occur when the neutral stimulus used is an acoustic one. Electrical stimulation of the inferior colliculus may disturb the signal transmission of the stimulus to be conditioned from the inferior colliculus to higher brain structures such as amygdala
Resumo:
Previous studies have shown that saccadic eye responses but not manual responses were sensitive to the kind of warning signal used, with visual onsets producing longer saccadic latencies compared to visual offsets. The aim of the present study was to determine the effects of distinct warning signals on manual latencies and to test the premise that the onset interference, in fact, does not occur for manual responses. A second objective was to determine if the magnitude of the warning effects could be modulated by contextual procedures. Three experimental conditions based on the kind of warning signal used (visual onset, visual offset and auditory warning) were run in two different contexts (blocked and non-blocked). Eighteen participants were asked to respond to the imperative stimulus that would occur some milliseconds (0, 250, 500 or 750 ms) after the warning signal. The experiment consisted in three experimental sessions of 240 trials, where all the variables were counterbalanced. The data showed that visual onsets produced longer manual latencies than visual offsets in the non-blocked context (275 vs 261 ms; P < 0.001). This interference was obtained, however, only for short intervals between the warning and the stimulus, and was abolished when the blocked context was used (256 vs 255 ms; P = 0.789). These results are discussed in terms of bottom-up and top-down interactions, mainly those related to the role of attentional processing in canceling out competitive interactions and suppressive influences of a distractor on the relevant stimulus.