976 resultados para RISK INDICATORS
Resumo:
Journal of Environmental Management, nº 82 p. 410–432
Resumo:
RESUMO - A exposição a formaldeído é reconhecidamente um dos mais importantes factores de risco presente nos laboratórios hospitalares de anatomia patológica. Neste contexto ocupacional, o formaldeído é utilizado em solução, designada comummente por formol. Trata-se de uma solução comercial de formaldeído, normalmente diluída a 10%, sendo pouco onerosa e, por esse motivo, a eleita para os trabalhos de rotina em anatomia patológica. A solução é utilizada como fixador e conservante do material biológico, pelo que as peças anatómicas a serem processadas são previamente impregnadas. No que concerne aos efeitos para a saúde do formaldeído, os efeitos locais parecem apresentar um papel mais importante comparativamente com os efeitos sistémicos, devido à sua reactividade e rápido metabolismo nas células da pele, tracto gastrointestinal e pulmões. Da mesma forma, a localização das lesões correspondem principalmente às zonas expostas às doses mais elevadas deste agente químico, ou seja, o desenvolvimento dos efeitos tóxicos dependerá mais da intensidade da dose externa do que da duração da exposição. O efeito do formaldeído no organismo humano mais facilmente detectável é a acção irritante, transitória e reversível sobre as mucosas dos olhos e aparelho respiratório superior (naso e orofaringe), o que acontece em geral para exposições frequentes e superiores a 1 ppm. Doses elevadas são citotóxicas e podem conduzir a degenerescência e necrose das mucosas e epitélios. No que concerne aos efeitos cancerígenos, a primeira avaliação efectuada pela International Agency for Research on Cancer data de 1981, actualizada em 1982, 1987, 1995 e 2004, considerando-o como um agente cancerígeno do grupo 2A (provavelmente carcinogénico). No entanto, a mais recente avaliação, em 2006, considera o formaldeído no Grupo 1 (agente carcinogénico) com base na evidência de que a exposição a este agente é susceptível de causar cancro nasofaríngeo em humanos. Constituiu objectivo principal deste estudo caracterizar a exposição profissional a formaldeído nos laboratórios hospitalares de anatomia patológica Portugueses. Pretendeu-se, ainda, descrever os fenómenos ambientais da contaminação ambiental por formaldeído e explorar eventuais associações entre variáveis. Considerou-se uma amostra de 10 laboratórios hospitalares de anatomia patológica, avaliada a exposição dos três grupos profissionais por comparação com os dois referenciais de exposição e, ainda, conhecidos os valores de concentração máxima em 83 actividades. Foram aplicados simultaneamente dois métodos distintos de avaliação ambiental: um dos métodos (Método 1) fez uso de um equipamento de leitura directa com o princípio de medição por Photo Ionization Detection, com uma lâmpada de 11,7 eV e, simultaneamente, realizou-se o registo da actividade. Este método disponibilizou dados para o referencial de exposição da concentração máxima; o outro método (Método 2) traduziu-se na aplicação do método NIOSH 2541, implicando o uso de bombas de amostragem eléctricas de baixo caudal e posterior processamento analítico das amostras por cromatografia gasosa. Este método, por sua vez, facultou dados para o referencial de exposição da concentração média ponderada. As estratégias de medição de cada um dos métodos e a definição dos grupos de exposição existentes neste contexto ocupacional, designadamente os Técnicos de Anatomia Patológica, os Médicos Anatomo-Patologistas e os Auxiliares, foram possíveis através da informação disponibilizada pelas técnicas de observação da actividade da análise (ergonómica) do trabalho. Estudaram-se diversas variáveis independentes, nomeadamente a temperatura ambiente e a humidade relativa, a solução de formaldeído utilizada, as condições de ventilação existentes e o número médio de peças processadas por dia em cada laboratório. Para a recolha de informação sobre estas variáveis foi preenchida, durante a permanência nos laboratórios estudados, uma Grelha de Observação e Registo. Como variáveis dependentes seleccionaram-se três indicadores de contaminação ambiental, designadamente o valor médio das concentrações superiores a 0,3 ppm em cada laboratório, a Concentração Média Ponderada obtida para cada grupo de exposição e o Índice do Tempo de Regeneração de cada laboratório. Os indicadores foram calculados e definidos através dos dados obtidos pelos dois métodos de avaliação ambiental aplicados. Baseada no delineado pela Universidade de Queensland, foi ainda aplicada uma metodologia de avaliação do risco de cancro nasofaríngeo nas 83 actividades estudadas de modo a definir níveis semi-quantitativos de estimação do risco. Para o nível de Gravidade considerou-se a informação disponível em literatura científica que define eventos biológicos adversos, relacionados com o modo de acção do agente químico e os associa com concentrações ambientais de formaldeído. Para o nível da Probabilidade utilizou-se a informação disponibilizada pela análise (ergonómica) de trabalho que permitiu conhecer a frequência de realização de cada uma das actividades estudadas. A aplicação simultânea dos dois métodos de avaliação ambiental resultou na obtenção de resultados distintos, mas não contraditórios, no que concerne à avaliação da exposição profissional a formaldeído. Para as actividades estudadas (n=83) verificou-se que cerca de 93% dos valores são superiores ao valor limite de exposição definido para a concentração máxima (VLE-CM=0,3 ppm). O “exame macroscópico” foi a actividade mais estudada e onde se verificou a maior prevalência de resultados superiores ao valor limite (92,8%). O valor médio mais elevado da concentração máxima (2,04 ppm) verificou-se no grupo de exposição dos Técnicos de Anatomia Patológica. No entanto, a maior amplitude de resultados observou-se no grupo dos Médicos Anatomo-Patologistas (0,21 ppm a 5,02 ppm). No que respeita ao referencial da Concentração Média Ponderada, todos os valores obtidos nos 10 laboratórios estudados para os três grupos de exposição foram inferiores ao valor limite de exposição definido pela Occupational Safety and Health Administration (TLV-TWA=0,75 ppm). Verificou-se associação estatisticamente significativa entre o número médio de peças processadas por laboratório e dois dos três indicadores de contaminação ambiental utilizados, designadamente o valor médio das concentrações superiores a 0,3 ppm (p=0,009) e o Índice do Tempo de Regeneração (p=0,001). Relativamente à temperatura ambiente não se observou associação estatisticamente significativa com nenhum dos indicadores de contaminação ambiental utilizados. A humidade relativa apresentou uma associação estatisticamente significativa apenas com o indicador de contaminação ambiental da Concentração Média Ponderada de dois grupos de exposição, nomeadamente com os Médicos Anatomo-Patologistas (p=0,02) e os Técnicos de Anatomia Patológica (p=0,04). A aplicação da metodologia de avaliação do risco nas 83 actividades estudadas permitiu verificar que, em cerca de dois terços (35%), o risco foi classificado como (pelo menos) elevado e, ainda, constatar que 70% dos laboratórios apresentou pelo menos 1 actividade com a classificação de risco elevado. Da aplicação dos dois métodos de avaliação ambiental e das informações obtidas para os dois referenciais de exposição pode concluir-se que o referencial mais adequado é a Concentração Máxima por estar associado ao modo de actuação do agente químico. Acresce, ainda, que um método de avaliação ambiental, como o Método 1, que permite o estudo das concentrações de formaldeído e simultaneamente a realização do registo da actividade, disponibiliza informações pertinentes para a intervenção preventiva da exposição por permitir identificar as actividades com a exposição mais elevada, bem como as variáveis que a condicionam. As peças anatómicas apresentaram-se como a principal fonte de contaminação ambiental por formaldeído neste contexto ocupacional. Aspecto de particular interesse, na medida que a actividade desenvolvida neste contexto ocupacional e, em particular na sala de entradas, é centrada no processamento das peças anatómicas. Dado não se perspectivar a curto prazo a eliminação do formaldeído, devido ao grande número de actividades que envolvem ainda a utilização da sua solução comercial (formol), pode concluir-se que a exposição a este agente neste contexto ocupacional específico é preocupante, carecendo de uma intervenção rápida com o objectivo de minimizar a exposição e prevenir os potenciais efeitos para a saúde dos trabalhadores expostos. ---------------- ABSTRACT - Exposure to formaldehyde is recognized as one of the most important risk factors present in anatomy and pathology laboratories from hospital settings. In this occupational setting, formaldehyde is used in solution, typically diluted to 10%, and is an inexpensive product. Because of that, is used in routine work in anatomy and pathology laboratories. The solution is applied as a fixative and preservative of biological material. Regarding formaldehyde health effects, local effects appear to have a more important role compared with systemic effects, due to his reactivity and rapid metabolism in skin, gastrointestinal tract and lungs cells. Likewise, lesions location correspond mainly to areas exposed to higher doses and toxic effects development depend more on external dose intensity than exposure duration. Human body formaldehyde effect more easily detectable is the irritating action, transient and reversible on eyes and upper respiratory tract (nasal and throat) membranes, which happen in general for frequent exposure to concentrations higher than 1 ppm. High doses are cytotoxic and can lead to degeneration, and also to mucous membranes and epithelia necrosis. With regard to carcinogenic effects, first assessment performed by International Agency for Research on Cancer in 1981, updated in 1982, 1987, 1995 and 2004, classified formaldehyde in Group 2A (probably carcinogenic). However, most recent evaluation in 2006, classifies formaldehyde carcinogenic (Group 1), based on evidence that exposure to this agent is likely to cause nasopharyngeal cancer in humans. This study principal objective was to characterize occupational exposure to formaldehyde in anatomy and pathology hospital laboratories, as well to describe formaldehyde environmental contamination phenomena and explore possible associations between variables. It was considered a sample of 10 hospital pathology laboratories, assessed exposure of three professional groups for comparison with two exposure metrics, and also knows ceiling concentrations in 83 activities. Were applied, simultaneously, two different environmental assessment methods: one method (Method 1) using direct reading equipment that perform measure by Photo Ionization Detection, with 11,7 eV lamps and, simultaneously, make activity description and film. This method provided data for ceiling concentrations for each activity study (TLV-C). In the other applied method (Method 2), air sampling and formaldehyde analysis were performed according to NIOSH method (2541). This method provided data average exposure concentration (TLV-TWA). Measuring and sampling strategies of each methods and exposure groups definition (Technicians, Pathologists and Assistants) was possible by information provided by activities (ergonomic) analysis. Several independent variables were studied, including temperature and relative humidity, formaldehyde solution used, ventilation conditions, and also anatomic pieces mean value processed per day in each laboratory. To register information about these variables was completed an Observation and Registration Grid. Three environmental contamination indicators were selected has dependent variables namely: mean value from concentrations exceeding 0,3 ppm in each laboratory, weighted average concentration obtained for each exposure group, as well each laboratory Time Regeneration Index. These indicators were calculated and determined through data obtained by the two environmental assessment methods. Based on Queensland University proposal, was also applied a methodology for assessing nasopharyngeal cancer risk in 83 activities studied in order to obtain risk levels (semi-quantitative estimation). For Severity level was considered available information in scientific literature that defines biological adverse events related to the chemical agent action mode, and associated with environment formaldehyde concentrations. For Probability level was used information provided by (ergonomic) work analysis that helped identifies activity frequency. Environmental assessment methods provide different results, but not contradictory, regarding formaldehyde occupational exposure evaluation. In the studied activities (n=83), about 93% of the values were above exposure limit value set for ceiling concentration in Portugal (VLE-CM = 0,3 ppm). "Macroscopic exam" was the most studied activity, and obtained the higher prevalence of results superior than 0,3 ppm (92,8%). The highest ceiling concentration mean value (2,04 ppm) was obtain in Technicians exposure group, but a result wider range was observed in Pathologists group (0,21 ppm to 5,02 ppm). Concerning Method 2, results from the three exposure groups, were all lower than limit value set by Occupational Safety and Health Administration (TLV-TWA=0,75ppm). There was a statistically significant association between anatomic pieces mean value processed by each laboratory per day, and two of the three environmental contamination indicators used, namely average concentrations exceeding 0,3 ppm (p=0,009) and Time Regeneration Index (p=0,001). Temperature was not statistically associated with any environmental contamination used indicators. Relative humidity had a statistically significant association only with one environmental contamination indicator, namely weighted average concentration, particularly with Pathologists group (p=0,02) and Technicians group (p=0,04). Risk assessment performed in the 83 studied activities showed that around two thirds (35%) were classified as (at least) high, and also noted that 70% of laboratories had at least 1 activity with high risk rating. The two environmental assessment methods application, as well information obtained from two exposure metrics, allowed to conclude that most appropriate exposure metric is ceiling concentration, because is associated with formaldehyde action mode. Moreover, an environmental method, like Method 1, which allows study formaldehyde concentrations and relates them with activity, provides relevant information for preventive information, since identifies the activity with higher exposure, as well variables that promote exposure. Anatomic pieces represent formaldehyde contamination main source in this occupational setting, and this is of particular interest because all activities are focused on anatomic pieces processing. Since there is no prospect, in short term, for formaldehyde use elimination due to large number of activities that still involve solution use, it can be concluded that exposure to this agent, in this particular occupational setting, is preoccupant, requiring an rapid intervention in order to minimize exposure and prevent potential health effects in exposed workers.
Resumo:
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies.
Resumo:
Schistosomiasis mansoni in the Serrano village, municipality of Cururupu, state of Maranhão, Brazil, is a widely spread disease. The PECE (Program for the Control of Schistosomiasis), undertaken since 1979 has reduced the prevalence of S. mansoni infection and the hepatosplenic form of the disease. Nevertheless piped water is available in 84% of the households, prevalence remains above 20%. In order to identify other risk factors responsible for the persistence of high prevalence levels, a cross-sectional survey was carried out in a systematic sample of 294 people of varying ages. Socioeconomic, environmental and demographic variables, and water contact patterns were investigated. Fecal samples were collected and analyzed by the Kato-Katz technique. Prevalence of S. mansoni infection was 24.1%, higher among males (35.5%) and between 10-19 years of age (36.6%). The risk factors identified in the univariable analysis were water contacts for vegetable extraction (Risk Ratio - RR = 2.92), crossing streams (RR = 2.55), bathing (RR = 2.35), fishing (RR = 2.19), hunting (RR = 2.17), cattle breeding (RR = 2.04), manioc culture (RR = 1.90) and leisure (RR = 1.56). After controlling for confounding variables by proportional hazards model the risks remained higher for males, vegetable extraction, bathing in rivers and water contact in rivers or in periodically inundated parts of riverine woodland (swamplands)
Resumo:
A clinical trial pilot study, double-blinded, randomized, and controlled with a placebo to assess the effectiveness of oral doxycycline (200 mg, single dose) in preventing leptospirosis after high exposure to potentially contamined water was performed in São Paulo, SP, Brazil. Confirmed cases were defined as those with leptospira IgM antibody and symptoms; asymptomatic cases were those presenting with IgM antibodies but no symptoms; and suspected cases were individuals with symptoms but no IgM antibody. Forty subjects were given doxycycline and 42 were given placebo. In the drug-treated group there were 2 confirmed cases, 11 asymptomatic cases, and 6 suspected cases. In the placebo group there were 5 confirmed, 6 symptomatic, and 5 suspected cases. Even though we found a protective association of doxycycline for confirmed leptospirosis cases (RR = 2.3) and seroconversion only (RR = 2.0), the association was not statistically significant because of the small number of individuals enrolled in this pilot study. We observed that the 22% of the volunteers already had IgM antibodies to leptospirosis at the first sampling. Finally, the attack rate to confirmed, asymptomatic, and suspected cases of Leptospirosis was 8.5%, 22%, and 13%, respectively, in this population.
Resumo:
Background: A growing body of research suggests that vitamin D might play an important role in overall health. No data exist on vitamin D intake for the Azorean adolescent population. The purpose of this study was to assess vitamin D intake and investigate a possible association between vitamin D intake and cardiometabolic risk factors in Azorean adolescents. Methods: A cross-sectional school-based study was conducted on 496 adolescents (288 girls) aged 15–18 years from the Azorean Islands, Portugal. Anthropometric measurements (waist circumference and height), blood pressure (systolic), and plasma biomarkers [fasting glucose, insulin, total cholesterol (TC), high-density lipoprotein cholesterol (HDL-C), and triglycerides (TGs)] were measured to assess metabolic risk. Homeostasis model assessment (HOMA), TC-to-HDL-C ratio, and waist-to-height ratio were calculated. For each of these variables, a Z-score was computed by age and sex. A metabolic risk score was constructed by summing the Zscores of all individual risk factors. High risk was considered when the individual had ‡ 1 standard deviation(SD) of this score. Vitamin D intake was assessed with a semiquantitative food frequency questionnaire. Participants were classified into quartiles of vitamin D intake. Logistic regression was used to determine odds ratios for high cardiometabolic risk scores after adjusting for total energy intake, pubertal stage, fat mass percentage, and cardiorespiratory fitness. Results: Mean (SD) vitamin D intake was 5.8 (6.5) mg/day, and 9.1% of Azorean adolescents achieved the estimated average requirement of vitamin D (10 mg/day or 400 IU). Logistic regression showed that the odds ratio for a high cardiometabolic risk score was 3.35 [95% confidence interval (CI) 1.28–8.75] for adolescents in the lowest vitamin D intake quartile in comparison with those in the highest vitamin D intake quartile, even after adjustment for confounders. Conclusion: A lower level of vitamin D intake was associated with worse metabolic profile among Azorean adolescents.
Resumo:
Epidemiologic studies have reported an inverse association between dairy product consumption and cardiometabolic risk factors in adults, but this relation is relatively unexplored in adolescents. We hypothesized that a higher dairy product intake is associated with lower cardiometabolic risk factor clustering in adolescents. To test this hypothesis, a cross-sectional study was conducted with 494 adolescents aged 15 to 18 years from the Azorean Archipelago, Portugal. We measured fasting glucose, insulin, total cholesterol, high-density lipoprotein cholesterol, triglycerides, systolic blood pressure, body fat, and cardiorespiratory fitness. We also calculated homeostatic model assessment and total cholesterol/high-density lipoprotein cholesterol ratio. For each one of these variables, a z score was computed using age and sex. A cardiometabolic risk score (CMRS) was constructed by summing up the z scores of all individual risk factors. High risk was considered to exist when an individual had at least 1 SD from this score. Diet was evaluated using a food frequency questionnaire, and the intake of total dairy (included milk, yogurt, and cheese), milk, yogurt, and cheese was categorized as low (equal to or below the median of the total sample) or “appropriate” (above the median of the total sample).The association between dairy product intake and CMRS was evaluated using separate logistic regression, and the results were adjusted for confounders. Adolescents with high milk intake had lower CMRS, compared with those with low intake (10.6% vs 18.1%, P = .018). Adolescents with appropriate milk intake were less likely to have high CMRS than those with low milk intake (odds ratio, 0.531; 95% confidence interval, 0.302-0.931). No association was found between CMRS and total dairy, yogurt, and cheese intake. Only milk intake seems to be inversely related to CMRS in adolescents.
Resumo:
2 Centre of Research, Education, Innovation and Intervention In Sport, Faculty of Sport, University of Porto, Portugal Background: Regarding children aged _10 years, only a few international studies were conducted to determine the prevalence of and risk factors for back pain. Although other studies on the older Portuguese children point to prevalence between 17% and 39%, none exists for this specific age-group. Thus, the aim of this study was conducted to establish the prevalence of and risk factors for back pain in schoolchildren aged 7–10 years. Methods: A cross-sectional survey among 637 children was conducted. A self-rating questionnaire was used to verify prevalence and duration of back pain, life habits, school absence, medical treatments or limitation of activities. For posture assessment, photographic records with a bio-photogrammetric analysis were used to obtain data about head, acromion and pelvic alignment, horizontal alignment of the scapulae, vertical alignment of the trunk and vertical body alignment. Results: Postural problems were found in 25.4% of the children, especially in the 8- and 9-year-old groups. Back pain occurs in 12.7% with the highest values among the 7- and 10-year-old children. The probability of back pain increased 7 times when the children presented a history of school absences, 4.3 times when they experienced sleeping difficulties, 4.4 times when school furniture was uncomfortable, 4.7 times if the children perceived an occurrence of parental back pain and 2.5 times when children presented incorrect posture. Conclusions: The combination of school absences, parental pain, sleeping difficulties, inappropriate school furniture and postural deviations at the sagittal and frontal planes seem to prove the multifactorial aetiology of back pain.
Risk Acceptance in the Furniture Sector: Analysis of Acceptance Level and Relevant Influence Factors
Resumo:
Risk acceptance has been broadly discussed in relation to hazardous risk activities and/or technologies. A better understanding of risk acceptance in occupational settings is also important; however, studies on this topic are scarce. It seems important to understand the level of risk that stakeholders consider sufficiently low, how stakeholders form their opinion about risk, and why they adopt a certain attitude toward risk. Accordingly, the aim of this study is to examine risk acceptance in regard to occupational accidents in furniture industries. The safety climate analysis was conducted through the application of the Safety Climate in Wood Industries questionnaire. Judgments about risk acceptance, trust, risk perception, benefit perception, emotions, and moral values were measured. Several models were tested to explain occupational risk acceptance. The results showed that the level of risk acceptance decreased as the risk level increased. High-risk and death scenarios were assessed as unacceptable. Risk perception, emotions, and trust had an important influence on risk acceptance. Safety climate was correlated with risk acceptance and other variables that influence risk acceptance. These results are important for the risk assessment process in terms of defining risk acceptance criteria and strategies to reduce risks.
Resumo:
This study aims to analyse the relationship between safety climate and the level of risk acceptance, as well as its relationship with workplace safety performance. The sample includes 14 companies and 403 workers. The safety climate assessment was performed by the application of a Safety Climate in Wood Industries questionnaire and safety performance was assessed with a checklist. Judgements about risk acceptance were measured through questionnaires together with four other variables: trust, risk perception, benefit perception and emotion. Safety climate was found to be correlated with workgroup safety performance, and it also plays an important role in workers’ risk acceptance levels. Risk acceptance tends to be lower when safety climate scores of workgroups are high, and subsequently, their safety performance is better. These findings seem to be relevant, as they provide Occupational, Safety and Health practitioners with a better understanding of workers’ risk acceptance levels and of the differences among workgroups.
Resumo:
We compared the indirect immunofluorescence assay (IFA) with Western blot (Wb) as a confirmatory method to detect antibodies anti retrovirus (HIV-1 and HTLV-I/II). Positive and negative HIV-1 and HTLV-I/II serum samples from different risk populations were studied. Sensitivity, specificity, positive, negative predictive and kappa index values were assayed, to assess the IFA efficiency versus Wb. The following cell lines were used as a source of viral antigens: H9 ( HTLV-III b); MT-2 and MT-4 (persistently infected with HTLV-I) and MO-T (persistently infected with HTLV-II). Sensitivity and specificity rates for HIV-1 were 96.80% and 98.60% respectively, while predictive positive and negative values were 99.50% and 92.00% respectively. No differences were found in HIV IFA performance between the various populations studied. As for IFA HTLV system, the sensitivity and specificity values were 97.91% and 100% respectively with positive and negative predictive values of 100% and 97.92%. Moreover, the sensitivity of the IFA for HTLV-I/II proved to be higher when the samples were tested simultaneously against both antigens (HTLV-I-MT-2 and HTLV-II-MO-T). The overall IFA efficiency for HIV-1 and HTLV-I/II-MT-2 antibody detection probed to be very satisfactory with an excellent correlation with Wb (Kappa indexes 0.93 and 0.98 respectively). These results confirmed that the IFA is a sensitive and specific alternative method for the confirmatory diagnosis of HIV-1 and HTLV-I/II infection in populations at different levels of risk to acquire the infection and suggest that IFA could be included in the serologic diagnostic algorithm.
Resumo:
This study was carried out in order to obtain base-line data concerning the epidemiology of American Visceral Leishmaniasis and Chagas Disease in an indigenous population with whom the government is starting a dwelling improvement programme. Information was collected from 242 dwellings (1,440 people), by means of house to house interviews about socio-economic and environmental factors associated with Leishmania chagasi and Trypanosoma cruzi transmission risk. A leishmanin skin test was applied to 385 people and 454 blood samples were collected on filter paper in order to detect L. chagasi antibodies by ELISA and IFAT and T. cruzi antibodies by ELISA. T. cruzi seroprevalence was 8.7% by ELISA, L. chagasi was 4.6% and 5.1% by IFAT and ELISA, respectively. ELISA sensitivity and specificity for L. chagasi antibodies were 57% and 97.5% respectively, as compared to the IFAT. Leishmanin skin test positivity was 19%. L. chagasi infection prevalence, being defined as a positive result in the three-immunodiagnostic tests, was 17.1%. Additionally, 2.7% of the population studied was positive to both L. chagasi and T. cruzi, showing a possible cross-reaction. L. chagasi and T. cruzi seropositivity increased with age, while no association with gender was observed. Age (p<0.007), number of inhabitants (p<0.05), floor material (p<0.03) and recognition of vector (p<0.01) were associated with T. cruzi infection, whilst age ( p<0.007) and dwelling improvement (p<0.02) were associated with L. chagasi infection. It is necessary to evaluate the long-term impact of the dwelling improvement programme on these parasitic infections in this community.
Resumo:
Based on the report for Project III of the PhD programme on Technology Assessment and prepared for the Winter School that took place at Universidade Nova de Lisboa, Caparica Campus on the 6th and 7th of December 2010.
Resumo:
A case-control study was conducted to identify risk factors for death from tetanus in the State of Pernambuco, Brazil. Information was obtained from medical records of 152 cases and 152 controls, admitted to the tetanus unit in the State University Hospital, in Recife, from 1990 to 1995. Variables were grouped in three different sets. Crude and adjusted odds ratios, p-values and 95% confidence intervals were estimated. Variables selected in the multivariate analysis in each set were controlled for the effect of those selected in the others. All factors related to the disease progression - incubation period, time elapsed between the occurrence of the first tetanus symptom and admission, and period of onset - showed a statistically significant association with death from tetanus. Similarly, signs and/or symptoms occurring on admission or in the following 24 hours (second set): reflex spasms, neck stiffness, respiratory signs/symptoms and respiratory failure requiring artificial ventilation (third set) were associated with death from tetanus even when adjusted for the effect of the others.
Resumo:
The objective of this study was to evaluate the prevalence and risk factors associated with HCV infection in a group of HIV seropositive patients. We analyzed the medical records of 1,457 patients. All patients were tested for HCV infection by third generation ELISA. Whenever possible, a sample of the positive patients was also tested for HCV by PCR. HCV positive patients were analyzed according to their risk factors for both infections. The prevalence of anti-HCV positive patients was 17.7% (258 patients). Eighty-two (82) of these patients were also tested by PCR and 81 were positive for HCV virus (98%). One hundred fifty-one (58.5%) were intravenous drug users (IDU); 42 (16.3%) were sexual partners of HIV patients; 23 (8.9%) were homosexual males; 12 (4.7%) had received blood transfusion; 61 (17.5%) had promiscuous sexual habits; 14 (5.4%) denied any risk factor; 12 (4.7%) were sexual partners of IDU. Two hundred four patients mentioned only one risk factor. Among them, 28 (10.9%) were sexual partners of HIV-positive patients. Although intravenous drug use was the most important risk factor for co-infection, sexual transmission seemed to contribute to the high HCV seroprevalence in this group of patients.