958 resultados para STANDARD-RISK
Resumo:
Indigent and congregate-living populations have high susceptibilities for disease and pose a higher risk for disease transmission to family, friends and to persons providing services to these populations. The adoption of basic infection control, personal hygiene, safe food handling and simple engineering practices will reduce the risk of infectious disease transmission to, from and among indigent and congregate-living populations. ^ The provision of social services, health promotion activities and other support services to indigent and congregate-living populations is an important aspect of many public health-related governmental, community-based and other medical care provider agencies. ^ In the interest of protecting the health of indigent and congregate-living populations, of personnel from organizations providing services to these populations and of the general community, an educational intervention is warranted to prevent the spread of blood-borne, air-borne, food-borne and close contact-borne infectious diseases. ^ An educational presentation was provided to staff from a community-based organization specializing in providing housing, health education, foodstuffs and meals and support services to disabled, low-income, homeless and HIV-infected individuals. The educational presentation delivered general best practices and standard guidelines. A pre and post test were administered to determine and measure knowledge pertinent to controlling the spread of infectious diseases between and among homeless shelter-living clients and between clients and the organization's staff. ^ Comparing pre-test and post-test results revealed areas of knowledge currently held by staff and other areas that staff would benefit from additional educational seminars and training. ^
Resumo:
Despite the availability of hepatitis B vaccine for over two decades, drug users and other high-risk adult populations have experienced low vaccine coverage. Poor compliance has limited efforts to reduce transmission of hepatitis B infection in this population. Evidence suggests that immunological response in drug users is impaired compared to the general population, both in terms of lower seroprotection rates and antibodies levels.^ The current study investigated the effectiveness of the multi-dose hepatitis B vaccine and compared the effect of the standard and accelerated vaccine schedules in a not-in-treatment, drug-using adult population in the city of Houston, USA.^ A population of drug-users from two communities in Houston, susceptible to hepatitis B, was sampled by outreach workers and referral methodology. Subjects were randomized either to the standard hepatitis vaccine schedule (0, 1-, 6-month) or to an accelerated schedule (0, 1-, 2-month). Antibody levels were detected through laboratory analyses at various time-points. The participants were followed for two years and seroconversion rates were calculated to determine immune response.^ A four percent difference in the overall compliance rate was observed between the standard (73%) and accelerated schedules (77%). Logistic regression analyses showed that drug users living on the streets were twice as likely to not complete all three vaccine doses (p=0.028), and current speedball use was also associated with non-completion (p=0.002). Completion of all three vaccinations in the multivariate analysis was also correlated with older age. Drug users on the accelerated schedule were 26% more likely to achieve completion, although this factor was marginally significant (p=0.085).^ Cumulative adequate protective response was gained by 65% of the HBV susceptible subgroup by 12-months and was identical for both the standard and accelerated schedules. Excess protective response (>=100 mIU/mL) occurred with greater frequency at the later period for the standard schedule (36% at 12-months compared to 14% at six months), while the greater proportion of excess protective response for the accelerated schedule occurred earlier (34% at 6 months compared to 18% at 12-months). Seroconversion at the adequate protective response level of 10 mIU/mL was reached by the accelerated schedule group at a quicker rate (62% vs. 49%), and with a higher mean titer (104.8 vs. 64.3 mIU/mL), when measured at six months. Multivariate analyses indicated a 63% increased risk of non-response for older age and confirmed the existence of an accelerating decline in immune response to vaccination manifesting after 40 years (p=0.001). Injecting more than daily was also highly associated with the risk of non-response (p=0.016).^ The substantial increase in the seroprotection rate at six months may be worth the trade-off against the faster antibody titer decrease and is recommended for enhancing compliance and seroconversion. Utilization of the accelerated schedule with the primary objective of increasing compliance and seroconversion rates during the six months after the first dose may confer early protective immunity and reduce the HBV vulnerability of drug users who continue, or have recently initiated, increased high risk drug use and sexual behaviors.^
Resumo:
During this cross-sectional study, both quantitative and qualitative research methods were used to elucidate the role that household environment and sanitation play in the nutritional status of children in a rural Honduran community. Anthropometric measurements were taken as measures of nutritional status among children under five years of age, while interviews regarding the household environment were conducted with their primary caregivers. Community participatory activities were conducted with primary caregivers, and results from water quality testing were analyzed for E. coli contamination. Anthropometric results were compared using the 1977 NCHS Growth Charts and the 2006 WHO Child Growth Standard to examine the implications of using the new WHO standard. The references showed generally good or excellent agreement between z-score categories, except among height-for-age classifications for males 24-35.9 months and weight-for-age classifications for males older than 24 months. Comparing the proportion of stunted, underweight, and wasted children, using the WHO standard generally resulted in higher proportions of stunting, lower underweight proportions, and higher overweight proportions. Logistic regression was used to determine which household and sanitation factors most influenced the growth of children. Results suggest only having water from a spring, stream, or other type of surface water as the primary source of drinking water is a significant risk factor for stunting. A protective association was seen between the household wealth index and stunting. Through participatory activities, the community provided insight on health issues important for improving child health. These activities yielded findings to be harnessed as a powerful resource to unify efforts for change. The qualitative findings were triangulated with the quantitative interview and water testing results to provide intervention recommendations for the community and its primary health care clinic. Recommendations include educating the community on best water consumption practices and encouraging the completion of at least some primary education for primary caregivers to improve child health. It is recommended that a community health worker program be developed to support and implement community interventions to improve water use and household sanitation behaviors and to encourage the involvement of the community in targeting and guiding successful interventions. ^
Resumo:
Background. Polyomavirus reactivation is common in solid-organ transplant recipients who are given immunosuppressive medications as standard treatment of care. Previous studies have shown that polyomavirus infection can lead to allograft failure in as many as 45% of the affected patients. Hypothesis. Ubiquitous polyomaviruses when reactivated by post-transplant immunosuppressive medications may lead to impaired renal function and possibly lower survival prospects. Study Overview. Secondary analysis of data was conducted on a prospective longitudinal study of subjects who were at least 18 years of age and were recipients of liver and/or kidney transplant at Mayo Clinic Scottsdale, Arizona. Methods. DNA extractions of blinded urine and blood specimens of transplant patients collected at Mayo Clinic during routine transplant patient visits were performed at Baylor College of Medicine using Qiagen kits. Virologic assays included testing DNA samples for specific polyomavirus sequences using QPCR technology. De-identified demographic and clinical patient data were merged with laboratory data and statistical analysis was performed using Stata10. Results. 76 patients enrolled in the study were followed for 3.9 years post transplantation. The prevalence of BK virus and JC virus urinary excretion was 30% and 28%. Significant association was observed between JC virus excretion and kidney as the transplanted organ (P = 0.039, Pearson Chi-square test). The median urinary JCV viral loads were two logs higher than those of BKV. Patients that excreted both BKV and JCV appeared to have the worst renal function with a mean creatinine clearance value of 71.6 millimeters per minute. A survival disadvantage was observed for dual shedders of BKV and JCV, log-rank statistics, p = 0.09; 2/5 dual-shedders expired during the study period. Liver transplant and male sex were determined to be potential risk factors for JC virus activation in renal and liver transplant recipients. All patients tested negative for SV40 and no association was observed between polyomavirus excretion and type of immunosuppressive medication (tacrolimus, mycophenolate mofetil, cyclosporine and sirolimus). Conclusions. Polyomavirus reactivation was common after solid-organ transplantation and may be associated with impaired renal function. Male sex and JCV infection may be potential risk factors for viral reactivation; findings should be confirmed in larger studies.^
Resumo:
Trauma is a leading cause of death worldwide, and is thus a major public health concern. Improving current resuscitation strategies may help to reduce morbidity and mortality from trauma, and clinical research plays an important role in addressing these issues. This thesis is a secondary analysis of data that was collected for a randomized clinical trial being conducted at Ben Taub General Hospital. The trial is designed to compare a hypotensive resuscitation strategy to standard fluid resuscitation for the early treatment of trauma patients in hemorrhagic shock. This thesis examines the clinical outcomes from the first 90 subjects enrolled in the study, with the primary aim of assessing the safety of hypotensive resuscitation within the trauma population. ^ Patients in hemorrhagic shock who required emergent surgery were randomized to one of two arms of the study. Those in the experimental (LMAP) arm were managed with a hypotensive resuscitation strategy in which the target mean arterial pressure was 50mmHg. Those in the control (HMAP) arm were managed with standard fluid resuscitation to a target mean arterial pressure of 65mmHg. Patients were followed for 30 days. Mortality, post-operative complications, and other clinical data were prospectively gathered by the Ben Taub surgical staff and then secondarily analyzed for the purpose of this thesis.^ Subjects in the LMAP group had significantly lower early post-operative mortality compared to those in the HMAP group. 30-day mortality was also lower in the LMAP group, although this did not reach statistical significance. There were no statistically significant differences between the two groups with regards to development of ischemic, hematologic or infectious complications, length of hospitalization, length of ICU stay or duration of mechanical ventilation. ^ Based upon the data presented in this thesis, it appears that hypotensive resuscitation is a safe strategy for use in the trauma population. Specifically, hypotensive resuscitation reduced the risk of early post-operative death from coagulopathic bleeding and did not result in an increased risk of ischemic or other post-operative complications. The preliminary results described in this thesis provide convincing evidence support the continued investigation and use of hypotensive resuscitation in a trauma setting.^
Resumo:
Complex diseases, such as cancer, are caused by various genetic and environmental factors, and their interactions. Joint analysis of these factors and their interactions would increase the power to detect risk factors but is statistically. Bayesian generalized linear models using student-t prior distributions on coefficients, is a novel method to simultaneously analyze genetic factors, environmental factors, and interactions. I performed simulation studies using three different disease models and demonstrated that the variable selection performance of Bayesian generalized linear models is comparable to that of Bayesian stochastic search variable selection, an improved method for variable selection when compared to standard methods. I further evaluated the variable selection performance of Bayesian generalized linear models using different numbers of candidate covariates and different sample sizes, and provided a guideline for required sample size to achieve a high power of variable selection using Bayesian generalize linear models, considering different scales of number of candidate covariates. ^ Polymorphisms in folate metabolism genes and nutritional factors have been previously associated with lung cancer risk. In this study, I simultaneously analyzed 115 tag SNPs in folate metabolism genes, 14 nutritional factors, and all possible genetic-nutritional interactions from 1239 lung cancer cases and 1692 controls using Bayesian generalized linear models stratified by never, former, and current smoking status. SNPs in MTRR were significantly associated with lung cancer risk across never, former, and current smokers. In never smokers, three SNPs in TYMS and three gene-nutrient interactions, including an interaction between SHMT1 and vitamin B12, an interaction between MTRR and total fat intake, and an interaction between MTR and alcohol use, were also identified as associated with lung cancer risk. These lung cancer risk factors are worthy of further investigation.^
Resumo:
Bisphosphonates represent a unique class of drugs that effectively treat and prevent a variety of bone-related disorders including metastatic bone disease and osteoporosis. High tolerance and high efficacy rates quickly ranked bisphosphonates as the standard of care for bone-related diseases. However, in the early 2000s, case reports began to surface that linked bisphosphonates with osteonecrosis of the jaw (ONJ). Since that time, studies conducted have corroborated the linkage. However, as with most disease states, many factors can contribute to the onset of disease. The aim of this study was to determine which comorbid factors presented an increased risk for developing ONJ in cancer patients.^ Using a case-control study design, investigators used a combination of ICD-9 codes and chart review to identify confirmed cases of ONJ at The University of Texas M. D. Anderson Cancer Center (MDACC). Each case was then matched to five controls based on age, gender, race/ethnicity, and primary cancer diagnosis. Data querying and chart review provided information on variables of interest. These variables included bisphosphonate exposure, glucocorticoids exposure, smoking history, obesity, and diabetes. Statistical analysis was conducted using PASW (Predictive Analytics Software) Statistics, Version 18 (SPSS Inc., Chicago, Illinois).^ One hundred twelve (112) cases were identified as confirmed cases of ONJ. Variables were run using univariate logistic regression to determine significance (p < .05); significant variables were included in the final conditional logistic regression model. Concurrent use of bisphosphonates and glucocorticoids (OR, 18.60; CI, 8.85 to 39.12; p < .001), current smokers (OR, 2.52; CI, 1.21 to 5.25; p = .014), and presence of diabetes (OR, 1.84; CI, 1.06 to 3.20; p = .030) were found to increase the risk for developing ONJ. Obesity was not associated significantly with ONJ development.^ In this study, cancer patients that received bisphosphonates as part of their therapeutic regimen were found to have an 18-fold increase in their risk of developing ONJ. Other factors included smoking and diabetes. More studies examining the concurrent use of glucocorticoids and bisphosphonates may be able to strengthen any correlations.^
Resumo:
This thesis project is motivated by the potential problem of using observational data to draw inferences about a causal relationship in observational epidemiology research when controlled randomization is not applicable. Instrumental variable (IV) method is one of the statistical tools to overcome this problem. Mendelian randomization study uses genetic variants as IVs in genetic association study. In this thesis, the IV method, as well as standard logistic and linear regression models, is used to investigate the causal association between risk of pancreatic cancer and the circulating levels of soluble receptor for advanced glycation end-products (sRAGE). Higher levels of serum sRAGE were found to be associated with a lower risk of pancreatic cancer in a previous observational study (255 cases and 485 controls). However, such a novel association may be biased by unknown confounding factors. In a case-control study, we aimed to use the IV approach to confirm or refute this observation in a subset of study subjects for whom the genotyping data were available (178 cases and 177 controls). Two-stage IV method using generalized method of moments-structural mean models (GMM-SMM) was conducted and the relative risk (RR) was calculated. In the first stage analysis, we found that the single nucleotide polymorphism (SNP) rs2070600 of the receptor for advanced glycation end-products (AGER) gene meets all three general assumptions for a genetic IV in examining the causal association between sRAGE and risk of pancreatic cancer. The variant allele of SNP rs2070600 of the AGER gene was associated with lower levels of sRAGE, and it was neither associated with risk of pancreatic cancer, nor with the confounding factors. It was a potential strong IV (F statistic = 29.2). However, in the second stage analysis, the GMM-SMM model failed to converge due to non- concaveness probably because of the small sample size. Therefore, the IV analysis could not support the causality of the association between serum sRAGE levels and risk of pancreatic cancer. Nevertheless, these analyses suggest that rs2070600 was a potentially good genetic IV for testing the causality between the risk of pancreatic cancer and sRAGE levels. A larger sample size is required to conduct a credible IV analysis.^
Resumo:
1. With the global increase in CO2 emissions, there is a pressing need for studies aimed at understanding the effects of ocean acidification on marine ecosystems. Several studies have reported that exposure to CO2 impairs chemosensory responses of juvenile coral reef fishes to predators. Moreover, one recent study pointed to impaired responses of reef fish to auditory cues that indicate risky locations. These studies suggest that altered behaviour following exposure to elevated CO2 is caused by a systemic effect at the neural level. 2. The goal of our experiment was to test whether juvenile damselfish Pomacentrus amboinensis exposed to different levels of CO2 would respond differently to a potential threat, the sight of a large novel coral reef fish, a spiny chromis, Acanthochromis polyancanthus, placed in a watertight bag. 3. Juvenile damselfish exposed to 440 (current day control), 550 or 700 µatm CO2 did not differ in their response to the chromis. However, fish exposed to 850 µatm showed reduced antipredator responses; they failed to show the same reduction in foraging, activity and area use in response to the chromis. Moreover, they moved closer to the chromis and lacked any bobbing behaviour typically displayed by juvenile damselfishes in threatening situations. 4. Our results are the first to suggest that response to visual cues of risk may be impaired by CO2 and provide strong evidence that the multi-sensory effects of CO2 may stem from systematic effects at the neural level.
Resumo:
This study aims to analyze households' attitude toward flood risk in Cotonou in the sense to identify whether they are willing or not to leave the flood-prone zones. Moreover, the attitudes toward the management of wastes and dirty water are analyzed. The data used in this study were obtained from two sources: the survey implemented during March 2011 on one hundred and fifty randomly selected households living in flood-prone areas of Cotonou, and Benin Living Standard Survey of 2006 (Part relative to Cotonou on 1,586 households). Moreover, climate data were used in this study. Multinomial probability model is used for the econometric analysis of the attitude toward flood risk. While the attitudes toward the management of wastes and dirty water are analyzed through a simple logit. The results show that 55.3% of households agreed to go elsewhere while 44.7% refused [we are better-off here (10.67%), due to the proximity of the activities (19.33), the best way is to build infrastructures that will protect against flood and family house (14.67%)]. The authorities have to rethink an alternative policy to what they have been doing such as building socio-economic houses outside Cotonou and propose to the households that are living the areas prone to inundation. Moreover, access to formal education has to be reinforced.
Resumo:
Three methodologies to assess As bioaccessibility were evaluated using playgroundsoil collected from 16 playgrounds in Madrid, Spain: two (Simplified Bioaccessibility Extraction Test: SBET, and hydrochloric acid-extraction: HCl) assess gastric-only bioaccessibility and the third (Physiologically Based Extraction Test: PBET) evaluates mouth–gastric–intestinal bioaccessibility. Aqua regia-extractable (pseudo total) As contents, which are routinely employed in riskassessments, were used as the reference to establish the following percentages of bioaccessibility: SBET – 63.1; HCl – 51.8; PBET – 41.6, the highest values associated with the gastric-only extractions. For Madridplaygroundsoils – characterised by a very uniform, weakly alkaline pH, and low Fe oxide and organic matter contents – the statistical analysis of the results indicates that, in contrast with other studies, the highest percentage of As in the samples was bound to carbonates and/or present as calcium arsenate. As opposed to the As bound to Fe oxides, this As is readily released in the gastric environment as the carbonate matrix is decomposed and calcium arsenate is dissolved, but some of it is subsequently sequestered in unavailable forms as the pH is raised to 5.5 to mimic intestinal conditions. The HCl extraction can be used as a simple and reliable (i.e. low residual standard error) proxy for the more expensive, time consuming, and error-prone PBET methodology. The HCl method would essentially halve the estimate of carcinogenic risk for children playing in Madridplaygroundsoils, providing a more representative value of associated risk than the pseudo-total concentrations used at present
Resumo:
Prevalence of vitamin B12 deficiency is very common in elderly people and can reach values as high as 40.5% of the population. It can be the result of the interaction among several factors. Vitamin B12 deficiencies have been associated with neurological, cognitive deterioration, haematological abnormalities and cardiovascular diseases that have an important influence on the health of the elderly and their quality of life. It is necessary to approach the problems arisen from the lack of data relative to them. The main objective of this thesis was to analyse the evolution of vitamin B12 status and related parameters, lipid and haematological profiles and their relationship to health risk factors, and to functional and cognitive status over one year and to determine the effect of an oral supplementation of 500 μg of cyanocobalamin for a short period of 28 days. An additional objective was to analyze the possible effects of medicine intakes on vitamin B status. Three studies were performed: a) a one year longitudinal follow-up with four measure points; b) an intervention study providing an oral liquid supplement of 500 μg of cyanocobalamin for a 28 days period; and c) analysis of the possible effect of medication intake on vitamin B status using the ATC classification of medicines. The participants for these studies were recruited from nursing homes for the elderly in the Region of Madrid. Sixty elders (mean age 84 _ 7y, 19 men and 41 women) were recruited for Study I and 64 elders (mean age 82 _ 7y, 24 men and 40 women) for Study II. For Study III, baseline data from the initially recruited participants of the first two studies were used. An informed consent was obtained from all participants or their mentors. The studies were approved by the Ethical Committee of the University of Granada. Blood samples were obtained at each examination date and were analyzed for serum cobalamin, holoTC, serum and RBC folate and total homocysteine according to laboratory standard procedures. The haematological parameters analyzed were haematocrit, haemoglobin and MCV. For the lipid profile TG, total cholesterol, LDL- and HDLcholesterol were analyzed. Anthropometric measures (BMI, skinfolds [triceps and subscapular], waist girth and waist to hip ratio), functional tests (hand grip, arm and leg strength tests, static balance) and MMSE were obtained or administered by trained personal. The vitamin B12 supplement of Study II was administered with breakfast and the medication intake was taken from the residents’ anamnesis. Data were analyzed by parametric and non-parametric statistics depending on the obtained data. Comparisons were done using the appropriate ANOVAs or non-parametric tests. Pearsons’ partial correlations with the variable “time” as control were used to define the association of the analyzed parameters. XIII The results showed that: A) Over one year, in relationship to vitamin B status, serum cobalamin decreased, serum folate and mean corpuscular volumen increased significantly and total homocysteine concentrations were stable. Regarding blood lipid profile, triglycerides increased and HDL-cholesterol decreased significantly. Regarding selected anthropometric measurements, waist circumference increased significantly. No significant changes were observed for the rest of parameters. B) Prevalence of hyperhomocysteinemia was high in the elderly studied, ranging from 60% to 90 % over the year depending on the cut-off used for the classification. LDL-cholesterol values were high, especially among women, and showed a tendency to increase over the year. Results of the balance test showed a deficiency and a tendency to decrease; this indicates that the population studied is at high risk for falls. Lower extremity muscular function was deficient and showed a tendency to decrease. A highly significant relationship was observed between the skinfold of the triceps and blood lipid profile. C) Low cobalamin concentrations correlated significantly with low MMSE scores in the elderly studied. No correlations were observed between vitamin B12 status and functional parameters. D) Regarding vitamin B12 status, holo-transcobalamin seems to be more sensitive for diagnosis; 5-10% of the elderly had a deficiency using serum cobalamin as a criterion, and 45-52% had a deficiency when using serum holotranscobalamin as a criterion. E) 500 μg of cyanocobalamin administered orally during 28 days significantly improved vitamin B12 status and significantly decreased total homocysteine concentrations in institutionalized elderly. No effect of the intervention was observed on functional and cognitive parameters. F) The relative change (%) of improvement of vitamin B12 status was higher when using serum holo-transcobalamin as a criterion than serum cobalamin. G) Antiaenemic drug intake normalized cobalamin, urologic drugs and corticosteroids serum folate, and psychoanaleptics holo-transcobalamin levels. Drugs treating pulmonary obstruction increased total homocysteine concentration significantly. H) The daily mean drug intake was 5.1. Fiftynine percent of the elderly took medication belonging to 5 or more different ATC groups. The most prevalent were psycholeptic (53%), antiacid (53%) and antithrombotic (47%) drugs.
Resumo:
All activities of an organization involve risks that should be managed. The risk management process aids decision making by taking account of uncertainty and the possibility of future events or circumstances (intended or unintended) and their effects on agreed objectives. With that idea, new ISO Standard has been drawn up. ISO 31010 has been recently issued which provides a structured process that identifies how objectives may be affected, and analyses the risk in term of consequences and their probabilities before deciding on whether further treatment is required. In this lecture, that ISO Standard has been adapted to Open Pit Blasting Operations, focusing in Environmental effects which can be managed properly. Technique used is Fault Tree Analysis (FTA), which is applied in all possible scenarios, providing to Blasting Professionals the tools to identify, analyze and manage environmental effects in blasting operations. Also this lecture can help to minimize each effect, studying each case. This paper also can be useful to Project Managers and Occupational Health and Safety Departments (OH&S) because blasting operations can be evaluated and compared one to each other to determine the risks that should be managed in different case studies. The environmental effects studied are: ground vibrations, flyrock and air overpressure (airblast). Sometimes, blasting operations are carried out near populated areas where environmental effects may impose several limitations on the use of explosives. In those cases, where these factors approach certain limits, National Standards and Regulations have to be applied.
Resumo:
Allostatic load (AL) has been proposed as a new conceptualization of cumulative biological burden exacted on the body through attempts to adapt to life's demands. Using a multisystem summary measure of AL, we evaluated its capacity to predict four categories of health outcomes, 7 years after a baseline survey of 1,189 men and women age 70–79. Higher baseline AL scores were associated with significantly increased risk for 7-year mortality as well as declines in cognitive and physical functioning and were marginally associated with incident cardiovascular disease events, independent of standard socio-demographic characteristics and baseline health status. The summary AL measure was based on 10 parameters of biological functioning, four of which are primary mediators in the cascade from perceived challenges to downstream health outcomes. Six of the components are secondary mediators reflecting primarily components of the metabolic syndrome (syndrome X). AL was a better predictor of mortality and decline in physical functioning than either the syndrome X or primary mediator components alone. The findings support the concept of AL as a measure of cumulative biological burden.
Resumo:
Introdução: As precauções-padrão (PP) constituem um conjunto de medidas que têm como finalidade minimizar o risco de transmissão ocupacional de patógenos, sendo indispensável sua utilização por profissionais de saúde, sobretudo pelos enfermeiros. No entanto, a não adesão às PP constitui problemática amplamente discutida em todo o mundo. Embora haja diversos estudos brasileiros que visem avaliar a adesão às PP , ainda tem-se observado grande fragilidade no processo de construção e de validação dos instrumentos utilizados para avaliação deste construto. Objetivo: Realizar a adaptação cultural e validação da Compliance with Standard Precautions Scale (CSPS) para enfermeiros brasileiros. Metodologia: Trata-se de um estudo metodológico para a adaptação e validação da CSPS. Essa escala é composta por 20 itens com quatro opções de respostas, e destina-se a avaliar a adesão às PP. O processo de adaptação consistiu em Tradução, Consenso entre Juízes, Retrotradução e Validação Semântica. A primeira etapa foi a tradução do idioma original para o português do Brasil. Após foi realizado um comitê composto por sete juízes, a versão de consenso obtida na etapa anterior foi traduzida novamente para o idioma de origem. Foram avaliadas as propriedades psicométricas do instrumento, considerando-se as validades de face e de conteúdo, a validade de construto e a confiabilidade. A versão para o Português do Brasil da CSPS (CSPS-PB) foi aplicada em uma amostra de 300 enfermeiros que atuam na assistência a pacientes em um hospital localizado na cidade de São Paulo/SP. A confiabilidade foi avaliada por meio da consistência interna (alfa de Cronbach) e teste reteste (coeficiente de correlação intraclasse - ICC). Para a validação de construto, foi utilizada a comparação entre grupos diferentes, análise fatorial exploratória e análise fatorial confirmatória, segundo o Modelo de Equações Estruturais (SEM). Utilizou-se o software IBM® SPSS, 19.0. Para a análise fatorial confirmatória foi utilizado o módulo específico Analysis of Moment Structures (IBM® SPSS AMOS). Para a análise paralela utilizou-se o programa RanEigen Syntax. O nível de significância adotado foi ? = 0,05. Todos os aspectos éticos foram contemplados. Resultados: A tradução realizada por tradutores juramentados garantiu qualidade a esse processo. A validação de face e de conteúdo possibilitou a realização de modificações pertinentes e imperativas a fim de atender aos critérios de equivalências conceituais, idiomáticas, culturais e semânticas. Obteve-se ?=0,61 na avaliação da consistência interna, indicando confiabilidade satisfatória. O ICC indicou uma correlação de 0,87 quase perfeita para o teste reteste duas semanas após a primeira abordagem, conferindo estabilidade satisfatória. A validade de construto mostrou que a CSPS-PB foi capaz de discriminar as médias de adesão às PP entre grupos distintos referente à idade (F=5,15 p<=0,01), ao tempo de experiência clínica (F = 8,9 p<= 0,000) e a ter recebido treinamento (t = 2,48 p<=0,01). Na análise fatorial confirmatória, o modelo foi subidentificado. A análise fatorial exploratória indicou que todos os itens apresentaram cargas fatoriais adequadas (>=0,30), sendo identificados quatro fatores pela análise paralela. O total de variância explicada foi de 35,48%. Conclusão: A CSPS-PB, trata-se de um instrumento adequado, confiável e válido para medir a adesão às PP entre enfermeiros brasileiros