63 resultados para time and risk preferences
em Scielo Saúde Pública - SP
Resumo:
In order to estimate the incidence of and risk factors for developing tuberculosis, the clinical charts of a retrospective cohort of 281 HIV-positive adults, who were notified to the AIDS Program of the Health Department of Brasilia in 1998, were reviewed in 2003. All the patients were treatment-naive regarding antiretroviral therapy at the time of inclusion in the cohort. Twenty-nine patients were identified as having tuberculosis at the start of the study. Thirteen incident tuberculosis cases were identified during the 60 months of follow-up, with an incidence density rate of 1.24/100 person-years. Tuberculosis incidence was highest among patients with baseline CD4+ T-lymphocyte counts < 200 cells/µl who were not using antiretroviral therapy (incidence = 5.47; 95% CI = 2.73 to 10.94). Multivariate analysis showed that baseline CD4+ T-lymphocyte counts < 200 cells/µl (adjusted hazard ratio [AHR] = 5.09; 95% CI = 1.27 to 20.37; p = 0.02) and non-use of antiretroviral therapy (AHR = 12.17; 95% CI = 2.6 to 56.90; p = 0.001) were independently associated with increased risk of tuberculosis.
Resumo:
An hemodialysis population in Central Brazil was screened by polymerase chain reaction (PCR) and serological methods to assess the prevalence of hepatitis C virus (HCV) infection and to investigate associated risk factors. All hemodialysis patients (n=428) were interviewed in eight dialysis units in Goiânia city. Blood samples were collected and serum samples screened for anti-HCV antibodies by an enzyme-linked immunosorbent assay (ELISA). Positive samples were retested for confirmation with a line immunoassay (LIA). All samples were also tested for HCV RNA by the PCR. An overall prevalence of 46.7% (CI 95%: 42-51.5) was found, ranging from 20.7% (CI 95%: 8.8-38.1) to 90.4% (CI 95%: 79.9-96.4) depending on the dialysis unit. Of the 428 patients, 185 were found to be seropositive by ELISA, and 167 were confirmed positive by LIA, resulting in an anti-HCV prevalence of 39%. A total of 131 patients were HCV RNA-positive. HCV viremia was present in 63.5% of the anti-HCV-positive patients and in 10.3% of the anti-HCV-negative patients. Univariate analysis of risk factors showed that the number of previous blood transfusions, transfusion of blood before mandatory screening for anti-HCV, length of time on hemodialysis, and treatment in multiple units were associated with HCV positivity. However, multivariate analysis revealed that blood transfusion before screening for anti-HCV and length of time on hemodialysis were significantly associated with HCV infection in this population. These data suggest that nosocomial transmission may play a role in the spread of HCV in the dialysis units studied. In addition to anti-HCV screening, HCV RNA detection is necessary for the diagnosis of HCV infection in hemodialysis patients.
Resumo:
A hemodialysis population from a dialysis unit in the city of Recife, Northeastern Brazil, was screened to assess the prevalence of hepatitis C virus (HCV) infection and to investigate the associated risk factors. Hemodialysis patients (n = 250) were interviewed and serum samples tested for anti-HCV antibodies by enzyme-linked immunosorbent assay (ELISA). All samples were also tested for HCV RNA by reverse transcriptase nested polymerase chain reaction (RT-nested-PCR). Out of 250 patients, 21 (8.4%) were found to be seropositive by ELISA, and 19 (7.6%) patients were HCV RNA positive. HCV viraemia was present in 90.5% of the anti-HCV positive patients. The predominant genotype was HCV 1a (8/19), followed by 3a (7/19), and 1b (4/19). None of the anti-HCV negative patients were shown to be viraemic by the PCR. Univariate analysis of risk factors showed that time spent on hemodialysis, the number of blood transfusions and a blood transfusion before November 1993 were associated with HCV positivity. However, multivariate analysis revealed that blood transfusions before November 1993 were significantly associated with HCV infection in this population. Low prevalence levels were encountered in this center, however prospective studies are necessary to confirm these findings.
Resumo:
A survey was conducted among the hemodialysis units of the city of Campo Grande, located in the state of Mato Grosso do Sul in the Mid-west region of Brazil, with the aim of investigating the prevalence, risk factors, and genotypes of hepatitis C virus (HCV) infection. A total of 163 patients were interviewed in five dialysis units. Serum samples were screened for anti-HCV. Positive samples were tested for HCV RNA and genotyped. The prevalence of anti-HCV was 11% (95% CI: 6.8-17.1). A history of transfusion with blood that was not screened for anti-HCV and length of time on hemodialysis were associated with HCV infection. HCV RNA was detected in 12 samples: ten were of genotype 1, subtypes 1a (75%) and 1b (8.3%), and two were of genotype 3, subtype 3a (16.7%).
Resumo:
Chagas heart disease (CHD) results from infection with the protozoan parasite Trypanosoma cruzi and is the leading cause of infectious myocarditis worldwide. It poses a substantial public health burden due to high morbidity and mortality. CHD is also the most serious and frequent manifestation of chronic Chagas disease and appears in 20-40% of infected individuals between 10-30 years after the original acute infection. In recent decades, numerous clinical and experimental investigations have shown that a low-grade but incessant parasitism, along with an accompanying immunological response [either parasite-driven (most likely) or autoimmune-mediated], plays an important role in producing myocardial damage in CHD. At the same time, primary neuronal damage and microvascular dysfunction have been described as ancillary pathogenic mechanisms. Conduction system disturbances, atrial and ventricular arrhythmias, congestive heart failure, systemic and pulmonary thromboembolism and sudden cardiac death are the most common clinical manifestations of chronic Chagas cardiomyopathy. Management of CHD aims to relieve symptoms, identify markers of unfavourable prognosis and treat those individuals at increased risk of disease progression or death. This article reviews the pathophysiology of myocardial damage, discusses the value of current risk stratification models and proposes an algorithm to guide mortality risk assessment and therapeutic decision-making in patients with CHD.
Resumo:
Clinical and laboratory risk factors for death from visceral leishmaniasis (VL) are relatively known, but quantitative real-time polymerase chain reaction (qPCR) might assess the role of parasite load in determining clinical outcome. The aim of this study was to identify risk factors, including parasite load in peripheral blood, for VL poor outcome among children. This prospective cohort study evaluated children aged ≤ 12 years old with VL diagnosis at three times: pre-treatment (T0), during treatment (T1) and post-treatment (T2). Forty-eight patients were included and 16 (33.3%) met the criteria for poor outcome. Age ≤ 12 months [relative risk (RR) 3.51; 95% confidence interval (CI) 1.89-6.52], tachydyspnoea (RR 3.46; 95% CI 2.19-5.47), bacterial infection (RR 3.08; 95% CI 1.27-7.48), liver enlargement (RR 3.00; 95% CI 1.44-6.23) and low serum albumin (RR 7.00; 95% CI 1.80-27.24) were identified as risk factors. qPCR was positive in all patients at T0 and the parasite DNA was undetectable in 76.1% of them at T1 and in 90.7% at T2. There was no statistical association between parasite load at T0 and poor outcome.
Resumo:
Few data are available on the prevalence and risk factors of Chlamydophila abortus infection in goats in Brazil. A cross-sectional study was carried out to determine the flock-level prevalence of C. abortus infection in goats from the semiarid region of the Paraíba State, Northeast region of Brazil, as well as to identify risk factors associated with the infection. Flocks were randomly selected and a pre-established number of female goats > 12 mo old were sampled in each of these flocks. A total of 975 serum samples from 110 flocks were collected, and structured questionnaire focusing on risk factors for C. abortus infection was given to each farmer at the time of blood collection. For the serological diagnosis the complement fixation test (CFT) using C. abortus S26/3 strain as antigen was performed. The flock-level factors for C. abortus prevalence were tested using multivariate logistic regression model. Fifty-five flocks out of 110 presented at least one seropositive animal with an overall prevalence of 50.0% (95%; CI: 40.3%, 59.7%). Ninety-one out of 975 dairy goats examined were seropositive with titers >32, resulting in a frequency of 9.3%. Lend buck for breeding (odds ratio = 2.35; 95% CI: 1.04-5.33) and history of abortions (odds ratio = 3.06; 95% CI: 1.37-6.80) were associated with increased flock prevalence.
Resumo:
The survival of hemodialysis patients is likely to be influenced not only by well-known risk factors like age and comorbidity, but also by changes in dialysis technology and practices accumulated along time. We compared the survival curves, dialysis routines and some risk factors of two groups of patients admitted to a Brazilian maintenance hemodialysis program during two consecutive decades: March 1977 to December 1986 (group 1, N = 162) and January 1987 to June 1997 (group 2, N = 237). The median treatment time was 22 months (range 1-198). Survival curves were constructed using the Kaplan-Meier method and compared using the log-rank method. The Cox proportional hazard regression model was used to investigate the more important variables associated with outcome. The most important changes in dialysis routine and in patient care during the total period of observation were the progressive increase in the dose of dialysis delivered, the prohibition of potassium-free dialysate, the use of bicarbonate as a buffer and the upgrading of the dialysis equipment. There were no significant differences between the survival curves of the two groups. Survival rates at 1, 5 and 10 years were 84, 53 and 29%, respectively, for group 1 and 77, 42 and 21% for group 2. Patients in group 1 were younger (45.5 ± 15.2 vs 55.2 ± 15.9 years, P<0.001) and had a lower prevalence of diabetes (11.1 vs 27.4%, P<0.001) and of cardiovascular disease (9.3 vs 20.7%, P<0.001). According to the Cox multivariate model, only age (hazard ratio (HR) 1.04, confidence interval (CI) 1.03-1.05, P<0.001) and diabetes (HR 2.55, CI 1.82-3.58, P<0.001) were independent predictors of mortality for the whole group. Patients of group 2 had a lower prevalence of sudden death (19.1 vs 9.7%, P<0.001). After adjusting for age, diabetes and other mortality risk factors, the risk of death was 17% lower in group 2, although this difference was not statistically significant. We conclude that the negative effects of advanced age and of higher frequency of comorbidity on the survival of group 2 patients were probably offset by improvements in patient care and in the quality and dose of dialysis delivered, so that the survival curves did not undergo significant changes along time.
Resumo:
INTRODUTION: Steroid resistant idiopathic nephrotic syndrome (SRINS) in children is one of the leading causes of progression to chronic kidney disease stage V (CKD V)/end stage renal disease (ESRD). OBJECTIVE: The aim of this retrospective study is to evaluate the efficacy of immunosuppressive drugs (IS) and to identify risk factors for progression to ESRD in this population. METHODS: Clinical and biochemical variables at presentation, early or late steroid resistance, histological pattern and response to cyclosporine A (CsA) and cyclophosfamide (CP) were reviewed in 136 children with SRINS. The analyzed outcome was the progression to ESRD. Univariate as well as multivariate Cox-regression analysis were performed. RESULTS: Median age at onset was 5.54 years (0.67-17.22) and median follow up time was 6.1 years (0.25-30.83). Early steroid-resistance was observed in 114 patients and late resistance in 22. Resistance to CP and CsA was 62.9% and 35% respectively. At last follow-up 57 patients reached ESRD. The renal survival rate was 71.5%, 58.4%, 55.3%, 35.6% and 28.5% at 5, 10, 15, 20 and 25 years respectively. Univariate analysis demonstrated that older age at onset, early steroid-resistance, hematuria, hypertension, focal segmental glomerulosclerosis (FSGS), and resistance to IS were risk factors for ESRD. The Cox proportional-hazards regression identified CsAresistance and FSGS as the only predictors for ESRD. CONCLUSION: Our findings showed that CsA-resistance and FSGS were risk factors for ESRD.
Resumo:
OBJECTIVE: To analyze the association between dietary patterns and oral cancer. METHODS: The study, part of a Latin American multicenter hospital-based case-control study, was conducted in São Paulo, Southeastern Brazil, between November 1998 and March 2002 and included 366 incident cases of oral cancer and 469 controls, frequency-matched with cases by sex and age. Dietary data were collected using a food frequency questionnaire. The risk associated with the intake of food groups defined a posteriori, through factor analysis (called factors), was assessed. The first factor, labeled "prudent," was characterized by the intake of vegetables, fruit, cheese, and poultry. The second factor, "traditional," consisted of the intake of rice, pasta, pulses, and meat. The third factor, "snacks," was characterized as the intake of bread, butter, salami, cheese, cakes, and desserts. The fourth, "monotonous," was inversely associated with the intake of fruit, vegetables and most other food items. Factor scores for each component retained were calculated for cases and controls. After categorization of factor scores into tertiles according to the distribution of controls, odds ratios and 95% confidence intervals were calculated using unconditional multiple logistic regression. RESULTS: "Traditional" factor showed an inverse association with cancer (OR=0.51; 95% CI: 0.32; 0.81, p-value for trend 0.14), whereas "monotonous" was positively associated with the outcome (OR=1.78; 95% CI: 1.78; 2.85, p-value for trend <0.001). CONCLUSIONS: The study data suggest that the traditional Brazilian diet, consisting of rice and beans plus moderate amounts of meat, may confer protection against oral cancer, independently of any other risk factors such as alcohol intake and smoking.
Resumo:
OBJECTIVE To estimate the incidence and identify risk factors for intimate partner violence during postpartum.METHODS This prospective cohort study was conducted with women, aged between 18-49 years, enrolled in the Brazilian Family Health Strategy in Recife, Northeastern Brazil, between 2005 and 2006. Of the 1.057 women interviewed during pregnancy and postpartum, 539 women, who did not report violence before or during pregnancy, were evaluated. A theoretical-conceptual framework was built with three levels of factors hierarchically ordered: women’s and partners’ sociodemografic and behavioral characteristics, and relationship dynamics. Incidence and risk factors of intimate partner violence were estimated by Poisson Regression.RESULTS The incidence of violence during postpartum was 9.3% (95%CI 7.0;12.0). Isolated psychological violence was the most common (4.3%; 95%CI 2.8;6.4). The overlapping of psychological with physical violence occurred at 3.3% (95%CI 2.0;5.3) and with physical and/or sexual in almost 2.0% (95%CI 0.8;3.0) of cases. The risk of partner violence during postpartum was increased for women with a low level of education (RR = 2.6; 95%CI 1.3;5.4), without own income (RR = 1.7; 95%CI 1.0;2.9) and those who perpetrated physical violence against their partner without being assaulted first (RR = 2.0; 95%CI 1.2;3.4), had a very controlling partner (RR = 2.5; 95%CI 1.1;5.8), and had frequent fights with their partner (RR = 1.7; 95%CI 1.0;2.9).CONCLUSIONS The high incidence of intimate partner violence during postpartum and its association with aspects of the relationship’s quality between the couple, demonstrated the need for public policies that promote conflict mediation and enable forms of empowerment for women to address the cycle of violence.
Resumo:
Results of a HIV prevalence study conducted in hemophiliacs from Belo Horizonte, Brazil are presented. History of exposure to acellular blood components was determined for the five year period prior to entry in the study, which occurred during 1986 and 1987. Patients with coagulations disorders (hemophilia A = 132, hemophilia B = 16 and coagulation disorders other than hemophilia = 16) were transfused with liquid cryoprecipitate, locally produced, lyophilized cryoprecipitate, imported from São Paulo (Brazil) and factor VIII and IX, imported from Rio de Janeiro (Brazil), Europe, and United States. Thirty six (22%) tested HIV seropositive. The univariate and multivariate analysis (logistic model) demonstrated that the risk of HIV infection during the study period was associated with the total units of acellular blood components transfused. In addition, the proportional contribution of the individual components to the total acellular units transfused, namely a increase in factor VIII/IX and lyophilized cryoprecipitate proportions, were found to be associated with HIV seropositivity. This analysis suggest that not only the total amount of units was an important determinant of HIV infection, but that the risk was also associated with the specific component of blood transfused
Resumo:
Data concerning HCV infection in Central Brazil are rare. Upon testing 2,350 voluntary blood donors from this region, we found anti-HCV prevalence rates of 2.2% by a second generation ELISA and 1.4% after confirmation by a line immunoassay. Antibodies against core, NS4, and NS5 antigens of HCV were detected in 81.8%, 72.7%, and 57.5%, respectively, of the positive samples in the line immunoassay. HCV viremia was present in 76.6% of the anti-HCV-positive blood donors. A relation was observed between PCR positivity and serum reactivity in recognizing different HCV antigens in the line immunoassay. The majority of the positive donors had history of previous parenteral exposure. While the combination of ALT>50 IU/l and anti-HBc positivity do not appear to be good surrogate markers for HCV infection, the use of both ALT anti-HCV tests is indicated in the screening of Brazilian blood donors.
Resumo:
Schistosomiasis mansoni in the Serrano village, municipality of Cururupu, state of Maranhão, Brazil, is a widely spread disease. The PECE (Program for the Control of Schistosomiasis), undertaken since 1979 has reduced the prevalence of S. mansoni infection and the hepatosplenic form of the disease. Nevertheless piped water is available in 84% of the households, prevalence remains above 20%. In order to identify other risk factors responsible for the persistence of high prevalence levels, a cross-sectional survey was carried out in a systematic sample of 294 people of varying ages. Socioeconomic, environmental and demographic variables, and water contact patterns were investigated. Fecal samples were collected and analyzed by the Kato-Katz technique. Prevalence of S. mansoni infection was 24.1%, higher among males (35.5%) and between 10-19 years of age (36.6%). The risk factors identified in the univariable analysis were water contacts for vegetable extraction (Risk Ratio - RR = 2.92), crossing streams (RR = 2.55), bathing (RR = 2.35), fishing (RR = 2.19), hunting (RR = 2.17), cattle breeding (RR = 2.04), manioc culture (RR = 1.90) and leisure (RR = 1.56). After controlling for confounding variables by proportional hazards model the risks remained higher for males, vegetable extraction, bathing in rivers and water contact in rivers or in periodically inundated parts of riverine woodland (swamplands)
Resumo:
This study was carried out in order to obtain base-line data concerning the epidemiology of American Visceral Leishmaniasis and Chagas Disease in an indigenous population with whom the government is starting a dwelling improvement programme. Information was collected from 242 dwellings (1,440 people), by means of house to house interviews about socio-economic and environmental factors associated with Leishmania chagasi and Trypanosoma cruzi transmission risk. A leishmanin skin test was applied to 385 people and 454 blood samples were collected on filter paper in order to detect L. chagasi antibodies by ELISA and IFAT and T. cruzi antibodies by ELISA. T. cruzi seroprevalence was 8.7% by ELISA, L. chagasi was 4.6% and 5.1% by IFAT and ELISA, respectively. ELISA sensitivity and specificity for L. chagasi antibodies were 57% and 97.5% respectively, as compared to the IFAT. Leishmanin skin test positivity was 19%. L. chagasi infection prevalence, being defined as a positive result in the three-immunodiagnostic tests, was 17.1%. Additionally, 2.7% of the population studied was positive to both L. chagasi and T. cruzi, showing a possible cross-reaction. L. chagasi and T. cruzi seropositivity increased with age, while no association with gender was observed. Age (p<0.007), number of inhabitants (p<0.05), floor material (p<0.03) and recognition of vector (p<0.01) were associated with T. cruzi infection, whilst age ( p<0.007) and dwelling improvement (p<0.02) were associated with L. chagasi infection. It is necessary to evaluate the long-term impact of the dwelling improvement programme on these parasitic infections in this community.