910 resultados para High-income population


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aimed to determine the frequency of Chlamydia trachomatis (CT) infection among high risk Brazilian women and evaluate its association with vaginal flora patterns.This was a cross-sectional study, performed in an outpatient clinic of Bauru State Hospital, So Paulo, Brazil. A total of 142 women were included from 2006 to 2008. Inclusion criteria was dyspareunia, pain during bimanual exam, presence of excessive cervical mucus, cervical ectopy or with three or more episodes of abnormal vaginal flora (AVF) in the previous year before enrollment. Endocervical CT testing was performed by PCR. Vaginal swabs were collected for microscopic assessment of the microbial flora pattern. Gram-stained smears were classified in normal, intermediate or bacterial vaginosis (BV), and recognition of Candida sp. morphotypes. Wet mount smears were used for detection of Trichomonas vaginalis and aerobic vaginitis (AV).Thirty-four of 142 women (23.9%) tested positive for CT. AVF was found in 50 (35.2%) cases. The most frequent type of AVF was BV (17.6%). CT was strongly associated with the presence of AV (n = 7, 4.9%, P = 0.018), but not BV (n = 25, 17.6%, P = 0.80) or intermediate flora (n = 18, 12.7%, P = 0.28).A high rate of chlamydial infection was found in this population. Chlamydia infection is associated with aerobic vaginitis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The anesthesia-related cardiac arrest (CA) rate is a quality indicator to improve patient safety in the perioperative period. A systematic review with meta-analysis of the worldwide literature related to anesthesia-related CA rate has not yet been performed.This study aimed to analyze global data on anesthesia-related and perioperative CA rates according to country's Human Development Index (HDI) and by time. In addition, we compared the anesthesia-related and perioperative CA rates in low- and high-income countries in 2 time periods.A systematic review was performed using electronic databases to identify studies in which patients underwent anesthesia with anesthesia-related and/or perioperative CA rates. Meta-regression and proportional meta-analysis were performed with 95% confidence intervals (CIs) to evaluate global data on anesthesia-related and perioperative CA rates according to country's HDI and by time, and to compare the anesthesia-related and perioperative CA rates by country's HDI status (low HDI vs high HDI) and by time period (pre-1990s vs 1990s-2010s), respectively.Fifty-three studies from 21 countries assessing 11.9 million anesthetic administrations were included. Meta-regression showed that anesthesia-related (slope: -3.5729; 95% CI: -6.6306 to -0.5152; P = 0.024) and perioperative (slope: -2.4071; 95% CI: -4.0482 to -0.7659; P = 0.005) CA rates decreased with increasing HDI, but not with time. Meta-analysis showed per 10,000 anesthetics that anesthesia-related and perioperative CA rates declined in high HDI (2.3 [95% CI: 1.2-3.7] before the 1990s to 0.7 [95% CI: 0.5-1.0] in the 1990s-2010s, P < 0.001; and 8.1 [95% CI: 5.1-11.9] before the 1990s to 6.2 [95% CI: 5.1-7.4] in the 1990s-2010s, P < 0.001, respectively). In low-HDI countries, anesthesia-related CA rates did not alter significantly (9.2 [95% CI: 2.0-21.7] before the 1990s to 4.5 [95% CI: 2.4-7.2] in the 1990s-2010s, P = 0.14), whereas perioperative CA rates increased significantly (16.4 [95% CI: 1.5-47.1] before the 1990s to 19.9 [95% CI: 10.9-31.7] in the 1990s-2010s, P = 0.03).Both anesthesia-related and perioperative CA rates decrease with increasing HDI but not with time. There is a clear and consistent reduction in anesthesia-related and perioperative CA rates in high-HDI countries, but an increase in perioperative CA rates without significant alteration in the anesthesia-related CA rates in low-HDI countries comparing the 2 time periods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The lower tuberculosis incidence reported in human immunodeficiency virus (HIV)-positive individuals receiving combined antiretroviral therapy (cART) is difficult to interpret causally. Furthermore, the role of unmasking immune reconstitution inflammatory syndrome (IRIS) is unclear. We aim to estimate the effect of cART on tuberculosis incidence in HIV-positive individuals in high-income countries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Highly active antiretroviral therapy (HAART) is being scaled up in developing countries. We compared baseline characteristics and outcomes during the first year of HAART between HIV-1-infected patients in low-income and high-income settings. METHODS: 18 HAART programmes in Africa, Asia, and South America (low-income settings) and 12 HIV cohort studies from Europe and North America (high-income settings) provided data for 4810 and 22,217, respectively, treatment-naive adult patients starting HAART. All patients from high-income settings and 2725 (57%) patients from low-income settings were actively followed-up and included in survival analyses. FINDINGS: Compared with high-income countries, patients starting HAART in low-income settings had lower CD4 cell counts (median 108 cells per muL vs 234 cells per muL), were more likely to be female (51%vs 25%), and more likely to start treatment with a non-nucleoside reverse transcriptase inhibitor (NNRTI) (70%vs 23%). At 6 months, the median number of CD4 cells gained (106 cells per muL vs 103 cells per muL) and the percentage of patients reaching HIV-1 RNA levels lower than 500 copies/mL (76%vs 77%) were similar. Mortality was higher in low-income settings (124 deaths during 2236 person-years of follow-up) than in high-income settings (414 deaths during 20,532 person-years). The adjusted hazard ratio (HR) of mortality comparing low-income with high-income settings fell from 4.3 (95% CI 1.6-11.8) during the first month to 1.5 (0.7-3.0) during months 7-12. The provision of treatment free of charge in low-income settings was associated with lower mortality (adjusted HR 0.23; 95% CI 0.08-0.61). INTERPRETATION: Patients starting HAART in resource-poor settings have increased mortality rates in the first months on therapy, compared with those in developed countries. Timely diagnosis and assessment of treatment eligibility, coupled with free provision of HAART, might reduce this excess mortality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examined the incidence of and risk factors for tuberculosis during the first year of highly active antiretroviral therapy in low-income (4540 patients) and high-income (22,217 patients) countries. Although incidence was much higher in low-income countries, the reduction in the incidence of tuberculosis associated with highly active antiretroviral therapy was similar: the rate ratio for months 7-12 versus months 1-3 was 0.48 (95% confidence interval, 0.36-0.64) in low-income countries and 0.36 (95% confidence interval, 0.26-0.50) in high-income countries. A low CD4 cell count at the start of therapy was the most important risk factor in both settings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Few data exist on tuberculosis (TB) incidence according to time from HIV seroconversion in high-income countries and whether rates following initiation of a combination of antiretroviral treatments (cARTs) differ from those soon after seroconversion. Methods Data on individuals with well estimated dates of HIV seroconversion were used to analyse post-seroconversion TB rates, ending at the earliest of 1 January 1997, death or last clinic visit. TB rates were also estimated following cART initiation, ending at the earliest of death or last clinic visit. Poisson models were used to examine the effect of current and past level of immunosuppression on TB risk after cART initiation. Results Of 19 815 individuals at risk during 1982–1996, TB incidence increased from 5.89/1000 person-years (PY) (95% CI 3.77 to 8.76) in the first year after seroconversion to 10.56 (4.83 to 20.04, p=0.01) at 10 years. Among 11 178 TB-free individuals initiating cART, the TB rate in the first year after cART initiation was 4.23/1000 PY (3.07 to 5.71) and dropped thereafter, remaining constant from year 2 onwards averaging at 1.64/1000 PY (1.29 to 2.05). Current CD4 count was inversely associated with TB rates, while nadir CD4 count was not associated with TB rates after adjustment for current CD4 count, HIV-RNA at cART initiation. Conclusions TB risk increases with duration of HIV infection in the absence of cART. Following cART initiation, TB incidence rates were lower than levels immediately following seroconversion. Implementation of current recommendations to prevent TB in early HIV infection could be beneficial.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE To describe the CD4 cell count at the start of combination antiretroviral therapy (cART) in low-income (LIC), lower middle-income (LMIC), upper middle-income (UMIC), and high-income (HIC) countries. METHODS Patients aged 16 years or older starting cART in a clinic participating in a multicohort collaboration spanning 6 continents (International epidemiological Databases to Evaluate AIDS and ART Cohort Collaboration) were eligible. Multilevel linear regression models were adjusted for age, gender, and calendar year; missing CD4 counts were imputed. RESULTS In total, 379,865 patients from 9 LIC, 4 LMIC, 4 UMIC, and 6 HIC were included. In LIC, the median CD4 cell count at cART initiation increased by 83% from 80 to 145 cells/μL between 2002 and 2009. Corresponding increases in LMIC, UMIC, and HIC were from 87 to 155 cells/μL (76% increase), 88 to 135 cells/μL (53%), and 209 to 274 cells/μL (31%). In 2009, compared with LIC, median counts were 13 cells/μL [95% confidence interval (CI): -56 to +30] lower in LMIC, 22 cells/μL (-62 to +18) lower in UMIC, and 112 cells/μL (+75 to +149) higher in HIC. They were 23 cells/μL (95% CI: +18 to +28 cells/μL) higher in women than men. Median counts were 88 cells/μL (95% CI: +35 to +141 cells/μL) higher in countries with an estimated national cART coverage >80%, compared with countries with <40% coverage. CONCLUSIONS Median CD4 cell counts at the start of cART increased 2000-2009 but remained below 200 cells/μL in LIC and MIC and below 300 cells/μL in HIC. Earlier start of cART will require substantial efforts and resources globally.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND The CD4 cell count or percent (CD4%) at the start of combination antiretroviral therapy (cART) is an important prognostic factor in children starting therapy and an important indicator of program performance. We describe trends and determinants of CD4 measures at cART initiation in children from low-, middle-, and high-income countries. METHODS We included children aged <16 years from clinics participating in a collaborative study spanning sub-Saharan Africa, Asia, Latin America, and the United States. Missing CD4 values at cART start were estimated through multiple imputation. Severe immunodeficiency was defined according to World Health Organization criteria. Analyses used generalized additive mixed models adjusted for age, country, and calendar year. RESULTS A total of 34,706 children from 9 low-income, 6 lower middle-income, 4 upper middle-income countries, and 1 high-income country (United States) were included; 20,624 children (59%) had severe immunodeficiency. In low-income countries, the estimated prevalence of children starting cART with severe immunodeficiency declined from 76% in 2004 to 63% in 2010. Corresponding figures for lower middle-income countries were from 77% to 66% and for upper middle-income countries from 75% to 58%. In the United States, the percentage decreased from 42% to 19% during the period 1996 to 2006. In low- and middle-income countries, infants and children aged 12-15 years had the highest prevalence of severe immunodeficiency at cART initiation. CONCLUSIONS Despite progress in most low- and middle-income countries, many children continue to start cART with severe immunodeficiency. Early diagnosis and treatment of HIV-infected children to prevent morbidity and mortality associated with immunodeficiency must remain a global public health priority.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Recommendations have differed nationally and internationally with respect to the best time to start antiretroviral therapy (ART). We compared effectiveness of three strategies for initiation of ART in high-income countries for HIV-positive individuals who do not have AIDS: immediate initiation, initiation at a CD4 count less than 500 cells per μL, and initiation at a CD4 count less than 350 cells per μL. METHODS We used data from the HIV-CAUSAL Collaboration of cohort studies in Europe and the USA. We included 55 826 individuals aged 18 years or older who were diagnosed with HIV-1 infection between January, 2000, and September, 2013, had not started ART, did not have AIDS, and had CD4 count and HIV-RNA viral load measurements within 6 months of HIV diagnosis. We estimated relative risks of death and of death or AIDS-defining illness, mean survival time, the proportion of individuals in need of ART, and the proportion of individuals with HIV-RNA viral load less than 50 copies per mL, as would have been recorded under each ART initiation strategy after 7 years of HIV diagnosis. We used the parametric g-formula to adjust for baseline and time-varying confounders. FINDINGS Median CD4 count at diagnosis of HIV infection was 376 cells per μL (IQR 222-551). Compared with immediate initiation, the estimated relative risk of death was 1·02 (95% CI 1·01-1·02) when ART was started at a CD4 count less than 500 cells per μL, and 1·06 (1·04-1·08) with initiation at a CD4 count less than 350 cells per μL. Corresponding estimates for death or AIDS-defining illness were 1·06 (1·06-1·07) and 1·20 (1·17-1·23), respectively. Compared with immediate initiation, the mean survival time at 7 years with a strategy of initiation at a CD4 count less than 500 cells per μL was 2 days shorter (95% CI 1-2) and at a CD4 count less than 350 cells per μL was 5 days shorter (4-6). 7 years after diagnosis of HIV, 100%, 98·7% (95% CI 98·6-98·7), and 92·6% (92·2-92·9) of individuals would have been in need of ART with immediate initiation, initiation at a CD4 count less than 500 cells per μL, and initiation at a CD4 count less than 350 cells per μL, respectively. Corresponding proportions of individuals with HIV-RNA viral load less than 50 copies per mL at 7 years were 87·3% (87·3-88·6), 87·4% (87·4-88·6), and 83·8% (83·6-84·9). INTERPRETATION The benefits of immediate initiation of ART, such as prolonged survival and AIDS-free survival and increased virological suppression, were small in this high-income setting with relatively low CD4 count at HIV diagnosis. The estimated beneficial effect on AIDS is less than in recently reported randomised trials. Increasing rates of HIV testing might be as important as a policy of early initiation of ART. FUNDING National Institutes of Health.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE To illustrate an approach to compare CD4 cell count and HIV-RNA monitoring strategies in HIV-positive individuals on antiretroviral therapy (ART). DESIGN Prospective studies of HIV-positive individuals in Europe and the USA in the HIV-CAUSAL Collaboration and The Center for AIDS Research Network of Integrated Clinical Systems. METHODS Antiretroviral-naive individuals who initiated ART and became virologically suppressed within 12 months were followed from the date of suppression. We compared 3 CD4 cell count and HIV-RNA monitoring strategies: once every (1) 3 ± 1 months, (2) 6 ± 1 months, and (3) 9-12 ± 1 months. We used inverse-probability weighted models to compare these strategies with respect to clinical, immunologic, and virologic outcomes. RESULTS In 39,029 eligible individuals, there were 265 deaths and 690 AIDS-defining illnesses or deaths. Compared with the 3-month strategy, the mortality hazard ratios (95% CIs) were 0.86 (0.42 to 1.78) for the 6 months and 0.82 (0.46 to 1.47) for the 9-12 month strategy. The respective 18-month risk ratios (95% CIs) of virologic failure (RNA >200) were 0.74 (0.46 to 1.19) and 2.35 (1.56 to 3.54) and 18-month mean CD4 differences (95% CIs) were -5.3 (-18.6 to 7.9) and -31.7 (-52.0 to -11.3). The estimates for the 2-year risk of AIDS-defining illness or death were similar across strategies. CONCLUSIONS Our findings suggest that monitoring frequency of virologically suppressed individuals can be decreased from every 3 months to every 6, 9, or 12 months with respect to clinical outcomes. Because effects of different monitoring strategies could take years to materialize, longer follow-up is needed to fully evaluate this question.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Early detection of melanoma has been encouraged in Queensland for many years, yet little is known about the patterns of detection and the way in which they relate to tumor thickness. Objective: Our purpose was to describe current patterns of melanoma detection in Queensland. Methods: This was a population-based study, comprising 3772 Queensland residents diagnosed with a histologically confirmed melanoma between 2000 and 2003. Results: Almost half (44.0%) of the melanomas were detected by the patients themselves, with physicians detecting one fourth (25.3%) and partners one fifth (18.6%). Melanomas detected by doctors were more likely to be thin (\0.75 mm) than those detected by the patient or other layperson. Melanomas detected during a deliberate skin examination were thinner than those detected incidentally. Limitations: Although a participation rate of 78% was achieved, as in any survey, nonresponse bias cannot be completely excluded, and the ability of the results to be generalized to other geographical areas is unknown. Conclusion: There are clear differences in the depth distribution of melanoma in terms of method of detection and who detects the lesions that are consistent with, but do not automatically lead to, the conclusion that promoting active methods of detection may be beneficial. ( J Am Acad Dermatol 2006;54:783-92.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cadmium has been widely used in various industries for the past fifty years, with current world production standing at around 16,755 tonnes per year. Very little cadmium is ever recycled and the ultimate fate of all cadmium is the environment. In view of reports that cadmium in the environment is increasing, this thesis aims to identify population groups 'at risk' of receiving dietary intakes of cadmium up to or above the current Food and Agricultural Organisation/World Health Organisation maximum tolerable intake of 70 ug/day. The study involves the investigation of one hundred households (260 individuals) who grow a large proportion of their vegetable diet in garden soils in the Borough of Walsall, part of an urban/industrial area in the United Kingdom. Measurements were made of the cadmium levels in atmospheric deposition, soil, house dust, diet and urine from the participants. Atmospheric deposition of cadmium was found to be comparable with other urban/industrial areas in the European Community, with deposition rates as high as 209 g ha-1 yr-1. The garden soils of the study households were found to contain up to 33 mg kg-1 total cadmium, eleven times the highest level usually found in agricultural soils. Dietary intakes of cadmium by the residents from food were calculated to be as high as 68 ug/day. It is suggested that with intakes from other sources, such as air, adventitious ingestion, smoking and occupational exposure, total intakes of cadmium may reach or exceed the FAO/WHO limit. Urinary excretion of cadmium amongst a non-smoking, non-occupationally exposed sub-group of the study population was found to be significantly higher than that of a similar urban population who did not rely on home-produced vegetables. The results from this research indicate that present levels of cadmium in urban/industrial areas can increase dietary intakes and body burdens of cadmium. As cadmium serves no useful biological function and has been found to be highly toxic, it is recommended that policy measures to reduce human exposure on the European scale be considered.