913 resultados para Constant Relative Risk Aversion


Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To assess the influence of recipient's and donor's factors as well as surgical events on the occurrence of reperfusion injury after lung transplantation. DESIGN AND SETTING: Retrospective study in the surgical intensive care unit (ICU) of a university hospital. METHODS: We collected data on 60 lung transplantation donor/recipient pairs from June 1993 to May 2001, and compared the demographic, peri- and postoperative variables of patients who experienced reperfusion injury (35%) and those who did not. RESULTS: The occurrence of high systolic pulmonary pressure immediately after transplantation and/or its persistence during the first 48[Symbol: see text]h after surgery was associated with reperfusion injury, independently of preoperative values. Reperfusion injury was associated with difficult hemostasis during transplantation (p[Symbol: see text]=[Symbol: see text]0.03). Patients with reperfusion injury were more likely to require the administration of catecholamine during the first 48[Symbol: see text]h after surgery (p[Symbol: see text]=[Symbol: see text]0.014). The extubation was delayed (p[Symbol: see text]=[Symbol: see text]0.03) and the relative odds of ICU mortality were significantly greater (OR 4.8, 95% CI: 1.06, 21.8) in patients with reperfusion injury. Our analysis confirmed that preexisting pulmonary hypertension increased the incidence of reperfusion injury (p[Symbol: see text]<[Symbol: see text]0.01). CONCLUSIONS: Difficulties in perioperative hemostasis were associated with reperfusion injury. Occurrence of reperfusion injury was associated with postoperative systolic pulmonary hypertension, longer mechanical ventilation and higher mortality. Whether early recognition and treatment of pulmonary hypertension during transplantation can prevent the occurrence of reperfusion injury needs to be investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Time delays from stroke onset to arrival at the hospital are the main obstacles for widespread use of thrombolysis. In order to decrease the delays, educational campaigns try to inform the general public how to act optimally in case of stroke. To determine the content of such a campaign, we assessed the stroke knowledge in our population. METHODS: The stroke knowledge was studied by means of a closed-ended questionnaire. 422 randomly chosen inhabitants of Bern, Switzerland, were interviewed. RESULTS: The knowledge of stroke warning signs (WS) was classified as good in 64.7%. A good knowledge of stroke risk factors (RF) was noted in 6.4%. 4.2% knew both the WS and the RF of stroke indicating a very good global knowledge of stroke. Only 8.3% recognized TIA as symptoms of stroke resolving within 24 hours, and only 2.8% identified TIA as a disease requiring immediate medical help. In multivariate analysis being a woman, advancing age, and having an afflicted relative were associated with a good knowledge of WS (p = 0.048, p < 0.001 and p = 0.043). Good knowledge of RF was related to university education (p < 0.001). The good knowledge of TIA did not depend on age, sex, level of education or having an afflicted relative. CONCLUSIONS: The study brings to light relevant deficits of stroke knowledge in our population. A small number of participants could recognize TIA as stroke related symptoms resolving completely within 24 hours. Only a third of the surveyed persons would seek immediate medical help in case of TIA. The information obtained will be used in the development of future educational campaigns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Standard procedures for forecasting flood risk (Bulletin 17B) assume annual maximum flood (AMF) series are stationary, meaning the distribution of flood flows is not significantly affected by climatic trends/cycles, or anthropogenic activities within the watershed. Historical flood events are therefore considered representative of future flood occurrences, and the risk associated with a given flood magnitude is modeled as constant over time. However, in light of increasing evidence to the contrary, this assumption should be reconsidered, especially as the existence of nonstationarity in AMF series can have significant impacts on planning and management of water resources and relevant infrastructure. Research presented in this thesis quantifies the degree of nonstationarity evident in AMF series for unimpaired watersheds throughout the contiguous U.S., identifies meteorological, climatic, and anthropogenic causes of this nonstationarity, and proposes an extension of the Bulletin 17B methodology which yields forecasts of flood risk that reflect climatic influences on flood magnitude. To appropriately forecast flood risk, it is necessary to consider the driving causes of nonstationarity in AMF series. Herein, large-scale climate patterns—including El Niño-Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO), North Atlantic Oscillation (NAO), and Atlantic Multidecadal Oscillation (AMO)—are identified as influencing factors on flood magnitude at numerous stations across the U.S. Strong relationships between flood magnitude and associated precipitation series were also observed for the majority of sites analyzed in the Upper Midwest and Northeastern regions of the U.S. Although relationships between flood magnitude and associated temperature series are not apparent, results do indicate that temperature is highly correlated with the timing of flood peaks. Despite consideration of watersheds classified as unimpaired, analyses also suggest that identified change-points in AMF series are due to dam construction, and other types of regulation and diversion. Although not explored herein, trends in AMF series are also likely to be partially explained by changes in land use and land cover over time. Results obtained herein suggest that improved forecasts of flood risk may be obtained using a simple modification of the Bulletin 17B framework, wherein the mean and standard deviation of the log-transformed flows are modeled as functions of climate indices associated with oceanic-atmospheric patterns (e.g. AMO, ENSO, NAO, and PDO) with lead times between 3 and 9 months. Herein, one-year ahead forecasts of the mean and standard deviation, and subsequently flood risk, are obtained by applying site specific multivariate regression models, which reflect the phase and intensity of a given climate pattern, as well as possible impacts of coupling of the climate cycles. These forecasts of flood risk are compared with forecasts derived using the existing Bulletin 17B model; large differences in the one-year ahead forecasts are observed in some locations. The increased knowledge of the inherent structure of AMF series and an improved understanding of physical and/or climatic causes of nonstationarity gained from this research should serve as insight for the formulation of a physical-casual based statistical model, incorporating both climatic variations and human impacts, for flood risk over longer planning horizons (e.g., 10-, 50, 100-years) necessary for water resources design, planning, and management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: At a mean follow-up of 3.1 years, twenty-seven consecutive repairs of massive rotator cuff tears yielded good and excellent clinical results despite a retear rate of 37%. Patients with a retear had improvement over the preoperative state, but those with a structurally intact repair had a substantially better result. The purpose of this study was to reassess the same patients to determine the long-term functional and structural results. METHODS: At a mean follow-up interval of 9.9 years, twenty-three of the twenty-seven patients returned for a review and were examined clinically, radiographically, and with magnetic resonance imaging with use of a methodology identical to that used at 3.1 years. RESULTS: Twenty-two of the twenty-three patients remained very satisfied or satisfied with the result. The mean subjective shoulder value was 82% (compared with 80% at 3.1 years). The mean relative Constant score was 85% (compared with 83% at 3.1 years). The retear rate was 57% at 9.9 years (compared with 37% at 3.1 years; p = 0.168). Patients with an intact repair had a better result than those with a failed reconstruction with respect to the mean absolute Constant score (81 compared with 64 points, respectively; p = 0.015), mean relative Constant score (95% and 77%; p = 0.002), and mean strength of abduction (5.5 and 2.6 kg; p = 0.007). The mean retear size had increased from 882 to 1164 mm(2) (p = 0.016). Supraspinatus and infraspinatus muscle fatty infiltration had increased (p = 0.004 and 0.008, respectively). Muscles with torn tendons preoperatively showed more fatty infiltration than muscles with intact tendons preoperatively, regardless of repair integrity. Shoulders with a retear had a significantly higher mean acromion index than those without retear (0.75 and 0.65, respectively; p = 0.004). CONCLUSIONS: Open repair of massive rotator cuff tears yielded clinically durable, excellent results with high patient satisfaction at a mean of almost ten years postoperatively. Conversely, fatty muscle infiltration of the supraspinatus and infraspinatus progressed, and the retear size increased over time. The preoperative integrity of the tendon appeared to be protective against muscle deterioration. A wide lateral extension of the acromion was identified as a previously unknown risk factor for retearing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Osteotomies of the proximal femur for hip joint conditions are normally done at the intertrochanteric or subtrochanteric level. Intra-articular osteotomies would be more direct and therefore allow a more powerful correction with no or very little undesired side correction. However, concerns about the risk of vascular damage and osteonecrosis of the femoral head have so far basically excluded this technique from practical use. Based on detailed knowledge of the vascular anatomy of the proximal femur, an approach to safely dislocate the femoral head has been described and successfully performed. Experience as well as further studies of femoral head perfusion allowed a substantial extension of this approach, with subperiosteal exposure of the circumference of the femoral neck with constant intraoperative control of the blood supply to the head. Using the extended retinacular soft-tissue flap, four surgical techniques (relative neck lengthening, subcapital realignment in slipped capital femoral epiphysis, true femoral neck osteotomy, and femoral head reduction osteotomy) evolved or became safer with respect to perfusion of the femoral head. The extended retinacular soft-tissue flap offers the technical and biologic possibility for a new class of intra articular procedures. Although meticulous execution of the surgical steps is important, the procedures have a high level of safety for femoral head perfusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contracts paying a guaranteed minimum rate of return and a fraction of a positive excess rate, which is specified relative to a benchmark portfolio, are closely related to unit-linked life-insurance products and can be considered as alternatives to direct investment in the underlying benchmark. They contain an embedded power option, and the key issue is the tractable and realistic hedging of this option, in order to rigorously justify valuation by arbitrage arguments and prevent the guarantees from becoming uncontrollable liabilities to the issuer. We show how to determine the contract parameters conservatively and implement robust risk-management strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Meat and meat products can be contaminated with different species of bacteria resistant to various antimicrobials. The human health risk of a type of meat or meat product carry by emerging antimicrobial resistance depends on (i) the prevalence of contamination with resistant bacteria, (ii) the human health consequences of an infection with a specific bacterium resistant to a specific antimicrobial and (iii) the consumption volume of a specific product. The objective of this study was to compare the risk for consumers arising from their exposure to antibiotic resistant bacteria from meat of four different types (chicken, pork, beef and veal), distributed in four different product categories (fresh meat, frozen meat, dried raw meat products and heat-treated meat products). A semi-quantitative risk assessment model, evaluating each food chain step, was built in order to get an estimated score for the prevalence of Campylobacter spp., Enterococcus spp. and Escherichia coli in each product category. To assess human health impact, nine combinations of bacterial species and antimicrobial agents were considered based on a published risk profile. The combination of the prevalence at retail, the human health impact and the amount of meat or product consumed, provided the relative proportion of total risk attributed to each category of product, resulting in a high, medium or low human health risk. According to the results of the model, chicken (mostly fresh and frozen meat) contributed 6.7% of the overall risk in the highest category and pork (mostly fresh meat and dried raw meat products) contributed 4.0%. The contribution of beef and veal was of 0.4% and 0.1% respectively. The results were tested and discussed for single parameter changes of the model. This risk assessment was a useful tool for targeting antimicrobial resistance monitoring to those meat product categories where the expected risk for public health was greater.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Few data exist on tuberculosis (TB) incidence according to time from HIV seroconversion in high-income countries and whether rates following initiation of a combination of antiretroviral treatments (cARTs) differ from those soon after seroconversion. Methods Data on individuals with well estimated dates of HIV seroconversion were used to analyse post-seroconversion TB rates, ending at the earliest of 1 January 1997, death or last clinic visit. TB rates were also estimated following cART initiation, ending at the earliest of death or last clinic visit. Poisson models were used to examine the effect of current and past level of immunosuppression on TB risk after cART initiation. Results Of 19 815 individuals at risk during 1982–1996, TB incidence increased from 5.89/1000 person-years (PY) (95% CI 3.77 to 8.76) in the first year after seroconversion to 10.56 (4.83 to 20.04, p=0.01) at 10 years. Among 11 178 TB-free individuals initiating cART, the TB rate in the first year after cART initiation was 4.23/1000 PY (3.07 to 5.71) and dropped thereafter, remaining constant from year 2 onwards averaging at 1.64/1000 PY (1.29 to 2.05). Current CD4 count was inversely associated with TB rates, while nadir CD4 count was not associated with TB rates after adjustment for current CD4 count, HIV-RNA at cART initiation. Conclusions TB risk increases with duration of HIV infection in the absence of cART. Following cART initiation, TB incidence rates were lower than levels immediately following seroconversion. Implementation of current recommendations to prevent TB in early HIV infection could be beneficial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: To assess health care utilisation for patients co-infected with TB and HIV (TB-HIV), and to develop a weighted health care index (HCI) score based on commonly used interventions and compare it with patient outcome. METHODS: A total of 1061 HIV patients diagnosed with TB in four regions, Central/Northern, Southern and Eastern Europe and Argentina, between January 2004 and December 2006 were enrolled in the TB-HIV study. A weighted HCI score (range 0–5), based on independent prognostic factors identified in multivariable Cox models and the final score, included performance of TB drug susceptibility testing (DST), an initial TB regimen containing a rifamycin, isoniazid and pyrazinamide, and start of combination antiretroviral treatment (cART). RESULTS: The mean HCI score was highest in Central/Northern Europe (3.2, 95%CI 3.1–3.3) and lowest in Eastern Europe (1.6, 95%CI 1.5–1.7). The cumulative probability of death 1 year after TB diagnosis decreased from 39% (95%CI 31–48) among patients with an HCI score of 0, to 9% (95%CI 6–13) among those with a score of ≥4. In an adjusted Cox model, a 1-unit increase in the HCI score was associated with 27% reduced mortality (relative hazard 0.73, 95%CI 0.64–0.84). CONCLUSIONS: Our results suggest that DST, standard anti-tuberculosis treatment and early cART may improve outcome for TB-HIV patients. The proposed HCI score provides a tool for future research and monitoring of the management of TB-HIV patients. The highest HCI score may serve as a benchmark to assess TB-HIV management, encouraging continuous health care improvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Persons infected with human immunodeficiency virus (HIV) have increased rates of coronary artery disease (CAD). The relative contribution of genetic background, HIV-related factors, antiretroviral medications, and traditional risk factors to CAD has not been fully evaluated in the setting of HIV infection. Methods In the general population, 23 common single-nucleotide polymorphisms (SNPs) were shown to be associated with CAD through genome-wide association analysis. Using the Metabochip, we genotyped 1875 HIV-positive, white individuals enrolled in 24 HIV observational studies, including 571 participants with a first CAD event during the 9-year study period and 1304 controls matched on sex and cohort. Results A genetic risk score built from 23 CAD-associated SNPs contributed significantly to CAD (P = 2.9×10−4). In the final multivariable model, participants with an unfavorable genetic background (top genetic score quartile) had a CAD odds ratio (OR) of 1.47 (95% confidence interval [CI], 1.05–2.04). This effect was similar to hypertension (OR = 1.36; 95% CI, 1.06–1.73), hypercholesterolemia (OR = 1.51; 95% CI, 1.16–1.96), diabetes (OR = 1.66; 95% CI, 1.10–2.49), ≥1 year lopinavir exposure (OR = 1.36; 95% CI, 1.06–1.73), and current abacavir treatment (OR = 1.56; 95% CI, 1.17–2.07). The effect of the genetic risk score was additive to the effect of nongenetic CAD risk factors, and did not change after adjustment for family history of CAD. Conclusions In the setting of HIV infection, the effect of an unfavorable genetic background was similar to traditional CAD risk factors and certain adverse antiretroviral exposures. Genetic testing may provide prognostic information complementary to family history of CAD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Empirical research has illustrated an association between study size and relative treatment effects, but conclusions have been inconsistent about the association of study size with the risk of bias items. Small studies give generally imprecisely estimated treatment effects, and study variance can serve as a surrogate for study size. METHODS We conducted a network meta-epidemiological study analyzing 32 networks including 613 randomized controlled trials, and used Bayesian network meta-analysis and meta-regression models to evaluate the impact of trial characteristics and study variance on the results of network meta-analysis. We examined changes in relative effects and between-studies variation in network meta-regression models as a function of the variance of the observed effect size and indicators for the adequacy of each risk of bias item. Adjustment was performed both within and across networks, allowing for between-networks variability. RESULTS Imprecise studies with large variances tended to exaggerate the effects of the active or new intervention in the majority of networks, with a ratio of odds ratios of 1.83 (95% CI: 1.09,3.32). Inappropriate or unclear conduct of random sequence generation and allocation concealment, as well as lack of blinding of patients and outcome assessors, did not materially impact on the summary results. Imprecise studies also appeared to be more prone to inadequate conduct. CONCLUSIONS Compared to more precise studies, studies with large variance may give substantially different answers that alter the results of network meta-analyses for dichotomous outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INFLUENCE OF ANCHORING ON MISCARRIAGE RISK PERCEPTION ASSOCIATED WITH AMNIOCENTESIS Publication No. ___________ Regina Nuccio, BS Supervisory Professor: Claire N. Singletary, MS, CGC Amniocentesis is the most common invasive procedure performed during pregnancy (Eddleman, et al., 2006). One important factor that women consider when making a decision about amniocentesis is the risk of miscarriage associated with the procedure. People use heuristics such as anchoring, the action of using a prior belief regarding the magnitude of risk as a frame of reference for new information to be synthesized, to better understand risks that they encounter in their lives. This study aimed to determine a woman’s perception of miscarriage risk associated with amniocentesis before and after a genetic counseling session and to determine what factors are most likely to anchor a woman’s perception of miscarriage risk associated with amniocentesis. Most women perceived the risk as low or average pre-counseling and were likely to indicate the numeric risk of amniocentesis as <1% risk. A higher percentage of patients correctly identified the numeric risk as <1% post-counseling when compared to pre-counseling. However, the majority of patients’ feeling about the risk perception did not change after the genetic counseling session (60%), regardless of how they perceived the risk before discussing amniocentesis with a genetic counselor. Those whose risk perception did change after discussing amniocentesis with a genetic counselor showed a decreased risk perception (p<0.0001). Of the multitude of factors studied, only two showed significance: having a friend or relative with a personal or family history of a genetic disorder was associated with a lower risk perception (p=0.001) and having a child already was associated with a lower risk perception (p=0.038). The lack of significant factors may reflect the uniqueness of each patient’s heuristic framework and reinforces the importance of genetic counseling to elucidate individual concerns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many persons in the U.S. gain weight during young adulthood, and the prevalence of obesity has been increasing among young adults. Although obesity and physical inactivity are generally recognized as risk factors for coronary heart disease (CHD), the magnitude of their effect on risk may have been seriously underestimated due to failure to adequately handle the problem of cigarette smoking. Since cigarette smoking causes weight loss, physically inactive cigarette smokers may remain relatively lean because they smoke cigarettes. We hypothesize cigarette smoking modifies the association between weight gain during young adulthood and risk of coronary heart disease during middle age, and that the true effect of weight gain during young adulthood on risk of CHD can be assessed only in persons who have not smoked cigarettes. Specifically, we hypothesize that weight gain during young adulthood is positively associated with risk of CHD during middle-age in nonsmokers but that the association is much smaller or absent entirely among cigarette smokers. The purpose of this study was to test this hypothesis. The population for analysis was comprised of 1,934 middle-aged, employed men whose average age at the baseline examination was 48.7 years. Information collected at the baseline examinations in 1958 and 1959 included recalled weight at age 20, present weight, height, smoking status, and other CHD risk factors. To decrease the effect of intraindividual variation, the mean values of the 1958 and 1959 baseline examinations were used in analyses. Change in body mass index ($\Delta$BMI) during young adulthood was the primary exposure variable and was measured as BMI at baseline (kg/m$\sp2)$ minus BMI at age 20 (kg/m$\sp2).$ Proportional hazards regression analysis was used to generate relative risks of CHD mortality by category of $\Delta$BMI and cigarette smoking status after adjustment for age, family history of CVD, major organ system disease, BMI at age 20, and number of cigarettes smoked per day. Adjustment was not performed for systolic blood pressure or total serum cholesterol as these were regarded as intervening variables. Vital status was known for all men on the 25th anniversary of their baseline examinations. 705 deaths (including 319 CHD deaths) occurred over 40,136 person-years of experience. $\Delta$BMI was positively associated with risk of CHD mortality in never-smokers, but not in ever-smokers (p for interaction = 0.067). For never-smokers with $\Delta$BMI of stable, low gain, moderate gain, and high gain, adjusted relative risks were 1.00, 1.62, 1.61, and 2.78, respectively (p for trend = 0.010). For ever-smokers, with $\Delta$BMI of stable, low gain, moderate gain, and high gain, adjusted relative risks were 1.00, 0.74, 1.07, and 1.06, respectively (p for trend = 0.422). These results support the research hypothesis that cigarette smoking modifies the association between weight gain and CHD mortality. Current estimates of the magnitude of effect of obesity and physical inactivity on risk of coronary mortality may have been seriously underestimated due to inadequate handling of cigarette smoking. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND The objective of the present investigation is to assess the baseline mortality-adjusted 10-year survival of rectal cancer patients. METHODS Ten-year survival was analyzed in 771 consecutive American Joint Committee on Cancer (AJCC) stage I-IV rectal cancer patients undergoing open resection between 1991 and 2008 using risk-adjusted Cox proportional hazard regression models adjusting for population-based baseline mortality. RESULTS The median follow-up of patients alive was 8.8 years. The 10-year relative, overall, and cancer-specific survival were 66.5% [95% confidence interval (CI) 61.3-72.1], 48.7% (95% CI 44.9-52.8), and 66.4% (95% CI 62.5-70.5), respectively. In the entire patient sample (stage I-IV) 47.3% and in patients with stage I-III 33.6 % of all deaths were related to rectal cancer during the 10-year period. For patients with AJCC stage I rectal cancer, the 10-year overall survival was 96% and did not significantly differ from an average population after matching for gender, age, and calendar year (p = 0.151). For the more advanced tumor stages, however, survival was significantly impaired (p < 0.001). CONCLUSIONS Retrospective investigations of survival after rectal cancer resection should adjust for baseline mortality because a large fraction of deaths is not cancer related. Stage I rectal cancer patients, compared to patients with more advanced disease stages, have a relative survival close to 100% and can thus be considered cured. Using this relative-survival approach, the real public health burden caused by rectal cancer can reliably be analyzed and reported.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Observational studies of a putative association between hormonal contraception (HC) and HIV acquisition have produced conflicting results. We conducted an individual participant data (IPD) meta-analysis of studies from sub-Saharan Africa to compare the incidence of HIV infection in women using combined oral contraceptives (COCs) or the injectable progestins depot-medroxyprogesterone acetate (DMPA) or norethisterone enanthate (NET-EN) with women not using HC. METHODS AND FINDINGS Eligible studies measured HC exposure and incident HIV infection prospectively using standardized measures, enrolled women aged 15-49 y, recorded ≥15 incident HIV infections, and measured prespecified covariates. Our primary analysis estimated the adjusted hazard ratio (aHR) using two-stage random effects meta-analysis, controlling for region, marital status, age, number of sex partners, and condom use. We included 18 studies, including 37,124 women (43,613 woman-years) and 1,830 incident HIV infections. Relative to no HC use, the aHR for HIV acquisition was 1.50 (95% CI 1.24-1.83) for DMPA use, 1.24 (95% CI 0.84-1.82) for NET-EN use, and 1.03 (95% CI 0.88-1.20) for COC use. Between-study heterogeneity was mild (I2 < 50%). DMPA use was associated with increased HIV acquisition compared with COC use (aHR 1.43, 95% CI 1.23-1.67) and NET-EN use (aHR 1.32, 95% CI 1.08-1.61). Effect estimates were attenuated for studies at lower risk of methodological bias (compared with no HC use, aHR for DMPA use 1.22, 95% CI 0.99-1.50; for NET-EN use 0.67, 95% CI 0.47-0.96; and for COC use 0.91, 95% CI 0.73-1.41) compared to those at higher risk of bias (pinteraction = 0.003). Neither age nor herpes simplex virus type 2 infection status modified the HC-HIV relationship. CONCLUSIONS This IPD meta-analysis found no evidence that COC or NET-EN use increases women's risk of HIV but adds to the evidence that DMPA may increase HIV risk, underscoring the need for additional safe and effective contraceptive options for women at high HIV risk. A randomized controlled trial would provide more definitive evidence about the effects of hormonal contraception, particularly DMPA, on HIV risk.