957 resultados para Poisson regression analysis
Resumo:
BACKGROUND AND PURPOSE Inverse relationship between onset-to-door time (ODT) and door-to-needle time (DNT) in stroke thrombolysis was reported from various registries. We analyzed this relationship and other determinants of DNT in dedicated stroke centers. METHODS Prospectively collected data of consecutive ischemic stroke patients from 10 centers who received IV thrombolysis within 4.5 hours from symptom onset were merged (n=7106). DNT was analyzed as a function of demographic and prehospital variables using regression analyses, and change over time was considered. RESULTS In 6348 eligible patients with known treatment delays, median DNT was 42 minutes and kept decreasing steeply every year (P<0.001). Median DNT of 55 minutes was observed in patients with ODT ≤30 minutes, whereas it declined for patients presenting within the last 30 minutes of the 3-hour time window (median, 33 minutes) and of the 4.5-hour time window (20 minutes). For ODT within the first 30 minutes of the extended time window (181-210 minutes), DNT increased to 42 minutes. DNT was stable for ODT for 30 to 150 minutes (40-45 minutes). We found a weak inverse overall correlation between ODT and DNT (R(2)=-0.12; P<0.001), but it was strong in patients treated between 3 and 4.5 hours (R(2)=-0.75; P<0.001). ODT was independently inversely associated with DNT (P<0.001) in regression analysis. Octogenarians and women tended to have longer DNT. CONCLUSIONS DNT was decreasing steeply over the last years in dedicated stroke centers; however, significant oscillations of in-hospital treatment delays occurred at both ends of the time window. This suggests that further improvements can be achieved, particularly in the elderly.
Resumo:
Aims: The aim of this study was to identify predictors of adverse events among patients with ST-elevation myocardial infarction (STEMI) undergoing contemporary primary percutaneous coronary intervention (PCI). Methods and results: Individual data of 2,655 patients from two primary PCI trials (EXAMINATION, N=1,504; COMFORTABLE AMI, N=1,161) with identical endpoint definitions and event adjudication were pooled. Predictors of all-cause death or any reinfarction and definite stent thrombosis (ST) and target lesion revascularisation (TLR) outcomes at one year were identified by multivariable Cox regression analysis. Killip class III or IV was the strongest predictor of all-cause death or any reinfarction (OR 5.11, 95% CI: 2.48-10.52), definite ST (OR 7.74, 95% CI: 2.87-20.93), and TLR (OR 2.88, 95% CI: 1.17-7.06). Impaired left ventricular ejection fraction (OR 4.77, 95% CI: 2.10-10.82), final TIMI flow 0-2 (OR 1.93, 95% CI: 1.05-3.54), arterial hypertension (OR 1.69, 95% CI: 1.11-2.59), age (OR 1.68, 95% CI: 1.41-2.01), and peak CK (OR 1.25, 95% CI: 1.02-1.54) were independent predictors of all-cause death or any reinfarction. Allocation to treatment with DES was an independent predictor of a lower risk of definite ST (OR 0.35, 95% CI: 0.16-0.74) and any TLR (OR 0.34, 95% CI: 0.21-0.54). Conclusions: Killip class remains the strongest predictor of all-cause death or any reinfarction among STEMI patients undergoing primary PCI. DES use independently predicts a lower risk of TLR and definite ST compared with BMS. The COMFORTABLE AMI trial is registered at: http://www.clinicaltrials.gov/ct2/show/NCT00962416. The EXAMINATION trial is registered at: http://www.clinicaltrials.gov/ct2/show/NCT00828087.
Resumo:
OBJECTIVE To investigate the long-term prognostic implications of coronary calcification in patients undergoing percutaneous coronary intervention for obstructive coronary artery disease. METHODS Patient-level data from 6296 patients enrolled in seven clinical drug-eluting stents trials were analysed to identify in angiographic images the presence of severe coronary calcification by an independent academic research organisation (Cardialysis, Rotterdam, The Netherlands). Clinical outcomes at 3-years follow-up including all-cause mortality, death-myocardial infarction (MI), and the composite end-point of all-cause death-MI-any revascularisation were compared between patients with and without severe calcification. RESULTS Severe calcification was detected in 20% of the studied population. Patients with severe lesion calcification were less likely to have undergone complete revascularisation (48% vs 55.6%, p<0.001) and had an increased mortality compared with those without severely calcified arteries (10.8% vs 4.4%, p<0.001). The event rate was also high in patients with severely calcified lesions for the combined end-point death-MI (22.9% vs 10.9%; p<0.001) and death-MI- any revascularisation (31.8% vs 22.4%; p<0.001). On multivariate Cox regression analysis, including the Syntax score, the presence of severe coronary calcification was an independent predictor of poor prognosis (HR: 1.33 95% CI 1.00 to 1.77, p=0.047 for death; 1.23, 95% CI 1.02 to 1.49, p=0.031 for death-MI, and 1.18, 95% CI 1.01 to 1.39, p=0.042 for death-MI- any revascularisation), but it was not associated with an increased risk of stent thrombosis. CONCLUSIONS Patients with severely calcified lesions have worse clinical outcomes compared to those without severe coronary calcification. Severe coronary calcification appears as an independent predictor of worse prognosis, and should be considered as a marker of advanced atherosclerosis.
Resumo:
BACKGROUND Heat periods during recent years were associated with excess hospitalization and mortality rates, especially in the elderly. We intended to study whether prolonged warmth/heat periods are associated with an increased prevalence of disorders of serum sodium and potassium and an increased hospital mortality. METHODS In this cross-sectional analysis all patients admitted to the Department of Emergency Medicine of a large tertiary care facility between January 2009 and December 2010 with measurements of serum sodium were included. Demographic data along with detailed data on diuretic medication, length of hospital stay and hospital mortality were obtained for all patients. Data on daily temperatures (maximum, mean, minimum) and humidity were retrieved by Meteo Swiss. RESULTS A total of 22.239 patients were included in the study. 5 periods with a temperature exceeding 25 °C for 3 to 5 days were noticed and 2 periods with temperatures exceeding 25 °C for more than 5 days were noted. Additionally, 2 periods with 3 to 5 days with daily temperatures exceeding 30 °C were noted during the study period. We found a significantly increased prevalence of hyponatremia during heat periods. However, in the Cox regression analysis, prolonged heat was not associated with the prevalence of disorders of serum sodium or potassium. Admission during a heat period was an independent predictor for hospital mortality. CONCLUSIONS Although we found an increased prevalence of hyponatremia during heat periods, no convincing connection could be found for hypernatremia or disorders of serum potassium.
Resumo:
Children living near highways are exposed to higher concentrations of traffic-related carcinogenic pollutants. Several studies reported an increased risk of childhood cancer associated with traffic exposure, but the published evidence is inconclusive. We investigated whether cancer risk is associated with proximity of residence to highways in a nation-wide cohort study including all children aged <16 years from Swiss national censuses in 1990 and 2000. Cancer incidence was investigated in time to event analyses (1990-2008) using Cox proportional hazards models and incidence density analyses (1985-2008) using Poisson regression. Adjustments were made for socio-economic factors, ionising background radiation and electromagnetic fields. In time to event analysis based on 532 cases the adjusted hazard ratio for leukaemia comparing children living <100 m from a highway with unexposed children (≥500 m) was 1.43 (95 % CI 0.79, 2.61). Results were similar in incidence density analysis including 1367 leukaemia cases (incidence rate ratio (IRR) 1.57; 95 % CI 1.09, 2.25). Associations were similar for acute lymphoblastic leukaemia (IRR 1.64; 95 % CI 1.10, 2.43) and stronger for leukaemia in children aged <5 years (IRR 1.92; 95 % CI 1.22, 3.04). Little evidence of association was found for other tumours. Our study suggests that young children living close to highways are at increased risk of developing leukaemia.
Resumo:
Background Protein-energy-malnutrition (PEM) is common in people with end stage kidney disease (ESKD) undergoing maintenance haemodialysis (MHD) and correlates strongly with mortality. To this day, there is no gold standard for detecting PEM in patients on MHD. Aim of Study The aim of this study was to evaluate if Nutritional Risk Screening 2002 (NRS-2002), handgrip strength measurement, mid-upper arm muscle area (MUAMA), triceps skin fold measurement (TSF), serum albumin, normalised protein catabolic rate (nPCR), Kt/V and eKt/V, dry body weight, body mass index (BMI), age and time since start on MHD are relevant for assessing PEM in patients on MHD. Methods The predictive value of the selected parameters on mortality and mortality or weight loss of more than 5% was assessed. Quantitative data analysis of the 12 parameters in the same patients on MHD in autumn 2009 (n = 64) and spring 2011 (n = 40) with paired statistical analysis and multivariate logistic regression analysis was performed. Results Paired data analysis showed significant reduction of dry body weight, BMI and nPCR. Kt/Vtot did not change, eKt/v and hand grip strength measurements were significantly higher in spring 2011. No changes were detected in TSF, serum albumin, NRS-2002 and MUAMA. Serum albumin was shown to be the only predictor of death and of the combined endpoint “death or weight loss of more than 5%”. Conclusion We now screen patients biannually for serum albumin, nPCR, Kt/V, handgrip measurement of the shunt-free arm, dry body weight, age and time since initiation of MHD.
Resumo:
OBJECTIVE To assess whether palliative primary tumor resection in colorectal cancer patients with incurable stage IV disease is associated with improved survival. BACKGROUND There is a heated debate regarding whether or not an asymptomatic primary tumor should be removed in patients with incurable stage IV colorectal disease. METHODS Stage IV colorectal cancer patients were identified in the Surveillance, Epidemiology, and End Results database between 1998 and 2009. Patients undergoing surgery to metastatic sites were excluded. Overall survival and cancer-specific survival were compared between patients with and without palliative primary tumor resection using risk-adjusted Cox proportional hazard regression models and stratified propensity score methods. RESULTS Overall, 37,793 stage IV colorectal cancer patients were identified. Of those, 23,004 (60.9%) underwent palliative primary tumor resection. The rate of patients undergoing palliative primary cancer resection decreased from 68.4% in 1998 to 50.7% in 2009 (P < 0.001). In Cox regression analysis after propensity score matching primary cancer resection was associated with a significantly improved overall survival [hazard ratio (HR) of death = 0.40, 95% confidence interval (CI) = 0.39-0.42, P < 0.001] and cancer-specific survival (HR of death = 0.39, 95% CI = 0.38-0.40, P < 0.001). The benefit of palliative primary cancer resection persisted during the time period 1998 to 2009 with HRs equal to or less than 0.47 for both overall and cancer-specific survival. CONCLUSIONS On the basis of this population-based cohort of stage IV colorectal cancer patients, palliative primary tumor resection was associated with improved overall and cancer-specific survival. Therefore, the dogma that an asymptomatic primary tumor never should be resected in patients with unresectable colorectal cancer metastases must be questioned.
Resumo:
PURPOSE The aim of this study was to analyze the patient pool referred to a specialty clinic for implant surgery over a 3-year period. MATERIALS AND METHODS All patients receiving dental implants between 2008 and 2010 at the Department of Oral Surgery and Stomatology were included in the study. As primary outcome parameters, the patients were analyzed according to the following criteria: age, sex, systemic diseases, and indication for therapy. For the inserted implants, the type of surgical procedure, the types of implants placed, postsurgical complications, and early failures were recorded. A logistic regression analysis was performed to identify possible local and systemic risk factors for complications. As a secondary outcome, data regarding demographics and surgical procedures were compared with the findings of a historic study group (2002 to 2004). RESULTS A total of 1,568 patients (792 women and 776 men; mean age, 52.6 years) received 2,279 implants. The most frequent indication was a single-tooth gap (52.8%). Augmentative procedures were performed in 60% of the cases. Tissue-level implants (72.1%) were more frequently used than bone-level implants (27.9%). Regarding dimensions of the implants, a diameter of 4.1 mm (59.7%) and a length of 10 mm (55.0%) were most often utilized. An early failure rate of 0.6% was recorded (13 implants). Patients were older and received more implants in the maxilla, and the complexity of surgical interventions had increased when compared to the patient pool of 2002 to 2004. CONCLUSION Implant therapy performed in a surgical specialty clinic utilizing strict patient selection and evidence-based surgical protocols showed a very low early failure rate of 0.6%.
Resumo:
BACKGROUND Non-steroidal anti-inflammatory drugs (NSAIDs) are the backbone of osteoarthritis pain management. We aimed to assess the effectiveness of different preparations and doses of NSAIDs on osteoarthritis pain in a network meta-analysis. METHODS For this network meta-analysis, we considered randomised trials comparing any of the following interventions: NSAIDs, paracetamol, or placebo, for the treatment of osteoarthritis pain. We searched the Cochrane Central Register of Controlled Trials (CENTRAL) and the reference lists of relevant articles for trials published between Jan 1, 1980, and Feb 24, 2015, with at least 100 patients per group. The prespecified primary and secondary outcomes were pain and physical function, and were extracted in duplicate for up to seven timepoints after the start of treatment. We used an extension of multivariable Bayesian random effects models for mixed multiple treatment comparisons with a random effect at the level of trials. For the primary analysis, a random walk of first order was used to account for multiple follow-up outcome data within a trial. Preparations that used different total daily dose were considered separately in the analysis. To assess a potential dose-response relation, we used preparation-specific covariates assuming linearity on log relative dose. FINDINGS We identified 8973 manuscripts from our search, of which 74 randomised trials with a total of 58 556 patients were included in this analysis. 23 nodes concerning seven different NSAIDs or paracetamol with specific daily dose of administration or placebo were considered. All preparations, irrespective of dose, improved point estimates of pain symptoms when compared with placebo. For six interventions (diclofenac 150 mg/day, etoricoxib 30 mg/day, 60 mg/day, and 90 mg/day, and rofecoxib 25 mg/day and 50 mg/day), the probability that the difference to placebo is at or below a prespecified minimum clinically important effect for pain reduction (effect size [ES] -0·37) was at least 95%. Among maximally approved daily doses, diclofenac 150 mg/day (ES -0·57, 95% credibility interval [CrI] -0·69 to -0·46) and etoricoxib 60 mg/day (ES -0·58, -0·73 to -0·43) had the highest probability to be the best intervention, both with 100% probability to reach the minimum clinically important difference. Treatment effects increased as drug dose increased, but corresponding tests for a linear dose effect were significant only for celecoxib (p=0·030), diclofenac (p=0·031), and naproxen (p=0·026). We found no evidence that treatment effects varied over the duration of treatment. Model fit was good, and between-trial heterogeneity and inconsistency were low in all analyses. All trials were deemed to have a low risk of bias for blinding of patients. Effect estimates did not change in sensitivity analyses with two additional statistical models and accounting for methodological quality criteria in meta-regression analysis. INTERPRETATION On the basis of the available data, we see no role for single-agent paracetamol for the treatment of patients with osteoarthritis irrespective of dose. We provide sound evidence that diclofenac 150 mg/day is the most effective NSAID available at present, in terms of improving both pain and function. Nevertheless, in view of the safety profile of these drugs, physicians need to consider our results together with all known safety information when selecting the preparation and dose for individual patients. FUNDING Swiss National Science Foundation (grant number 405340-104762) and Arco Foundation, Switzerland.
Resumo:
Purpose. To examine the association between living in proximity to Toxics Release Inventory (TRI) facilities and the incidence of childhood cancer in the State of Texas. ^ Design. This is a secondary data analysis utilizing the publicly available Toxics release inventory (TRI), maintained by the U.S. Environmental protection agency that lists the facilities that release any of the 650 TRI chemicals. Total childhood cancer cases and childhood cancer rate (age 0-14 years) by county, for the years 1995-2003 were used from the Texas cancer registry, available at the Texas department of State Health Services website. Setting: This study was limited to the children population of the State of Texas. ^ Method. Analysis was done using Stata version 9 and SPSS version 15.0. Satscan was used for geographical spatial clustering of childhood cancer cases based on county centroids using the Poisson clustering algorithm which adjusts for population density. Pictorial maps were created using MapInfo professional version 8.0. ^ Results. One hundred and twenty five counties had no TRI facilities in their region, while 129 facilities had at least one TRI facility. An increasing trend for number of facilities and total disposal was observed except for the highest category based on cancer rate quartiles. Linear regression analysis using log transformation for number of facilities and total disposal in predicting cancer rates was computed, however both these variables were not found to be significant predictors. Seven significant geographical spatial clusters of counties for high childhood cancer rates (p<0.05) were indicated. Binomial logistic regression by categorizing the cancer rate in to two groups (<=150 and >150) indicated an odds ratio of 1.58 (CI 1.127, 2.222) for the natural log of number of facilities. ^ Conclusion. We have used a unique methodology by combining GIS and spatial clustering techniques with existing statistical approaches in examining the association between living in proximity to TRI facilities and the incidence of childhood cancer in the State of Texas. Although a concrete association was not indicated, further studies are required examining specific TRI chemicals. Use of this information can enable the researchers and public to identify potential concerns, gain a better understanding of potential risks, and work with industry and government to reduce toxic chemical use, disposal or other releases and the risks associated with them. TRI data, in conjunction with other information, can be used as a starting point in evaluating exposures and risks. ^
Resumo:
Objective. The purpose of the study is to provide a holistic depiction of behavioral & environmental factors contributing to risky sexual behaviors among predominantly high school educated, low-income African Americans residing in urban areas of Houston, TX utilizing the Theory of Gender and Power, Situational/Environmental Variables Theory, and Sexual Script Theory. ^ Methods. A cross-sectional study was conducted via questionnaires among 215 Houston area residents, 149 were women and 66 were male. Measures used to assess behaviors of the population included a history of homelessness, use of crack/cocaine among several other illicit drugs, the type of sexual partner, age of participant, age of most recent sex partner, whether or not participants sought health care in the last 12 months, knowledge of partner's other sexual activities, symptoms of depression, and places where partner's were met. In an effort to determine risk of sexual encounters, a risk index employing the variables used to assess condom use was created categorizing sexual encounters as unsafe or safe. ^ Results. Variables meeting the significance level of p<.15 for the bivariate analysis of each theory were entered into a binary logistic regression analysis. The block for each theory was significant, suggesting that the grouping assignments of each variable by theory were significantly associated with unsafe sexual behaviors. Within the regression analysis, variables such as sex for drugs/money, low income, and crack use demonstrated an effect size of ≥±1, indicating that these variables had a significant effect on unsafe sexual behavioral practices. ^ Conclusions. Variables assessing behavior and environment demonstrated a significant effect when categorized by relation to designated theories. ^
Resumo:
Southeast Texas, including Houston, has a large presence of industrial facilities and has been documented to have poorer air quality and significantly higher cancer rates than the remainder of Texas. Given citizens’ concerns in this 4th largest city in the U.S., Mayor Bill White recently partnered with the UT School of Public Health to determine methods to evaluate the health risks of hazardous air pollutants (HAPs). Sexton et al. (2007) published a report that strongly encouraged analytic studies linking these pollutants with health outcomes. In response, we set out to complete the following aims: 1. determine the optimal exposure assessment strategy to assess the association between childhood cancer rates and increased ambient levels of benzene and 1,3-butadiene (in an ecologic setting) and 2. evaluate whether census tracts with the highest levels of benzene or 1,3-butadiene have higher incidence of childhood lymphohematopoietic cancer compared with census tracts with the lowest levels of benzene or 1,3-butadiene, using Poisson regression. The first aim was achieved by evaluating the usefulness of four data sources: geographic information systems (GIS) to identify proximity to point sources of industrial air pollution, industrial emission data from the U.S. EPA’s Toxic Release Inventory (TRI), routine monitoring data from the U.S. EPA Air Quality System (AQS) from 1999-2000 and modeled ambient air levels from the U.S. EPA’s 1999 National Air Toxic Assessment Project (NATA) ASPEN model. Further, once these four data sources were evaluated, we narrowed them down to two: the routine monitoring data from the AQS for the years 1998-2000 and the 1999 U.S. EPA NATA ASPEN modeled data. We applied kriging (spatial interpolation) methodology to the monitoring data and compared the kriged values to the ASPEN modeled data. Our results indicated poor agreement between the two methods. Relative to the U.S. EPA ASPEN modeled estimates, relying on kriging to classify census tracts into exposure groups would have caused a great deal of misclassification. To address the second aim, we additionally obtained childhood lymphohematopoietic cancer data for 1995-2004 from the Texas Cancer Registry. The U.S. EPA ASPEN modeled data were used to estimate ambient levels of benzene and 1,3-butadiene in separate Poisson regression analyses. All data were analyzed at the census tract level. We found that census tracts with the highest benzene levels had elevated rates of all leukemia (rate ratio (RR) = 1.37; 95% confidence interval (CI), 1.05-1.78). Among census tracts with the highest 1,3-butadiene levels, we observed RRs of 1.40 (95% CI, 1.07-1.81) for all leukemia. We detected no associations between benzene or 1,3-butadiene levels and childhood lymphoma incidence. This study is the first to examine this association in Harris and surrounding counties in Texas and is among the first to correlate monitored levels of HAPs with childhood lymphohematopoietic cancer incidence, evaluating several analytic methods in an effort to determine the most appropriate approach to test this association. Despite recognized weakness of ecologic analyses, our analysis suggests an association between childhood leukemia and hazardous air pollution.^
Resumo:
We conducted a nested case-control study to determine the significant risk factors for developing encephalitis from West Nile virus (WNV) infection. The purpose of this research project was to expand the previously published Houston study of 2002–2004 patients to include data on Houston patients from four additional years (2005–2008) to determine if there were any differences in risk factors shown to be associated with developing the more severe outcomes of WNV infection, encephalitis and death, by having this larger sample size. A re-analysis of the risk factors for encephalitis and death was conducted on all of the patients from 2002–2008 and was the focus of this proposed research. This analysis allowed for the determination to be made that there are differences in the outcome in the risk factors for encephalitis and death with an increased sample size. Retrospective medical chart reviews were completed for the 265 confirmed WNV hospitalized patients; 153 patients had encephalitis (WNE), 112 had either viral syndrome with fever (WNF) or meningitis (WNM); a total of 22 patients died. Univariate logistic regression analyses on demographic, comorbidities, and social risk factors was conducted in a similar manner as in the previously conducted study to determine the risk factors for developing encephalitis from WNV. A multivariate model was developed by using model building strategies for the multivariate logistic regression analysis. The hypothesis of this study was that there would be additional risk factors shown to be significant with the increase in sample size of the dataset. This analysis with a greater sample size and increased power supports the hypothesis in that there were additional risk factors shown to be statistically associated with the more severe outcomes of WNV infection (WNE or death). Based on univariate logistic regression results, these data showed that even though age of 20–44 years was statistically significant as a protecting effect for developing WNE in the original study, the expanded sample lacked significance. This study showed a significant WNE risk factor to be chronic alcohol abuse, when it was not significant in the original analysis. Other WNE risk factors identified in this analysis that showed to be significant but were not significant in the original analysis were cancer not in remission > 5 years, history of stroke, and chronic renal disease. When comparing the two analyses with death as an outcome, two risk factors that were shown to be significant in the original analysis but not in the expanded dataset analysis were diabetes mellitus and immunosuppression. Three risk factors shown to be significant in this expanded analysis but were not significant in the original study were illicit drug use, heroin or opiate use, and injection drug use. However, with the multiple logistic regression models, the same independent risk factors for developing encephalitis of age and history of hypertension including drug induced hypertension were consistent in both studies.^
Resumo:
The ascertainment and analysis of adverse reactions to investigational agents presents a significant challenge because of the infrequency of these events, their subjective nature and the low priority of safety evaluations in many clinical trials. A one year review of antibiotic trials published in medical journals demonstrates the lack of standards in identifying and reporting these potentially fatal conditions. This review also illustrates the low probability of observing and detecting rare events in typical clinical trials which include fewer than 300 subjects. Uniform standards for ascertainment and reporting are suggested which include operational definitions of study subjects. Meta-analysis of selected antibiotic trials using multivariate regression analysis indicates that meaningful conclusions may be drawn from data from multiple studies which are pooled in a scientifically rigorous manner. ^
Resumo:
Purpose. This project was designed to describe the association between wasting and CD4 cell counts in HIV-infected men in order to better understand the role of wasting in progression of HIV infection.^ Methods. Baseline and prevalence data were collected from a cross-sectional survey of 278 HIV-infected men seen at the Houston Veterans Affairs Medical Center Special Medicine Clinic, from June 1, 1991 to January 1, 1994. A follow-up study was conducted among those at risk, to investigate the incidence of wasting and the association between wasting and low CD4 cell counts. Wasting was described by four methods. Z-scores for age-, sex-, and height-adjusted weight; sex-, and age-adjusted mid-arm muscle circumference (MAMC); and fat-free mass; and the ratio of extra-cellular mass (ECM) to body-cell mass (BCM) $>$ 1.20. FFM, ECM, and BCM were estimated from bioelectrical impedance analysis. MAMC was calculated from triceps skinfold and mid-arm circumference. The relationship between wasting and covariates was examined with logistic regression in the cross-sectional study, and with Poisson regression in the follow-up study. The association between death and wasting was examined with Cox's regression.^ Results. The prevalence of wasting ranged from 5% (weight and ECM:BCM) to almost 14% (MAMC and FFM) among the 278 men examined. The odds of wasting, associated with baseline CD4 cell count $<$200, was significant for each method but weight, and ranged from 4.6 to 12.7. Use of antiviral therapy was significantly protective of MAMC, FFM and ECM:BCM (OR $\approx$ 0.2), whereas the need for antibacterial therapy was a risk (OR 3.1, 95% CI 1.1-8.7). The average incidence of wasting ranged from 4 to 16 per 100 person-years among the approximately 145 men followed for 160 person-years. Low CD4 cell count seemed to increase the risk of wasting, but statistical significance was not reached. The effect of the small sample size on the power to detect a significant association should be considered. Wasting, by MAMC and FFM, was significantly associated with death, after adjusting for baseline serum albumin concentration and CD4 cell count.^ Conclusions. Wasting by MAMC and FFM were strongly associated with baseline CD4 cell counts in both the prevalence and incidence study and strong predictors of death. Of the two methods, MAMC is convenient, has available reference population data, may be the most appropriate for assessing the nutritional status of HIV-infected men. ^