995 resultados para multivariate analysis
Resumo:
Soil physical quality is an important factor for the sustainability of agricultural systems. Thus, the aim of this study was to evaluate soil physical properties and soil organic carbon in a Typic Acrudox under an integrated crop-livestock-forest system. The experiment was carried out in Mato Grosso do Sul, Brazil. Treatments consisted of seven systems: integrated crop-livestock-forest, with 357 trees ha-1 and pasture height of 30 cm (CLF357-30); integrated crop-livestock-forest with 357 trees ha-1 and pasture height of 45 cm (CLF357-45); integrated crop-livestock-forest with 227 trees ha-1 and pasture height of 30 cm (CLF227-30); integrated crop-livestock-forest with 227 trees ha-1 and pasture height of 45 cm (CLF227-45); integrated crop-livestock with pasture height of 30 cm (CL30); integrated crop-livestock with pasture height of 45 cm (CL45) and native vegetation (NV). Soil properties were evaluated for the depths of 0-10 and 10-20 cm. All grazing treatments increased bulk density (r b) and penetration resistance (PR), and decreased total porosity (¦t) and macroporosity (¦ma), compared to NV. The values of r b (1.18-1.47 Mg m-3), ¦ma (0.14-0.17 m³ m-3) and PR (0.62-0.81 MPa) at the 0-10 cm depth were not restrictive to plant growth. The change in land use from NV to CL or CLF decreased soil organic carbon (SOC) and the soil organic carbon pool (SOCpool). All grazing treatments had a similar SOCpool at the 0-10 cm depth and were lower than that for NV (17.58 Mg ha-1).
Resumo:
Glutathione (GSH) dysregulation at the gene, protein, and functional levels has been observed in schizophrenia patients. Together with disease-like anomalies in GSH deficit experimental models, it suggests that such redox dysregulation can play a critical role in altering neural connectivity and synchronization, and thus possibly causing schizophrenia symptoms. To determine whether increased GSH levels would modulate EEG synchronization, N-acetyl-cysteine (NAC), a glutathione precursor, was administered to patients in a randomized, double-blind, crossover protocol for 60 days, followed by placebo for another 60 days (or vice versa). We analyzed whole-head topography of the multivariate phase synchronization (MPS) for 128-channel resting-state EEGs that were recorded at the onset, at the point of crossover, and at the end of the protocol. In this proof of concept study, the treatment with NAC significantly increased MPS compared to placebo over the left parieto-temporal, the right temporal, and the bilateral prefrontal regions. These changes were robust both at the group and at the individual level. Although MPS increase was observed in the absence of clinical improvement at a group level, it correlated with individual change estimated by Liddle's disorganization scale. Therefore, significant changes in EEG synchronization induced by NAC administration may precede clinically detectable improvement, highlighting its possible utility as a biomarker of treatment efficacy. TRIAL REGISTRATION: ClinicalTrials.gov NCT01506765.
Resumo:
BACKGROUND: Pathological complete response (pCR) following chemotherapy is strongly associated with both breast cancer subtype and long-term survival. Within a phase III neoadjuvant chemotherapy trial, we sought to determine whether the prognostic implications of pCR, TP53 status and treatment arm (taxane versus non-taxane) differed between intrinsic subtypes. PATIENTS AND METHODS: Patients were randomized to receive either six cycles of anthracycline-based chemotherapy or three cycles of docetaxel then three cycles of eprirubicin/docetaxel (T-ET). pCR was defined as no evidence of residual invasive cancer (or very few scattered tumour cells) in primary tumour and lymph nodes. We used a simplified intrinsic subtypes classification, as suggested by the 2011 St Gallen consensus. Interactions between pCR, TP53 status, treatment arm and intrinsic subtype on event-free survival (EFS), distant metastasis-free survival (DMFS) and overall survival (OS) were studied using a landmark and a two-step approach multivariate analyses. RESULTS: Sufficient data for pCR analyses were available in 1212 (65%) of 1856 patients randomized. pCR occurred in 222 of 1212 (18%) patients: 37 of 496 (7.5%) luminal A, 22 of 147 (15%) luminal B/HER2 negative, 51 of 230 (22%) luminal B/HER2 positive, 43 of 118 (36%) HER2 positive/non-luminal, 69 of 221(31%) triple negative (TN). The prognostic effect of pCR on EFS did not differ between subtypes and was an independent predictor for better EFS [hazard ratio (HR) = 0.40, P < 0.001 in favour of pCR], DMFS (HR = 0.32, P < 0.001) and OS (HR = 0.32, P < 0.001). Chemotherapy arm was an independent predictor only for EFS (HR = 0.73, P = 0.004 in favour of T-ET). The interaction between TP53, intrinsic subtypes and survival outcomes only approached statistical significance for EFS (P = 0.1). CONCLUSIONS: pCR is an independent predictor of favourable clinical outcomes in all molecular subtypes in a two-step multivariate analysis. CLINICALTRIALSGOV: EORTC 10994/BIG 1-00 Trial registration number NCT00017095.
Resumo:
Purpose: Primary bone lymphoma (PBL) accounts for less than 1% of all malignant lymphomas, and 4-5% of all extra-nodal lymphomas. In this study, the disease profile, outcome, and prognostic factors were assessed in patients with stage I and II PBL.Patients and Methods: Thirteen Rare Cancer Network (RCN) institutions enrolled 116 consecutive patients with PBL treated between 1987 and 2008 in this study. Inclusion criteria were age > 16 years, stage I and II, minimum 6 months follow-up and a biopsy-proven confirmation of non-Hodgkin's lymphoma (NHL). Eighty-seven patients underwent chemoradiotherapy (CXRT), 15 radiotherapy (RT) without (13) or with (2) surgery, 14 chemotherapy (CXT) without (9) or with (5) surgery. Median RT dose was 40 Gy (range: 4-60). The median number of CXT cycles was 6 (range: 2-8). Median follow-up was 41 months (range: 6-242).Results: The overall response rate at the end of treatment was 91% (CR 74%, PR 17%). Local recurrence or progression was observed in 12 (10%) patients, and systemic recurrence in 17 (15%). Causes of death included disease progression in 21, unrelated in 5, CXT-related toxicity in 1, and second primary cancer in 2 patients. The 5-yr overall survival (OS), lymphoma-specific survival (LSS), and local control (LC) were 76%, 78% and 92%, respectively. In univariate analyses (log-rank test), favorable prognostic factors for OS were age <50 years (P=0.008), international prognostic index (IPI) score ≤1 (P=0.009), high grade histology (P=0.04), CXRT (P=0.05), CXT (P=0,0004), complete response (CR) (P<0.0001), number of CXT cycles ( ≥6 ) (P=0.01), and RT dose > 40 Gy (P=0.005). All above-mentioned parameters were also significant for LSS except for age and number of chemotherapy cycles. For LC, only CR and stage I were favorable factors. In multivariate analysis, IPI score, RT dose, complete response, and chemotherapy were independently influencing the outcome (OS and LSS). Complete response at the end of treatment was the only predicting factor for LC. Six patients developed grade 3 or more toxicities, according to Common Terminology Criteria for Adverse Events (CTCAE) V3.0.Conclusion: This large multicenter study confirms the relatively good prognosis of early stage PBL treated with combined CXRT. Local control was excellent, while systemic failures were rare. An adequate dose of RT (40 Gy or more) and complete CXT regime (≥ 6 cycles) were associated with better outcome.
Resumo:
BACKGROUND: Little is known on the prevalence of multimorbidity (MM) in the general population. We aimed to assess the prevalence of MM using measured or self-reported data in the Swiss population. METHODS: Cross-sectional, population-based study conducted between 2003 and 2006 in the city of Lausanne, Switzerland, and including 3714 participants (1967 women) aged 35 to 75 years. Clinical evaluation was conducted by thoroughly trained nurses or medical assistants and the psychiatric evaluation by psychologists or psychiatrists. For psychiatric conditions, two definitions were used: either based on the participant's statements, or on psychiatric evaluation. MM was defined as presenting ≥2 morbidities out of a list of 27 (self-reported - definition A, or measured - definition B) or as the Functional Comorbidity Index (FCI) using measured data - definition C. RESULTS: The overall prevalence and (95% confidence interval) of MM was 34.8% (33.3%-36.4%), 56.3% (54.6%-57.9%) and 22.7% (21.4%-24.1%) for definitions A, B and C, respectively. Prevalence of MM was higher in women (40.2%, 61.7% and 27.1% for definitions A, B and C, respectively, vs. 28.7%, 50.1% and 17.9% in men, p < 0.001); Swiss nationals (37.1%, 58.8% and 24.8% for definitions A, B and C, respectively, vs. 31.4%, 52.3% and 19.7% in foreigners, all p < 0.001); elderly (>65 years: 67.0%, 70.0% and 36.7% for definitions A, B and C, respectively, vs. 23.6%, 50.2% and 13.8% for participants <45 years, p < 0.001); participants with lower educational level; former smokers and obese participants. Multivariate analysis confirmed most of these associations: odds ratio (95% Confidence interval) 0.55 (0.47-0.64), 0.61 (0.53-0.71) and 0.51 (0.42-0.61) for men relative to women for definitions A, B and C, respectively; 1.27 (1.09-1.49), 1.29 (1.11-1.49) and 1.41 (1.17-1.71) for Swiss nationals relative to foreigners, for definitions A, B and C, respectively. Conversely, no difference was found for educational level for definitions A and B and abdominally obese participants for all definitions. CONCLUSIONS: Prevalence of MM is high in the Lausanne population, and varies according to the definition or the data collection method.
Resumo:
OBJECTIVE: Evaluation of the quantitative antibiogram as an epidemiological tool for the prospective typing of methicillin-resistant Staphylococcus aureus (MRSA), and comparison with ribotyping. METHODS: The method is based on the multivariate analysis of inhibition zone diameters of antibiotics in disk diffusion tests. Five antibiotics were used (erythromycin, clindamycin, cotrimoxazole, gentamicin, and ciprofloxacin). Ribotyping was performed using seven restriction enzymes (EcoRV, HindIII, KpnI, PstI, EcoRI, SfuI, and BamHI). SETTING: 1,000-bed tertiary university medical center. RESULTS: During a 1-year period, 31 patients were found to be infected or colonized with MRSA. Cluster analysis of antibiogram data showed nine distinct antibiotypes. Four antibiotypes were isolated from multiple patients (2, 4, 7, and 13, respectively). Five additional antibiotypes were isolated from the remaining five patients. When analyzed with respect to the epidemiological data, the method was found to be equivalent to ribotyping. Among 206 staff members who were screened, six were carriers of MRSA. Both typing methods identified concordant of MRSA types in staff members and in the patients under their care. CONCLUSIONS: The quantitative antibiogram was found to be equivalent to ribotyping as an epidemiological tool for typing of MRSA in our setting. Thus, this simple, rapid, and readily available method appears to be suitable for the prospective surveillance and control of MRSA for hospitals that do not have molecular typing facilities and in which MRSA isolates are not uniformly resistant or susceptible to the antibiotics tested.
Resumo:
Despite numerous studies conducted on the lower limit of soil and its contact with saprolite layers, a great deal of work is left to standardize identification and annotation of these variables in the field. In shallow soils, the appropriately noting these limits or contacts is essential for determining their behavior and potential use. The aims of this study were to identify and define the field contact and/or transition zone between soil and saprolite in profiles of an Alisol derived from fine sandstone and siltstone/claystone in subtropical southern Brazil and to subsequently validate the field observations through a multivariate analysis of laboratory analytical data. In the six Alisol profiles evaluated, the sequence of horizons found was A, Bt, C, and Cr, where C was considered part of the soil due to its pedogenetic structure, and Cr was considered saprolite due to its rock structure. The morphological properties that were determined in the field and that were different between the B and C horizons and the Cr layer were color, structure, texture, and fragments of saprolite. According to the test of means, the properties that support the inclusion of the C horizon as part of the soil are sand, clay, water-dispersible clay, silt/clay ratio, macroporosity, total porosity, resistance to penetration, cation exchange capacity, Fe extracted by DCB, Al, H+Al, and cation exchange capacity of clay. The properties that support the C horizon as a transition zone are silt, Ca, total organic C, and Fe extracted by ammonium oxalate. Discriminant analysis indicated differences among the three horizons evaluated.
Resumo:
BACKGROUND: To compare the prognostic relevance of Masaoka and Müller-Hermelink classifications. METHODS: We treated 71 patients with thymic tumors at our institution between 1980 and 1997. Complete follow-up was achieved in 69 patients (97%) with a mean follow up-time of 8.3 years (range, 9 months to 17 years). RESULTS: Masaoka stage I was found in 31 patients (44.9%), stage II in 17 (24.6%), stage III in 19 (27.6%), and stage IV in 2 (2.9%). The 10-year overall survival rate was 83.5% for stage I, 100% for stage IIa, 58% for stage IIb, 44% for stage III, and 0% for stage IV. The disease-free survival rates were 100%, 70%, 40%, 38%, and 0%, respectively. Histologic classification according to Müller-Hermelink found medullary tumors in 7 patients (10.1%), mixed in 18 (26.1%), organoid in 14 (20.3%), cortical in 11 (15.9%), well-differentiated thymic carcinoma in 14 (20.3%), and endocrine carcinoma in 5 (7.3%), with 10-year overall survival rates of 100%, 75%, 92%, 87.5%, 30%, and 0%, respectively, and 10-year disease-free survival rates of 100%, 100%, 77%, 75%, 37%, and 0%, respectively. Medullary, mixed, and well-differentiated organoid tumors were correlated with stage I and II, and well-differentiated thymic carcinoma and endocrine carcinoma with stage III and IV (p < 0.001). Multivariate analysis showed age, gender, myasthenia gravis, and postoperative adjuvant therapy not to be significant predictors of overall and disease-free survival after complete resection, whereas the Müller-Hermelink and Masaoka classifications were independent significant predictors for overall (p < 0.05) and disease-free survival (p < 0.004; p < 0.0001). CONCLUSIONS: The consideration of staging and histology in thymic tumors has the potential to improve recurrence prediction and patient selection for combined treatment modalities.
Resumo:
PURPOSE: Squamous cell carcinoma of larynx with subglottic extension (sSCC) is a rare location described to carry a poor prognosis. The aim of this study was to analyze outcomes and feasibility of larynx preservation in sSCC patients. PATIENTS AND METHODS: Between 1996 and 2012, 197 patients with sSCC were treated at our institution and included in the analysis. Stage III-IV tumors accounted for 76 %. Patients received surgery (62 %), radiotherapy (RT) (18 %), or induction chemotherapy (CT) (20 %) as front-line therapy. RESULTS: The 5-year actuarial overall survival (OS), locoregional control (LRC), and distant control rate were 59 % (95 % CI 51-68), 83 % (95 % CI 77-89), and 88 % (95 % CI 83-93), respectively, with a median follow-up of 54.4 months. There was no difference in OS and LRC according to front-line treatments or between primary subglottic cancer and glottosupraglottic cancers with subglottic extension. In the multivariate analysis, age > 60 years and positive N stage were the only predictors for OS (HR 2, 95 % CI 1.2-3.6; HR1.9, 95 % CI 1-3.5, respectively). A lower LRC was observed for T3 patients receiving a larynx preservation protocol as compared with those receiving a front-line surgery (HR 14.1, 95 % CI 2.5-136.7; p = 0.02); however, no difference of ultimate LRC was observed according to the first therapy when including T3 patients who underwent salvage laryngectomy (p = 0.6). In patients receiving a larynx preservation protocol, the 5-year larynx-preservation rate was 55 % (95 % CI 43-68), with 36 % in T3 patients. The 5-year larynx preservation rate was 81 % (95 % CI 65-96) and 35 % (95 % CI 20-51) for patients who received RT or induction CT as a front-line treatment, respectively. CONCLUSION: Outcomes of sSCC are comparable with other laryngeal cancers when managed with modern therapeutic options. Larynx-preservation protocols could be a suitable option in T1-T2 (RT or chemo-RT) and selected T3 sSCC patients (induction CT).
Resumo:
Many forested areas have been converted to intensive agricultural use to satisfy food, fiber, and forage production for a growing world population. There is great interest in evaluating forest conversion to cultivated land because this conversion adversely affects several soil properties. We examined soil microbial, physical, and chemical properties in an Oxisol (Latossolo Vermelho distrófico) of southern Brazil 24 years after forest conversion to a perennial crop with coffee or annual grain crops (maize and soybeans) in conventional tillage or no-tillage. One goal was to determine which soil quality parameters seemed most sensitive to change. A second goal was to test the hypothesis that no-tillage optimized preservation of soil quality indicators in annual cropping systems on converted land. Land use significantly affected microbial biomass and its activity, C and N mineralization, and aggregate stability by depth. Cultivated sites had lower microbial biomass and mineralizable C and N than a forest used as control. The forest and no-tillage sites had higher microbial biomass and mineralizable C and N than the conventional tillage site, and the metabolic quotient was 65 and 43 % lower, respectively. Multivariate analysis of soil microbial properties showed a clear separation among treatments, displaying a gradient from conventional tillage to forest. Although the soil at the coffee site was less disturbed and had a high organic C content, the microbial activity was low, probably due to greater soil acidity and Al toxicity. Under annual cropping, microbial activity in no-tillage was double that of the conventional tillage management. The greater microbial activity in forest and no-tillage sites may be attributed, at least partially, to lower soil disturbance. Reducing soil disturbance is important for soil C sequestration and microbial activity, although control of soil pH and Al toxicity are also essential to maintain the soil microbial activity high.
Resumo:
BACKGROUND: The risk of falls is the most commonly cited reason for not providing oral anticoagulation, although the risk of bleeding associated with falls on oral anticoagulants is still debated. We aimed to evaluate whether patients on oral anticoagulation with high falls risk have an increased risk of major bleeding. METHODS: We prospectively studied consecutive adult medical patients who were discharged on oral anticoagulants. The outcome was the time to a first major bleed within a 12-month follow-up period adjusted for age, sex, alcohol abuse, number of drugs, concomitant treatment with antiplatelet agents, and history of stroke or transient ischemic attack. RESULTS: Among the 515 enrolled patients, 35 patients had a first major bleed during follow-up (incidence rate: 7.5 per 100 patient-years). Overall, 308 patients (59.8%) were at high risk of falls, and these patients had a nonsignificantly higher crude incidence rate of major bleeding than patients at low risk of falls (8.0 vs 6.8 per 100 patient-years, P=.64). In multivariate analysis, a high falls risk was not statistically significantly associated with the risk of a major bleed (hazard ratio 1.09; 95% confidence interval, 0.54-2.21). Overall, only 3 major bleeds occurred directly after a fall (incidence rate: 0.6 per 100 patient-years). CONCLUSIONS: In this prospective cohort, patients on oral anticoagulants at high risk of falls did not have a significantly increased risk of major bleeds. These findings suggest that being at risk of falls is not a valid reason to avoid oral anticoagulants in medical patients.
Resumo:
RATIONALE: Many sources of conflict exist in intensive care units (ICUs). Few studies recorded the prevalence, characteristics, and risk factors for conflicts in ICUs. OBJECTIVES: To record the prevalence, characteristics, and risk factors for conflicts in ICUs. METHODS: One-day cross-sectional survey of ICU clinicians. Data on perceived conflicts in the week before the survey day were obtained from 7,498 ICU staff members (323 ICUs in 24 countries). MEASUREMENTS AND MAIN RESULTS: Conflicts were perceived by 5,268 (71.6%) respondents. Nurse-physician conflicts were the most common (32.6%), followed by conflicts among nurses (27.3%) and staff-relative conflicts (26.6%). The most common conflict-causing behaviors were personal animosity, mistrust, and communication gaps. During end-of-life care, the main sources of perceived conflict were lack of psychological support, absence of staff meetings, and problems with the decision-making process. Conflicts perceived as severe were reported by 3,974 (53%) respondents. Job strain was significantly associated with perceiving conflicts and with greater severity of perceived conflicts. Multivariate analysis identified 15 factors associated with perceived conflicts, of which 6 were potential targets for future intervention: staff working more than 40 h/wk, more than 15 ICU beds, caring for dying patients or providing pre- and postmortem care within the last week, symptom control not ensured jointly by physicians and nurses, and no routine unit-level meetings. CONCLUSIONS: Over 70% of ICU workers reported perceived conflicts, which were often considered severe and were significantly associated with job strain. Workload, inadequate communication, and end-of-life care emerged as important potential targets for improvement.
Resumo:
BACKGROUND: Atazanavir-associated hyperbilirubinemia can cause premature discontinuation of atazanavir and avoidance of its initial prescription. We used genomewide genotyping and clinical data to characterize determinants of atazanavir pharmacokinetics and hyperbilirubinemia in AIDS Clinical Trials Group protocol A5202. METHODS: Plasma atazanavir pharmacokinetics and indirect bilirubin concentrations were characterized in HIV-1-infected patients randomized to atazanavir/ritonavir-containing regimens. A subset had genomewide genotype data available. RESULTS: Genomewide assay data were available from 542 participants, of whom 475 also had data on estimated atazanavir clearance and relevant covariates available. Peak bilirubin concentration and relevant covariates were available for 443 participants. By multivariate analysis, higher peak on-treatment bilirubin levels were found to be associated with the UGT1A1 rs887829 T allele (P=6.4×10), higher baseline hemoglobin levels (P=4.9×10), higher baseline bilirubin levels (P=6.7×10), and slower plasma atazanavir clearance (P=8.6×10). For peak bilirubin levels greater than 3.0 mg/dl, the positive predictive value of a baseline bilirubin level of 0.5 mg/dl or higher with hemoglobin concentrations of 14 g/dl or higher was 0.51, which increased to 0.85 with rs887829 TT homozygosity. For peak bilirubin levels of 3.0 mg/dl or lower, the positive predictive value of a baseline bilirubin level less than 0.5 mg/dl with a hemoglobin concentration less than 14 g/dl was 0.91, which increased to 0.96 with rs887829 CC homozygosity. No polymorphism predicted atazanavir pharmacokinetics at genomewide significance. CONCLUSION: Atazanavir-associated hyperbilirubinemia is best predicted by considering UGT1A1 genotype, baseline bilirubin level, and baseline hemoglobin level in combination. Use of ritonavir as a pharmacokinetic enhancer may have abrogated genetic associations with atazanavir pharmacokinetics.
Resumo:
Background and Purpose-Demographic changes will result in a rapid increase of patients age >= 90 years (nonagenarians), but little is known about outcomes in these patients after intravenous thrombolysis (IVT) for acute ischemic stroke. We aimed to assess safety and functional outcome in nonagenarians treated with IVT and to compare the outcomes with those of patients age 80 to 89 years (octogenarians).Methods-We analyzed prospectively collected data of 284 consecutive stroke patients age >= 80 years treated with IVT in 7 Swiss stroke units. Presenting characteristics, favorable outcome (modified Rankin scale [mRS] 0 or 1), mortality at 3 months, and symptomatic intracranial hemorrhage (SICH) using the National Institute of Neurological Disorders and Stroke (NINDS) and Safe Implementation of Thrombolysis in Stroke-Monitoring Study (SITS-MOST) criteria were compared between nonagenarians and octogenarians.Results-As compared with octogenarians (n=238; mean age, 83 years), nonagenarians (n=46; mean age, 92 years) were more often women (70% versus 54%; P=0.046) and had lower systolic blood pressure (161 mm Hg versus 172 mm Hg; P=0.035). Patients age >= 90 years less often had a favorable outcome and had a higher incidence of mortality than did patients age 80 to 89 years (14.3% versus 30.2%; P=0.034; and 45.2% versus 22.1%; P=0.002; respectively), while more nonagenarians than octogenarians experienced a SICH (SICHNINDS, 13.3% versus 5.9%; P=0.106; SICHSITS-MOST, 13.3% versus 4.7%; P=0.037). Multivariate adjustment identified age >= 90 years as an independent predictor of mortality (P=0.017).Conclusions-Our study suggests less favorable outcomes in nonagenarians as compared with octogenarians after IVT for ischemic stroke, and it demands a careful selection for treatment, unless randomized controlled trials yield more evidence for IVT in very old stroke patients. (Stroke. 2011; 42: 1967-1970.)
Resumo:
BACKGROUND: Some physicians are still concerned about the safety of treatment at home of patients with acute deep venous thrombosis (DVT). METHODS: We used data from the RIETE (Registro Informatizado de la Enfermedad TromboEmbólica) registry to compare the outcomes in consecutive outpatients with acute lower limb DVT according to initial treatment at home or in the hospital. A propensity score-matching analysis was carried out with a logistic regression model. RESULTS: As of December 2012, 13,493 patients had been enrolled. Of these, 4456 (31%) were treated at home. Patients treated at home were more likely to be male and younger and to weigh more; they were less likely than those treated in the hospital to have chronic heart failure, lung disease, renal insufficiency, anemia, recent bleeding, immobilization, or cancer. During the first week of anticoagulation, 27 patients (0.20%) suffered pulmonary embolism (PE), 12 (0.09%) recurrent DVT, and 51 (0.38%) major bleeding; 80 (0.59%) died. When only patients treated at home were considered, 12 (0.27%) had PE, 4 (0.09%) had recurrent DVT, 6 (0.13%) bled, and 4 (0.09%) died (no fatal PE, 3 fatal bleeds). After propensity analysis, patients treated at home had a similar rate of venous thromboembolism recurrences and a lower rate of major bleeding (odds ratio, 0.4; 95% confidence interval, 0.1-1.0) or death (odds ratio, 0.2; 95% confidence interval, 0.1-0.7) within the first week compared with those treated in the hospital. CONCLUSIONS: In outpatients with DVT, home treatment was associated with a better outcome than treatment in the hospital. These data may help safely treat more DVT patients at home.