865 resultados para layoff hazard rates


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To evaluate rates of visual field progression in eyes with optic disc hemorrhages and the effect of intraocular pressure (IOP) reduction on these rates. Design: Observational cohort study. Participants: The study included 510 eyes of 348 patients with glaucoma who were recruited from the Diagnostic Innovations in Glaucoma Study (DIGS) and followed for an average of 8.2 years. Methods: Eyes were followed annually with clinical examination, standard automated perimetry visual fields, and optic disc stereophotographs. The presence of optic disc hemorrhages was determined on the basis of masked evaluation of optic disc stereophotographs. Evaluation of rates of visual field change during follow-up was performed using the visual field index (VFI). Main Outcome Measures: The evaluation of the effect of optic disc hemorrhages on rates of visual field progression was performed using random coefficient models. Estimates of rates of change for individual eyes were obtained by best linear unbiased prediction (BLUP). Results: During follow-up, 97 (19%) of the eyes had at least 1 episode of disc hemorrhage. The overall rate of VFI change in eyes with hemorrhages was significantly faster than in eyes without hemorrhages (-0.88%/year vs. -0.38%/year, respectively, P < 0.001). The difference in rates of visual field loss pre- and post-hemorrhage was significantly related to the reduction of IOP in the post-hemorrhage period compared with the pre-hemorrhage period (r = -0.61; P < 0.001). Each 1 mmHg of IOP reduction was associated with a difference of 0.31%/year in the rate of VFI change. Conclusions: There was a beneficial effect of treatment in slowing rates of progressive visual field loss in eyes with optic disc hemorrhage. Further research should elucidate the reasons why some patients with hemorrhages respond well to IOP reduction and others seem to continue to progress despite a significant reduction in IOP levels. Financial Disclosure(s): Proprietary or commercial disclosure may be found after the references. Ophthalmology 2010; 117: 2061-2066 (C) 2010 by the American Academy of Ophthalmology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE. To evaluate and compare rates of change in neuro-retinal rim area (RA) and retinal nerve fiber layer thickness (RNFLT) measurements in glaucoma patients, those with suspected glaucoma, and normal subjects observed over time. METHODS. In this observational cohort study, patients recruited from two longitudinal studies (Diagnostic Innovations in Glaucoma Study-DIGS and African Descent and Evaluation Study-ADAGES) were observed with standard achromatic perimetry (SAP), optic disc stereophotographs, confocal scanning laser ophthalmoscopy (HRT-3; Heidelberg Engineering, Heidelberg, Germany), and scanning laser polarimetry (GDx-VCC; Carl Zeiss Meditec, Inc., Dublin, CA). Glaucoma progression was determined by the Guided Progression Analysis software for standard automated perimetry [SAP] and by masked assessment of serial optic disc stereophotographs by expert graders. Random-coefficients models were used to evaluate rates of change in average RNFLT and global RA measurements and their relationship with glaucoma progression. RESULTS. At baseline, 194 (31%) eyes were glaucomatous, 347 (55%) had suspected glaucoma, and 88 (14%) were normal. Forty-six (9%) eyes showed progression by SAP and/or stereophotographs, during an average follow-up of 3.3 (+/-0.7) years. The average rate of decline for RNFLT measurements was significantly higher in the progressing group than in the non-progressing group (-0.65 vs. -0.11 mu m/y, respectively; P < 0.001), whereas RA decline was not significantly different between these groups (-0.0058 vs. -0.0073 mm(2)/y, respectively; P = 0.727). The areas under the receiver operating characteristic (ROC) curves used to discriminate progressing versus nonprogressing eyes were 0.811 and 0.507 for the rates of change in the RNFLT and RA, respectively (P < 0.001). CONCLUSIONS. The ability to discriminate eyes with progressing glaucoma by SAP and/or stereophotographs from stable eyes was significantly greater for RNFLT than for RA measurements. (Invest Ophthalmol Vis Sci. 2010;51:3531-3539) DOI: 10.1167/iovs.09-4350

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES We sought to assess the prognostic value and risk classification improvement using contemporary single-photon emission computed tomography myocardial perfusion imaging (SPECT-MPI) to predict all-cause mortality. BACKGROUND Myocardial perfusion is a strong estimator of prognosis. Evidence published to date has not established the added prognostic value of SPECT-MPI nor defined an approach to detect improve classification of risk in women from a developing nation. METHODS A total of 2,225 women referred for SPECT-MPI were followed by a mean period of 3.7 +/- 1.4 years. SPECT-MPI results were classified as abnormal on the presence of any perfusion defect. Abnormal scans were further classified as with mild/moderate reversible, severe reversible, partial reversible, or fixed perfusion defects. Risk estimates for incident mortality were categorized as <1%/year, 1% to 2%/year, and >2%/year using Cox proportional hazard models. Risk-adjusted models incorporated clinical risk factors, left ventricular ejection fraction (LVEF), and perfusion variables. RESULTS All-cause death occurred in 139 patients. SPECT-MPI significantly risk stratified the population; patients with abnormal scans had significantly higher death rates compared with patients with normal scans, 13.1% versus 4.0%, respectively (p < 0.001). Cox analysis demonstrated that after adjusting for clinical risk factors and LVEF, SPECT-MPI improved the model discrimination (integrated discrimination index = 0.009; p = 0.02), added significant incremental prognostic information (global chi-square increased from 87.7 to 127.1; p < 0.0001), and improved risk prediction (net reclassification improvement = 0.12; p = 0.005). CONCLUSIONS SPECT-MPI added significant incremental prognostic information to clinical and left ventricular functional variables while enhancing the ability to classify this Brazilian female population into low-and high-risk categories of all-cause mortality. (J Am Coll Cardiol Img 2011;4:880-8) (C) 2011 by the American College of Cardiology Foundation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Perioperative complications following robotic-assisted radical prostatectomy (RARP) have been previously reported in recent series. Few studies, however, have used standardized systems to classify surgical complications, and that inconsistency has hampered accurate comparisons between different series or surgical approaches. Objective: To assess trends in the incidence and to classify perioperative surgical complications following RARP in 2500 consecutive patients. Design, setting, and participants: We analyzed 2500 patients who underwent RARP for treatment of clinically localized prostate cancer (PCa) from August 2002 to February 2009. Data were prospectively collected in a customized database and retrospectively analyzed. Intervention: All patients underwent RARP performed by a single surgeon. Measurements: The data were collected prospectively in a customized database. Complications were classified using the Clavien grading system. To evaluate trends regarding complications and radiologic anastomotic leaks, we compared eight groups of 300 patients each, categorized according the surgeon`s experience (number of cases). Results and limitations: Our median operative time was 90 min (interquartile range [IQR]: 75-100 min). The median estimated blood loss was 100 ml (IQR: 100-150 ml). Our conversion rate was 0.08%, comprising two procedures converted to standard laparoscopy due to robot malfunction. One hundred and forty complications were observed in 127 patients (5.08%). The following percentages of patients presented graded complications: grade 1, 2.24%; grade 2, 1.8%; grade 3a, 0.08%; grade 3b, 0.48%; grade 4a, 0.40%. There were no cases of multiple organ dysfunction or death (grades 4b and 5). There were significant decreases in the overall complication rates (p = 0.0034) and in the number of anastomotic leaks (p < 0.001) as the surgeon`s experience increased. Conclusions: RARP is a safe option for treatment of clinically localized PCa, presenting low complication rates in experienced hands. Although the robotic system provides the surgeon with enhanced vision and dexterity, proficiency is only accomplished with consistent surgical volume; complication rates demonstrated a tendency to decrease as the surgeon`s experience increased. (C) 2010 European Association of Urology. Published by Elsevier B. V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our purpose was to retrospectively compare controlled ovarian stimulation(COH) in IVF cycles with administration of hCG on the day of menses (D1-hCG) with women not receiving hCG at day 1 of menses (Control). Data on maternal age, endocrine profile, amount of rFSH required, embryo characteristics, implantation and pregnancy rates were recorded for comparison between D1-hCG (n = 36) and Control (n = 64). Dose of rFSH required to accomplish COH was significantly lower in D1-hCG. Following ICSI, more top-quality embryos were available for transfer per patient in the D1-hCG and biochemical pregnancy rates per transfer were significantly higher in the D1-hCG. Significantly higher implantation and on-going pregnancy rates per embryo transfer were observed in D1-hCG (64%) compared to Control (41%). Administration of D1-hCG prior to COH reduces rFSH use and enhances oocyte developmental competence to obtain top quality embryos, and improves implantation and on-going pregnancy rates. At present it is not clear if the benefit is related to producing an embryo that more likely to implant or a more receptive uterus, or merely fortuitous and related to the relatively small power of the study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Transanal endoscopic microsurgery may represent appropriate diagnostic and therapeutic procedure in selected patients with distal rectal cancer following neoadjuvant chemoradiation. Even though this procedure has been associated with low rates of postoperative complications, patients undergoing neoadjuvant chemoradiation seem to be at increased risk for suture line dehiscence. In this setting, we compared the clinical outcomes of patients undergoing transanal endoscopic microsurgery with and without neoadjuvant chemoradiation. METHODS: Thirty-six consecutive patients were treated by transanal endoscopic microsurgery at a single institution. Twenty-three patients underwent local excision after neoadjuvant chemoradiation therapy for rectal adenocarcinoma, and 13 patients underwent local excision without any neoadjuvant treatment for benign and malignant rectal tumors. Chemoradiation therapy included 50.4 to 54Gy and 5-fluorouracil-based chemotherapy. All patients underwent transanal endoscopic microsurgery with primary closure of the rectal defect. Complications (immediate and late) and readmission rates were compared between groups. RESULTS: Overall, median hospital stay was 2 days. Immediate (30-d) complication rate was 44% for grade II/III complications. Patients undergoing neoadjuvant chemoradiation therapy were more likely to develop grade II/III immediate complications (56% vs 23%; P = .05). Overall, the 30-day readmission rate was 30%. Wound dehiscence was significantly more frequent among patients undergoing neoadjuvant chemoradiation therapy (70% vs 23%; P = .03). Patients undergoing neoadjuvant chemoradiation therapy were at significantly higher risk of requiring readmission (43% vs 7%; P = .02). CONCLUSION: Transanal local excision with the use of endoscopic microsurgical approach may result in significant postoperative morbidity, wound dehiscence, and readmission rates, in particular, because of rectal pain secondary to wound dehiscence. In this setting, the benefits of this minimally invasive approach either for diagnostic or therapeutic purposes become significantly restricted to highly selected patients that can potentially avoid a major operation but will still face a significantly morbid and painful procedure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Heart failure (HF) incidence in diabetes in both the presence and absence of CHD is rising. Prospective population-based studies can help describe the relationship between HbA(1c), a measure of glycaemia control, and HF risk. We studied the incidence of HF hospitalisation or death among 1,827 participants in the Atherosclerosis Risk in Communities (ARIC) study with diabetes and no evidence of HF at baseline. Cox proportional hazard models included age, sex, race, education, health insurance status, alcohol consumption, BMI and WHR, and major CHD risk factors (BP level and medications, LDL- and HDL-cholesterol levels, and smoking). In this population of persons with diabetes, crude HF incidence rates per 1,000 person-years were lower in the absence of CHD (incidence rate 15.5 for CHD-negative vs 56.4 for CHD-positive, p < 0.001). The adjusted HR of HF for each 1% higher HbA(1c) was 1.17 (95% CI 1.11-1.25) for the non-CHD group and 1.20 (95% CI 1.04-1.40) for the CHD group. When the analysis was limited to HF cases which occurred in the absence of prevalent or incident CHD (during follow-up) the adjusted HR remained 1.20 (95% CI 1.11-1.29). These data suggest HbA(1c) is an independent risk factor for incident HF in persons with diabetes with and without CHD. Long-term clinical trials of tight glycaemic control should quantify the impact of different treatment regimens on HF risk reduction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hantaviruses are rodent-borne Bunyaviruses that infect the Arvicolinae, Murinae, and Sigmodontinae subfamilies of Muridae. The rate of molecular evolution in the hantaviruses has been previously estimated at approximately 10(-7) nucleotide substitutions per site, per year (substitutions/site/year), based on the assumption of codivergence and hence shared divergence times with their rodent hosts. If substantiated, this would make the hantaviruses among the slowest evolving of all RNA viruses. However, as hantaviruses replicate with an RNA-dependent RNA polymerase, with error rates in the region of one mutation per genome replication, this low rate of nucleotide substitution is anomalous. Here, we use a Bayesian coalescent approach to estimate the rate of nucleotide substitution from serially sampled gene sequence data for hantaviruses known to infect each of the 3 rodent subfamilies: Araraquara virus ( Sigmodontinae), Dobrava virus ( Murinae), Puumala virus ( Arvicolinae), and Tula virus ( Arvicolinae). Our results reveal that hantaviruses exhibit shortterm substitution rates of 10(-2) to 10(-4) substitutions/site/year and so are within the range exhibited by other RNA viruses. The disparity between this substitution rate and that estimated assuming rodent-hantavirus codivergence suggests that the codivergence hypothesis may need to be reevaluated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Historically, the cure rate model has been used for modeling time-to-event data within which a significant proportion of patients are assumed to be cured of illnesses, including breast cancer, non-Hodgkin lymphoma, leukemia, prostate cancer, melanoma, and head and neck cancer. Perhaps the most popular type of cure rate model is the mixture model introduced by Berkson and Gage [1]. In this model, it is assumed that a certain proportion of the patients are cured, in the sense that they do not present the event of interest during a long period of time and can found to be immune to the cause of failure under study. In this paper, we propose a general hazard model which accommodates comprehensive families of cure rate models as particular cases, including the model proposed by Berkson and Gage. The maximum-likelihood-estimation procedure is discussed. A simulation study analyzes the coverage probabilities of the asymptotic confidence intervals for the parameters. A real data set on children exposed to HIV by vertical transmission illustrates the methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our objective was to describe the prevalence of low concentrations of retinol, beta-carotene, and vitamin E in a group of human immunodeficiency virus (HIV)-infected Latin American children and a comparison group of HIV-exposed, uninfected children. Our hypothesis was that the rates of low concentrations of these micronutrients would be higher in the HIV-infected group than those in the HIV-exposed, uninfected group. This was a cross-sectional substudy of a larger cohort study at clinical pediatric HIV centers in Latin America. Serum levels of micronutrients were measured in the first stored sample obtained after each child`s first birthday by high-performance liquid chromatography. Low concentrations of retinol, beta-carotene, and vitamin E were defined as serum levels below 0.70, 0.35, and 18.0 mu mol/L, respectively. The Population for this analysis was 336 children (124 HIV-infected, 212 HIV-exposed, uninfected) aged I year or older to younger than 4 years. Rates of low concentrations were 74% for retinol, 27% for beta-carotene, and 89% for vitamin E. These rates were not affected by HIV status. Among the HIV-infected children, those treated with anti retrovirals were less likely to have retinol deficiency, but no other HIV-related factors correlated with micronutrient low serum levels. Low concentrations of retinol, beta-carotene, and vitamin E are very common in children exposed to HIV living in Brazil, Argentina, and Mexico, regardless of HIV-infection status. Published by Elsevier Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Occupational risk due to airborne disease challenges healthcare institutions. Environmental measures are effective but their cost-effectiveness is still debatable and most of the capacity planning is based on occupational rates. Better indices to plan and evaluate capacity are needed. Goal To evaluate the impact of installing an exclusively dedicated respiratory isolation room (EDRIR) in a tertiary emergency department (ED) determined by a time-to-reach-facility method. Methods A group of patients in need of respiratory isolation were first identified-group I (2004; 29 patients; 44.1 +/- 3.4 years) and the occupational rate and time intervals (arrival to diagnosis, diagnosis to respiratory isolation indication and indication to effective isolation) were determined and it was estimated that adding an EDRIR would have a significant impact over the time to isolation. After implementing the EDRIR, a second group of patients was gathered in the same period of the year-group II (2007; 50 patients; 43.4 +/- 1.8 years) and demographic and functional parameters were recorded to evaluate time to isolation. Cox proportional hazard models adjusted for age, gender and inhospital respiratory isolation room availability were obtained. Results Implementing an EDRIR decreased the time from arrival to indication of respiratory isolation (27.5 +/- 9.3 X 3.7 +/- 2.0; p = 0.0180) and from indication to effective respiratory isolation (13.3 +/- 3.0 X 2.94 +/- 1.06; p = 0.003) but not the respiratory isolation duration and total hospital stay. The impact on crude isolation rates was very significant (8.9 X 75.4/100.000 patients; p < 0.001). The HR for effective respiratory isolation was 26.8 (95% CI 7.42 to 96.9) p < 0.001 greater for 2007. Conclusion Implementing an EDRIR in a tertiary ED significantly reduced the time to respiratory isolation.