35 resultados para design factors
Resumo:
OBJECTIVE To determine the frequency of and risk factors for complications associated with casts in horses. DESIGN Multicenter retrospective case series. ANIMALS 398 horses with a half-limb or full-limb cast treated at 1 of 4 hospitals. PROCEDURES Data collected from medical records included age, breed, sex, injury, limb affected, time from injury to hospital admission, surgical procedure performed, type of cast (bandage cast [BC; fiberglass tape applied over a bandage] or traditional cast [TC; fiberglass tape applied over polyurethane resin-impregnated foam]), limb position in cast (flexed, neutral, or extended), and complications. Risk factors for cast complications were identified via multiple logistic regression. RESULTS Cast complications were detected in 197 of 398 (49%) horses (18/53 [34%] horses with a BC and 179/345 [52%] horses with a TC). Of the 197 horses with complications, 152 (77%) had clinical signs of complications prior to cast removal; the most common clinical signs were increased lameness severity and visibly detectable soft tissue damage Cast sores were the most common complication (179/398 [45%] horses). Casts broke for 20 (5%) horses. Three (0.8%) horses developed a bone fracture attributable to casting Median time to detection of complications was 12 days and 8 days for horses with TCs and BCs, respectively. Complications developed in 71%, 48%, and 47% of horses with the casted limb in a flexed, neutral, and extended position, respectively. For horses with TCs, hospital, limb position in the cast, and sex were significant risk factors for development of cast complications. CONCLUSIONS AND CLINICAL RELEVANCE Results indicated that 49% of horses with a cast developed cast complications.
Resumo:
BACKGROUND Advanced lower extremity peripheral artery disease (PAD), whether presenting as acute limb ischemia (ALI) or chronic critical limb ischemia (CLI), is associated with high rates of cardiovascular ischemic events, amputation, and death. Past research has focused on strategies of revascularization, but few data are available that prospectively evaluate the impact of key process of care factors (spanning pre-admission, acute hospitalization, and post-discharge) that might contribute to improving short and long-term health outcomes. METHODS/DESIGN The FRIENDS registry is designed to prospectively evaluate a range of patient and health system care delivery factors that might serve as future targets for efforts to improve limb and systemic outcomes for patients with ALI or CLI. This hypothesis-driven registry was designed to evaluate the contributions of: (i) pre-hospital limb ischemia symptom duration, (ii) use of leg revascularization strategies, and (iii) use of risk-reduction pharmacotherapies, as pre-specified factors that may affect amputation-free survival. Sequential patients would be included at an index "vascular specialist-defined" ALI or CLI episode, and patients excluded only for non-vascular etiologies of limb threat. Data including baseline demographics, functional status, co-morbidities, pre-hospital time segments, and use of medical therapies; hospital-based use of revascularization strategies, time segments, and pharmacotherapies; and rates of systemic ischemic events (e.g., myocardial infarction, stroke, hospitalization, and death) and limb ischemic events (e.g., hospitalization for revascularization or amputation) will be recorded during a minimum of one year follow-up. DISCUSSION The FRIENDS registry is designed to evaluate the potential impact of key factors that may contribute to adverse outcomes for patients with ALI or CLI. Definition of new "health system-based" therapeutic targets could then become the focus of future interventional clinical trials for individuals with advanced PAD.
Resumo:
OBJECTIVE: Altered microbiota composition, changes in immune responses and impaired intestinal barrier functions are observed in IBD. Most of these features are controlled by proteases and their inhibitors to maintain gut homeostasis. Unrestrained or excessive proteolysis can lead to pathological gastrointestinal conditions. The aim was to validate the identified protease IBD candidates from a previously performed systematic review through a genetic association study and functional follow-up. DESIGN: We performed a genetic association study in a large multicentre cohort of patients with Crohn's disease (CD) and UC from five European IBD referral centres in a total of 2320 CD patients, 2112 UC patients and 1796 healthy controls. Subsequently, we did an extensive functional assessment of the candidate genes to explore their causality in IBD pathogenesis. RESULTS: Ten single nucleotide polymorphisms (SNPs) in four genes were significantly associated with CD: CYLD, USP40, APEH and USP3. CYLD was the most significant gene with the intronically located rs12324931 the strongest associated SNP (pFDR=1.74e-17, OR=2.24 (1.83 to 2.74)). Five SNPs in four genes were significantly associated with UC: USP40, APEH, DAG1 and USP3. CYLD, as well as some of the other associated genes, is part of the ubiquitin proteasome system (UPS). We therefore determined if the IBD-associated adherent-invasive Escherichia coli (AIEC) can modulate the UPS functioning. Infection of intestinal epithelial cells with the AIEC LF82 reference strain modulated the UPS turnover by reducing poly-ubiquitin conjugate accumulation, increasing 26S proteasome activities and decreasing protein levels of the NF-κB regulator CYLD. This resulted in IκB-α degradation and NF-κB activation. This activity was very important for the pathogenicity of AIEC since decreased CYLD resulted in increased ability of AIEC LF82 to replicate intracellularly. CONCLUSIONS: Our results reveal the UPS, and CYLD specifically, as an important contributor to IBD pathogenesis, which is favoured by both genetic and microbial factors.
Resumo:
Swiss aquaculture farms were assessed according to their risk of acquiring or spreading viral haemorrhagic septicaemia (VHS) and infectious haematopoietic necrosis (IHN). Risk factors for the introduction and spread of VHS and IHN were defined and assessed using published data and expert opinions. Among the 357 aquaculture farms identified in Switzerland, 49.3% were categorised as high risk, 49.0% as medium risk and 1.7% as low risk. According to the new Directive 2006/88/EC for aquaculture of the European Union, the frequency of farm inspections must be derived from their risk levels. A sensitivity analysis showed that water supply and fish movements were highly influential on the output of the risk assessment regarding the introduction of VHS and IHN. Fish movements were also highly influential on the risk assessment output regarding the spread of these diseases.
Resumo:
Objectives: In fast ball sports like beach volleyball, decision-making skills are a determining factor for excellent performance. The current investigation aimed to identify factors that influence the decisionmaking process in top-level beach volleyball defense in order to find relevant aspects for further research. For this reason, focused interviews with top players in international beach volleyball were conducted and analyzed with respect to decision-making characteristics. Design: Nineteen world-tour beach volleyball defense players, including seven Olympic or world champions, were interviewed, focusing on decision-making factors, gaze behavior, and interactions between the two. Methods: Verbal data were analyzed by inductive content analysis according to Mayring (2008). This approach allows categories to emerge from the interview material itself instead of forcing data into preset classifications and theoretical concepts. Results: The data analysis showed that, for top-level beach volleyball defense, decision making depends on opponent specifics, external context, situational context, opponent's movements, and intuition. Information on gaze patterns and visual cues revealed general tendencies indicating optimal gaze strategies that support excellent decision making. Furthermore, the analysis highlighted interactions between gaze behavior, visual information, and domain-specific knowledge. Conclusions: The present findings provide information on visual perception, domain-specific knowledge, and interactions between the two that are relevant for decision making in top-level beach volleyball defense. The results can be used to inform sports practice and to further untangle relevant mechanisms underlying decision making in complex game situations.
Resumo:
BACKGROUND The impact of prognostic factors in T1G3 non-muscle-invasive bladder cancer (BCa) patients is critical for proper treatment decision making. OBJECTIVE To assess prognostic factors in patients who received bacillus Calmette-Guérin (BCG) as initial intravesical treatment of T1G3 tumors and to identify a subgroup of high-risk patients who should be considered for more aggressive treatment. DESIGN, SETTING, AND PARTICIPANTS Individual patient data were collected for 2451 T1G3 patients from 23 centers who received BCG between 1990 and 2011. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS Using Cox multivariable regression, the prognostic importance of several clinical variables was assessed for time to recurrence, progression, BCa-specific survival, and overall survival (OS). RESULTS AND LIMITATIONS With a median follow-up of 5.2 yr, 465 patients (19%) progressed, 509 (21%) underwent cystectomy, and 221 (9%) died because of BCa. In multivariable analyses, the most important prognostic factors for progression were age, tumor size, and concomitant carcinoma in situ (CIS); the most important prognostic factors for BCa-specific survival and OS were age and tumor size. Patients were divided into four risk groups for progression according to the number of adverse factors among age ≥ 70 yr, size ≥ 3 cm, and presence of CIS. Progression rates at 10 yr ranged from 17% to 52%. BCa-specific death rates at 10 yr were 32% in patients ≥ 70 yr with tumor size ≥ 3 cm and 13% otherwise. CONCLUSIONS T1G3 patients ≥ 70 yr with tumors ≥ 3 cm and concomitant CIS should be treated more aggressively because of the high risk of progression. PATIENT SUMMARY Although the majority of T1G3 patients can be safely treated with intravesical bacillus Calmette-Guérin, there is a subgroup of T1G3 patients with age ≥ 70 yr, tumor size ≥ 3 cm, and concomitant CIS who have a high risk of progression and thus require aggressive treatment.
Resumo:
OBJECTIVE A number of factors limit the effectiveness of current aortic arch studies in assessing optimal neuroprotection strategies, including insufficient patient numbers, heterogenous definitions of clinical variables, multiple technical strategies, inadequate reporting of surgical outcomes and a lack of collaborative effort. We have formed an international coalition of centres to provide more robust investigations into this topic. METHODS High-volume aortic arch centres were identified from the literature and contacted for recruitment. A Research Steering Committee of expert arch surgeons was convened to oversee the direction of the research. RESULTS The International Aortic Arch Surgery Study Group has been formed by 41 arch surgeons from 10 countries to better evaluate patient outcomes after aortic arch surgery. Several projects, including the establishment of a multi-institutional retrospective database, randomized controlled trials and a prospectively collected database, are currently underway. CONCLUSIONS Such a collaborative effort will herald a turning point in the surgical management of aortic arch pathologies and will provide better powered analyses to assess the impact of varying surgical techniques on mortality and morbidity, identify predictors for neurological and operative risk, formulate and validate risk predictor models and review long-term survival outcomes and quality-of-life after arch surgery.
Resumo:
This thesis consists of four essays on the design and disclosure of compensation contracts. Essays 1, 2 and 3 focus on behavioral aspects of mandatory compensation disclosure rules and of contract negotiations in agency relationships. The three experimental studies develop psychology- based theory and present results that deviate from standard economic predictions. Furthermore, the results of Essay 1 and 2 also have implications for firms’ discretion in how to communicate their top management’s incentives to the capital market. Essay 4 analyzes the role of fairness perceptions for the evaluation of executive compensation. For this purpose, two surveys targeting representative eligible voters as well as investment professionals were conducted. Essay 1 investigates the role of the detailed ‘Compensation Discussion and Analysis’, which is part of the Security and Exchange Commission’s 2006 regulation, on investors’ evaluations of executive performance. Compensation disclosure complying with this regulation clarifies the relationship between realized reported compensation and the underlying performance measures and their target achievement levels. The experimental findings suggest that the salient presentation of executives’ incentives inherent in the ‘Compensation Discussion and Analysis’ makes investors’ performance evaluations less outcome dependent. Therefore, investors’ judgment and investment decisions might be less affected by noisy environmental factors that drive financial performance. The results also suggest that fairness perceptions of compensation contracts are essential for investors’ performance evaluations in that more transparent disclosure increases the perceived fairness of compensation and the performance evaluation of managers who are not responsible for a bad financial performance. These results have important practical implications as firms might choose to communicate their top management’s incentive compensation more transparently in order to benefit from less volatile expectations about their future performance. Similar to the first experiment, the experiment described in Essay 2 addresses the question of more transparent compensation disclosure. However, other than the first experiment, the second experiment does not analyze the effect of a more salient presentation of contract information but the informational effect of contract information itself. For this purpose, the experiment tests two conditions in which the assessment of the compensation contracts’ incentive compatibility, which determines executive effort, is either possible or not. On the one hand, the results suggest that the quality of investors’ expectations about executive effort is improved, but on the other hand investors might over-adjust their prior expectations about executive effort if being confronted with an unexpected financial performance and under-adjust if the financial performance confirms their prior expectations. Therefore, in the experiment, more transparent compensation disclosure does not lead to more correct overall judgments of executive effort and to even lower processing quality of outcome information. These results add to the literature on disclosure which predominantly advocates more transparency. The findings of the experiment however, identify decreased information processing quality as a relevant disclosure cost category. Firms might therefore carefully evaluate the additional costs and benefits of more transparent compensation disclosure. Together with the results from the experiment in Essay 1, the two experiments on compensation disclosure imply that firms should rather focus on their discretion how to present their compensation disclosure to benefit from investors’ improved fairness perceptions and their spill-over on performance evaluation. Essay 3 studies the behavioral effects of contextual factors in recruitment processes that do not affect the employer’s or the applicant’s bargaining power from a standard economic perspective. In particular, the experiment studies two common characteristics of recruitment processes: Pre-contractual competition among job applicants and job applicants’ non-binding effort announcements as they might be made during job interviews. Despite the standard economic irrelevance of these factors, the experiment develops theory regarding the behavioral effects on employees’ subsequent effort provision and the employers’ contract design choices. The experimental findings largely support the predictions. More specifically, the results suggest that firms can benefit from increased effort and, therefore, may generate higher profits. Further, firms may seize a larger share of the employment relationship’s profit by highlighting the competitive aspects of the recruitment process and by requiring applicants to make announcements about their future effort. Finally, Essay 4 studies the role of fairness perceptions for the public evaluation of executive compensation. Although economic criteria for the design of incentive compensation generally do not make restrictive recommendations with regard to the amount of compensation, fairness perceptions might be relevant from the perspective of firms and standard setters. This is because behavioral theory has identified fairness as an important determinant of individuals’ judgment and decisions. However, although fairness concerns about executive compensation are often stated in the popular media and even in the literature, evidence on the meaning of fairness in the context of executive compensation is scarce and ambiguous. In order to inform practitioners and standard setters whether fairness concerns are exclusive to non-professionals or relevant for investment professionals as well, the two surveys presented in Essay 4 aim to find commonalities in the opinions of representative eligible voters and investments professionals. The results suggest that fairness is an important criterion for both groups. Especially, exposure to risk in the form of the variable compensation share is an important criterion shared by both groups. The higher the assumed variable share, the higher is the compensation amount to be perceived as fair. However, to a large extent, opinions on executive compensation depend on personality characteristics, and to some extent, investment professionals’ perceptions deviate systematically from those of non-professionals. The findings imply that firms might benefit from emphasizing the riskiness of their managers’ variable pay components and, therefore, the findings are also in line with those of Essay 1.
Resumo:
BACKGROUND Anthelmintic drugs have been widely used in sheep as a cost-effective means for gastro-intestinal nematode (GIN) control. However, growing anthelmintic resistance (AHR) has created a compelling need to identify evidence-based management recommendations that reduce the risk of further development and impact of AHR. OBJECTIVE To identify, critically assess, and synthesize available data from primary research on factors associated with AHR in sheep. METHODS Publications reporting original observational or experimental research on selected factors associated with AHR in sheep GINs and published after 1974, were identified through two processes. Three electronic databases (PubMed, Agricola, CAB) and Web of Science (a collection of databases) were searched for potentially relevant publications. Additional publications were identified through consultation with experts, manual search of references of included publications and conference proceedings, and information solicited from small ruminant practitioner list-serves. Two independent investigators screened abstracts for relevance. Relevant publications were assessed for risk of systematic bias. Where sufficient data were available, random-effects Meta-Analyses (MAs) were performed to estimate the pooled Odds Ratio (OR) and 95% Confidence Intervals (CIs) of AHR for factors reported in ≥2 publications. RESULTS Of the 1712 abstracts screened for eligibility, 131 were deemed relevant for full publication review. Thirty publications describing 25 individual studies (15 observational studies, 7 challenge trials, and 3 controlled trials) were included in the qualitative synthesis and assessed for systematic bias. Unclear (i.e. not reported, or unable to assess) or high risk of selection bias and confounding bias was found in 93% (14/15) and 60% (9/15) of the observational studies, respectively, while unclear risk of selection bias was identified in all of the trials. Ten independent studies were included in the quantitative synthesis, and MAs were performed for five factors. Only high frequency of treatment was a significant risk factor (OR=4.39; 95% CI=1.59, 12.14), while the remaining 4 variables were marginally significant: mixed-species grazing (OR=1.63; 95% CI=0.66, 4.07); flock size (OR=1.02; 95% CI=0.97, 1.07); use of long-acting drug formulations (OR=2.85; 95% CI=0.79, 10.24); and drench-and-shift pasture management (OR=4.08; 95% CI=0.75, 22.16). CONCLUSIONS While there is abundant literature on the topic of AHR in sheep GINs, few studies have explicitly investigated the association between putative risk or protective factors and AHR. Consequently, several of the current recommendations on parasite management are not evidence-based. Moreover, many of the studies included in this review had a high or unclear risk of systematic bias, highlighting the need to improve study design and/or reporting of future research carried out in this field.
Resumo:
OBJECTIVES The aim of this study was to identify common risk factors for patient-reported medical errors across countries. In country-level analyses, differences in risks associated with error between health care systems were investigated. The joint effects of risks on error-reporting probability were modelled for hypothetical patients with different health care utilization patterns. DESIGN Data from the Commonwealth Fund's 2010 lnternational Survey of the General Public's Views of their Health Care System's Performance in 11 Countries. SETTING Representative population samples of 11 countries were surveyed (total sample = 19,738 adults). Utilization of health care, coordination of care problems and reported errors were assessed. Regression analyses were conducted to identify risk factors for patients' reports of medical, medication and laboratory errors across countries and in country-specific models. RESULTS Error was reported by 11.2% of patients but with marked differences between countries (range: 5.4-17.0%). Poor coordination of care was reported by 27.3%. The risk of patient-reported error was determined mainly by health care utilization: Emergency care (OR = 1.7, P < 0.001), hospitalization (OR = 1.6, P < 0.001) and the number of providers involved (OR three doctors = 2.0, P < 0.001) are important predictors. Poor care coordination is the single most important risk factor for reporting error (OR = 3.9, P < 0.001). Country-specific models yielded common and country-specific predictors for self-reported error. For high utilizers of care, the probability that errors are reported rises up to P = 0.68. CONCLUSIONS Safety remains a global challenge affecting many patients throughout the world. Large variability exists in the frequency of patient-reported error across countries. To learn from others' errors is not only essential within countries but may also prove a promising strategy internationally.
Resumo:
BACKGROUND Repeated hospitalizations are frequent toward the end of life, where each admission should be an opportunity to initiate advance-care planning to high-risk patients. OBJECTIVE To identify the risk factors for having a 30-day potentially avoidable readmission due to end-of-life care issues among all medical patients. DESIGN Nested case-control study. SETTING/PATIENTS All 10,275 consecutive discharges from any medical service of an academic tertiary medical center in Boston, Massachusetts between July 1, 2009 and June 30, 2010. MEASUREMENTS A random sample of all the potentially avoidable 30-day readmissions was independently reviewed by 9 trained physicians to identify the ones due to end-of-life issues. RESULTS Among 534, 30-day potentially avoidable readmission cases reviewed, 80 (15%) were due to an end-of-life care issue. In multivariable analysis, the following risk factors were significantly associated with a 30-day potentially avoidable readmission due to end-of-life care issues: number of admissions in the previous 12 months (odds ratio [OR]: 1.10 per admission, 95% confidence interval [CI]: 1.02-1.20), neoplasm (OR: 5.60, 95% CI: 2.85-10.98), opiate medications at discharge (OR: 2.29, 95% CI: 1.29-4.07), Elixhauser comorbidity index (OR: 1.16 per 5-point increase, 95% CI: 1.10-1.22). The discrimination of the model (C statistic) was 0.85. CONCLUSIONS In a medical population, we identified 4 main risk factors that were significantly associated with 30-day potentially avoidable readmission due to end-of-life care issues, producing a model with very good to excellent discrimination. Patients with these risk factors might benefit from palliative care consultation prior to discharge in order to improve end-of-life care and possibly reduce unnecessary rehospitalizations.
Resumo:
Background: Virtual patients (VPs) are increasingly used to train clinical reasoning. So far, no validated evaluation instruments for VP design are available. Aims: We examined the validity of an instrument for assessing the perception of VP design by learners. Methods: Three sources of validity evidence were examined: (i) Content was examined based on theory of clinical reasoning and an international VP expert team. (ii) The response process was explored in think-aloud pilot studies with medical students and in content analyses of free text questions accompanying each item of the instrument. (iii) Internal structure was assessed by exploratory factor analysis (EFA) and inter-rater reliability by generalizability analysis. Results: Content analysis was reasonably supported by the theoretical foundation and the VP expert team. The think-aloud studies and analysis of free text comments supported the validity of the instrument. In the EFA, using 2547 student evaluations of a total of 78 VPs, a three-factor model showed a reasonable fit with the data. At least 200 student responses are needed to obtain a reliable evaluation of a VP on all three factors. Conclusion: The instrument has the potential to provide valid information about VP design, provided that many responses per VP are available.
Resumo:
OBJECTIVE: To evaluate the incidence of colic and risk factors for colic in equids hospitalized for ocular disease. DESIGN: Retrospective observational study. Animals-337 equids (317 horses, 19 ponies, and 1 donkey) hospitalized for ocular disease. PROCEDURES: Medical records of equids hospitalized for > 24 hours for treatment of ocular disease between January 1997 and December 2008 were reviewed. Information from only the first hospitalization was used for equids that were hospitalized for ocular disease on more than 1 occasion. Information gathered included the signalment, the type of ocular lesion and the treatment administered, and any colic signs recorded during hospitalization as well as the severity, presumptive diagnosis, and treatment of the colic. Statistical analysis was used to identify any risk factors for colic in equids hospitalized for ocular disease. RESULTS: 72 of 337 (21.4%) equids hospitalized for ocular disease had signs of colic during hospitalization. Most equids (59.7% [43/72]) had mild signs of colic, and most (87.5% [63/72]) were treated medically. Ten of 72 (13.9%) equids with colic had a cecal impaction. Risk factors for colic in equids hospitalized for ocular disease were age (0 to 1 year and ≥ 21 years) and an increased duration of hospitalization (≥ 8 days). CONCLUSIONS AND CLINICAL RELEVANCE: There was a high incidence of colic in equids hospitalized with ocular disease in this study. Findings from this study may help identify equids at risk for development of colic and thereby help direct implementation of prophylactic measures.
Resumo:
OBJECTIVE The natural course of chronic hepatitis C varies widely. To improve the profiling of patients at risk of developing advanced liver disease, we assessed the relative contribution of factors for liver fibrosis progression in hepatitis C. DESIGN We analysed 1461 patients with chronic hepatitis C with an estimated date of infection and at least one liver biopsy. Risk factors for accelerated fibrosis progression rate (FPR), defined as ≥0.13 Metavir fibrosis units per year, were identified by logistic regression. Examined factors included age at infection, sex, route of infection, HCV genotype, body mass index (BMI), significant alcohol drinking (≥20 g/day for ≥5 years), HIV coinfection and diabetes. In a subgroup of 575 patients, we assessed the impact of single nucleotide polymorphisms previously associated with fibrosis progression in genome-wide association studies. Results were expressed as attributable fraction (AF) of risk for accelerated FPR. RESULTS Age at infection (AF 28.7%), sex (AF 8.2%), route of infection (AF 16.5%) and HCV genotype (AF 7.9%) contributed to accelerated FPR in the Swiss Hepatitis C Cohort Study, whereas significant alcohol drinking, anti-HIV, diabetes and BMI did not. In genotyped patients, variants at rs9380516 (TULP1), rs738409 (PNPLA3), rs4374383 (MERTK) (AF 19.2%) and rs910049 (major histocompatibility complex region) significantly added to the risk of accelerated FPR. Results were replicated in three additional independent cohorts, and a meta-analysis confirmed the role of age at infection, sex, route of infection, HCV genotype, rs738409, rs4374383 and rs910049 in accelerating FPR. CONCLUSIONS Most factors accelerating liver fibrosis progression in chronic hepatitis C are unmodifiable.
Resumo:
BACKGROUND Urinary incontinence or the inability to void spontaneously after ileal orthotopic bladder substitution is a frequent finding in female patients. OBJECTIVE To evaluate how hysterectomy and nerve sparing affect functional outcomes and whether these relate to pre- and postoperative urethral pressure profile (UPP) results. DESIGN, SETTING, AND PARTICIPANTS Prospectively performed pre- and postoperative UPPs of 73 female patients who had undergone cystectomy and bladder substitution were correlated with postoperative voiding and continence status. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS Outcome analyses were performed with the Kruskal-Wallis test, Wilcoxon-Mann-Whitney, or two-group post hoc testing with the Bonferroni correction. Chi-square or Fisher exact tests were applied for the categorical data. RESULTS AND LIMITATIONS Of postoperatively continent or hypercontinent patients, 22 of 43 (51.2%) had the uterus preserved; of incontinent patients, only 4 of 30 (13.3%, p<0.01) had the uterus preserved. Of postoperatively continent or hypercontinent patients, 27 of 43 patients (62.8%) had bilateral and 15 of 43 (34.9%) had unilateral attempted nerve sparing. In incontinent patients, 11 of 30 (36.7%) had bilateral and 16 of 30 (53.3%) had unilateral attempted nerve sparing (p=0.02). When compared with postoperatively incontinent patients, postoperatively continent patients had a longer functional urethral length (median: 32mm vs 24mm; p<0.001), a higher postoperative urethral closing pressure at rest (56cm H2O vs 35cm H2O; p<0.001) as well as a higher preoperative urethral closing pressure at rest (74cm H2O vs 47.5cm H2O; p=0.01). The main limitation was the limited number of patients. CONCLUSIONS In female patients undergoing radical cystectomy and bladder substitution, preservation of the uterus and attempted nerve sparing results in better functional outcomes. The preoperative UPPs correlate with postoperative voiding and continence status and may predict which patients are at a higher risk of functional failure after bladder substitution. PATIENT SUMMARY If preservation of the urethra's innervation is not possible during cystectomy, poor functional results with bladder substitutes are likely.