126 resultados para patient reported outcomes


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Poor nutritional status among older people is well documented with 40% of older people reported as malnourished on hospital admission. Poor nutrition contributes to increased infection, poorer patient outcomes and death and longer hospital stays. In this study, we assessed the ‘nutrition narrative’ from older hospital patients together with nutrition knowledge among nursing and medical staff and students.
Methods: The study used a convenience sample of older people (30, mean age 82 years) in two large geographically separate city hospitals. Patients mentally alert and consenting, gave a recorded ‘nutrition narrative’ to get a sense of how they felt their nutritional needs were being met in hospital. Main themes were identified by grounded analysis framework. Focus groups were recruited from medical/nursing teachers and students to assess their working knowledge of nutrition and the nutritional needs of the older patient group.
Results: Analysis of the ‘nutrition narrative’ suggested several themes (i) staff should listen to patients' needs/wishes in discussion with themselves and family members (ii) staff should continue to encourage and progress a positive eating experience (iii) staff should monitor food eaten/or not eaten and increase regular monitoring of weight. The focus groups with medical and nursing students suggested a limited knowledge about nutritional care of older people and little understanding about roles or cross-talk about nutrition across the multidisciplinary groups.
Conclusions: The ‘nutrition narrative’ themes suggested that the nutritional experience of older people in hospital can and must be improved. Nursing and medical staff providing medical and nursing care need better basic knowledge of nutrition and nutritional assessment, an improved understanding of the roles of the various multidisciplinary staff and of hospital catering pathways. Care professionals need to prioritise patient nutrition much more highly and recognise nutritional care as integral to patient healing and recovery

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Achalasia is a neurodegenerative motility disorder of the oesophagus resulting in deranged oesophageal peristalsis and loss of lower oesophageal sphincter function. Historically, annual achalasia incidence rates were believed to be low, approximately 0.5-1.2 per 100000. More recent reports suggest that annual incidence rates have risen to 1.6 per 100000 in some populations. The aetiology of achalasia is still unclear but is likely to be multi-factorial. Suggested causes include environmental or viral exposures resulting in inflammation of the oesophageal myenteric plexus, which elicits an autoimmune response. Risk of achalasia may be elevated in a sub-group of genetically susceptible people. Improvement in the diagnosis of achalasia, through the introduction of high resolution manometry with pressure topography plotting, has resulted in the development of a novel classification system for achalasia. This classification system can evaluate patient prognosis and predict responsiveness to treatment. There is currently much debate over whether pneumatic dilatation is a superior method compared to the Heller's myotomy procedure in the treatment of achalasia. A recent comparative study found equal efficacy, suggesting that patient preference and local expertise should guide the choice. Although achalasia is a relatively rare condition, it carries a risk of complications, including aspiration pneumonia and oesophageal cancer. The risk of both squamous cell carcinoma and adenocarcinoma of the oesophagus is believed to be significantly increased in patients with achalasia, however the absolute excess risk is small. Therefore, it is currently unknown whether a surveillance programme in achalasia patients would be effective or cost-effective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biological dose escalation through stereotactic ablative radiotherapy (SABR) holds promise of improved patient convenience, system capacity and tumor control with decreased cost and side effects. The objectives are to report the toxicities, biochemical and pathologic outcomes of this prospective study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS: To investigate the potential dosimetric and clinical benefits predicted by using four-dimensional computed tomography (4DCT) compared with 3DCT in the planning of radical radiotherapy for non-small cell lung cancer.

MATERIALS AND METHODS:
Twenty patients were planned using free breathing 4DCT then retrospectively delineated on three-dimensional helical scan sets (3DCT). Beam arrangement and total dose (55 Gy in 20 fractions) were matched for 3D and 4D plans. Plans were compared for differences in planning target volume (PTV) geometrics and normal tissue complication probability (NTCP) for organs at risk using dose volume histograms. Tumour control probability and NTCP were modelled using the Lyman-Kutcher-Burman (LKB) model. This was compared with a predictive clinical algorithm (Maastro), which is based on patient characteristics, including: age, performance status, smoking history, lung function, tumour staging and concomitant chemotherapy, to predict survival and toxicity outcomes. Potential therapeutic gains were investigated by applying isotoxic dose escalation to both plans using constraints for mean lung dose (18 Gy), oesophageal maximum (70 Gy) and spinal cord maximum (48 Gy).

RESULTS:
4DCT based plans had lower PTV volumes, a lower dose to organs at risk and lower predicted NTCP rates on LKB modelling (P < 0.006). The clinical algorithm showed no difference for predicted 2-year survival and dyspnoea rates between the groups, but did predict for lower oesophageal toxicity with 4DCT plans (P = 0.001). There was no correlation between LKB modelling and the clinical algorithm for lung toxicity or survival. Dose escalation was possible in 15/20 cases, with a mean increase in dose by a factor of 1.19 (10.45 Gy) using 4DCT compared with 3DCT plans.

CONCLUSIONS:
4DCT can theoretically improve therapeutic ratio and dose escalation based on dosimetric parameters and mathematical modelling. However, when individual characteristics are incorporated, this gain may be less evident in terms of survival and dyspnoea rates. 4DCT allows potential for isotoxic dose escalation, which may lead to improved local control and better overall survival.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study investigated the longitudinal relationship between alcohol consumption at age 13, and at age 16. Alcohol-specific measures were frequency of drinking, amount consumed at last use and alcohol related harms. Self-report data were gathered from 1113 high school students at T1, and 981 students at T2. Socio-demographic data were gathered, as was information on context of use, alcohol-related knowledge and attitudes, four domains of aggression and delay reward discounting. Results indicated that any consumption of alcohol, even supervised consumption, at T1 was associated with significantly poorer outcomes at T2. In other words, compared to those still abstinent at age 13, those engaging in alcohol use in any context reported significantly more frequent drinking, more alcohol-related harms and more units consumed at last use at age 16. Results also support the relationship between higher levels of physical aggression at T1 and a greater likelihood of more problematic alcohol use behaviours at T2. The findings support other evidence suggesting that abstinence in early adolescence has better longitudinal outcomes that supervised consumption of alcohol. These results suggest support for current guidance on adolescent drinking in the United Kingdom (UK).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Systematic reviews have considerable potential to provide evidence-based data to aid clinical decision-making. However, there is growing recognition that trials involving mechanical ventilation lack consistency in the definition and measurement of ventilation outcomes, creating difficulties in combining data for meta-analyses. To address the inconsistency in outcome definitions, international standards for trial registration and clinical trial protocols published recommendations, effectively setting the “gold standard” for reporting trial outcomes. In this Critical Care Perspective, we review the problems resulting from inconsistent outcome definitions and inconsistent reporting of outcomes (outcome sets). We present data highlighting the variability of the most commonly reported ventilation outcome definitions. Ventilation outcomes reported in trials over the last 6 years typically fall into four domains: measures of ventilator dependence; adverse outcomes; mortality; and resource use. We highlight the need, first, for agreement on outcome definitions and, second, for a minimum core outcome set for trials involving mechanical ventilation. A minimum core outcome set would not restrict trialists from measuring additional outcomes, but would overcome problems of variability in outcome selection, measurement, and reporting, thereby enhancing comparisons across trials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The emerging field of microneedle-based minimally invasive patient monitoring and diagnosis is reviewed. Microneedle arrays consist of rows of micron-scale projections attached to a solid support. They have been widely investigated for transdermal drug and vaccine delivery applications since the late 1990s. However, researchers and clinicians have recently realized the great potential of microneedles for extraction of skin interstitial fluid and, less commonly, blood, for enhanced monitoring of patient health.

Methods: We reviewed the journal and patent literature, and summarized the findings and provided technical insights and critical analysis.

Results: We describe the basic concepts in detail and extensively review the work performed to date.

Conclusions: It is our view that microneedles will have an important role to play in clinical management of patients and will ultimately improve therapeutic outcomes for people worldwide.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To investigate, for the first time, the influence of pharmacist intervention and the use of a patient information leaflet on self-application of hydrogel-forming microneedle arrays by human volunteers without the aid of an applicator device.
Methods: A patient information leaflet was drafted and pharmacist counselling strategy devised. Twenty human volunteers applied 11 × 11 arrays of 400 μm hydrogel-forming microneedle arrays to their own skin following the instructions provided. Skin barrier function disruption was assessed using transepidermal water loss measurements and optical coherence tomography and results compared to those obtained when more experienced researchers applied the microneedles to the volunteers or themselves.
Results: Volunteer self-application of the 400 μm microneedle design resulted in an approximately 30% increase in skin transepidermal water loss, which was not significantly different from that seen with self-application by the more experienced researchers or application to the volunteers. Use of optical coherence tomography showed that self-application of microneedles of the same density (400 μm, 600 μm and 900 μm) led to percentage penetration depths of approximately 75%, 70% and 60%, respectively, though the diameter of the micropores created remained quite constant at approximately 200 μm. Transepidermal water loss progressively increased with increasing height of the applied microneedles and this data, like that for penetration depth, was consistent, regardless of applicant.
Conclusion: We have shown that hydrogel-forming microneedle arrays can be successfully and reproducibly applied by human volunteers given appropriate instruction. If these outcomes were able to be extrapolated to the general patient population, then use of bespoke MN applicator devices may not be necessary, thus possibly enhancing patient compliance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This programme of research aimed to understand the extent to which current UK medical graduates are prepared for practice. Commissioned by the General Medical Council, we conducted: (1) A Rapid Review of the literature between 2009 and 2013; (2) narrative interviews with a range of stakeholders; and (3) longitudinal audio-diaries with Foundation Year 1 doctors. The Rapid Review (RR) resulted in data from 81 manuscripts being extracted and mapped against a coding framework (including outcomes from Tomorrow's Doctors (2009) (TD09)). A narrative synthesis of the data was undertaken. Narrative interviews were conducted with 185 participants from 8 stakeholder groups: F1 trainees, newly registered trainee doctors, clinical educators, undergraduate and postgraduate deans and foundation programme directors, other healthcare professionals, employers, policy and government and patient and public representatives. Longitudinal audio-diaries were recorded by 26 F1 trainees over 4 months. The data were analysed thematically and mapped against TD09. Together these data shed light onto how preparedness for practice is conceptualised, measured, how prepared UK medical graduates are for practice, the effectiveness of transition interventions and the currently debated issue of bringing full registration forward to align with medical students’ graduation. Preparedness for practice was conceptualised as both a long- and short-term venture that included personal readiness as well as knowledge, skills and attitudes. It has mainly been researched using self-report measures of generalised incidents that have been shown to be problematic. In terms of transition interventions: assistantships were found to be valuable and efficacious for proactive students as team members, shadowing is effective when undertaken close to employment/setting of F1 post and induction is generally effective but of inconsistent quality. The August transition was highlighted in our interview and audio-diary data where F1s felt unprepared, particularly for the step-change in responsibility, workload, degree of multitasking and understanding where to go for help. Evidence of preparedness for specific tasks, skills and knowledge was contradictory: trainees are well prepared for some practical procedures but not others, reasonably well prepared for history taking and full physical examinations, but mostly unprepared for adopting an holistic understanding of the patient, involving patients in their care, safe and legal prescribing, diagnosing and managing complex clinical conditions and providing immediate care in medical emergencies. Evidence for preparedness for interactional and interpersonal aspects of practice was inconsistent with some studies in the RR suggesting graduates were prepared for team working and communicating with colleagues and patients, but other studies contradicting this. Interview and audio-diary data highlights concerns around F1s preparedness for communicating with angry or upset patients and relatives, breaking bad news, communicating with the wider team (including interprofessionally) and handover communication. There was some evidence in the RR to suggest that graduates were unprepared for dealing with error and safety incidents and lack an understanding of how the clinical environment works. Interview and audio-diary data backs this up, adding that F1s are also unprepared for understanding financial aspects of healthcare. In terms of being personally prepared, RR, interview and audio diary evidence is mixed around graduates’ preparedness for identifying their own limitations, but all data points to graduates’ difficulties in the domain of time management. In terms of personal and situational demographic factors, the RR found that gender did not typically predict perceptions of preparedness, but graduates from more recent cohorts, graduate entry students, graduates from problem based learning courses, UK educated graduates and graduates with an integrated degree reported feeling better prepared. The longitudinal audio-diaries provided insights into the preparedness journey for F1s. There seems to be a general development in the direction of trainees feeling more confident and competent as they gain more experience. However, these developments were not necessarily linear as challenging circumstances (e.g. new specialty, new colleagues, lack of staffing) sometimes made them feel unprepared for situations where they had previously indicated preparedness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Streptococcus bovis can lead to bacteraemia, septicaemia, and ultimately endocarditis. The objective of this study was to evaluate the long-term implications of S. bovis endocarditis on cardiac morbidity and mortality. 

Methods: A retrospective cohort study was performed between January 2000 and March 2009 to assess all patients diagnosed with S. bovis bacteraemia from the Belfast Health and Social Care Trust. The primary end-point for cardiac investigations was the presence of endocarditis. Secondary end-points included referral for cardiac surgery and overall mortality. 

Results: Sixty-one positive S. bovis blood cultures from 43 patients were included. Following echocardiography, seven patients were diagnosed with infective endocarditis (16.3 % of total patients); four patients (9.3 %) had native valve involvement while three (7.0 %) had prosthetic valve infection. Five of these seven patients had more than one positive S. bovis culture (71.4 %). Three had significant valve dysfunction that warranted surgical repair/replacement, one of whom was unfit for surgery. There was a 100 % recurrence rate amongst the valve replacement patients (n = 2) and six patients with endocarditis had colorectal pathology. Patients with endocarditis had similar long-term survival as those with non-endocarditic bacteraemia (57.1 % alive vs. 50 % of non-endocarditis patients, p = 0.73). 

Conclusion: Streptococcus bovis endocarditis patients tended to have pre-existing valvular heart disease and those with prosthetic heart valves had higher surgical intervention and relapse rates. These patients experienced a higher rate of co-existing colorectal pathology but currently have reasonable long-term outcomes. This may suggest that they represent a patient population that merits consideration for an early surgical strategy to maximise long-term results, however, further evaluation is warranted. © 2013 The Japanese Association for Thoracic Surgery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Cancer survivors (CSs) are at risk of developing late effects (LEs) associated with the disease and its treatment. This paper compares the health status, care needs and use of health services by CSs with LEs and CSs without LEs.

METHODS: Cancer survivors (n = 613) were identified via the Northern Ireland Cancer Registry and invited to participate in a postal survey that was administered by their general practitioner. The survey assessed self-reported LEs, health status, health service use and unmet care needs. A total of 289 (47%) CSs responded to the survey, and 93% of respondents completed a LEs scale.

RESULTS: Forty-one per cent (111/269) of CSs reported LEs. Survivors without LEs and survivors with LEs were comparable in terms of age and gender. The LEs group reported a significantly greater number of co-morbidities, lower physical health and mental health scores, greater overall health service use and more unmet needs. Unadjusted logistic regression analysis found that cancer site, time since diagnosis and treatment were significantly associated with reporting of LEs. CSs who received combination therapies compared with CSs who received single treatments were over two and a half times more likely to report LEs (OR = 2.63, 95% CI = 1.32-5.25) after controlling for all other variables.

CONCLUSIONS: The CS population with LEs comprises a particularly vulnerable group of survivors who have multiple health care problems and needs and who require tailored care plans that take account of LEs and their impact on health-related quality of life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To prospectively evaluate and quantify the efficacy of cadaveric fascia lata (CFL) as an allograft material in pubovaginal sling placement to treat stress urinary incontinence (SUI).

Patients and methods Thirty-one women with SUI (25 type II and six type III; mean age 63 years, range 40-75) had a CFL pubovaginal sling placed transvaginally. The operative time, blood loss, surgical complications and mean hospital stay were all documented. Before and at 4 months and 1 year after surgery each patient completed a 3-day voiding diary and validated voiding questionnaires (functional inquiry into voiding habits, Urogenital Distress Inventory and Incontinence Impact Questionnaire, including visual analogue scales).

Results The mean (range) operative time was 71 (50-120) min, blood loss 78.7 (20-250) mL and hospital stay 1.2 (1-2) days; there were no surgical complications. Over the mean follow-up of 13.5 months, complete resolution of SUI was reported by 29 (93%) patients. Overactive bladder symptoms were present in 23 (74%) patients before surgery, 21 (68%) at 4 months and two (6%) at 1 year; 80% of patients with low (<15 cmH (2) O) voiding pressures before surgery required self-catheterization afterward, as did 36% at 4 months, but only one (3%) at 1 year. Twenty-four (77%) patients needed to adopt specific postures to facilitate voiding. After surgery there was a significant reduction in daytime frequency, leakage episodes and pad use (P <0.05). The severity of leak and storage symptoms was also significantly less (P <0.002), whilst the severity of obstructive symptoms remained unchanged. Mean subjective levels of improvement were 69% at 4 months and 85% at 1 year, with corresponding objective satisfaction levels of 61% and 69%, respectively. At 1 year, approximate to 80% of the patients said they would undergo the procedure again and/or recommend it to a friend.

Conclusion Placing a pubovaginal sling of CFL allograft is a highly effective, safe surgical approach for resolving SUI, with a short operative time and rapid recovery. Storage symptoms are significantly improved, and subjective improvement and satisfaction rates are high.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To compare the long-term outcome of artificial urinary sphincter (AUS) implantation in patients after prostatectomy, with and with no history of previous irradiation.

Patients and methods The study included 98 men (mean age 68 years) with urinary incontinence after prostatectomy for prostate cancer (85 radical, 13 transurethral resection) who had an AUS implanted. Twenty-two of the patients had received adjuvant external beam irradiation before AUS implantation. Over a mean (range) follow-up of 46 (5-118) months, the complication and surgical revision rates were recorded and compared between irradiated and unirradiated patients. The two groups were also compared for the resolution of incontinence and satisfaction, assessed using a questionnaire.

Results Overall, surgical revision was equally common in irradiated (36%) and unirradiated (24%) patients. After activating the AUS, urethral atrophy, infection and erosion requiring surgical revision were more common in irradiated patients (41% vs 11%; P <0.05); 70% of patients reported a significant improvement in continence, regardless of previous irradiation. Patient satisfaction remained high, with >80% of patients stating that they would undergo surgery again and/or recommend it to others, despite previous Irradiation and/or the need for surgical revision.

Conclusions Despite higher complication and surgical revision rates in patients who have an AUS implanted and have a history of previous Irradiation, the long-term continence and patient satisfaction appear not to be adversely affected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Arteriovenous fistulae (AVFs) are the preferred option for vascular access, as they are associated with lower mortality in hemodialysis patients than in those patients with arteriovenous grafts (AVGs) or central venous catheters (CVCs). We sought to assess whether vascular access outcomes for surgical trainees are comparable to fully trained surgeons.

METHODS: A prospectively collected database of patients was created and information recorded regarding patient demographics, past medical history, preoperative investigations, grade of operating surgeon, type of AVF formed, primary AVF function, cumulative AVF survival and functional patency.

RESULTS: One hundred and sixty-two patients were identified as having had vascular access procedures during the 6 month study period and 143 were included in the final analysis. Secondary AVF patency was established in 123 (86%) of these AVFs and 89 (62.2%) were used for dialysis. There was no significant difference in survival of AVFs according to training status of surgeon (log rank x2 0.506 p=0.477) or type of AVF (log rank x2 0.341 p=0.559). Patency rates of successful AVFs at 1 and 2 years were 60.9% and 47.9%, respectively.

CONCLUSION: We have demonstrated in this prospective study that there are no significant differences in outcomes of primary AVFs formed by fully trained surgeons versus surgical trainees. Creation of a primary AVF represents an excellent training platform for intermediate stage surgeons across general and vascular surgical specialties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Intravenous sedation is the most commonly used method of sedation for the provision of adult dental care. However, disparity exists in pre-operative fasting times in use for patients throughout the United Kingdom.

AIMS: The aims of the study were to obtain information on the effects of existing extended pre-operative fasting regimens, to canvas patient opinions on the fasting process, and to record their positive and negative experiences associated with it.

METHODS: A prospective cross-sectional descriptive study using survey methodology was conducted of adult patients attending a dental hospital for operative treatment under intravenous sedation. Sixty-four questionnaires were distributed over a four-month period, beginning 2nd October 2007.

RESULTS: The surveyed patient pool consisted of 38 females and 14 males with a mean age of 32.4 years. The response rate achieved was 81.2%. Seventy-one per cent of patients indicated that normally they consumed something for breakfast, the most common items being tea and toast. Fifty-one per cent of patients indicated that they would wish to eat the same as normal prior to their appointment and 59% wished to drink as normal. Only 19% of respondents reported that they did not wish to eat anything, with 8% preferring not to drink anything at all. Seventy-nine per cent of the patients reported that they had experienced at least one adverse symptom after fasting and 42% had experienced two or more such symptoms. In general, those patients with more experience of sedation found fasting less unpleasant than those attending for the first time (P<0.05). In addition, one-quarter of all patients indicated that the fasting process had made them feel more nervous about their sedation appointment.

CONCLUSIONS: The extended fasting regimen prior to intravenous sedation appeared to affect patients' wellbeing, as the majority reported adverse symptoms.