188 resultados para Diagnosis, Surgical.
Resumo:
Background A large animal model is required for assessment of minimally invasive, tissue engineering based approaches to thoracic spine fusion, with relevance to deformity correction surgery for human adolescent idiopathic scoliosis. Here we develop a novel open mini–thoracotomy approach in an ovine model of thoracic interbody fusion which allows assessment of various fusion constructs, with a focus on novel, tissue engineering based interventions. Methods The open mini-thoracotomy surgical approach was developed through a series of mock surgeries, and then applied in a live sheep study. Customized scaffolds were manufactured to conform with intervertebral disc space clearances required of the study. Twelve male Merino sheep aged 4 to 6 years and weighing 35 – 45 kg underwent the abovementioned procedure and were divided into two groups of six sheep at survival timelines of 6 and 12 months. Each sheep underwent a 3-level discectomy (T6/7, T8/9 and T10/11) with randomly allocated implantation of a different graft substitute at each of the three levels; (i) polycaprolactone (PCL) based scaffold plus 0.54μg rhBMP-2, (ii) PCL-based scaffold alone or (iii) autograft. The sheep were closely monitored post- operatively for signs of pain (i.e. gait abnormalities/ teeth gnawing/ social isolation). Fusion assessments were conducted post-sacrifice using Computed Tomography and hard-tissue histology. All scientific work was undertaken in accordance with the study protocol has been approved by the Institute's committee on animal research. Results. All twelve sheep were successfully operated on and reached the allotted survival timelines, thereby demonstrating the feasibility of the surgical procedure and post-operative care. There were no significant complications and during the post-operative period the animals did not exhibit marked signs of distress according to the described assessment criteria. Computed Tomographic scanning demonstrated higher fusion grades in the rhBMP-2 plus PCL-based scaffold group in comparison to either PCL-based scaffold alone or autograft. These results were supported by histological evaluation of the respective groups. Conclusion. This novel open mini-thoracotomy surgical approach to the ovine thoracic spine represents a safe surgical method which can reproducibly form the platform for research into various spine tissue engineered constructs (TEC) and their fusion promoting properties.
Resumo:
Considerate amount of research has proposed optimization-based approaches employing various vibration parameters for structural damage diagnosis. The damage detection by these methods is in fact a result of updating the analytical structural model in line with the current physical model. The feasibility of these approaches has been proven. But most of the verification has been done on simple structures, such as beams or plates. In the application on a complex structure, like steel truss bridges, a traditional optimization process will cost massive computational resources and lengthy convergence. This study presents a multi-layer genetic algorithm (ML-GA) to overcome the problem. Unlike the tedious convergence process in a conventional damage optimization process, in each layer, the proposed algorithm divides the GA’s population into groups with a less number of damage candidates; then, the converged population in each group evolves as an initial population of the next layer, where the groups merge to larger groups. In a damage detection process featuring ML-GA, as parallel computation can be implemented, the optimization performance and computational efficiency can be enhanced. In order to assess the proposed algorithm, the modal strain energy correlation (MSEC) has been considered as the objective function. Several damage scenarios of a complex steel truss bridge’s finite element model have been employed to evaluate the effectiveness and performance of ML-GA, against a conventional GA. In both single- and multiple damage scenarios, the analytical and experimental study shows that the MSEC index has achieved excellent damage indication and efficiency using the proposed ML-GA, whereas the conventional GA only converges at a local solution.
Resumo:
BACKGROUND: Numerous strategies are available to prevent surgical site infections in hip arthroplasty, but there is no consensus on which might be the best. This study examined infection prevention strategies currently recommended for patients undergoing hip arthroplasty. METHODS: Four clinical guidelines on infection prevention/orthopedics were reviewed. Infection control practitioners, infectious disease physicians, and orthopedic surgeons were consulted through structured interviews and an online survey. Strategies were classified as "highly important" if they were recommended by at least one guideline and ranked as significantly or critically important by >/=75% of the experts. RESULTS: The guideline review yielded 28 infection prevention measures, with 7 identified by experts as being highly important in this context: antibiotic prophylaxis, antiseptic skin preparation of patients, hand/forearm antisepsis by surgical staff, sterile gowns/surgical attire, ultraclean/laminar air operating theatres, antibiotic-impregnated cement, and surveillance. Controversial measures included antibiotic-impregnated cement and, considering recent literature, laminar air operating theatres. CONCLUSIONS: Some of these measures may already be accepted as routine clinical practice, whereas others are controversial. Whether these practices should be continued for this patient group will be informed by modeling the cost-effectiveness of infection prevention strategies. This will allow predictions of long-term health and cost outcomes and thus inform decisions on how to best use scarce health care resources for infection control.
Resumo:
The findings presented in this paper are part of a research project designed to provide a preliminary indication of the support needs of postdiagnosis women with breast cancer in remote and isolated areas in Queensland. This discussion will present data that focuses on the women’s expressed personal concerns. For participants in this research a diagnosis of breast cancer involves a confrontation with their own mortality and the possibility of a reduced life span. This is a definite life crisis, creating shock and needing considerable adjustment. Along with these generic issues the participants also articulated significant issues in relation to their experience as women in a rural setting. These concerns centred around worries about how their partner and families cope during their absences for treatment, the additional burden on the family of having to cope with running the property or farm during the participant’s absence or illness, added financial strain brought about by the cost of travel for treatment, maintenance of properties during absences, and problems created by time off from properties or self-employment. These findings accord with other reports of health and welfare services for rural Australian and the generic literature on psycho-oncology studies of breast cancer.
Resumo:
Recent years have seen increased attention given to examining the phenomenon of hope in patients with metastatic cancer One of the results of this activity has been a greater appreciation of the significance of hope for the dying patient However, there are many questions about the experience of hope and its impact on the lives of patients with cancer which remain to be answered This paper discusses how hope is currently conceptualized in the nursing literature, and considers the implications that this conceptualization has for how we care for cancer patients Some alternative ways of looking at the experience and the impact of hope are also discussed
Resumo:
Background: Patients with Crohn’s disease (CD) often require surgery at some stage of disease course. Prediction of CD outcome is influenced by clinical, environmental, serological, and genetic factors (eg, NOD2). Being able to identify CD patients at high risk of surgical intervention should assist clinicians to decide whether or not to prescribe early aggressive treatment with immunomodulators. Methods: We performed a retrospective analysis of selected clinical (age at diagnosis, perianal disease, active smoking) and genetic (NOD2 genotype) data obtained for a population-based CD cohort from the Canterbury Inflammatory Bowel Disease study. Logistic regression was used to identify predictors of complicated outcome in these CD patients (ie, need for inflammatory bowel disease-related surgery). Results: Perianal disease and the NOD2 genotype were the only independent factors associated with the need for surgery in this patient group (odds ratio=2.84 and 1.60, respectively). By combining the associated NOD2 genotype with perianal disease we generated a single “clinicogenetic” variable. This was strongly associated with increased risk of surgery (odds ratio=3.84, P=0.00, confidence interval, 2.28-6.46) and offered moderate predictive accuracy (positive predictive value=0.62). Approximately 1/3 of surgical outcomes in this population are attributable to the NOD2+PA variable (attributable risk=0.32). Conclusions: Knowledge of perianal disease and NOD2 genotype in patients presenting with CD may offer clinicians some decision-making utility for early diagnosis of complicated CD progression and initiating intensive treatment to avoid surgical intervention. Future studies should investigate combination effects of other genetic, clinical, and environmental factors when attempting to identify predictors of complicated CD outcomes.
Resumo:
The acetylcholine receptor (AchR) antibody assay has a key role in the diagnosis of myasthenia gravis. In this article, the role of AchR antibody assay in the diagnosis of ocular and generalized myasthenia gravis is reviewed, and compared to standard means of diagnosing the disease by clinical and electrophysiological methods.
Resumo:
The relationship of acetylcholine receptor (AchR) antibodies to disease activity in myasthenia gravis (MG) is controversial. Some authors claim a direct correlation with disease activity and treatment, in particular plasmapheresis therapy, whereas others have commented on the poor overall correlation of antibody levels with clinical state. Antibody levels were examined in a population of MG patients and correlated with disease activity and response to treatment. Antibodies to skeletal muscle AchR were found in most patients with generalised MG (24/25) and in about half of the patients with purely ocular MG (6/10) and in neither of 2 patients with congenital MG. There was scant correlation with disease activity or response to treatment. It is concluded that the assay is more useful for diagnosis than for management of MG.
Resumo:
It has been reported that poor nutritional status, in the form of weight loss and resulting body mass index (BMI) changes, is an issue in people with Parkinson's disease (PWP). The symptoms resulting from Parkinson's disease (PD) and the side effects of PD medication have been implicated in the aetiology of nutritional decline. However, the evidence on which these claims are based is, on one hand, contradictory, and on the other, restricted primarily to otherwise healthy PWP. Despite the claims that PWP suffer from poor nutritional status, evidence is lacking to inform nutrition-related care for the management of malnutrition in PWP. The aims of this thesis were to better quantify the extent of poor nutritional status in PWP, determine the important factors differentiating the well-nourished from the malnourished and evaluate the effectiveness of an individualised nutrition intervention on nutritional status. Phase DBS: Nutritional status in people with Parkinson's disease scheduled for deep-brain stimulation surgery The pre-operative rate of malnutrition in a convenience sample of people with Parkinson's disease (PWP) scheduled for deep-brain stimulation (DBS) surgery was determined. Poorly controlled PD symptoms may result in a higher risk of malnutrition in this sub-group of PWP. Fifteen patients (11 male, median age 68.0 (42.0 – 78.0) years, median PD duration 6.75 (0.5 – 24.0) years) participated and data were collected during hospital admission for the DBS surgery. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference, waist circumference, body mass index (BMI)) were taken, and body composition was measured using bioelectrical impedance spectroscopy (BIS). Six (40%) of the participants were malnourished (SGA-B) while 53% reported significant weight loss following diagnosis. BMI was significantly different between SGA-A and SGA-B (25.6 vs 23.0kg/m 2, p<.05). There were no differences in any other variables, including PG-SGA score and the presence of non-motor symptoms. The conclusion was that malnutrition in this group is higher than that in other studies reporting malnutrition in PWP, and it is under-recognised. As poorer surgical outcomes are associated with poorer pre-operative nutritional status in other surgeries, it might be beneficial to identify patients at nutritional risk prior to surgery so that appropriate nutrition interventions can be implemented. Phase I: Nutritional status in community-dwelling adults with Parkinson's disease The rate of malnutrition in community-dwelling adults (>18 years) with Parkinson's disease was determined. One hundred twenty-five PWP (74 male, median age 70.0 (35.0 – 92.0) years, median PD duration 6.0 (0.0 – 31.0) years) participated. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference (MAC), calf circumference, waist circumference, body mass index (BMI)) were taken. Nineteen (15%) of the participants were malnourished (SGA-B). All anthropometric indices were significantly different between SGA-A and SGA-B (BMI 25.9 vs 20.0kg/m2; MAC 29.1 – 25.5cm; waist circumference 95.5 vs 82.5cm; calf circumference 36.5 vs 32.5cm; all p<.05). The PG-SGA score was also significantly lower in the malnourished (2 vs 8, p<.05). The nutrition impact symptoms which differentiated between well-nourished and malnourished were no appetite, constipation, diarrhoea, problems swallowing and feel full quickly. This study concluded that malnutrition in community-dwelling PWP is higher than that documented in community-dwelling elderly (2 – 11%), yet is likely to be under-recognised. Nutrition impact symptoms play a role in reduced intake. Appropriate screening and referral processes should be established for early detection of those at risk. Phase I: Nutrition assessment tools in people with Parkinson's disease There are a number of validated and reliable nutrition screening and assessment tools available for use. None of these tools have been evaluated in PWP. In the sample described above, the use of the World Health Organisation (WHO) cut-off (≤18.5kg/m2), age-specific BMI cut-offs (≤18.5kg/m2 for under 65 years, ≤23.5kg/m2 for 65 years and older) and the revised Mini-Nutritional Assessment short form (MNA-SF) were evaluated as nutrition screening tools. The PG-SGA (including the SGA classification) and the MNA full form were evaluated as nutrition assessment tools using the SGA classification as the gold standard. For screening, the MNA-SF performed the best with sensitivity (Sn) of 94.7% and specificity (Sp) of 78.3%. For assessment, the PG-SGA with a cut-off score of 4 (Sn 100%, Sp 69.8%) performed better than the MNA (Sn 84.2%, Sp 87.7%). As the MNA has been recommended more for use as a nutrition screening tool, the MNA-SF might be more appropriate and take less time to complete. The PG-SGA might be useful to inform and monitor nutrition interventions. Phase I: Predictors of poor nutritional status in people with Parkinson's disease A number of assessments were conducted as part of the Phase I research, including those for the severity of PD motor symptoms, cognitive function, depression, anxiety, non-motor symptoms, constipation, freezing of gait and the ability to carry out activities of daily living. A higher score in all of these assessments indicates greater impairment. In addition, information about medical conditions, medications, age, age at PD diagnosis and living situation was collected. These were compared between those classified as SGA-A and as SGA-B. Regression analysis was used to identify which factors were predictive of malnutrition (SGA-B). Differences between the groups included disease severity (4% more severe SGA-A vs 21% SGA-B, p<.05), activities of daily living score (13 SGA-A vs 18 SGA-B, p<.05), depressive symptom score (8 SGA-A vs 14 SGA-B, p<.05) and gastrointestinal symptoms (4 SGA-A vs 6 SGA-B, p<.05). Significant predictors of malnutrition according to SGA were age at diagnosis (OR 1.09, 95% CI 1.01 – 1.18), amount of dopaminergic medication per kg body weight (mg/kg) (OR 1.17, 95% CI 1.04 – 1.31), more severe motor symptoms (OR 1.10, 95% CI 1.02 – 1.19), less anxiety (OR 0.90, 95% CI 0.82 – 0.98) and more depressive symptoms (OR 1.23, 95% CI 1.07 – 1.41). Significant predictors of a higher PG-SGA score included living alone (β=0.14, 95% CI 0.01 – 0.26), more depressive symptoms (β=0.02, 95% CI 0.01 – 0.02) and more severe motor symptoms (OR 0.01, 95% CI 0.01 – 0.02). More severe disease is associated with malnutrition, and this may be compounded by lack of social support. Phase II: Nutrition intervention Nineteen of the people identified in Phase I as requiring nutrition support were included in Phase II, in which a nutrition intervention was conducted. Nine participants were in the standard care group (SC), which received an information sheet only, and the other 10 participants were in the intervention group (INT), which received individualised nutrition information and weekly follow-up. INT gained 2.2% of starting body weight over the 12 week intervention period resulting in significant increases in weight, BMI, mid-arm circumference and waist circumference. The SC group gained 1% of starting weight over the 12 weeks which did not result in any significant changes in anthropometric indices. Energy and protein intake (18.3kJ/kg vs 3.8kJ/kg and 0.3g/kg vs 0.15g/kg) increased in both groups. The increase in protein intake was only significant in the SC group. The changes in intake, when compared between the groups, were no different. There were no significant changes in any motor or non-motor symptoms or in "off" times or dyskinesias in either group. Aspects of quality of life improved over the 12 weeks as well, especially emotional well-being. This thesis makes a significant contribution to the evidence base for the presence of malnutrition in Parkinson's disease as well as for the identification of those who would potentially benefit from nutrition screening and assessment. The nutrition intervention demonstrated that a traditional high protein, high energy approach to the management of malnutrition resulted in improved nutritional status and anthropometric indices with no effect on the presence of Parkinson's disease symptoms and a positive effect on quality of life.
Resumo:
Background Nutrition screening is usually administered by nurses. However, most studies on nutrition screening tools have not used nurses to validate the tools. The 3-Minute Nutrition Screening (3-MinNS) assesses weight loss, dietary intake and muscle wastage, with the composite score of each used to determine risk of malnutrition. The aim of the study was to determine the validity and reliability of 3-MinNS administered by nurses, who are the intended assessors. Methods In this cross sectional study, three ward-based nurses screened 121 patients aged 21 years and over using 3-MinNS in three wards within 24 hours of admission. A dietitian then assessed the patients’ nutritional status using Subjective Global Assessment within 48 hours of admission, whilst blinded to the results of the screening. To assess the reliability of 3-MinNS, 37 patients screened by the first nurse were re-screened by a second nurse within 24 hours, who was blinded to the results of the first nurse. The sensitivity, specificity and best cutoff score for 3-MinNS were determined using the Receiver Operator Characteristics Curve. Results The best cutoff score to identify all patients at risk of malnutrition using 3-MinNS was three, with sensitivity of 89% and specificity of 88%. This cutoff point also identified all (100%) severely malnourished patients. There was strong correlation between 3-MinNS and SGA (r=0.78, p<0.001). The agreement between two nurses conducting the 3-MinNS tool was 78.3%. Conclusion 3-Minute Nutrition Screening is a valid and reliable tool for nurses to identify patients at risk of malnutrition.
Resumo:
Diagnosis threat is a psychosocial factor that has been proposed to contribute to poor outcomes following mild traumatic brain injury (mTBI). This threat is thought to impair the cognitive test performance of individuals with mTBI because of negative injury stereotypes. University students (N= 45, 62.2% female) with a history of mTBI were randomly allocated to a diagnosis threat (DT, n=15), reduced threat (DT-reduced, n=15) or neutral (n=15) group. The reduced threat condition invoked a positive stereotype (i.e., that people with mTBI can perform well on cognitive tests). All participants were given neutral instructions before they completed baseline tests of: a) objective cognitive function across a number of domains; b) psychological symptoms; and, c) PCS symptoms, including self-reported cognitive and emotional difficulties. Participants then received either neutral, DT or DT-reduced instructions, before repeating the tests. Results were analyzed using separate mixed model ANOVAs; one for each dependent measure. The only significant result was for the 2 X 3 ANOVA on an objective test of attention/working memory, Digit Span, p<.05, such that the DT-reduced group performed better than the other groups, which were not different from each other. Although not consistent with predictions or earlier DT studies, the absence of group differences on most tests fits with several recent DT findings. The results of this study suggest that it is timely to reconsider the role of DT as a unique contributor to poor mTBI outcome.
Resumo:
Background A reliable standardized diagnosis of pneumonia in children has long been difficult to achieve. Clinical and radiological criteria have been developed by the World Health Organization (WHO), however, their generalizability to different populations is uncertain. We evaluated WHO defined chest radiograph (CXRs) confirmed alveolar pneumonia in the clinical context in Central Australian Aboriginal children, a high risk population, hospitalized with acute lower respiratory illness (ALRI). Methods CXRs in children (aged 1-60 months) hospitalized and treated with intravenous antibiotics for ALRI and enrolled in a randomized controlled trial (RCT) of Vitamin A/Zinc supplementation were matched with data collected during a population-based study of WHO-defined primary endpoint pneumonia (WHO-EPC). These CXRs were reread by a pediatric pulmonologist (PP) and classified as pneumonia-PP when alveolar changes were present. Sensitivities, specificities, positive and negative predictive values (PPV, NPV) for clinical presentations were compared between WHO-EPC and pneumonia-PP. Results Of the 147 episodes of hospitalized ALRI, WHO-EPC was significantly less commonly diagnosed in 40 (27.2%) compared to pneumonia-PP (difference 20.4%, 95% CI 9.6-31.2, P < 0.001). Clinical signs on admission were poor predictors for both pneumonia-PP and WHO-EPC; the sensitivities of clinical signs ranged from a high of 45% for tachypnea to 5% for fever + tachypnea + chest-indrawing. The PPV range was 40-20%, respectively. Higher PPVs were observed against the pediatric pulmonologist's diagnosis compared to WHO-EPC. Conclusions WHO-EPC underestimates alveolar consolidation in a clinical context. Its use in clinical practice or in research designed to inform clinical management in this population should be avoided. Pediatr Pulmonol. 2012; 47:386-392. (C) 2011 Wiley Periodicals, Inc.
Resumo:
Background Oropharyngeal aspiration (OPA) can lead to recurrent respiratory illnesses and chronic lung disease in children. Current clinical feeding evaluations performed by speech pathologists have poor reliability in detecting OPA when compared to radiological procedures such as the modified barium swallow (MBS). Improved ability to diagnose OPA accurately via clinical evaluation potentially reduces reliance on expensive, less readily available radiological procedures. Our study investigates the utility of adding cervical auscultation (CA), a technique of listening to swallowing sounds, in improving the diagnostic accuracy of a clinical evaluation for the detection of OPA. Methods We plan an open, unblinded, randomised controlled trial at a paediatric tertiary teaching hospital. Two hundred and sixteen children fulfilling the inclusion criteria will be randomised to one of the two clinical assessment techniques for the clinical detection of OPA: (1) clinical feeding evaluation only (CFE) group or (2) clinical feeding evaluation with cervical auscultation (CFE + CA) group. All children will then undergo an MBS to determine radiologically assessed OPA. The primary outcome is the presence or absence of OPA, as determined on MBS using the Penetration-Aspiration Scale. Our main objective is to determine the sensitivity, specificity, negative and positive predictive values of ‘CFE + CA’ versus ‘CFE’ only compared to MBS-identified OPA. Discussion Early detection and appropriate management of OPA is important to prevent chronic pulmonary disease and poor growth in children. As the reliability of CFE to detect OPA is low, a technique that can improve the diagnostic accuracy of the CFE will help minimise consequences to the paediatric respiratory system. Cervical auscultation is a technique that has previously been documented as a clinical adjunct to the CFE; however, no published RCTs addressing the reliability of this technique in children exist. Our study will be the first to establish the utility of CA in assessing and diagnosing OPA risk in young children.
Resumo:
This project was an observational study of outpatients following lower limb surgical procedures for removal of skin cancers. Findings highlight a previously unreported high surgical site failure rate. Results also identified four potential risk factors (increasing age, presence of leg pain, split skin graft and haematoma) which negatively impact on surgical site healing in this population.