985 resultados para Patient simulation


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The planning of refractive surgical interventions is a challenging task. Numerical modeling has been proposed as a solution to support surgical intervention and predict the visual acuity, but validation on patient specific intervention is missing. The purpose of this study was to validate the numerical predictions of the post-operative corneal topography induced by the incisions required for cataract surgery. The corneal topography of 13 patients was assessed preoperatively and postoperatively (1-day and 30-day follow-up) with a Pentacam tomography device. The preoperatively acquired geometric corneal topography – anterior, posterior and pachymetry data – was used to build patient-specific finite element models. For each patient, the effects of the cataract incisions were simulated numerically and the resulting corneal surfaces were compared to the clinical postoperative measurements at one day and at 30-days follow up. Results showed that the model was able to reproduce experimental measurements with an error on the surgically induced sphere of 0.38D one day postoperatively and 0.19D 30 days postoperatively. The standard deviation of the surgically induced cylinder was 0.54D at the first postoperative day and 0.38D 30 days postoperatively. The prediction errors in surface elevation and curvature were below the topography measurement device accuracy of ±5μm and ±0.25D after the 30-day follow-up. The results showed that finite element simulations of corneal biomechanics are able to predict post cataract surgery within topography measurement device accuracy. We can conclude that the numerical simulation can become a valuable tool to plan corneal incisions in cataract surgery and other ophthalmosurgical procedures in order to optimize patients' refractive outcome and visual function.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Myocardial fibrosis detected via delayed-enhanced magnetic resonance imaging (MRI) has been shown to be a strong indicator for ventricular tachycardia (VT) inducibility. However, little is known regarding how inducibility is affected by the details of the fibrosis extent, morphology, and border zone configuration. The objective of this article is to systematically study the arrhythmogenic effects of fibrosis geometry and extent, specifically on VT inducibility and maintenance. We present a set of methods for constructing patient-specific computational models of human ventricles using in vivo MRI data for patients suffering from hypertension, hypercholesterolemia, and chronic myocardial infarction. Additional synthesized models with morphologically varied extents of fibrosis and gray zone (GZ) distribution were derived to study the alterations in the arrhythmia induction and reentry patterns. Detailed electrophysiological simulations demonstrated that (1) VT morphology was highly dependent on the extent of fibrosis, which acts as a structural substrate, (2) reentry tended to be anchored to the fibrosis edges and showed transmural conduction of activations through narrow channels formed within fibrosis, and (3) increasing the extent of GZ within fibrosis tended to destabilize the structural reentry sites and aggravate the VT as compared to fibrotic regions of the same size and shape but with lower or no GZ. The approach and findings represent a significant step toward patient-specific cardiac modeling as a reliable tool for VT prediction and management of the patient. Sensitivities to approximation nuances in the modeling of structural pathology by image-based reconstruction techniques are also implicated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pectus excavatum is the most common congenital deformity of the anterior chest wall, in which several ribs and the sternum grow abnormally. Nowadays, the surgical correction is carried out in children and adults through Nuss technic. This technic has been shown to be safe with major drivers as cosmesis and the prevention of psychological problems and social stress. Nowadays, no application is known to predict the cosmetic outcome of the pectus excavatum surgical correction. Such tool could be used to help the surgeon and the patient in the moment of deciding the need for surgery correction. This work is a first step to predict postsurgical outcome in pectus excavatum surgery correction. Facing this goal, it was firstly determined a point cloud of the skin surface along the thoracic wall using Computed Tomography (before surgical correction) and the Polhemus FastSCAN (after the surgical correction). Then, a surface mesh was reconstructed from the two point clouds using a Radial Basis Function algorithm for further affine registration between the meshes. After registration, one studied the surgical correction influence area (SCIA) of the thoracic wall. This SCIA was used to train, test and validate artificial neural networks in order to predict the surgical outcome of pectus excavatum correction and to determine the degree of convergence of SCIA in different patients. Often, ANN did not converge to a satisfactory solution (each patient had its own deformity characteristics), thus invalidating the creation of a mathematical model capable of estimating, with satisfactory results, the postsurgical outcome

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pectus excavatum is the most common congenital deformity of the anterior chest wall, in which an abnormal formation of the rib cage gives the chest a caved-in or sunken appearance. Today, the surgical correction of this deformity is carried out in children and adults through Nuss technic, which consists in the placement of a prosthetic bar under the sternum and over the ribs. Although this technique has been shown to be safe and reliable, not all patients have achieved adequate cosmetic outcome. This often leads to psychological problems and social stress, before and after the surgical correction. This paper targets this particular problem by presenting a method to predict the patient surgical outcome based on pre-surgical imagiologic information and chest skin dynamic modulation. The proposed approach uses the patient pre-surgical thoracic CT scan and anatomical-surgical references to perform a 3D segmentation of the left ribs, right ribs, sternum and skin. The technique encompasses three steps: a) approximation of the cartilages, between the ribs and the sternum, trough b-spline interpolation; b) a volumetric mass spring model that connects two layers - inner skin layer based on the outer pleura contour and the outer surface skin; and c) displacement of the sternum according to the prosthetic bar position. A dynamic model of the skin around the chest wall region was generated, capable of simulating the effect of the movement of the prosthetic bar along the sternum. The results were compared and validated with patient postsurgical skin surface acquired with Polhemus FastSCAN system

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The success of dental implant-supported prosthesis is directly linked to the accuracy obtained during implant’s pose estimation (position and orientation). Although traditional impression techniques and recent digital acquisition methods are acceptably accurate, a simultaneously fast, accurate and operator-independent methodology is still lacking. Hereto, an image-based framework is proposed to estimate the patient-specific implant’s pose using cone-beam computed tomography (CBCT) and prior knowledge of implanted model. The pose estimation is accomplished in a threestep approach: (1) a region-of-interest is extracted from the CBCT data using 2 operator-defined points at the implant’s main axis; (2) a simulated CBCT volume of the known implanted model is generated through Feldkamp-Davis-Kress reconstruction and coarsely aligned to the defined axis; and (3) a voxel-based rigid registration is performed to optimally align both patient and simulated CBCT data, extracting the implant’s pose from the optimal transformation. Three experiments were performed to evaluate the framework: (1) an in silico study using 48 implants distributed through 12 tridimensional synthetic mandibular models; (2) an in vitro study using an artificial mandible with 2 dental implants acquired with an i-CAT system; and (3) two clinical case studies. The results shown positional errors of 67±34μm and 108μm, and angular misfits of 0.15±0.08º and 1.4º, for experiment 1 and 2, respectively. Moreover, in experiment 3, visual assessment of clinical data results shown a coherent alignment of the reference implant. Overall, a novel image-based framework for implants’ pose estimation from CBCT data was proposed, showing accurate results in agreement with dental prosthesis modelling requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT - Objectives: We attempted to show how the implementation of the key elements of the World Health Organization Patient Safety Curriculum Guide Multi-professional Edition in an undergraduate curriculum affected the knowledge, skills, and attitudes towards patient safety in a graduate entry Portuguese Medical School. Methods: After receiving formal recognition by the WHO as a Complementary Test Site and approval of the organizational ethics committee , the validated pre-course questionnaires measuring the knowledge, skills, and attitudes to patient safety were administered to the 2nd and3rd year students pursuing a four-year course (N = 46). The key modules of the curriculum were implemented over the academic year by employing a variety of learning strategies including expert lecturers, small group problem-based teaching sessions, and Simulation Laboratory sessions. The identical questionnaires were then administered and the impact was measured. The Curriculum Guide was evaluated as a health education tool in this context. Results: A significant number of the respondents, 47 % (n = 22), reported having received some form of prior patient safety training. The effect on Patient Safety Knowledge was assessed by using the percentage of correct pre- and post-course answers to construct 2 × 2 contingency tables and by applying Fishers’ test (two-tailed). No significant differences were detected (p < 0.05). To assess the effect of the intervention on Patient Safety skills and attitudes, the mean and standard deviation were calculated for the pre and post-course responses, and independent samples were subjected to Mann-Whitney’s test. The attitudinal survey indicated a very high baseline incidence of desirable attitudes and skills toward patient safety. Significant changes were detected (p < 0.05) regarding what should happen if an error is made (p = 0.016), the role of healthcare organizations in error reporting (p = 0.006), and the extent of medical error (p = 0.005). Conclusions: The implementation of selected modules of the WHO Patient Safety Curriculum was associated with a number of positive changes regarding patient safety skills and attitudes, with a baseline incidence of highly desirable patient safety attitudes, but no measureable change on the patient safety knowledge, at the University of Algarve Medical School. The significance of these results is discussed along with implications and suggestions for future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, many of the health care systems are large and complex environments and quite dynamic, specifically Emergency Departments, EDs. It is opened and working 24 hours per day throughout the year with limited resources, whereas it is overcrowded. Thus, is mandatory to simulate EDs to improve qualitatively and quantitatively their performance. This improvement can be achieved modelling and simulating EDs using Agent-Based Model, ABM and optimising many different staff scenarios. This work optimises the staff configuration of an ED. In order to do optimisation, objective functions to minimise or maximise have to be set. One of those objective functions is to find the best or optimum staff configuration that minimise patient waiting time. The staff configuration comprises: doctors, triage nurses, and admissions, the amount and sort of them. Staff configuration is a combinatorial problem, that can take a lot of time to be solved. HPC is used to run the experiments, and encouraging results were obtained. However, even with the basic ED used in this work the search space is very large, thus, when the problem size increases, it is going to need more resources of processing in order to obtain results in an acceptable time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: We are interested in the numerical simulation of the anastomotic region comprised between outflow canula of LVAD and the aorta. Segmenta¬tion, geometry reconstruction and grid generation from patient-specific data remain an issue because of the variable quality of DICOM images, in particular CT-scan (e.g. metallic noise of the device, non-aortic contrast phase). We pro¬pose a general framework to overcome this problem and create suitable grids for numerical simulations.Methods: Preliminary treatment of images is performed by reducing the level window and enhancing the contrast of the greyscale image using contrast-limited adaptive histogram equalization. A gradient anisotropic diffusion filter is applied to reduce the noise. Then, watershed segmentation algorithms and mathematical morphology filters allow reconstructing the patient geometry. This is done using the InsightToolKit library (www.itk.org). Finally the Vascular Model¬ing ToolKit (www.vmtk.org) and gmsh (www.geuz.org/gmsh) are used to create the meshes for the fluid (blood) and structure (arterial wall, outflow canula) and to a priori identify the boundary layers. The method is tested on five different patients with left ventricular assistance and who underwent a CT-scan exam.Results: This method produced good results in four patients. The anastomosis area is recovered and the generated grids are suitable for numerical simulations. In one patient the method failed to produce a good segmentation because of the small dimension of the aortic arch with respect to the image resolution.Conclusions: The described framework allows the use of data that could not be otherwise segmented by standard automatic segmentation tools. In particular the computational grids that have been generated are suitable for simulations that take into account fluid-structure interactions. Finally the presented method features a good reproducibility and fast application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To assess how different diagnostic decision aids perform in terms of sensitivity, specificity, and harm. METHODS: Four diagnostic decision aids were compared, as applied to a simulated patient population: a findings-based algorithm following a linear or branched pathway, a serial threshold-based strategy, and a parallel threshold-based strategy. Headache in immune-compromised HIV patients in a developing country was used as an example. Diagnoses included cryptococcal meningitis, cerebral toxoplasmosis, tuberculous meningitis, bacterial meningitis, and malaria. Data were derived from literature and expert opinion. Diagnostic strategies' validity was assessed in terms of sensitivity, specificity, and harm related to mortality and morbidity. Sensitivity analyses and Monte Carlo simulation were performed. RESULTS: The parallel threshold-based approach led to a sensitivity of 92% and a specificity of 65%. Sensitivities of the serial threshold-based approach and the branched and linear algorithms were 47%, 47%, and 74%, respectively, and the specificities were 85%, 95%, and 96%. The parallel threshold-based approach resulted in the least harm, with the serial threshold-based approach, the branched algorithm, and the linear algorithm being associated with 1.56-, 1.44-, and 1.17-times higher harm, respectively. Findings were corroborated by sensitivity and Monte Carlo analyses. CONCLUSION: A threshold-based diagnostic approach is designed to find the optimal trade-off that minimizes expected harm, enhancing sensitivity and lowering specificity when appropriate, as in the given example of a symptom pointing to several life-threatening diseases. Findings-based algorithms, in contrast, solely consider clinical observations. A parallel workup, as opposed to a serial workup, additionally allows for all potential diseases to be reviewed, further reducing false negatives. The parallel threshold-based approach might, however, not be as good in other disease settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pharmacokinetic variability in drug levels represent for some drugs a major determinant of treatment success, since sub-therapeutic concentrations might lead to toxic reactions, treatment discontinuation or inefficacy. This is true for most antiretroviral drugs, which exhibit high inter-patient variability in their pharmacokinetics that has been partially explained by some genetic and non-genetic factors. The population pharmacokinetic approach represents a very useful tool for the description of the dose-concentration relationship, the quantification of variability in the target population of patients and the identification of influencing factors. It can thus be used to make predictions and dosage adjustment optimization based on Bayesian therapeutic drug monitoring (TDM). This approach has been used to characterize the pharmacokinetics of nevirapine (NVP) in 137 HIV-positive patients followed within the frame of a TDM program. Among tested covariates, body weight, co-administration of a cytochrome (CYP) 3A4 inducer or boosted atazanavir as well as elevated aspartate transaminases showed an effect on NVP elimination. In addition, genetic polymorphism in the CYP2B6 was associated with reduced NVP clearance. Altogether, these factors could explain 26% in NVP variability. Model-based simulations were used to compare the adequacy of different dosage regimens in relation to the therapeutic target associated with treatment efficacy. In conclusion, the population approach is very useful to characterize the pharmacokinetic profile of drugs in a population of interest. The quantification and the identification of the sources of variability is a rational approach to making optimal dosage decision for certain drugs administered chronically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The use of virtual reality (VR) has gained increasing interest to acquire laparoscopic skills outside the operating theatre and thus increasing patients' safety. The aim of this study was to evaluate trainees' acceptance of VR for assessment and training during a skills course and at their institution. METHODS: All 735 surgical trainees of the International Gastrointestinal Surgery Workshop 2006-2008, held in Davos, Switzerland, were given a minimum of 45 minutes for VR training during the course. Participants' opinion on VR was analyzed with a standardized questionnaire. RESULTS: Fivehundred-twenty-seven participants (72%) from 28 countries attended the VR sessions and answered the questionnaires. The possibility of using VR at the course was estimated as excellent or good in 68%, useful in 21%, reasonable in 9% and unsuitable or useless in 2%. If such VR simulators were available at their institution, most course participants would train at least one hour per week (46%), two or more hours (42%) and only 12% wouldn't use VR. Similarly, 63% of the participants would accept to operate on patients only after VR training and 55% to have VR as part of their assessment. CONCLUSION: Residents accept and appreciate VR simulation for surgical assessment and training. The majority of the trainees are motivated to regularly spend time for VR training if accessible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simulation is a useful tool in cardiac SPECT to assess quantification algorithms. However, simple equation-based models are limited in their ability to simulate realistic heart motion and perfusion. We present a numerical dynamic model of the left ventricle, which allows us to simulate normal and anomalous cardiac cycles, as well as perfusion defects. Bicubic splines were fitted to a number of control points to represent endocardial and epicardial surfaces of the left ventricle. A transformation from each point on the surface to a template of activity was made to represent the myocardial perfusion. Geometry-based and patient-based simulations were performed to illustrate this model. Geometry-based simulations modeled ~1! a normal patient, ~2! a well-perfused patient with abnormal regional function, ~3! an ischaemic patient with abnormal regional function, and ~4! a patient study including tracer kinetics. Patient-based simulation consisted of a left ventricle including a realistic shape and motion obtained from a magnetic resonance study. We conclude that this model has the potential to study the influence of several physical parameters and the left ventricle contraction in myocardial perfusion SPECT and gated-SPECT studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hypothesis: The quality of care for chronic patients depends on the collaborative skills of the healthcare providers.1,2 The literature lacks reports of the use of simulation to teach collaborative skills in non-acute care settings. We posit that simulation offers benefits for supporting the development of collaborative practice in non-acute settings. We explored the benefits and challenges of using an Interprofessional Team - Objective Structured Clinical Examination (IT-OSCE) as a formative assessment tool. IT-OSCE is an intervention which involves an interprofessional team of trainees interacting with a simulated patient (SP) enabling them to practice collaborative skills in non-acute care settings.5 A simulated patient are people trained to portray patients in a simulated scenario for educational purposes.6,7 Since interprofessional education (IPE) ultimately aims to provide collaborative patient-centered care.8,9 We sought to promote patient-centeredness in the learning process. Methods: The IT-OSCE was conducted with four trios of students from different professions. The debriefing was co-facilitated by the SP with a faculty. The participants were final-year students in nursing, physiotherapy and medicine. Our research question focused on the introduction of co-facilitated (SP and faculty) debriefing after an IT-OSCE: 1) What are the benefits and challenges of involving the SP during the debriefing? and 2) To evaluate the IT-OSCE, an exploratory case study was used to provide fine grained data 10, 11. Three focus groups were conducted - two with students (n=6; n=5), one with SPs (n=3) and one with faculty (n=4). Audiotapes were transcribed for thematic analysis performed by three researchers, who found a consensus on the final set of themes. Results: The thematic analysis showed little differentiation between SPs, student and faculty perspectives. The analysis of transcripts revealed more particularly, that the SP's co-facilitation during the debriefing of an IT-OSCE proved to be feasible. It was appreciated by all the participants and appeared to value and to promote patient-centeredness in the learning process. The main challenge consisted in SPs feedback, more particularly in how they could report accurate observations to a students' group rather than individual students. Conclusion: In conclusion, SP methodology using an IT-OSCE seems to be a useful and promising way to train collaborative skills, aligning IPE, simulation-based team training in a non-acute care setting and patient-centeredness. We acknowledge the limitations of the study, especially the small sample and consider the exploration of SP-based IPE in non-acute care settings as strength. Future studies could consider the preparation of SPs and faculty as co-facilitators. References: 1. Borrill CS, Carletta J, Carter AJ, et al. The effectiveness of health care teams in the National Health Service. Aston centre for Health Service Organisational Research. 2001. 2. Reeves S, Lewin S, Espin S, Zwarenstein M. Interprofessional teamwork for health and social care. Oxford: Wiley-Blackwell; 2010. 3. Issenberg S, McGaghie WC, Petrusa ER, Gordon DL, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning - a BEME systematic review. Medical Teacher. 2005;27(1):10-28. 4. McGaghie W, Petrusa ER, Gordon DL, Scalese RJ. A critical review of simulation-based medical education research: 2003-2009. Medical Education. 2010;44(1):50-63. 5. Simmons B, Egan-Lee E, Wagner SJ, Esdaile M, Baker L, Reeves S. Assessment of interprofessional learning: the design of an interprofessional objective structured clinical examination (iOSCE) approach. Journal of Interprofessional Care. 2011;25(1):73-74. 6. Nestel D, Layat Burn C, Pritchard SA, Glastonbury R, Tabak D. The use of simulated patients in medical education: Guide Supplement 42.1 - Viewpoint. Medical teacher. 2011;33(12):1027-1029. Disclosures: None (C) 2014 by Lippincott Williams & Wilkins, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main purpose of this study was to describe and evaluate nursing students' learning about an empowering discourse in patient education. In Phase 1, the purpose was to describe an empowering discourse between a nurse and a patient. In Phase 2, the purpose was first to create a computer simulation program of an empowering discourse based on the description, and second, the purpose was to evaluate nursing students’ learning of how to conduct an empowering discourse using a computer simulation program. The ultimate goal was to strengthen the knowledge basis on empowering discourse and to develop nursing students’ knowledge about how to conduct an empowering discourse for the development of patient education. In Phase I, empowering discourse was described using a systematic literature review with a metasummary technique (n=15). Data were collected covering a period from January 1995 to October 2005. In Phase 2, the computer simulation program of empowering discourse was created based the description in 2006–2007. A descriptive comparative design was used to evaluate students’ (n=69) process of learning empowering discourse using the computer simulation program and a pretest–post-test design without a control group was used to evaluate students’ (n=43) outcomes of learning. Data were collected in 2007. Empowering discourse was a structured process and it was possible to simulate and learned with the computer simulation program. According to students’ knowledge, empowering discourse was an unstructured process. Process of learning empowering discourse using the computer simulation program was controlled by the students and it changed students’ knowledge. The outcomes of learning empowering discourse appeared as changes of students’ knowledge to more holistic and better-organized or only to more holistic or better-organized. The study strengthened knowledge base of empowering discourse and developed students to more knowledgeable in empowering discourse.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En radiothérapie, la tomodensitométrie (CT) fournit l’information anatomique du patient utile au calcul de dose durant la planification de traitement. Afin de considérer la composition hétérogène des tissus, des techniques de calcul telles que la méthode Monte Carlo sont nécessaires pour calculer la dose de manière exacte. L’importation des images CT dans un tel calcul exige que chaque voxel exprimé en unité Hounsfield (HU) soit converti en une valeur physique telle que la densité électronique (ED). Cette conversion est habituellement effectuée à l’aide d’une courbe d’étalonnage HU-ED. Une anomalie ou artefact qui apparaît dans une image CT avant l’étalonnage est susceptible d’assigner un mauvais tissu à un voxel. Ces erreurs peuvent causer une perte cruciale de fiabilité du calcul de dose. Ce travail vise à attribuer une valeur exacte aux voxels d’images CT afin d’assurer la fiabilité des calculs de dose durant la planification de traitement en radiothérapie. Pour y parvenir, une étude est réalisée sur les artefacts qui sont reproduits par simulation Monte Carlo. Pour réduire le temps de calcul, les simulations sont parallélisées et transposées sur un superordinateur. Une étude de sensibilité des nombres HU en présence d’artefacts est ensuite réalisée par une analyse statistique des histogrammes. À l’origine de nombreux artefacts, le durcissement de faisceau est étudié davantage. Une revue sur l’état de l’art en matière de correction du durcissement de faisceau est présentée suivi d’une démonstration explicite d’une correction empirique.