987 resultados para Biomedical model
Resumo:
The optical quality of the human eye mainly depends on the refractive performance of the cornea. The shape of the cornea is a mechanical balance between intraocular pressure and tissue intrinsic stiffness. Several surgical procedures in ophthalmology alter the biomechanics of the cornea to provoke local or global curvature changes for vision correction. Legitimated by the large number of surgical interventions performed every day, the demand for a deeper understanding of corneal biomechanics is rising to improve the safety of procedures and medical devices. The aim of our work is to propose a numerical model of corneal biomechanics, based on the stromal microstructure. Our novel anisotropic constitutive material law features a probabilistic weighting approach to model collagen fiber distribution as observed on human cornea by Xray scattering analysis (Aghamohammadzadeh et. al., Structure, February 2004). Furthermore, collagen cross-linking was explicitly included in the strain energy function. Results showed that the proposed model is able to successfully reproduce both inflation and extensiometry experimental data (Elsheikh et. al., Curr Eye Res, 2007; Elsheikh et. al., Exp Eye Res, May 2008). In addition, the mechanical properties calculated for patients of different age groups (Group A: 65-79 years; Group B: 80-95 years) demonstrate an increased collagen cross-linking, and a decrease in collagen fiber elasticity from younger to older specimen. These findings correspond to what is known about maturing fibrous biological tissue. Since the presented model can handle different loading situations and includes the anisotropic distribution of collagen fibers, it has the potential to simulate clinical procedures involving nonsymmetrical tissue interventions. In the future, such mechanical model can be used to improve surgical planning and the design of next generation ophthalmic devices.
Resumo:
This paper aims at the development and evaluation of a personalized insulin infusion advisory system (IIAS), able to provide real-time estimations of the appropriate insulin infusion rate for type 1 diabetes mellitus (T1DM) patients using continuous glucose monitors and insulin pumps. The system is based on a nonlinear model-predictive controller (NMPC) that uses a personalized glucose-insulin metabolism model, consisting of two compartmental models and a recurrent neural network. The model takes as input patient's information regarding meal intake, glucose measurements, and insulin infusion rates, and provides glucose predictions. The predictions are fed to the NMPC, in order for the latter to estimate the optimum insulin infusion rates. An algorithm based on fuzzy logic has been developed for the on-line adaptation of the NMPC control parameters. The IIAS has been in silico evaluated using an appropriate simulation environment (UVa T1DM simulator). The IIAS was able to handle various meal profiles, fasting conditions, interpatient variability, intraday variation in physiological parameters, and errors in meal amount estimations.
Resumo:
We present an automatic method to segment brain tissues from volumetric MRI brain tumor images. The method is based on non-rigid registration of an average atlas in combination with a biomechanically justified tumor growth model to simulate soft-tissue deformations caused by the tumor mass-effect. The tumor growth model, which is formulated as a mesh-free Markov Random Field energy minimization problem, ensures correspondence between the atlas and the patient image, prior to the registration step. The method is non-parametric, simple and fast compared to other approaches while maintaining similar accuracy. It has been evaluated qualitatively and quantitatively with promising results on eight datasets comprising simulated images and real patient data.
Resumo:
This paper presents a kernel density correlation based nonrigid point set matching method and shows its application in statistical model based 2D/3D reconstruction of a scaled, patient-specific model from an un-calibrated x-ray radiograph. In this method, both the reference point set and the floating point set are first represented using kernel density estimates. A correlation measure between these two kernel density estimates is then optimized to find a displacement field such that the floating point set is moved to the reference point set. Regularizations based on the overall deformation energy and the motion smoothness energy are used to constraint the displacement field for a robust point set matching. Incorporating this non-rigid point set matching method into a statistical model based 2D/3D reconstruction framework, we can reconstruct a scaled, patient-specific model from noisy edge points that are extracted directly from the x-ray radiograph by an edge detector. Our experiment conducted on datasets of two patients and six cadavers demonstrates a mean reconstruction error of 1.9 mm
Resumo:
Understanding how nanoparticles may affect immune responses is an essential prerequisite to developing novel clinical applications. To investigate nanoparticle-dependent outcomes on immune responses, dendritic cells (DCs) were treated with model biomedical poly(vinylalcohol)-coated super-paramagnetic iron oxide nanoparticles (PVA-SPIONs). PVA-SPIONs uptake by human monocyte-derived DCs (MDDCs) was analyzed by flow cytometry (FACS) and advanced imaging techniques. Viability, activation, function, and stimulatory capacity of MDDCs were assessed by FACS and an in vitro CD4+ T cell assay. PVA-SPION uptake was dose-dependent, decreased by lipopolysaccharide (LPS)-induced MDDC maturation at higher particle concentrations, and was inhibited by cytochalasin D pre-treatment. PVA-SPIONs did not alter surface marker expression (CD80, CD83, CD86, myeloid/plasmacytoid DC markers) or antigen-uptake, but decreased the capacity of MDDCs to process antigen, stimulate CD4+ T cells, and induce cytokines. The decreased antigen processing and CD4+ T cell stimulation capability of MDDCs following PVA-SPION treatment suggests that MDDCs may revert to a more functionally immature state following particle exposure.
Resumo:
Clinical application of injectable ceramic cement in comminuted fractures revealed penetration of the viscous paste into the joint space. Not much is known on the fate of this cement and its influence on articular tissues. The purpose of this experimental study was to assess these unknown alterations of joint tissues after intra-articular injection of cement in a rabbit knee. Observation periods were from 1 week up to 24 months, with three rabbits per group. Norian SRS cement was injected into one knee joint, the contralateral side receiving the same volume of Ringers' solution. Light microscopic evaluation of histologic sections was performed, investigating the appearance of the cement, inflammatory reactions, and degenerative changes of the articular surface. No signs of pronounced acute or chronic inflammation were visible. The injected cement was mainly found as a single particle, anterior to the cruciate ligaments. It became surrounded by synovial tissues within 4 weeks and showed signs of superficial resorption. In some specimens, bone formation was seen around the cement. Degeneration of the articular surface showed no differences between experimental and control side, and no changes over time became apparent. No major degenerative changes were induced by the injected cement. The prolonged presence of cement still seems to make it advisable to remove radiologically visible amounts from the joint space.
Resumo:
Professor Sir David R. Cox (DRC) is widely acknowledged as among the most important scientists of the second half of the twentieth century. He inherited the mantle of statistical science from Pearson and Fisher, advanced their ideas, and translated statistical theory into practice so as to forever change the application of statistics in many fields, but especially biology and medicine. The logistic and proportional hazards models he substantially developed, are arguably among the most influential biostatistical methods in current practice. This paper looks forward over the period from DRC's 80th to 90th birthdays, to speculate about the future of biostatistics, drawing lessons from DRC's contributions along the way. We consider "Cox's model" of biostatistics, an approach to statistical science that: formulates scientific questions or quantities in terms of parameters gamma in probability models f(y; gamma) that represent in a parsimonious fashion, the underlying scientific mechanisms (Cox, 1997); partition the parameters gamma = theta, eta into a subset of interest theta and other "nuisance parameters" eta necessary to complete the probability distribution (Cox and Hinkley, 1974); develops methods of inference about the scientific quantities that depend as little as possible upon the nuisance parameters (Barndorff-Nielsen and Cox, 1989); and thinks critically about the appropriate conditional distribution on which to base infrences. We briefly review exciting biomedical and public health challenges that are capable of driving statistical developments in the next decade. We discuss the statistical models and model-based inferences central to the CM approach, contrasting them with computationally-intensive strategies for prediction and inference advocated by Breiman and others (e.g. Breiman, 2001) and to more traditional design-based methods of inference (Fisher, 1935). We discuss the hierarchical (multi-level) model as an example of the future challanges and opportunities for model-based inference. We then consider the role of conditional inference, a second key element of the CM. Recent examples from genetics are used to illustrate these ideas. Finally, the paper examines causal inference and statistical computing, two other topics we believe will be central to biostatistics research and practice in the coming decade. Throughout the paper, we attempt to indicate how DRC's work and the "Cox Model" have set a standard of excellence to which all can aspire in the future.
Resumo:
A time series is a sequence of observations made over time. Examples in public health include daily ozone concentrations, weekly admissions to an emergency department or annual expenditures on health care in the United States. Time series models are used to describe the dependence of the response at each time on predictor variables including covariates and possibly previous values in the series. Time series methods are necessary to account for the correlation among repeated responses over time. This paper gives an overview of time series ideas and methods used in public health research.
Resumo:
Constructing a 3D surface model from sparse-point data is a nontrivial task. Here, we report an accurate and robust approach for reconstructing a surface model of the proximal femur from sparse-point data and a dense-point distribution model (DPDM). The problem is formulated as a three-stage optimal estimation process. The first stage, affine registration, is to iteratively estimate a scale and a rigid transformation between the mean surface model of the DPDM and the sparse input points. The estimation results of the first stage are used to establish point correspondences for the second stage, statistical instantiation, which stably instantiates a surface model from the DPDM using a statistical approach. This surface model is then fed to the third stage, kernel-based deformation, which further refines the surface model. Handling outliers is achieved by consistently employing the least trimmed squares (LTS) approach with a roughly estimated outlier rate in all three stages. If an optimal value of the outlier rate is preferred, we propose a hypothesis testing procedure to automatically estimate it. We present here our validations using four experiments, which include 1 leave-one-out experiment, 2 experiment on evaluating the present approach for handling pathology, 3 experiment on evaluating the present approach for handling outliers, and 4 experiment on reconstructing surface models of seven dry cadaver femurs using clinically relevant data without noise and with noise added. Our validation results demonstrate the robust performance of the present approach in handling outliers, pathology, and noise. An average 95-percentile error of 1.7-2.3 mm was found when the present approach was used to reconstruct surface models of the cadaver femurs from sparse-point data with noise added.
Resumo:
Background: Statistical shape models are widely used in biomedical research. They are routinely implemented for automatic image segmentation or object identification in medical images. In these fields, however, the acquisition of the large training datasets, required to develop these models, is usually a time-consuming process. Even after this effort, the collections of datasets are often lost or mishandled resulting in replication of work. Objective: To solve these problems, the Virtual Skeleton Database (VSD) is proposed as a centralized storage system where the data necessary to build statistical shape models can be stored and shared. Methods: The VSD provides an online repository system tailored to the needs of the medical research community. The processing of the most common image file types, a statistical shape model framework, and an ontology-based search provide the generic tools to store, exchange, and retrieve digital medical datasets. The hosted data are accessible to the community, and collaborative research catalyzes their productivity. Results: To illustrate the need for an online repository for medical research, three exemplary projects of the VSD are presented: (1) an international collaboration to achieve improvement in cochlear surgery and implant optimization, (2) a population-based analysis of femoral fracture risk between genders, and (3) an online application developed for the evaluation and comparison of the segmentation of brain tumors. Conclusions: The VSD is a novel system for scientific collaboration for the medical image community with a data-centric concept and semantically driven search option for anatomical structures. The repository has been proven to be a useful tool for collaborative model building, as a resource for biomechanical population studies, or to enhance segmentation algorithms.