76 resultados para Analytical Anisotropic Algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper deals with the development and application of the methodology for automatic mapping of pollution/contamination data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve this problem. The automatic tuning of isotropic and an anisotropic GRNN model using cross-validation procedure is presented. Results are compared with k-nearest-neighbours interpolation algorithm using independent validation data set. Quality of mapping is controlled by the analysis of raw data and the residuals using variography. Maps of probabilities of exceeding a given decision level and ?thick? isoline visualization of the uncertainties are presented as examples of decision-oriented mapping. Real case study is based on mapping of radioactively contaminated territories.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The decision-making process regarding drug dose, regularly used in everyday medical practice, is critical to patients' health and recovery. It is a challenging process, especially for a drug with narrow therapeutic ranges, in which a medical doctor decides the quantity (dose amount) and frequency (dose interval) on the basis of a set of available patient features and doctor's clinical experience (a priori adaptation). Computer support in drug dose administration makes the prescription procedure faster, more accurate, objective, and less expensive, with a tendency to reduce the number of invasive procedures. This paper presents an advanced integrated Drug Administration Decision Support System (DADSS) to help clinicians/patients with the dose computing. Based on a support vector machine (SVM) algorithm, enhanced with the random sample consensus technique, this system is able to predict the drug concentration values and computes the ideal dose amount and dose interval for a new patient. With an extension to combine the SVM method and the explicit analytical model, the advanced integrated DADSS system is able to compute drug concentration-to-time curves for a patient under different conditions. A feedback loop is enabled to update the curve with a new measured concentration value to make it more personalized (a posteriori adaptation).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electrical Impedance Tomography (EIT) is an imaging method which enables a volume conductivity map of a subject to be produced from multiple impedance measurements. It has the potential to become a portable non-invasive imaging technique of particular use in imaging brain function. Accurate numerical forward models may be used to improve image reconstruction but, until now, have employed an assumption of isotropic tissue conductivity. This may be expected to introduce inaccuracy, as body tissues, especially those such as white matter and the skull in head imaging, are highly anisotropic. The purpose of this study was, for the first time, to develop a method for incorporating anisotropy in a forward numerical model for EIT of the head and assess the resulting improvement in image quality in the case of linear reconstruction of one example of the human head. A realistic Finite Element Model (FEM) of an adult human head with segments for the scalp, skull, CSF, and brain was produced from a structural MRI. Anisotropy of the brain was estimated from a diffusion tensor-MRI of the same subject and anisotropy of the skull was approximated from the structural information. A method for incorporation of anisotropy in the forward model and its use in image reconstruction was produced. The improvement in reconstructed image quality was assessed in computer simulation by producing forward data, and then linear reconstruction using a sensitivity matrix approach. The mean boundary data difference between anisotropic and isotropic forward models for a reference conductivity was 50%. Use of the correct anisotropic FEM in image reconstruction, as opposed to an isotropic one, corrected an error of 24 mm in imaging a 10% conductivity decrease located in the hippocampus, improved localisation for conductivity changes deep in the brain and due to epilepsy by 4-17 mm, and, overall, led to a substantial improvement on image quality. This suggests that incorporation of anisotropy in numerical models used for image reconstruction is likely to improve EIT image quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Five selective serotonin reuptake inhibitors (SSRIs) have been introduced recently: citalopram, fluoxetine, fluvoxamine, paroxetine and sertraline. Although no therapeutic window has been defined for SSRIs, in contrast to tricyclic antidepressants, analytical methods for therapeutic drug monitoring of SSRIs are useful in several instances. SSRIs differ widely in their chemical structure and in their metabolism. The fact that some of them have N-demethylated metabolites, which are also SSRIs, requires that methods be available which allow therapeutic drug monitoring of the parent compounds and of these active metabolites. most procedures are based on prepurification of the SSRIs by liquid-liquid extraction before they are submitted to separation by chromatographic procedures (high-performance liquid chromatography, gas chromatography, thin layer chromatography) and detection by various detectors (UV, fluorescence, electrochemical detector, nitrogen-phosphorus detector, mass spectrometry). This literature review shows that most methods allow quantitative determination of SSRIs in plasma, in the lower ng/ml range, and that they are, therefore, suitable for therapeutic drug monitoring purposes of this category of drugs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabecular bone score (TBS) is a recently-developed analytical tool that performs novel grey-level texture measurements on lumbar spine dual X-ray absorptiometry (DXA) images, and thereby captures information relating to trabecular microarchitecture. In order for TBS to usefully add to bone mineral density (BMD) and clinical risk factors in osteoporosis risk stratification, it must be independently associated with fracture risk, readily obtainable, and ideally, present a risk which is amenable to osteoporosis treatment. This paper summarizes a review of the scientific literature performed by a Working Group of the European Society for Clinical and Economic Aspects of Osteoporosis and Osteoarthritis. Low TBS is consistently associated with an increase in both prevalent and incident fractures that is partly independent of both clinical risk factors and areal BMD (aBMD) at the lumbar spine and proximal femur. More recently, TBS has been shown to have predictive value for fracture independent of fracture probabilities using the FRAX® algorithm. Although TBS changes with osteoporosis treatment, the magnitude is less than that of aBMD of the spine, and it is not clear how change in TBS relates to fracture risk reduction. TBS may also have a role in the assessment of fracture risk in some causes of secondary osteoporosis (e.g., diabetes, hyperparathyroidism and glucocorticoid-induced osteoporosis). In conclusion, there is a role for TBS in fracture risk assessment in combination with both aBMD and FRAX.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many three-dimensional (3-D) structures in rock, which formed during the deformation of the Earth's crust and lithosphere, are controlled by a difference in mechanical strength between rock units and are often the result of a geometrical instability. Such structures are, for example, folds, pinch-and-swell structures (due to necking) or cuspate-lobate structures (mullions). These struc-tures occur from the centimeter to the kilometer scale and the related deformation processes con-trol the formation of, for example, fold-and-thrust belts and extensional sedimentary basins or the deformation of the basement-cover interface. The 2-D deformation processes causing these structures are relatively well studied, however, several processes during large-strain 3-D defor-mation are still incompletely understood. One of these 3-D processes is the lateral propagation of these structures, such as fold and cusp propagation in a direction orthogonal to the shortening direction or neck propagation in direction orthogonal to the extension direction. Especially, we are interested in fold nappes which are recumbent folds with amplitudes usually exceeding 10 km and they have been presumably formed by ductile shearing. They often exhibit a constant sense of shearing and a non-linear increase of shear strain towards their overturned limb. The fold axes of the Morcles fold nappe in western Switzerland plunges to the ENE whereas the fold axes in the more eastern Doldenhorn nappe plunges to the WSW. These opposite plunge direc-tions characterize the Rawil depression (Wildstrubel depression). The Morcles nappe is mainly the result of layer parallel contraction and shearing. During the compression the massive lime-stones were more competent than the surrounding marls and shales, which led to the buckling characteristics of the Morcles nappe, especially in the north-dipping normal limb. The Dolden-horn nappe exhibits only a minor overturned fold limb. There are still no 3-D numerical studies which investigate the fundamental dynamics of the formation of the large-scale 3-D structure including the Morcles and Doldenhorn nappes and the related Rawil depression. We study the 3-D evolution of geometrical instabilities and fold nappe formation with numerical simulations based on the finite element method (FEM). Simulating geometrical instabilities caused by sharp variations of mechanical strength between rock units requires a numerical algorithm that can accurately resolve material interfaces for large differences in material properties (e.g. between limestone and shale) and for large deformations. Therefore, our FE algorithm combines a nu-merical contour-line technique and a deformable Lagrangian mesh with re-meshing. With this combined method it is possible to accurately follow the initial material contours with the FE mesh and to accurately resolve the geometrical instabilities. The algorithm can simulate 3-D de-formation for a visco-elastic rheology. The viscous rheology is described by a power-law flow law. The code is used to study the 3-D fold nappe formation, the lateral propagation of folding and also the lateral propagation of cusps due to initial half graben geometry. Thereby, the small initial geometrical perturbations for folding and necking are exactly followed by the FE mesh, whereas the initial large perturbation describing a half graben is defined by a contour line inter-secting the finite elements. Further, the 3-D algorithm is applied to 3-D viscous nacking during slab detachment. The results from various simulations are compared with 2-D resulats and a 1-D analytical solution. -- On retrouve beaucoup de structures en 3 dimensions (3-D) dans les roches qui ont pour origines une déformation de la lithosphère terrestre. Ces structures sont par exemple des plis, des boudins (pinch-and-swell) ou des mullions (cuspate-lobate) et sont présentés de l'échelle centimétrique à kilométrique. Mécaniquement, ces structures peuvent être expliquées par une différence de résistance entre les différentes unités de roches et sont généralement le fruit d'une instabilité géométrique. Ces différences mécaniques entre les unités contrôlent non seulement les types de structures rencontrées, mais également le type de déformation (thick skin, thin skin) et le style tectonique (bassin d'avant pays, chaîne d'avant pays). Les processus de la déformation en deux dimensions (2-D) formant ces structures sont relativement bien compris. Cependant, lorsque l'on ajoute la troisiéme dimension, plusieurs processus ne sont pas complètement compris lors de la déformation à large échelle. L'un de ces processus est la propagation latérale des structures, par exemple la propagation de plis ou de mullions dans la direction perpendiculaire à l'axe de com-pression, ou la propagation des zones d'amincissement des boudins perpendiculairement à la direction d'extension. Nous sommes particulièrement intéressés les nappes de plis qui sont des nappes de charriage en forme de plis couché d'une amplitude plurikilométrique et étant formées par cisaillement ductile. La plupart du temps, elles exposent un sens de cisaillement constant et une augmentation non linéaire de la déformation vers la base du flanc inverse. Un exemple connu de nappes de plis est le domaine Helvétique dans les Alpes de l'ouest. Une de ces nap-pes est la Nappe de Morcles dont l'axe de pli plonge E-NE tandis que de l'autre côté de la dépression du Rawil (ou dépression du Wildstrubel), la nappe du Doldenhorn (équivalent de la nappe de Morcles) possède un axe de pli plongeant O-SO. La forme particulière de ces nappes est due à l'alternance de couches calcaires mécaniquement résistantes et de couches mécanique-ment faibles constituées de schistes et de marnes. Ces différences mécaniques dans les couches permettent d'expliquer les plissements internes à la nappe, particulièrement dans le flanc inver-se de la nappe de Morcles. Il faut également noter que le développement du flanc inverse des nappes n'est pas le même des deux côtés de la dépression de Rawil. Ainsi la nappe de Morcles possède un important flanc inverse alors que la nappe du Doldenhorn en est presque dépour-vue. A l'heure actuelle, aucune étude numérique en 3-D n'a été menée afin de comprendre la dynamique fondamentale de la formation des nappes de Morcles et du Doldenhorn ainsi que la formation de la dépression de Rawil. Ce travail propose la première analyse de l'évolution 3-D des instabilités géométriques et de la formation des nappes de plis en utilisant des simulations numériques. Notre modèle est basé sur la méthode des éléments finis (FEM) qui permet de ré-soudre avec précision les interfaces entre deux matériaux ayant des propriétés mécaniques très différentes (par exemple entre les couches calcaires et les couches marneuses). De plus nous utilisons un maillage lagrangien déformable avec une fonction de re-meshing (production d'un nouveau maillage). Grâce à cette méthode combinée il nous est possible de suivre avec précisi-on les interfaces matérielles et de résoudre avec précision les instabilités géométriques lors de la déformation de matériaux visco-élastiques décrit par une rhéologie non linéaire (n>1). Nous uti-lisons cet algorithme afin de comprendre la formation des nappes de plis, la propagation latérale du plissement ainsi que la propagation latérale des structures de type mullions causé par une va-riation latérale de la géométrie (p.ex graben). De plus l'algorithme est utilisé pour comprendre la dynamique 3-D de l'amincissement visqueux et de la rupture de la plaque descendante en zone de subduction. Les résultats obtenus sont comparés à des modèles 2-D et à la solution analytique 1-D. -- Viele drei dimensionale (3-D) Strukturen, die in Gesteinen vorkommen und durch die Verfor-mung der Erdkruste und Litosphäre entstanden sind werden von den unterschiedlichen mechani-schen Eigenschaften der Gesteinseinheiten kontrolliert und sind häufig das Resulat von geome-trischen Istabilitäten. Zu diesen strukturen zählen zum Beispiel Falten, Pich-and-swell Struktu-ren oder sogenannte Cusbate-Lobate Strukturen (auch Mullions). Diese Strukturen kommen in verschiedenen Grössenordungen vor und können Masse von einigen Zentimeter bis zu einigen Kilometer aufweisen. Die mit der Entstehung dieser Strukturen verbundenen Prozesse kontrol-lieren die Entstehung von Gerbirgen und Sediment-Becken sowie die Verformung des Kontaktes zwischen Grundgebirge und Stedimenten. Die zwei dimensionalen (2-D) Verformungs-Prozesse die zu den genannten Strukturen führen sind bereits sehr gut untersucht. Einige Prozesse wäh-rend starker 3-D Verformung sind hingegen noch unvollständig verstanden. Einer dieser 3-D Prozesse ist die seitliche Fortpflanzung der beschriebenen Strukturen, so wie die seitliche Fort-pflanzung von Falten und Cusbate-Lobate Strukturen senkrecht zur Verkürzungsrichtung und die seitliche Fortpflanzung von Pinch-and-Swell Strukturen othogonal zur Streckungsrichtung. Insbesondere interessieren wir uns für Faltendecken, liegende Falten mit Amplituden von mehr als 10 km. Faltendecken entstehen vermutlich durch duktile Verscherung. Sie zeigen oft einen konstanten Scherungssinn und eine nicht-lineare zunahme der Scherverformung am überkipp-ten Schenkel. Die Faltenachsen der Morcles Decke in der Westschweiz fallen Richtung ONO während die Faltenachsen der östicher gelegenen Doldenhorn Decke gegen WSW einfallen. Diese entgegengesetzten Einfallrichtungen charakterisieren die Rawil Depression (Wildstrubel Depression). Die Morcles Decke ist überwiegend das Resultat von Verkürzung und Scherung parallel zu den Sedimentlagen. Während der Verkürzung verhielt sich der massive Kalkstein kompetenter als der Umliegende Mergel und Schiefer, was zur Verfaltetung Morcles Decke führ-te, vorallem in gegen Norden eifallenden überkippten Schenkel. Die Doldenhorn Decke weist dagegen einen viel kleineren überkippten Schenkel und eine stärkere Lokalisierung der Verfor-mung auf. Bis heute gibt es keine 3-D numerischen Studien, die die fundamentale Dynamik der Entstehung von grossen stark verformten 3-D Strukturen wie den Morcles und Doldenhorn Decken sowie der damit verbudenen Rawil Depression untersuchen. Wir betrachten die 3-D Ent-wicklung von geometrischen Instabilitäten sowie die Entstehung fon Faltendecken mit Hilfe von numerischen Simulationen basiert auf der Finite Elemente Methode (FEM). Die Simulation von geometrischen Instabilitäten, die aufgrund von Änderungen der Materialeigenschaften zwischen verschiedenen Gesteinseinheiten entstehen, erfortert einen numerischen Algorithmus, der in der Lage ist die Materialgrenzen mit starkem Kontrast der Materialeigenschaften (zum Beispiel zwi-schen Kalksteineinheiten und Mergel) für starke Verfomung genau aufzulösen. Um dem gerecht zu werden kombiniert unser FE Algorithmus eine numerische Contour-Linien-Technik und ein deformierbares Lagranges Netz mit Re-meshing. Mit dieser kombinierten Methode ist es mög-lich den anfänglichen Materialgrenzen mit dem FE Netz genau zu folgen und die geometrischen Instabilitäten genügend aufzulösen. Der Algorithmus ist in der Lage visko-elastische 3-D Ver-formung zu rechnen, wobei die viskose Rheologie mit Hilfe eines power-law Fliessgesetzes beschrieben wird. Mit dem numerischen Algorithmus untersuchen wir die Entstehung von 3-D Faltendecken, die seitliche Fortpflanzung der Faltung sowie der Cusbate-Lobate Strukturen die sich durch die Verkürzung eines mit Sediment gefüllten Halbgraben bilden. Dabei werden die anfänglichen geometrischen Instabilitäten der Faltung exakt mit dem FE Netz aufgelöst wäh-rend die Materialgranzen des Halbgrabens die Finiten Elemente durchschneidet. Desweiteren wird der 3-D Algorithmus auf die Einschnürung während der 3-D viskosen Plattenablösung und Subduktion angewandt. Die 3-D Resultate werden mit 2-D Ergebnissen und einer 1-D analyti-schen Lösung verglichen.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although fetal anatomy can be adequately viewed in new multi-slice MR images, many critical limitations remain for quantitative data analysis. To this end, several research groups have recently developed advanced image processing methods, often denoted by super-resolution (SR) techniques, to reconstruct from a set of clinical low-resolution (LR) images, a high-resolution (HR) motion-free volume. It is usually modeled as an inverse problem where the regularization term plays a central role in the reconstruction quality. Literature has been quite attracted by Total Variation energies because of their ability in edge preserving but only standard explicit steepest gradient techniques have been applied for optimization. In a preliminary work, it has been shown that novel fast convex optimization techniques could be successfully applied to design an efficient Total Variation optimization algorithm for the super-resolution problem. In this work, two major contributions are presented. Firstly, we will briefly review the Bayesian and Variational dual formulations of current state-of-the-art methods dedicated to fetal MRI reconstruction. Secondly, we present an extensive quantitative evaluation of our SR algorithm previously introduced on both simulated fetal and real clinical data (with both normal and pathological subjects). Specifically, we study the robustness of regularization terms in front of residual registration errors and we also present a novel strategy for automatically select the weight of the regularization as regards the data fidelity term. Our results show that our TV implementation is highly robust in front of motion artifacts and that it offers the best trade-off between speed and accuracy for fetal MRI recovery as in comparison with state-of-the art methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION: The decline of malaria and scale-up of rapid diagnostic tests calls for a revision of IMCI. A new algorithm (ALMANACH) running on mobile technology was developed based on the latest evidence. The objective was to ensure that ALMANACH was safe, while keeping a low rate of antibiotic prescription. METHODS: Consecutive children aged 2-59 months with acute illness were managed using ALMANACH (2 intervention facilities), or standard practice (2 control facilities) in Tanzania. Primary outcomes were proportion of children cured at day 7 and who received antibiotics on day 0. RESULTS: 130/842 (15∙4%) in ALMANACH and 241/623 (38∙7%) in control arm were diagnosed with an infection in need for antibiotic, while 3∙8% and 9∙6% had malaria. 815/838 (97∙3%;96∙1-98.4%) were cured at D7 using ALMANACH versus 573/623 (92∙0%;89∙8-94∙1%) using standard practice (p<0∙001). Of 23 children not cured at D7 using ALMANACH, 44% had skin problems, 30% pneumonia, 26% upper respiratory infection and 13% likely viral infection at D0. Secondary hospitalization occurred for one child using ALMANACH and one who eventually died using standard practice. At D0, antibiotics were prescribed to 15∙4% (12∙9-17∙9%) using ALMANACH versus 84∙3% (81∙4-87∙1%) using standard practice (p<0∙001). 2∙3% (1∙3-3.3) versus 3∙2% (1∙8-4∙6%) received an antibiotic secondarily. CONCLUSION: Management of children using ALMANACH improve clinical outcome and reduce antibiotic prescription by 80%. This was achieved through more accurate diagnoses and hence better identification of children in need of antibiotic treatment or not. The building on mobile technology allows easy access and rapid update of the decision chart. TRIAL REGISTRATION: Pan African Clinical Trials Registry PACTR201011000262218.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To review the available knowledge on epidemiology and diagnoses of acute infections in children aged 2 to 59 months in primary care setting and develop an electronic algorithm for the Integrated Management of Childhood Illness to reach optimal clinical outcome and rational use of medicines. METHODS: A structured literature review in Medline, Embase and the Cochrane Database of Systematic Review (CDRS) looked for available estimations of diseases prevalence in outpatients aged 2-59 months, and for available evidence on i) accuracy of clinical predictors, and ii) performance of point-of-care tests for targeted diseases. A new algorithm for the management of childhood illness (ALMANACH) was designed based on evidence retrieved and results of a study on etiologies of fever in Tanzanian children outpatients. FINDINGS: The major changes in ALMANACH compared to IMCI (2008 version) are the following: i) assessment of 10 danger signs, ii) classification of non-severe children into febrile and non-febrile illness, the latter receiving no antibiotics, iii) classification of pneumonia based on a respiratory rate threshold of 50 assessed twice for febrile children 12-59 months; iv) malaria rapid diagnostic test performed for all febrile children. In the absence of identified source of fever at the end of the assessment, v) urine dipstick performed for febrile children <2 years to consider urinary tract infection, vi) classification of 'possible typhoid' for febrile children >2 years with abdominal tenderness; and lastly vii) classification of 'likely viral infection' in case of negative results. CONCLUSION: This smartphone-run algorithm based on new evidence and two point-of-care tests should improve the quality of care of <5 year children and lead to more rational use of antimicrobials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The fight against doping in sports has been governed since 1999 by the World Anti-Doping Agency (WADA), an independent institution behind the implementation of the World Anti-Doping Code (Code). The intent of the Code is to protect clean athletes through the harmonization of anti-doping programs at the international level with special attention to detection, deterrence and prevention of doping.1 A new version of the Code came into force on January 1st 2015, introducing, among other improvements, longer periods of sanctioning for athletes (up to four years) and measures to strengthen the role of anti-doping investigations and intelligence. To ensure optimal harmonization, five International Standards covering different technical aspects of the Code are also currently in force: the List of Prohibited Substances and Methods (List), Testing and Investigations, Laboratories, Therapeutic Use Exemptions (TUE) and Protection of Privacy and Personal Information. Adherence to these standards is mandatory for all anti-doping stakeholders to be compliant with the Code. Among these documents, the eighth version of International Standard for Laboratories (ISL), which also came into effect on January 1st 2015, includes regulations for WADA and ISO/IEC 17025 accreditations and their application for urine and blood sample analysis by anti-doping laboratories.2 Specific requirements are also described in several Technical Documents or Guidelines in which various topics are highlighted such as the identification criteria for gas chromatography (GC) and liquid chromatography (LC) coupled to mass spectrometry (MS) techniques (IDCR), measurements and reporting of endogenous androgenic anabolic agents (EAAS) and analytical requirements for the Athlete Biological Passport (ABP).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This review presents the evolution of steroid analytical techniques, including gas chromatography coupled to mass spectrometry (GC-MS), immunoassay (IA) and targeted liquid chromatography coupled to mass spectrometry (LC-MS), and it evaluates the potential of extended steroid profiles by a metabolomics-based approach, namely steroidomics. Steroids regulate essential biological functions including growth and reproduction, and perturbations of the steroid homeostasis can generate serious physiological issues; therefore, specific and sensitive methods have been developed to measure steroid concentrations. GC-MS measuring several steroids simultaneously was considered the first historical standard method for analysis. Steroids were then quantified by immunoassay, allowing a higher throughput; however, major drawbacks included the measurement of a single compound instead of a panel and cross-reactivity reactions. Targeted LC-MS methods with selected reaction monitoring (SRM) were then introduced for quantifying a small steroid subset without the problems of cross-reactivity. The next step was the integration of metabolomic approaches in the context of steroid analyses. As metabolomics tends to identify and quantify all the metabolites (i.e., the metabolome) in a specific system, appropriate strategies were proposed for discovering new biomarkers. Steroidomics, defined as the untargeted analysis of the steroid content in a sample, was implemented in several fields, including doping analysis, clinical studies, in vivo or in vitro toxicology assays, and more. This review discusses the current analytical methods for assessing steroid changes and compares them to steroidomics. Steroids, their pathways, their implications in diseases and the biological matrices in which they are analysed will first be described. Then, the different analytical strategies will be presented with a focus on their ability to obtain relevant information on the steroid pattern. The future technical requirements for improving steroid analysis will also be presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fetal MRI reconstruction aims at finding a high-resolution image given a small set of low-resolution images. It is usually modeled as an inverse problem where the regularization term plays a central role in the reconstruction quality. Literature has considered several regularization terms s.a. Dirichlet/Laplacian energy [1], Total Variation (TV)based energies [2,3] and more recently non-local means [4]. Although TV energies are quite attractive because of their ability in edge preservation, standard explicit steepest gradient techniques have been applied to optimize fetal-based TV energies. The main contribution of this work lies in the introduction of a well-posed TV algorithm from the point of view of convex optimization. Specifically, our proposed TV optimization algorithm for fetal reconstruction is optimal w.r.t. the asymptotic and iterative convergence speeds O(1/n(2)) and O(1/root epsilon), while existing techniques are in O(1/n) and O(1/epsilon). We apply our algorithm to (1) clinical newborn data, considered as ground truth, and (2) clinical fetal acquisitions. Our algorithm compares favorably with the literature in terms of speed and accuracy.