946 resultados para méthode level-set
Resumo:
We present a controlled image smoothing and enhancement method based on a curvature flow interpretation of the geometric heat equation. Compared to existing techniques, the model has several distinct advantages. (i) It contains just one enhancement parameter. (ii) The scheme naturally inherits a stopping criterion from the image; continued application of the scheme produces no further change. (iii) The method is one of the fastest possible schemes based on a curvature-controlled approach.
Resumo:
Land-surface processes include a broad class of models that operate at a landscape scale. Current modelling approaches tend to be specialised towards one type of process, yet it is the interaction of processes that is increasing seen as important to obtain a more integrated approach to land management. This paper presents a technique and a tool that may be applied generically to landscape processes. The technique tracks moving interfaces across landscapes for processes such as water flow, biochemical diffusion, and plant dispersal. Its theoretical development applies a Lagrangian approach to motion over a Eulerian grid space by tracking quantities across a landscape as an evolving front. An algorithm for this technique, called level set method, is implemented in a geographical information system (GIS). It fits with a field data model in GIS and is implemented as operators in map algebra. The paper describes an implementation of the level set methods in a map algebra programming language, called MapScript, and gives example program scripts for applications in ecology and hydrology.
Resumo:
Peer reviewed
Resumo:
Water removal in paper manufacturing is an energy-intensive process. The dewatering process generally consists of four stages of which the first three stages include mechanical water removal through gravity filtration, vacuum dewatering and wet pressing. In the fourth stage, water is removed thermally, which is the most expensive stage in terms of energy use. In order to analyse water removal during a vacuum dewatering process, a numerical model was created by using a Level-Set method. Various different 2D structures of the paper model were created in MATLAB code with randomly positioned circular fibres with identical orientation. The model considers the influence of the forming fabric which supports the paper sheet during the dewatering process, by using volume forces to represent flow resistance in the momentum equation. The models were used to estimate the dry content of the porous structure for various dwell times. The relation between dry content and dwell time was compared to laboratory data for paper sheets with basis weights of 20 and 50 g/m2 exposed to vacuum levels between 20 kPa and 60 kPa. The comparison showed reasonable results for dewatering and air flow rates. The random positioning of the fibres influences the dewatering rate slightly. In order to achieve more accurate comparisons, the random orientation of the fibres needs to be considered, as well as the deformation and displacement of the fibres during the dewatering process.
Resumo:
The analysis of fluid behavior in multiphase flow is very relevant to guarantee system safety. The use of equipment to describe such behavior is subjected to factors such as the high level of investments and of specialized labor. The application of image processing techniques to flow analysis can be a good alternative, however, very little research has been developed. In this subject, this study aims at developing a new approach to image segmentation based on Level Set method that connects the active contours and prior knowledge. In order to do that, a model shape of the targeted object is trained and defined through a model of point distribution and later this model is inserted as one of the extension velocity functions for the curve evolution at zero level of level set method. The proposed approach creates a framework that consists in three terms of energy and an extension velocity function λLg(θ)+vAg(θ)+muP(0)+θf. The first three terms of the equation are the same ones introduced in (LI CHENYANG XU; FOX, 2005) and the last part of the equation θf is based on the representation of object shape proposed in this work. Two method variations are used: one restricted (Restrict Level Set - RLS) and the other with no restriction (Free Level Set - FLS). The first one is used in image segmentation that contains targets with little variation in shape and pose. The second will be used to correctly identify the shape of the bubbles in the liquid gas two phase flows. The efficiency and robustness of the approach RLS and FLS are presented in the images of the liquid gas two phase flows and in the image dataset HTZ (FERRARI et al., 2009). The results confirm the good performance of the proposed algorithm (RLS and FLS) and indicate that the approach may be used as an efficient method to validate and/or calibrate the various existing equipment used as meters for two phase flow properties, as well as in other image segmentation problems.
Resumo:
Ce travail présente une technique de simulation de feux de forêt qui utilise la méthode Level-Set. On utilise une équation aux dérivées partielles pour déformer une surface sur laquelle est imbriqué notre front de flamme. Les bases mathématiques de la méthode Level-set sont présentées. On explique ensuite une méthode de réinitialisation permettant de traiter de manière robuste des données réelles et de diminuer le temps de calcul. On étudie ensuite l’effet de la présence d’obstacles dans le domaine de propagation du feu. Finalement, la question de la recherche du point d’ignition d’un incendie est abordée.
Resumo:
Cette thèse est divisée en trois chapitres. Le premier explique comment utiliser la méthode «level-set» de manière rigoureuse pour faire la simulation de feux de forêt en utilisant comme modèle physique pour la propagation le modèle de l'ellipse de Richards. Le second présente un nouveau schéma semi-implicite avec une preuve de convergence pour la solution d'une équation de type Hamilton-Jacobi anisotrope. L'avantage principal de cette méthode est qu'elle permet de réutiliser des solutions à des problèmes «proches» pour accélérer le calcul. Une autre application de ce schéma est l'homogénéisation. Le troisième chapitre montre comment utiliser les méthodes numériques des deux premiers chapitres pour étudier l'influence de variations à petites échelles dans la vitesse du vent sur la propagation d'un feu de forêt à l'aide de la théorie de l'homogénéisation.
Resumo:
IN MANY FACTORIES, the feed chute of the first mill is operated with a high chute level for the purpose of maximising the cane rate through the mill. There is a trend towards trying to control chute level within a small control range near the top of a chute that can result in rapid changes in cane feeding rate to maintain the chute level set point. This paper reviews the theory that predicts higher cane rate with higher chute level and discusses the main weakness in the theory that it does not consider the beneficial effect on capacity of cane falling from the top of the chute to the top surface of the cane mat. An extension to the chute theory model is described that predicts higher capacity with lower chute level because of the effect of the falling cane. The original model and this extended model are believed to be the upper and lower limits to the true effect. The paper reports an experiment that measured the real effect of chute level on capacity and finds that increasing chute level does lead to higher capacity but that the trend is only about one-third as strong as the original theory predicted. The paper questions whether the benefits of slightly greater capacity outweigh the costs of operating with the small control range near the top of the chute.
Resumo:
The long term goal of our work is to enable rapid prototyping design optimization to take place on geometries of arbitrary size in a spirit of a real time computer game. In recent papers we have reported the integration of a Level Set based geometry kernel with an octree-based cut-Cartesian mesh generator, RANS flow solver and post-processing all within a single piece of software - and all implemented in parallel with commodity PC clusters as the target. This work has shown that it is possible to eliminate all serial bottlenecks from the CED Process. This paper reports further progress towards our goal; in particular we report on the generation of viscous layer meshes to bridge the body to the flow across the cut-cells. The Level Set formulation, which underpins the geometry representation, is used as a natural mechanism to allow rapid construction of conformal layer meshes. The guiding principle is to construct the mesh which most closely approximates the body but remains solvable. This apparently novel approach is described and examples given.
Resumo:
The background to this review paper is research we have performed over recent years aimed at developing a simulation system capable of handling large scale, real world applications implemented in an end-to-end parallel, scalable manner. The particular focus of this paper is the use of a Level Set solid modeling geometry kernel within this parallel framework to enable automated design optimization without topological restrictions and on geometries of arbitrary complexity. Also described is another interesting application of Level Sets: their use in guiding the export of a body-conformal mesh from our basic cut-Cartesian background octree - mesh - this permits third party flow solvers to be deployed. As a practical demonstrations meshes of guaranteed quality are generated and flow-solved for a B747 in full landing configuration and an automated optimization is performed on a cooled turbine tip geometry. Copyright © 2009 by W.N.Dawes.
Resumo:
La méthode de projection et l'approche variationnelle de Sasaki sont deux techniques permettant d'obtenir un champ vectoriel à divergence nulle à partir d'un champ initial quelconque. Pour une vitesse d'un vent en haute altitude, un champ de vitesse sur une grille décalée est généré au-dessus d'une topographie donnée par une fonction analytique. L'approche cartésienne nommée Embedded Boundary Method est utilisée pour résoudre une équation de Poisson découlant de la projection sur un domaine irrégulier avec des conditions aux limites mixtes. La solution obtenue permet de corriger le champ initial afin d'obtenir un champ respectant la loi de conservation de la masse et prenant également en compte les effets dûs à la géométrie du terrain. Le champ de vitesse ainsi généré permettra de propager un feu de forêt sur la topographie à l'aide de la méthode iso-niveaux. L'algorithme est décrit pour le cas en deux et trois dimensions et des tests de convergence sont effectués.
Resumo:
INTRODUCTION : L’articulation temporo-mandibulaire (ATM) est un système articulaire excessivement complexe. L'étiologie des désordres temporo-mandibulaires (DTM) est encore incertaine et le lien de cause à effet des traitements orthodontiques en tant que facteur de risque est une question qui a longuement été discutée. Cette étude clinique prospective vise à évaluer les effets à long terme du port continu de coquilles correctrices Invisalign® sur l’ATM et les muscles du complexe facial. MATÉRIELS ET MÉTHODES : L'étude incluait 43 adolescents et adultes âgés entre 13 et 51 ans (25 femmes et 18 hommes). Deux d'entre eux ont été exclus en raison de mauvaise coopération causant l’arrêt du traitement orthodontique. Les effets dans le temps des coquilles sur l'ATM et les muscles du complexe facial ont été évalués en utilisant l’examen du Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD). Le nombre de contractions musculaires durant le sommeil a été mesuré objectivement par enregistrements électromyographiques (EMG) et la fréquence de grincement et de serrement des dents à l’éveil a été rapportée subjectivement par les patients à l’aide de questionnaires. Des mesures répétées ont été effectuées aux temps suivants: avant le début du traitement pour les données contrôles (T1), deux semaines (T2), et six mois (T3) après le début du traitement. Les données numériques ont été analysées par l’analyse de variance (ANOVA) en mesures répétées et la méthode de Brunner-Langer, alors que les données nominales ont été évaluées par le test de Cochran-Mantel-Haenszel. Les résultats ont été considérés significatifs si p < 0.05. RÉSULTATS ET DISCUSSION : Le nombre de contractions musculaires par heure (index) durant le sommeil et leur durée moyenne n’ont pas été statistiquement différents entre les trois nuits d’enregistrement EMG (Brunner Langer, p > 0.005). Cependant, 67 % des participants ont rapporté avoir eu du grincement ou du serrement des dents la nuit au T2 et 64 % au T3 comparativement à 39 % au T1, ce qui était une augmentation significative (Cochran-Mantel-Haenszel, p = 0.0112). Quarante-quatre pour cent des patients ont signalé du grincement ou du serrement des dents pendant le jour au T1, tandis qu'un pourcentage nettement plus élevé de 66 % en a rapporté au T2 et 61 % au T3 (Cochran-Mantel-Haenszel, p = 0.0294). Au T1, 12 % des sujets ont indiqué qu'ils se sont réveillés avec une douleur musculaire, comparativement à 29 % au T2, ce qui était une augmentation significative (Cochran-Mantel-Haenszel, p = 0.0347). Au T2, il y avait une réduction significative des mouvements maximaux de la mandibule dans toutes les directions (ANOVA en mesures répétées, p < 0,05). De plus, il y a eu une augmentation significative du nombre de sites douloureux et de l'intensité de la douleur à la palpation de l'ATM et des muscles faciaux avec l'évaluation du RDC/TMD au T2 en comparaison aux T1 et T3 (Brunner Langer, p < 0,05). CONCLUSION : La présente étude n’a révélé aucun effet des coquilles sur l’activité oro-faciale durant le sommeil au fil du temps mesurée objectivement à l’aide des enregistrements EMG, mais une augmentation significative de la fréquence du grincement et du serrement des dents rapportée subjectivement par les patients au moyen des questionnaires aux T2 et T3. Au T2, il y avait une augmentation significative des symptômes de l'ATM et des muscles du complexe oro-facial, mais ces symptômes sont retournés au niveau initial avec le temps.
Resumo:
Several methods based on Kriging have recently been proposed for calculating a probability of failure involving costly-to-evaluate functions. A closely related problem is to estimate the set of inputs leading to a response exceeding a given threshold. Now, estimating such a level set—and not solely its volume—and quantifying uncertainties on it are not straightforward. Here we use notions from random set theory to obtain an estimate of the level set, together with a quantification of estimation uncertainty. We give explicit formulae in the Gaussian process set-up and provide a consistency result. We then illustrate how space-filling versus adaptive design strategies may sequentially reduce level set estimation uncertainty.
Resumo:
2000 Mathematics Subject Classification: 60F05, 60B10.
Resumo:
Objective. The aim of this paper is to report the clinical practice changes resulting from strategies to standardise diabetic foot clinical management in three diverse ambulatory service sites in Queensland, Australia. Methods. Multifaceted strategies were implemented in 2008, including: multidisciplinary teams, clinical pathways, clinical training, clinical indicators, and telehealth support. Prior to the intervention, none of the aforementioned strategies were used, except one site had a basic multidisciplinary team. A retrospective audit of consecutive patient records from July 2006 to June 2007 determined baseline clinical activity (n = 101).Aclinical pathway teleform was implemented as a clinical activity analyser in 2008 (n = 327) and followed up in 2009 (n = 406). Pre- and post-implementation data were analysed using Chi-square tests with a significance level set at P < 0.05. Results. There was an improvement in surveillance of the high risk population of 34% in 2008 and 19% in 2009, and treating according to risk of 15% in 2009 (P < 0.05). The documentation of all best-practice clinical activities performed improved 13–66% (P < 0.03). Conclusion. These findings support the use of multifaceted strategies to standardise practice and improve diabetic foot complications management in diverse ambulatory services.