982 resultados para semi-recursive method


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Based in internet growth, through semantic web, together with communication speed improvement and fast development of storage device sizes, data and information volume rises considerably every day. Because of this, in the last few years there has been a growing interest in structures for formal representation with suitable characteristics, such as the possibility to organize data and information, as well as the reuse of its contents aimed for the generation of new knowledge. Controlled Vocabulary, specifically Ontologies, present themselves in the lead as one of such structures of representation with high potential. Not only allow for data representation, as well as the reuse of such data for knowledge extraction, coupled with its subsequent storage through not so complex formalisms. However, for the purpose of assuring that ontology knowledge is always up to date, they need maintenance. Ontology Learning is an area which studies the details of update and maintenance of ontologies. It is worth noting that relevant literature already presents first results on automatic maintenance of ontologies, but still in a very early stage. Human-based processes are still the current way to update and maintain an ontology, which turns this into a cumbersome task. The generation of new knowledge aimed for ontology growth can be done based in Data Mining techniques, which is an area that studies techniques for data processing, pattern discovery and knowledge extraction in IT systems. This work aims at proposing a novel semi-automatic method for knowledge extraction from unstructured data sources, using Data Mining techniques, namely through pattern discovery, focused in improving the precision of concept and its semantic relations present in an ontology. In order to verify the applicability of the proposed method, a proof of concept was developed, presenting its results, which were applied in building and construction sector.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a real data set of claims amounts where costs related to damage are recorded separately from those related to medical expenses. Only claims with positive costs are considered here. Two approaches to density estimation are presented: a classical parametric and a semi-parametric method, based on transformation kernel density estimation. We explore the data set with standard univariate methods. We also propose ways to select the bandwidth and transformation parameters in the univariate case based on Bayesian methods. We indicate how to compare the results of alternative methods both looking at the shape of the overall density domain and exploring the density estimates in the right tail.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hem realitzat l’estudi de moviments humans i hem buscat la forma de poder crear aquests moviments en temps real sobre entorns digitals de forma que la feina que han de dur a terme els artistes i animadors sigui reduïda. Hem fet un estudi de les diferents tècniques d’animació de personatges que podem trobar actualment en l’industria de l’entreteniment així com les principals línies de recerca, estudiant detingudament la tècnica més utilitzada, la captura de moviments. La captura de moviments permet enregistrar els moviments d’una persona mitjançant sensors òptics, sensors magnètics i vídeo càmeres. Aquesta informació és emmagatzemada en arxius que després podran ser reproduïts per un personatge en temps real en una aplicació digital. Tot moviment enregistrat ha d’estar associat a un personatge, aquest és el procés de rigging, un dels punts que hem treballat ha estat la creació d’un sistema d’associació de l’esquelet amb la malla del personatge de forma semi-automàtica, reduint la feina de l’animador per a realitzar aquest procés. En les aplicacions en temps real com la realitat virtual, cada cop més s’està simulant l’entorn en el que viuen els personatges mitjançant les lleis de Newton, de forma que tot canvi en el moviment d’un cos ve donat per l’aplicació d’una força sobre aquest. La captura de moviments no escala bé amb aquests entorns degut a que no és capaç de crear noves animacions realistes a partir de l’enregistrada que depenguin de l’interacció amb l’entorn. L’objectiu final del nostre treball ha estat realitzar la creació d’animacions a partir de forces tal i com ho fem en la realitat en temps real. Per a això hem introduït un model muscular i un sistema de balanç sobre el personatge de forma que aquest pugui respondre a les interaccions amb l’entorn simulat mitjançant les lleis de Newton de manera realista.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction: Vertebral fracture is one of the major osteoporotic fractures which are unfortunately very often undetected. In addition, it is well known that prevalent vertebral fracture increases dramatically the risk of future additional fracture. Instant Vertebral Assessment (IVA) has been introduced in DXA device couple years ago to ease the detection of such fracture when routine DXA are performed. To correctly use such tool, ISCD provided clinical recommendation on when and how to use it. The aim of our study was to evaluate the ISCD guidelines in clinical routine patients and see how often it may change of patient management. Methods: During two months (March and April 2010), a medical questionnaire was systematically given to our clinical routine patient to check the validity of ISCD IVA recommendations in our population. In addition, all women had BMD measurement at AP spine, Femur and 1/3 radius using a Discovery A System (Hologic, Waltham, USA). When appropriate, IVA measurement had been performed on the same DXA system and had been centrally evaluated by two trained Doctors for fracture status according to the semi-quantitative method of Genant. The reading had been performed when possible between L5 and T4. Results: Out of 210 women seen in the consultation, 109 (52%) of them (mean age 68.2 ± 11.5 years) fulfilled the necessary criteria to have an IVA measurement. Out of these 109 women, 43 (incidence 39.4%) had osteoporosis at one of the three skeletal sites and 31 (incidence 28.4%) had at least one vertebral fracture. 14.7% of women had both osteoporosis and at least one vertebral fracture classifying them as "severe osteoporosis" while 46.8% did not have osteoporosis nor vertebral fracture. 24.8% of the women had osteoporosis but no vertebral fracture while 13.8% of women did have osteoporosis and vertebral fracture (clinical osteoporosis). Conclusion: In conclusion, in 52% of our patients, IVA was needed according to ISCD criteria. In half of them the IVA test influenced of patient management either by changing the type of treatment of simply by classifying patient as "clinical osteoporosis". IVA appears to be an important tool in clinical routine but unfortunately is not yet very often used in most of the centers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Vertebral fracture is one of the major osteoporotic fractures which are unfortunately very often undetected. In addition, it is well known that prevalent vertebral fracture increases dramatically the risk of future additional fracture. Instant Vertebral Assessment (IVA) has been introduced in DXA device couple years ago to ease the detection of such fracture when routine DXA are performed. To correctly use such tool, ISCD provided clinical recommendation on when and how to use it. The aim of our study was to evaluate the ISCD guidelines in clinical routine patients and see how often it may change of patient management. During two months (March and April 2010), a medical questionnaire was systematically given to our clinical routine patient to check the validity of ISCD IVA recommendations in our population. In addition, all women had BMD measurement at AP spine, Femur and 1/3 radius using a Discovery A System (Hologic, Waltham, USA). When appropriate, IVA measurement had been performed on the same DXA system and had been centrally evaluated by two trained Doctors for fracture status according to the semi-quantitative method of Genant. The reading had been performed when possible between L5 and T4. Out of 210 women seen in the consultation, 109 (52%) of them (mean age 68.2±11.5 years) fulfilled the necessary criteria to have an IVA measurement. Out of these 109 women, 43 (incidence 39.4%) had osteoporosis at one of the three skeletal sites and 31 (incidence 28.4%) had at least one vertebral fracture. 14.7% of women had both osteoporosis and at least one vertebral fracture classifying them as "severe osteoporosis" while 46.8% did not have osteoporosis not vertebral fracture. 24.8% of the women had osteoporosis but no vertebral fracture while 13.8% of women did have osteoporosis but vertebral fracture (Clinical osteoporosis). In conclusion, in 52% of our patients, IVA was needed according to ISCD criteria. In half of them the IVA test influenced of patient management either my changing the type of treatment of simply by classifying patient as "clinical osteoporosis". IVA appears to be an important tool in clinical routine but unfortunately is not yet very often use in most of the centers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: MLPA method is a potentially useful semi-quantitative method to detect copy number alterations in targeted regions. In this paper, we propose a method for the normalization procedure based on a non-linear mixed-model, as well as a new approach for determining the statistical significance of altered probes based on linear mixed-model. This method establishes a threshold by using different tolerance intervals that accommodates the specific random error variability observed in each test sample.Results: Through simulation studies we have shown that our proposed method outperforms two existing methods that are based on simple threshold rules or iterative regression. We have illustrated the method using a controlled MLPA assay in which targeted regions are variable in copy number in individuals suffering from different disorders such as Prader-Willi, DiGeorge or Autism showing the best performace.Conclusion: Using the proposed mixed-model, we are able to determine thresholds to decide whether a region is altered. These threholds are specific for each individual, incorporating experimental variability, resulting in improved sensitivity and specificity as the examples with real data have revealed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a study of the interaction of small molecules with ZnO surfaces by means of theoretical methods. The AM1 semi-empirical method was used for optimizing the geometric parameters of adsorbed molecules. The optimized AM1 structures were used in the calculations of the ab initio RHF method with the 3-21G* basis set. The interaction of CO, CO2 and NH3 molecules were studied with (ZnO)22 and (ZnO)60 cluster models. We have analyzed the interaction energy, SCF orbital energies, Mulliken charges and the density of states (DOS).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work presents the electrochemical and quantum chemical studies of the oxidation of the tricyclic antidepressant amitriptyline (AM) employing a carbon-polyurethane composite electrode (GPU) in a 0.1 mol L-1 BR buffer. The electrochemical results showed that the oxidation of AM occurs irreversibly at potentials close to 830 mV with the loss of one electron and one proton and is controlled by reagent and product adsorption. According to the PM3 results, the atom C16 is the region of highest probability for the oxidation of AM since it has the largest charge variation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The synthesis and physico-chemical properties of new 6-acetylamino or 6-benzoyl-amino 2-benzylidene-4-methyl-4H-benzo[1,4]thiazin-3-ones and 6-benzoylamino or 6-nitro 2-benzylidene-4H-benzo[1,4]thiazin-3-ones are described. These benzylidene benzothiazine compounds were prepared by the Knoevenagel condensation with benzaldehydes. The configurations and conformations of benzylidene benzothiazine derivatives were optimised using the semi-empirical method AM1.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Molecular modeling enables the students to visualize the abstract relationships underlying theoretical concepts that explain experimental data on the molecular and atomic levels. With this aim we used the free software "Arguslab 4.0.1" (semi-empirical method) to study the reaction of 1-chloropropane with ethoxide in solution, known to lead to methyl propyl ether, through the S N2 mechanism, and propene, through the E2 mechanism. This tool allows users to calculate some properties (i. e. heat formation or electric charges) and to produce 3D images (molecular geometry, electrostatic potential surface, etc.) that render the comprehension of the factors underlying the reaction's progress, which are related to the structure of the reagents, and the process kinetic clearer and easier to understand by the students

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The main objective of this work is to analyze the importance of the gas-solid interface transfer of the kinetic energy of the turbulent motion on the accuracy of prediction of the fluid dynamic of Circulating Fluidized Bed (CFB) reactors. CFB reactors are used in a variety of industrial applications related to combustion, incineration and catalytic cracking. In this work a two-dimensional fluid dynamic model for gas-particle flow has been used to compute the porosity, the pressure, and the velocity fields of both phases in 2-D axisymmetrical cylindrical co-ordinates. The fluid dynamic model is based on the two fluid model approach in which both phases are considered to be continuous and fully interpenetrating. CFB processes are essentially turbulent. The model of effective stress on each phase is that of a Newtonian fluid, where the effective gas viscosity was calculated from the standard k-epsilon turbulence model and the transport coefficients of the particulate phase were calculated from the kinetic theory of granular flow (KTGF). This work shows that the turbulence transfer between the phases is very important for a better representation of the fluid dynamics of CFB reactors, especially for systems with internal recirculation and high gradients of particle concentration. Two systems with different characteristics were analyzed. The results were compared with experimental data available in the literature. The results were obtained by using a computer code developed by the authors. The finite volume method with collocated grid, the hybrid interpolation scheme, the false time step strategy and SIMPLEC (Semi-Implicit Method for Pressure Linked Equations - Consistent) algorithm were used to obtain the numerical solution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Schwann cell disturbance followed by segmental demyelination in the peripheral nervous system occurs in diabetic patients. Since Schwann cell and oligodendrocyte remyelination in the central nervous system is a well-known event in the ethidium bromide (EB) demyelinating model, the aim of this investigation was to determine the behavior of both cell types after local EB injection into the brainstem of streptozotocin diabetic rats. Adult male Wistar rats received a single intravenous injection of streptozotocin (50 mg/kg) and were submitted 10 days later to a single injection of 10 µL 0.1% (w/v) EB or 0.9% saline solution into the cisterna pontis. Ten microliters of 0.1% EB was also injected into non-diabetic rats. The animals were anesthetized and perfused through the heart 7 to 31 days after EB or saline injection and brainstem sections were collected and processed for light and transmission electron microscopy. The final balance of myelin repair in diabetic and non-diabetic rats at 31 days was compared using a semi-quantitative method. Diabetic rats presented delayed macrophage activity and lesser remyelination compared to non-diabetic rats. Although oligodendrocytes were the major remyelinating cells in the brainstem, Schwann cells invaded EB-induced lesions, first appearing at 11 days in non-diabetic rats and by 15 days in diabetic rats. Results indicate that short-term streptozotocin-induced diabetes hindered both oligodendrocyte and Schwann cell remyelination (mean remyelination scores of 2.57 ± 0.77 for oligodendrocytes and 0.67 ± 0.5 for Schwann cells) compared to non-diabetic rats (3.27 ± 0.85 and 1.38 ± 0.81, respectively).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction : La chronicité de la rhinosinusite, sa résistance aux antibiotiques, et ses exacerbations aiguës laissent croire que les biofilms sont impliqués dans la rhinosinusite chronique. Objectifs : Nous avons évalué la capacité des bactéries Pseudomonas aeruginosa, staphylocoques à coagulase négative et Staphylococcus aureus à former des biofilms par un essai in vitro, et si cette capacité de formation a un lien avec l’évolution de la maladie. Nous avons évalué in vitro l’effet de la moxifloxacine, un antibiotique utilisé dans le traitement de la rhinosinusite chronique sur des biofilms matures de Staphylococcus aureus. Méthodes : Trent et une souches bactériennes ont été isolées de 19 patients atteints de rhinosinusite chronique et qui ont subit au moins une chirurgie endoscopique des sinus. L’évolution de la maladie a été notée comme "bonne" ou "mauvaise" selon l’évaluation du clinicien. La production de biofilm a été évaluée grâce à la coloration au crystal violet. Nous avons évalué la viabilité du biofilm après traitement avec la moxifloxacine. Ces résultats ont été confirmés en microscopie confocale à balayage laser et par la coloration au LIVE/DEAD BacLight. Résultat et Conclusion : Vingt deux des 31 souches ont produit un biofilm. La production d’un biofilm plus importante chez Pseudomonas aeruginosa et Staphylococcus aureus était associée à une mauvaise évolution. Ceci suggère un rôle du biofilm dans la pathogenèse de la rhinosinusite chronique. Le traitement avec la moxifloxacine, à une concentration de 1000X la concentration minimale inhibitrice réduit le nombre des bactéries viables de 2 à 2.5 log. Ces concentrations (100 µg/ml - 200 µg/ml) sont faciles à atteindre dans des solutions topiques. Les résultats de notre étude suggèrent que l’utilisation de concentrations supérieure à la concentration minimale inhibitrice sous forme topique peut ouvrir des voies de recherche sur de nouveaux traitements qui peuvent être bénéfiques pour les patients atteints de forme sévère de rhinosinusite chronique surtout après une chirurgie endoscopique des sinus.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ce mémoire s'intéresse à la reconstruction d'un modèle 3D à partir de plusieurs images. Le modèle 3D est élaboré avec une représentation hiérarchique de voxels sous la forme d'un octree. Un cube englobant le modèle 3D est calculé à partir de la position des caméras. Ce cube contient les voxels et il définit la position de caméras virtuelles. Le modèle 3D est initialisé par une enveloppe convexe basée sur la couleur uniforme du fond des images. Cette enveloppe permet de creuser la périphérie du modèle 3D. Ensuite un coût pondéré est calculé pour évaluer la qualité de chaque voxel à faire partie de la surface de l'objet. Ce coût tient compte de la similarité des pixels provenant de chaque image associée à la caméra virtuelle. Finalement et pour chacune des caméras virtuelles, une surface est calculée basée sur le coût en utilisant la méthode de SGM. La méthode SGM tient compte du voisinage lors du calcul de profondeur et ce mémoire présente une variation de la méthode pour tenir compte des voxels précédemment exclus du modèle par l'étape d'initialisation ou de creusage par une autre surface. Par la suite, les surfaces calculées sont utilisées pour creuser et finaliser le modèle 3D. Ce mémoire présente une combinaison innovante d'étapes permettant de créer un modèle 3D basé sur un ensemble d'images existant ou encore sur une suite d'images capturées en série pouvant mener à la création d'un modèle 3D en temps réel.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study focuses on the onset of southwest monsoon over Kerala. India Meteorological Department (IMD) has been using a semi-objective method to define monsoon onset. The main objectives of the study are to understand the monsoon onset processes, to simulate monsoon onset in a GCM using as input the atmospheric conditions and Sea Surface Temperature, 10 days earlier to the onset, to develop a method for medium range prediction of the date of onset of southwest monsoon over Kerala and to examine the possibility of objectively defining the date of Monsoon Onset over Kerala (MOK). It gives a broad description of regional monsoon systems and monsoon onsets over Asia and Australia. Asian monsoon includes two separate subsystems, Indain monsoon and East Asian monsoon. It is seen from this study that the duration of the different phases of the onset process are dependent on the period of ISO. Based on the study of the monsoon onset process, modeling studies can be done for better understanding of the ocean-atmosphere interaction especially those associated with the warm pool in the Bay of Bengal and the Arabian Sea.