73 resultados para Tooth extraction
Resumo:
This paper introduces a new neurofuzzy model construction and parameter estimation algorithm from observed finite data sets, based on a Takagi and Sugeno (T-S) inference mechanism and a new extended Gram-Schmidt orthogonal decomposition algorithm, for the modeling of a priori unknown dynamical systems in the form of a set of fuzzy rules. The first contribution of the paper is the introduction of a one to one mapping between a fuzzy rule-base and a model matrix feature subspace using the T-S inference mechanism. This link enables the numerical properties associated with a rule-based matrix subspace, the relationships amongst these matrix subspaces, and the correlation between the output vector and a rule-base matrix subspace, to be investigated and extracted as rule-based knowledge to enhance model transparency. The matrix subspace spanned by a fuzzy rule is initially derived as the input regression matrix multiplied by a weighting matrix that consists of the corresponding fuzzy membership functions over the training data set. Model transparency is explored by the derivation of an equivalence between an A-optimality experimental design criterion of the weighting matrix and the average model output sensitivity to the fuzzy rule, so that rule-bases can be effectively measured by their identifiability via the A-optimality experimental design criterion. The A-optimality experimental design criterion of the weighting matrices of fuzzy rules is used to construct an initial model rule-base. An extended Gram-Schmidt algorithm is then developed to estimate the parameter vector for each rule. This new algorithm decomposes the model rule-bases via an orthogonal subspace decomposition approach, so as to enhance model transparency with the capability of interpreting the derived rule-base energy level. This new approach is computationally simpler than the conventional Gram-Schmidt algorithm for resolving high dimensional regression problems, whereby it is computationally desirable to decompose complex models into a few submodels rather than a single model with large number of input variables and the associated curse of dimensionality problem. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.
Resumo:
A new robust neurofuzzy model construction algorithm has been introduced for the modeling of a priori unknown dynamical systems from observed finite data sets in the form of a set of fuzzy rules. Based on a Takagi-Sugeno (T-S) inference mechanism a one to one mapping between a fuzzy rule base and a model matrix feature subspace is established. This link enables rule based knowledge to be extracted from matrix subspace to enhance model transparency. In order to achieve maximized model robustness and sparsity, a new robust extended Gram-Schmidt (G-S) method has been introduced via two effective and complementary approaches of regularization and D-optimality experimental design. Model rule bases are decomposed into orthogonal subspaces, so as to enhance model transparency with the capability of interpreting the derived rule base energy level. A locally regularized orthogonal least squares algorithm, combined with a D-optimality used for subspace based rule selection, has been extended for fuzzy rule regularization and subspace based information extraction. By using a weighting for the D-optimality cost function, the entire model construction procedure becomes automatic. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.
Resumo:
The present work presents a new method for activity extraction and reporting from video based on the aggregation of fuzzy relations. Trajectory clustering is first employed mainly to discover the points of entry and exit of mobiles appearing in the scene. In a second step, proximity relations between resulting clusters of detected mobiles and contextual elements from the scene are modeled employing fuzzy relations. These can then be aggregated employing typical soft-computing algebra. A clustering algorithm based on the transitive closure calculation of the fuzzy relations allows building the structure of the scene and characterises the ongoing different activities of the scene. Discovered activity zones can be reported as activity maps with different granularities thanks to the analysis of the transitive closure matrix. Taking advantage of the soft relation properties, activity zones and related activities can be labeled in a more human-like language. We present results obtained on real videos corresponding to apron monitoring in the Toulouse airport in France.
Resumo:
Generalizing the notion of an eigenvector, invariant subspaces are frequently used in the context of linear eigenvalue problems, leading to conceptually elegant and numerically stable formulations in applications that require the computation of several eigenvalues and/or eigenvectors. Similar benefits can be expected for polynomial eigenvalue problems, for which the concept of an invariant subspace needs to be replaced by the concept of an invariant pair. Little has been known so far about numerical aspects of such invariant pairs. The aim of this paper is to fill this gap. The behavior of invariant pairs under perturbations of the matrix polynomial is studied and a first-order perturbation expansion is given. From a computational point of view, we investigate how to best extract invariant pairs from a linearization of the matrix polynomial. Moreover, we describe efficient refinement procedures directly based on the polynomial formulation. Numerical experiments with matrix polynomials from a number of applications demonstrate the effectiveness of our extraction and refinement procedures.
Resumo:
The extraction of design data for the lowpass dielectric multilayer according to Tschebysheff performance is described. The extraction proceeds initially by analogy with electric-circuit design, and can then be given numerical refinement which is also described. Agreement with the Tschebysheff desideratum is satisfactory. The multilayers extracted by this procedure are of fractional thickness, symmetric with regard to their central layers.
Resumo:
Validating chemical methods to predict bioavailable fractions of polycyclic aromatic hydrocarbons (PAHs) by comparison with accumulation bioassays is problematic. Concentrations accumulated in soil organisms not only depend on the bioavailable fraction but also on contaminant properties. A historically contaminated soil was freshly spiked with deuterated PAHs (dPAHs). dPAHs have a similar fate to their respective undeuterated analogues, so chemical methods that give good indications of bioavailability should extract the fresh more readily available dPAHs and historic more recalcitrant PAHs in similar proportions to those in which they are accumulated in the tissues of test organisms. Cyclodextrin and butanol extractions predicted the bioavailable fraction for earthworms (Eisenia fetida) and plants (Lolium multiflorum) better than the exhaustive extraction. The PAHs accumulated by earthworms had a larger dPAH:PAH ratio than that predicted by chemical methods. The isotope ratio method described here provides an effective way of evaluating other chemical methods to predict bioavailability.
Resumo:
Automatic keyword or keyphrase extraction is concerned with assigning keyphrases to documents based on words from within the document. Previous studies have shown that in a significant number of cases author-supplied keywords are not appropriate for the document to which they are attached. This can either be because they represent what the author believes the paper is about not what it actually is, or because they include keyphrases which are more classificatory than explanatory e.g., “University of Poppleton” instead of “Knowledge Discovery in Databases”. Thus, there is a need for a system that can generate appropriate and diverse range of keyphrases that reflect the document. This paper proposes a solution that examines the synonyms of words and phrases in the document to find the underlying themes, and presents these as appropriate keyphrases. The primary method explores taking n-grams of the source document phrases, and examining the synonyms of these, while the secondary considers grouping outputs by their synonyms. The experiments undertaken show the primary method produces good results and that the secondary method produces both good results and potential for future work.
Resumo:
Assessment of the risk to human health posed by contaminated land may be seriously overestimated if reliant on total pollutant concentration. In vitro extraction tests, such as the physiologically based extraction test (PBET), imitate the physicochemical conditions of the human gastro-intestinal tract and offer a more practicable alternative for routine testing purposes. However, even though passage through the colon accounts for approximately 80% of the transit time through the human digestive tract and the typical contents of the colon in vivo are a carbohydrate-rich aqueous medium with the potential to promote desorption of organic pollutants, PBET comprises stomach and small intestine compartments only. Through addition of an eight-hour colon compartment to PBET and use of a carbohydrate-rich fed-state medium we demonstrated that colon-extended PBET (CE-PBET) in- creased assessments of soil-bound PAH bioaccessibility by up to 50% in laboratory soils and a factor of 4 in field soils. We attribute this increased bioaccessibility to a combination of the additional extraction time and the presence of carbohydrates in the colon compartment, both of which favor PAH desorption from soil. We propose that future assessments of the bioaccessibility of organic pollutants in soils using physiologically based extraction tests should have a colon compartment as in CE-PBET.
Resumo:
Meadowsweet was extracted in water at a range of temperatures (60–100 °C), and the total phenols, tannins, quercetin, salicylic acid content and colour were analysed. The extraction of total phenols followed pseudo first-order kinetics, the rate constant (k) increased from 0.09 ± 0.02 min−1 to 0.44 ± 0.09 min−1, as the temperature increased from 60 to 100 °C. An increase in temperature from 60 to 100 °C increased the concentration of total phenols extracted from 39 ± 2 to 63 ± 3 mg g−1 gallic acid equivalents, although it did not significantly affect the proportion of tannin and non-tannin fractions. The extraction of quercetin and salicyclic acid from meadowsweet also followed pseudo first-order kinetics, the rate constant of both compounds increasing with an increase in temperature up until 90 °C. Therefore, the aqueous extraction of meadowsweet at temperatures at or above 90 °C for 15 min yields extracts high in phenols, which may be added to beverages.
Resumo:
The total phenols, apigenin 7-glucoside, turbidity and colour of extracts from dried chamomile flowers were studied with a view to develop chamomile extracts with potential anti-inflammatory properties for incorporation into beverages. The extraction of all constituents followed pseudo first-order kinetics. In general, the rate constant (k) increased as the temperature increased from 57 to 100 °C. The turbidity only increased significantly between 90 and 100 °C. Therefore, aqueous chamomile extracts had maximum total phenol concentration and minimum turbidity when extracted at 90 °C for 20 min. The effect of drying conditions on chamomile extracted using these conditions was determined. A significant reduction in phenol concentration, from 19.7 ± 0.5 mg/g GAE in fresh chamomile to 13 ± 1 mg/g GAE, was found only in the plant material oven-dried at 80 °C (p ⩽ 0.05). The biggest colour change was between fresh chamomile and that oven-dried at 80 °C, followed by samples air-dried. There was no significant difference in colour of material freeze-dried and oven-dried at 40 °C.