996 resultados para arbre de régression et de classification


Relevância:

20.00% 20.00%

Publicador:

Resumo:

New elements about the stratigraphy of the Serra de Candeeiros Dogger and Lower «Lusitanian» are presented. The Lower Aalenian was recognized for the first time. Bathonian (more than 50 metres thick) is dated on brachiopods and foraminifera. It corresponds to a series of massive micritic, biodetritical, coral-reef, chaetetid, bryozoa and oolitic-limestones. Callovian (120 m) begins by whitish or yellowish limestones with ammonites and brachiopods of the Gracilis zone. It is followed by regressive limestone sequences ending with thick oncolitic layers. The «Lusitanian» base is formed by greyish lagoon brackish limestones; it lies unconformably on the Dogger, with or without angular and/or cartographic unconformity. This radical facies change is related to tectonic deformation of several blocks between the Nazaré and Tagus faults during Oxfordian times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Portugal, Carixian is generally represented by alternative layers of marly limestones characterized by nodule and lumpy levels. These layers are particularly developped [show preferential development] on passage areas to a sedimentary basin, particularly along the slope of tilted blocks between the Meseta and Berlenga's horst. This facies is included in the range of the «nodular limestone» and of the «ammonitico-rosso». Limestones are radiolaria micrites with fragments of pelagic organisms (ammonoids, thin shelled gastropods). These layers can be affected by intensive bioturbation (Brenha) which is responsible for dismantlement, specially where the initial thickness does not exceed a few centimetres. This process can lead to the isolation of residual nodules (Brenha, São Pedro de Muel, Peniche) which can be mobilised by massive sliding (Peniche). The isolated elements, shell fragments or residual nodules, can also be incrustated, thus developing oncolitic cryptalgal structures. At Brenha the lump structure developed progressively into a sequence overlapping the normal sedimentary one (thick limestone beds alternating with bituminous shales). Cryptalgal structures correspond to rather unstable environment conditions on mobile margins. These structures are known in deep pelagic sediments corresponding to well defined events of the geodynamic evolution (end of the initial rifting). Cryptalgal accretions disappear towards the sedimentary basin, and the nodular levels are less important. In the articulation areas with the Tomar platform, small mounds and cupules (Alcabideque) developed within the alternating marly-limestone levels. They represent the so called «mud mounds» of metric dimensions. The upper part of these «mud mounds» is hardened, showing track remains and supporting some brachiopods and pectinids. Hence the lumpy facies of Portugal is included among the range of sedimentaty environments and can be used as «geodynamic tracer».

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The synthetic study of the uppermost Cretaceous of the Beira Litoral (fauna, flora) confirms its upper Campanian-Maastrichtian age. It shows the presence of a tropical to subtropical climate in an area constituted by a low coastal plain only occasionally linked to the sea, saturated with fresh water and possessing accordingly, a predominantely freshwater fauna (Viso, Aveiro); this plain changed towards the interior into a drier more forested zone with a more abundant terrestrial fauna which includes mammals (Taveiro). A thorough study of the chelonian Rosasia, abundant on the coastal plain, was made possible thanks to the discovery of a skull: it demonstrates that the genus belongs to the family Bothremydidae, revalided here. The composition of this family is presented, its phylogenetic and paleobiogeographic relation with the other pleurodires are analyzed, and its diagnosis established. The family is constituted of three groups; Rosasia belongs to one of these, the Bothremys group.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This Thesis describes the application of automatic learning methods for a) the classification of organic and metabolic reactions, and b) the mapping of Potential Energy Surfaces(PES). The classification of reactions was approached with two distinct methodologies: a representation of chemical reactions based on NMR data, and a representation of chemical reactions from the reaction equation based on the physico-chemical and topological features of chemical bonds. NMR-based classification of photochemical and enzymatic reactions. Photochemical and metabolic reactions were classified by Kohonen Self-Organizing Maps (Kohonen SOMs) and Random Forests (RFs) taking as input the difference between the 1H NMR spectra of the products and the reactants. The development of such a representation can be applied in automatic analysis of changes in the 1H NMR spectrum of a mixture and their interpretation in terms of the chemical reactions taking place. Examples of possible applications are the monitoring of reaction processes, evaluation of the stability of chemicals, or even the interpretation of metabonomic data. A Kohonen SOM trained with a data set of metabolic reactions catalysed by transferases was able to correctly classify 75% of an independent test set in terms of the EC number subclass. Random Forests improved the correct predictions to 79%. With photochemical reactions classified into 7 groups, an independent test set was classified with 86-93% accuracy. The data set of photochemical reactions was also used to simulate mixtures with two reactions occurring simultaneously. Kohonen SOMs and Feed-Forward Neural Networks (FFNNs) were trained to classify the reactions occurring in a mixture based on the 1H NMR spectra of the products and reactants. Kohonen SOMs allowed the correct assignment of 53-63% of the mixtures (in a test set). Counter-Propagation Neural Networks (CPNNs) gave origin to similar results. The use of supervised learning techniques allowed an improvement in the results. They were improved to 77% of correct assignments when an ensemble of ten FFNNs were used and to 80% when Random Forests were used. This study was performed with NMR data simulated from the molecular structure by the SPINUS program. In the design of one test set, simulated data was combined with experimental data. The results support the proposal of linking databases of chemical reactions to experimental or simulated NMR data for automatic classification of reactions and mixtures of reactions. Genome-scale classification of enzymatic reactions from their reaction equation. The MOLMAP descriptor relies on a Kohonen SOM that defines types of bonds on the basis of their physico-chemical and topological properties. The MOLMAP descriptor of a molecule represents the types of bonds available in that molecule. The MOLMAP descriptor of a reaction is defined as the difference between the MOLMAPs of the products and the reactants, and numerically encodes the pattern of bonds that are broken, changed, and made during a chemical reaction. The automatic perception of chemical similarities between metabolic reactions is required for a variety of applications ranging from the computer validation of classification systems, genome-scale reconstruction (or comparison) of metabolic pathways, to the classification of enzymatic mechanisms. Catalytic functions of proteins are generally described by the EC numbers that are simultaneously employed as identifiers of reactions, enzymes, and enzyme genes, thus linking metabolic and genomic information. Different methods should be available to automatically compare metabolic reactions and for the automatic assignment of EC numbers to reactions still not officially classified. In this study, the genome-scale data set of enzymatic reactions available in the KEGG database was encoded by the MOLMAP descriptors, and was submitted to Kohonen SOMs to compare the resulting map with the official EC number classification, to explore the possibility of predicting EC numbers from the reaction equation, and to assess the internal consistency of the EC classification at the class level. A general agreement with the EC classification was observed, i.e. a relationship between the similarity of MOLMAPs and the similarity of EC numbers. At the same time, MOLMAPs were able to discriminate between EC sub-subclasses. EC numbers could be assigned at the class, subclass, and sub-subclass levels with accuracies up to 92%, 80%, and 70% for independent test sets. The correspondence between chemical similarity of metabolic reactions and their MOLMAP descriptors was applied to the identification of a number of reactions mapped into the same neuron but belonging to different EC classes, which demonstrated the ability of the MOLMAP/SOM approach to verify the internal consistency of classifications in databases of metabolic reactions. RFs were also used to assign the four levels of the EC hierarchy from the reaction equation. EC numbers were correctly assigned in 95%, 90%, 85% and 86% of the cases (for independent test sets) at the class, subclass, sub-subclass and full EC number level,respectively. Experiments for the classification of reactions from the main reactants and products were performed with RFs - EC numbers were assigned at the class, subclass and sub-subclass level with accuracies of 78%, 74% and 63%, respectively. In the course of the experiments with metabolic reactions we suggested that the MOLMAP / SOM concept could be extended to the representation of other levels of metabolic information such as metabolic pathways. Following the MOLMAP idea, the pattern of neurons activated by the reactions of a metabolic pathway is a representation of the reactions involved in that pathway - a descriptor of the metabolic pathway. This reasoning enabled the comparison of different pathways, the automatic classification of pathways, and a classification of organisms based on their biochemical machinery. The three levels of classification (from bonds to metabolic pathways) allowed to map and perceive chemical similarities between metabolic pathways even for pathways of different types of metabolism and pathways that do not share similarities in terms of EC numbers. Mapping of PES by neural networks (NNs). In a first series of experiments, ensembles of Feed-Forward NNs (EnsFFNNs) and Associative Neural Networks (ASNNs) were trained to reproduce PES represented by the Lennard-Jones (LJ) analytical potential function. The accuracy of the method was assessed by comparing the results of molecular dynamics simulations (thermal, structural, and dynamic properties) obtained from the NNs-PES and from the LJ function. The results indicated that for LJ-type potentials, NNs can be trained to generate accurate PES to be used in molecular simulations. EnsFFNNs and ASNNs gave better results than single FFNNs. A remarkable ability of the NNs models to interpolate between distant curves and accurately reproduce potentials to be used in molecular simulations is shown. The purpose of the first study was to systematically analyse the accuracy of different NNs. Our main motivation, however, is reflected in the next study: the mapping of multidimensional PES by NNs to simulate, by Molecular Dynamics or Monte Carlo, the adsorption and self-assembly of solvated organic molecules on noble-metal electrodes. Indeed, for such complex and heterogeneous systems the development of suitable analytical functions that fit quantum mechanical interaction energies is a non-trivial or even impossible task. The data consisted of energy values, from Density Functional Theory (DFT) calculations, at different distances, for several molecular orientations and three electrode adsorption sites. The results indicate that NNs require a data set large enough to cover well the diversity of possible interaction sites, distances, and orientations. NNs trained with such data sets can perform equally well or even better than analytical functions. Therefore, they can be used in molecular simulations, particularly for the ethanol/Au (111) interface which is the case studied in the present Thesis. Once properly trained, the networks are able to produce, as output, any required number of energy points for accurate interpolations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Alzheimer Disease (AD) is characterized by progressive cognitive decline and dementia. Earlier diagnosis and classification of different stages of the disease are currently the main challenges and can be assessed by neuroimaging. With this work we aim to evaluate the quality of brain regions and neuroimaging metrics as biomarkers of AD. Multimodal Imaging Brain Connectivity Analysis (MIBCA) toolbox functionalities were used to study AD by T1weighted, Diffusion Tensor Imaging and 18FAV45 PET, with data obtained from the AD Neuroimaging Initiative database, specifically 12 healthy controls (CTRL) and 33 patients with early mild cognitive impairment (EMCI), late MCI (LMCI) and AD (11 patients/group). The metrics evaluated were gray-matter volume (GMV), cortical thickness (CThk), mean diffusivity (MD), fractional anisotropy (FA), fiber count (FiberConn), node degree (Deg), cluster coefficient (ClusC) and relative standard-uptake-values (rSUV). Receiver Operating Characteristic (ROC) curves were used to evaluate and compare the diagnostic accuracy of the most significant metrics and brain regions and expressed as area under the curve (AUC). Comparisons were performed between groups. The RH-Accumbens/Deg demonstrated the highest AUC when differentiating between CTRLEMCI (82%), whether rSUV presented it in several brain regions when distinguishing CTRL-LMCI (99%). Regarding CTRL-AD, highest AUC were found with LH-STG/FiberConn and RH-FP/FiberConn (~100%). A larger number of neuroimaging metrics related with cortical atrophy with AUC>70% was found in CTRL-AD in both hemispheres, while in earlier stages, cortical metrics showed in more confined areas of the temporal region and mainly in LH, indicating an increasing of the spread of cortical atrophy that is characteristic of disease progression. In CTRL-EMCI several brain regions and neuroimaging metrics presented AUC>70% with a worst result in later stages suggesting these indicators as biomarkers for an earlier stage of MCI, although further research is necessary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The potential of the electrocardiographic (ECG) signal as a biometric trait has been ascertained in the literature over the past decade. The inherent characteristics of the ECG make it an interesting biometric modality, given its universality, intrinsic aliveness detection, continuous availability, and inbuilt hidden nature. These properties enable the development of novel applications, where non-intrusive and continuous authentication are critical factors. Examples include, among others, electronic trading platforms, the gaming industry, and the auto industry, in particular for car sharing programs and fleet management solutions. However, there are still some challenges to overcome in order to make the ECG a widely accepted biometric. In particular, the questions of uniqueness (inter-subject variability) and permanence over time (intra-subject variability) are still largely unanswered. In this paper we focus on the uniqueness question, presenting a preliminary study of our biometric recognition system, testing it on a database encompassing 618 subjects. We also performed tests with subsets of this population. The results reinforce that the ECG is a viable trait for biometrics, having obtained an Equal Error Rate of 9.01% and an Error of Identification of 15.64% for the entire test population.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimization problems arise in science, engineering, economy, etc. and we need to find the best solutions for each reality. The methods used to solve these problems depend on several factors, including the amount and type of accessible information, the available algorithms for solving them, and, obviously, the intrinsic characteristics of the problem. There are many kinds of optimization problems and, consequently, many kinds of methods to solve them. When the involved functions are nonlinear and their derivatives are not known or are very difficult to calculate, these methods are more rare. These kinds of functions are frequently called black box functions. To solve such problems without constraints (unconstrained optimization), we can use direct search methods. These methods do not require any derivatives or approximations of them. But when the problem has constraints (nonlinear programming problems) and, additionally, the constraint functions are black box functions, it is much more difficult to find the most appropriate method. Penalty methods can then be used. They transform the original problem into a sequence of other problems, derived from the initial, all without constraints. Then this sequence of problems (without constraints) can be solved using the methods available for unconstrained optimization. In this chapter, we present a classification of some of the existing penalty methods and describe some of their assumptions and limitations. These methods allow the solving of optimization problems with continuous, discrete, and mixing constraints, without requiring continuity, differentiability, or convexity. Thus, penalty methods can be used as the first step in the resolution of constrained problems, by means of methods that typically are used by unconstrained problems. We also discuss a new class of penalty methods for nonlinear optimization, which adjust the penalty parameter dynamically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In research on Silent Speech Interfaces (SSI), different sources of information (modalities) have been combined, aiming at obtaining better performance than the individual modalities. However, when combining these modalities, the dimensionality of the feature space rapidly increases, yielding the well-known "curse of dimensionality". As a consequence, in order to extract useful information from this data, one has to resort to feature selection (FS) techniques to lower the dimensionality of the learning space. In this paper, we assess the impact of FS techniques for silent speech data, in a dataset with 4 non-invasive and promising modalities, namely: video, depth, ultrasonic Doppler sensing, and surface electromyography. We consider two supervised (mutual information and Fisher's ratio) and two unsupervised (meanmedian and arithmetic mean geometric mean) FS filters. The evaluation was made by assessing the classification accuracy (word recognition error) of three well-known classifiers (knearest neighbors, support vector machines, and dynamic time warping). The key results of this study show that both unsupervised and supervised FS techniques improve on the classification accuracy on both individual and combined modalities. For instance, on the video component, we attain relative performance gains of 36.2% in error rates. FS is also useful as pre-processing for feature fusion. Copyright © 2014 ISCA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In video communication systems, the video signals are typically compressed and sent to the decoder through an error-prone transmission channel that may corrupt the compressed signal, causing the degradation of the final decoded video quality. In this context, it is possible to enhance the error resilience of typical predictive video coding schemes using as inspiration principles and tools from an alternative video coding approach, the so-called Distributed Video Coding (DVC), based on the Distributed Source Coding (DSC) theory. Further improvements in the decoded video quality after error-prone transmission may also be obtained by considering the perceptual relevance of the video content, as distortions occurring in different regions of a picture have a different impact on the user's final experience. In this context, this paper proposes a Perceptually Driven Error Protection (PDEP) video coding solution that enhances the error resilience of a state-of-the-art H.264/AVC predictive video codec using DSC principles and perceptual considerations. To increase the H.264/AVC error resilience performance, the main technical novelties brought by the proposed video coding solution are: (i) design of an improved compressed domain perceptual classification mechanism; (ii) design of an improved transcoding tool for the DSC-based protection mechanism; and (iii) integration of a perceptual classification mechanism in an H.264/AVC compliant codec with a DSC-based error protection mechanism. The performance results obtained show that the proposed PDEP video codec provides a better performing alternative to traditional error protection video coding schemes, notably Forward Error Correction (FEC)-based schemes. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study of new abundant coral crops and a systematic revision of the historic collections allow us to extend significantly the data about the Upper Oligocene and Miocene Scleractinia of the French atlantic basins. The SW and W-NW France faunas have been considered, and complete lists of the different defined taxa are presented. The generallines of the evolution of this group are specified, and linked to the paleoclimatic and paleobiogeographic changes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter analyzes the signals captured during impacts and vibrations of a mechanical manipulator. Eighteen signals are captured and several metrics are calculated between them, such as the correlation, the mutual information and the entropy. A sensor classification scheme based on the multidimensional scaling technique is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The disturbing emergence of multidrug-resistant strains of Mycobacterium tuberculosis (Mtb) has been driving the scientific community to urgently search for new and efficient antitubercular drugs. Despite the various drugs currently under evaluation, isoniazid is still the key and most effective component in all multi-therapeutic regimens recommended by the WHO. This paper describes the QSAR-oriented design, synthesis and in vitro antitubercular activity of several potent isoniazid derivatives (isonicotinoyl hydrazones and isonicotinoyl hydrazides) against H37Rv and two resistant Mtb strains. QSAR studies entailed RFs and ASNNs classification models, as well as MLR models. Strict validation procedures were used to guarantee the models' robustness and predictive ability. Lipophilicity was shown not to be relevant to explain the activity of these derivatives, whereas shorter N-N distances and lengthy substituents lead to more active compounds. Compounds I, 2, 4, 5 and 6, showed measured activities against H37Rv higher than INH (i.e., MIC <= 0.28 mu M), while compound 9 exhibited a six fold decrease in MIC against the katG (S315T) mutated strain, by comparison with INH (Le., 6.9 vs. 43.8 mu M). All compounds were ineffective against H37Rv(INH) (Delta katG), a strain with a full deletion of the katG gene, thus corroborating the importance of KatG in the activation of INH-based compounds. The most potent compounds were also shown not to be cytotoxic up to a concentration 500 times higher than MIC. (C) 2014 Elsevier Masson SAS. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyzes the signals captured during impacts and vibrations of a mechanical manipulator. To test the impacts, a flexible beam is clamped to the end-effector of a manipulator that is programmed in a way such that the rod moves against a rigid surface. Eighteen signals are captured and theirs correlation are calculated. A sensor classification scheme based on the multidimensional scaling technique is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In cluster analysis, it can be useful to interpret the partition built from the data in the light of external categorical variables which are not directly involved to cluster the data. An approach is proposed in the model-based clustering context to select a number of clusters which both fits the data well and takes advantage of the potential illustrative ability of the external variables. This approach makes use of the integrated joint likelihood of the data and the partitions at hand, namely the model-based partition and the partitions associated to the external variables. It is noteworthy that each mixture model is fitted by the maximum likelihood methodology to the data, excluding the external variables which are used to select a relevant mixture model only. Numerical experiments illustrate the promising behaviour of the derived criterion. © 2014 Springer-Verlag Berlin Heidelberg.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An aneurysm is a localized blood-filled dilatation of an artery whose consequences can be deadly. One of its current treatments is endovascular aneurysm repair, a minimally invasive procedure in which an endoprosthesis, called a stent-graft, is placed transluminally to prevent wall rupture. Early stent-grafts were custom designed for the patient through the assembling of off-the-shelf components by the operating surgeon. However, nowadays, stent-grafts have become a commercial product. The existing endoprostheses differ in several aspects, such as shape design and materials, but they have in common a metallic scaffold with a polymeric covering membrane. This article aims to gather relevant information for those who wish to understand the principles of stent-grafts and even to develop new devices. Hence, a stent-graft classification based on different characteristics is presented and the desired features for an ideal device are pointed out. Additionally, the materials currently in use to fabricate this type of endoprosthesis are reviewed and new materials are suggested.