951 resultados para Computational tools
Resumo:
Dissertação para obtenção do Grau de Doutor em Engenharia Química e Bioquímica
Resumo:
Artificial neural networks (ANNs) are relatively new computational tools that have found extensive utilization in solving many complex real-world problems. This paper describes how an ANN can be used to identify the spectral lines of elements. The spectral lines of Cadmium (Cd), Calcium (Ca), Iron (Fe), Lithium (Li), Mercury (Hg), Potassium (K) and Strontium (Sr) in the visible range are chosen for the investigation. One of the unique features of this technique is that it uses the whole spectrum in the visible range instead of individual spectral lines. The spectrum of a sample taken with a spectrometer contains both original peaks and spurious peaks. It is a tedious task to identify these peaks to determine the elements present in the sample. ANNs capability of retrieving original data from noisy spectrum is also explored in this paper. The importance of the need of sufficient data for training ANNs to get accurate results is also emphasized. Two networks are examined: one trained in all spectral lines and other with the persistent lines only. The network trained in all spectral lines is found to be superior in analyzing the spectrum even in a noisy environment.
Resumo:
Present thesis has discussed the design and synthesis of polymers suitable for nonlinear optics. Most of the molecules that were studied have shown good nonlinear optical activity. The second order nonlinear optical activity of the polymers was measured experimentally by Kurtz and Perry powder technique. The thesis comprises of eight chapters.The theory of NLO phenomenon and a review about the various nonlinear optical polymers has been discussed in chapter 1. The review has provided a survey of NLO active polymeric materials with a general introduction, which included the principles and the origin of nonlinear optics, and has given emphasis to polymeric materials for nonlinear optics, including guest-host systems, side chain polymers, main chain polymers, crosslinked polymers, chiral polymers etc.Chapter 2 has discussed the stability of the metal incorporated tetrapyrrole molecules, porphyrin, chlorin and bacteriochlorin.Chapter 3 has provided the NLO properties of certain organic molecules by computational tools. The chapter is divided into four parts. The first part has described the nonlinear optical properties of chromophore (D-n-A) and bichromophore (D-n-A-A-n-D) systems, which were separated by methylene spacer, by making use of DPT and semiempirical calculations.Chapter 4: A series of polyurethanes was prepared from cardanol, a renewable resource and a waste of the cashew industry by previously designed bifunctional and multifunctional polymers using quantum theoretical approach.Chapter 5: A series of chiral polyurethanes with main chain bis azo diol groups in the polymer backbone was designed and NLO activity was predicted by ZlNDO/ CV methods.In Chapter 7, polyurethanes were first designed by computational methods and the NLO properties were predicted by correction vector method. The designed bifunctional and multifunctional polyurethanes were synthesized by varying the chiral-achiral diol compositions
Resumo:
The Kineticist's Workbench is a program that simulates chemical reaction mechanisms by predicting, generating, and interpreting numerical data. Prior to simulation, it analyzes a given mechanism to predict that mechanism's behavior; it then simulates the mechanism numerically; and afterward, it interprets and summarizes the data it has generated. In performing these tasks, the Workbench uses a variety of techniques: graph- theoretic algorithms (for analyzing mechanisms), traditional numerical simulation methods, and algorithms that examine simulation results and reinterpret them in qualitative terms. The Workbench thus serves as a prototype for a new class of scientific computational tools---tools that provide symbiotic collaborations between qualitative and quantitative methods.
Resumo:
El presente trabajo pretende investigar y analizar la utilidad de las relaciones estratégicas comunitarias y el marketing en el sector de extracción de petróleo crudo y gas natural. El mercadeo juega un papel importante a la hora de vender cualquier tipo de producto o servicio. Dependiendo de las características de cada sector, las estrategias de mercadeo pueden llegar a ser más o menos útiles y/o efectivas que otras. Por esta razón, esta investigación pretende analizar la utilidad de las estrategias de marketing comunitario y marketing en el sector de extracción de petróleo crudo y gas natural. La forma de llegar a esa respuesta, puede hallarse por medio de tres caminos: el primero de ellos es encontrar y analizar las características del sector, el segundo es identificar cuáles son las estrategias que están utilizando las empresas que operan este campo, y por último, consistiría en analizar la utilidad de integrar el concepto de relación estratégica comunitaria y marketing al sector de extracción de petróleo crudo y gas natural, en cuanto a los beneficios que el marketing comunitario le traería al sector de interés. El sector de petróleo crudo y gas natural es uno de los más importantes en la economía global y sus clientes son numerosos y totalmente heterogéneos. El mercadeo se vuelve muy complejo para esta industria ya que al tener tipos de clientes tan distintos se dificulta alcanzar las expectativas de cada uno de ellos con una misma estrategia y por supuesto la forma de vender un producto es más compleja. Estos factores mencionados anteriormente están generando relaciones entre empresas y clientes totalmente transaccionales de corto plazo, y es ahí donde se cree que la inclusión de mercadeo relacional y estrategias de mercadeo relacional pueden ser de utilidad para el sector, generando desarrollo y beneficios mutuos tanto para clientes como empresas.
Resumo:
We are developing computational tools supporting the detailed analysis of the dependence of neural electrophysiological response on dendritic morphology. We approach this problem by combining simulations of faithful models of neurons (experimental real life morphological data with known models of channel kinetics) with algorithmic extraction of morphological and physiological parameters and statistical analysis. In this paper, we present the novel method for an automatic recognition of spike trains in voltage traces, which eliminates the need for human intervention. This enables classification of waveforms with consistent criteria across all the analyzed traces and so it amounts to reduction of the noise in the data. This method allows for an automatic extraction of relevant physiological parameters necessary for further statistical analysis. In order to illustrate the usefulness of this procedure to analyze voltage traces, we characterized the influence of the somatic current injection level on several electrophysiological parameters in a set of modeled neurons. This application suggests that such an algorithmic processing of physiological data extracts parameters in a suitable form for further investigation of structure-activity relationship in single neurons.
Resumo:
An important goal in computational neuroanatomy is the complete and accurate simulation of neuronal morphology. We are developing computational tools to model three-dimensional dendritic structures based on sets of stochastic rules. This paper reports an extensive, quantitative anatomical characterization of simulated motoneurons and Purkinje cells. We used several local and global algorithms implemented in the L-Neuron and ArborVitae programs to generate sets of virtual neurons. Parameters statistics for all algorithms were measured from experimental data, thus providing a compact and consistent description of these morphological classes. We compared the emergent anatomical features of each group of virtual neurons with those of the experimental database in order to gain insights on the plausibility of the model assumptions, potential improvements to the algorithms, and non-trivial relations among morphological parameters. Algorithms mainly based on local constraints (e.g., branch diameter) were successful in reproducing many morphological properties of both motoneurons and Purkinje cells (e.g. total length, asymmetry, number of bifurcations). The addition of global constraints (e.g., trophic factors) improved the angle-dependent emergent characteristics (average Euclidean distance from the soma to the dendritic terminations, dendritic spread). Virtual neurons systematically displayed greater anatomical variability than real cells, suggesting the need for additional constraints in the models. For several emergent anatomical properties, a specific algorithm reproduced the experimental statistics better than the others did. However, relative performances were often reversed for different anatomical properties and/or morphological classes. Thus, combining the strengths of alternative generative models could lead to comprehensive algorithms for the complete and accurate simulation of dendritic morphology.
Resumo:
It is generally assumed that the variability of neuronal morphology has an important effect on both the connectivity and the activity of the nervous system, but this effect has not been thoroughly investigated. Neuroanatomical archives represent a crucial tool to explore structure–function relationships in the brain. We are developing computational tools to describe, generate, store and render large sets of three–dimensional neuronal structures in a format that is compact, quantitative, accurate and readily accessible to the neuroscientist. Single–cell neuroanatomy can be characterized quantitatively at several levels. In computer–aided neuronal tracing files, a dendritic tree is described as a series of cylinders, each represented by diameter, spatial coordinates and the connectivity to other cylinders in the tree. This ‘Cartesian’ description constitutes a completely accurate mapping of dendritic morphology but it bears little intuitive information for the neuroscientist. In contrast, a classical neuroanatomical analysis characterizes neuronal dendrites on the basis of the statistical distributions of morphological parameters, e.g. maximum branching order or bifurcation asymmetry. This description is intuitively more accessible, but it only yields information on the collective anatomy of a group of dendrites, i.e. it is not complete enough to provide a precise ‘blueprint’ of the original data. We are adopting a third, intermediate level of description, which consists of the algorithmic generation of neuronal structures within a certain morphological class based on a set of ‘fundamental’, measured parameters. This description is as intuitive as a classical neuroanatomical analysis (parameters have an intuitive interpretation), and as complete as a Cartesian file (the algorithms generate and display complete neurons). The advantages of the algorithmic description of neuronal structure are immense. If an algorithm can measure the values of a handful of parameters from an experimental database and generate virtual neurons whose anatomy is statistically indistinguishable from that of their real counterparts, a great deal of data compression and amplification can be achieved. Data compression results from the quantitative and complete description of thousands of neurons with a handful of statistical distributions of parameters. Data amplification is possible because, from a set of experimental neurons, many more virtual analogues can be generated. This approach could allow one, in principle, to create and store a neuroanatomical database containing data for an entire human brain in a personal computer. We are using two programs, L–NEURON and ARBORVITAE, to investigate systematically the potential of several different algorithms for the generation of virtual neurons. Using these programs, we have generated anatomically plausible virtual neurons for several morphological classes, including guinea pig cerebellar Purkinje cells and cat spinal cord motor neurons. These virtual neurons are stored in an online electronic archive of dendritic morphology. This process highlights the potential and the limitations of the ‘computational neuroanatomy’ strategy for neuroscience databases.
Resumo:
The role and function of a given protein is dependent on its structure. In recent years, however, numerous studies have highlighted the importance of unstructured, or disordered regions in governing a protein’s function. Disordered proteins have been found to play important roles in pivotal cellular functions, such as DNA binding and signalling cascades. Studying proteins with extended disordered regions is often problematic as they can be challenging to express, purify and crystallise. This means that interpretable experimental data on protein disorder is hard to generate. As a result, predictive computational tools have been developed with the aim of predicting the level and location of disorder within a protein. Currently, over 60 prediction servers exist, utilizing different methods for classifying disorder and different training sets. Here we review several good performing, publicly available prediction methods, comparing their application and discussing how disorder prediction servers can be used to aid the experimental solution of protein structure. The use of disorder prediction methods allows us to adopt a more targeted approach to experimental studies by accurately identifying the boundaries of ordered protein domains so that they may be investigated separately, thereby increasing the likelihood of their successful experimental solution.
Resumo:
LEÃO, Adriano de Castro; DÓRIA NETO, Adrião Duarte; SOUSA, Maria Bernardete Cordeiro de. New developmental stages for common marmosets (Callithrix jacchus) using mass and age variables obtained by K-means algorithm and self-organizing maps (SOM). Computers in Biology and Medicine, v. 39, p. 853-859, 2009
Resumo:
In this paper artificial neural network (ANN) based on supervised and unsupervised algorithms were investigated for use in the study of rheological parameters of solid pharmaceutical excipients, in order to develop computational tools for manufacturing solid dosage forms. Among four supervised neural networks investigated, the best learning performance was achieved by a feedfoward multilayer perceptron whose architectures was composed by eight neurons in the input layer, sixteen neurons in the hidden layer and one neuron in the output layer. Learning and predictive performance relative to repose angle was poor while to Carr index and Hausner ratio (CI and HR, respectively) showed very good fitting capacity and learning, therefore HR and CI were considered suitable descriptors for the next stage of development of supervised ANNs. Clustering capacity was evaluated for five unsupervised strategies. Network based on purely unsupervised competitive strategies, classic "Winner-Take-All", "Frequency-Sensitive Competitive Learning" and "Rival-Penalize Competitive Learning" (WTA, FSCL and RPCL, respectively) were able to perform clustering from database, however this classification was very poor, showing severe classification errors by grouping data with conflicting properties into the same cluster or even the same neuron. On the other hand it could not be established what was the criteria adopted by the neural network for those clustering. Self-Organizing Maps (SOM) and Neural Gas (NG) networks showed better clustering capacity. Both have recognized the two major groupings of data corresponding to lactose (LAC) and cellulose (CEL). However, SOM showed some errors in classify data from minority excipients, magnesium stearate (EMG) , talc (TLC) and attapulgite (ATP). NG network in turn performed a very consistent classification of data and solve the misclassification of SOM, being the most appropriate network for classifying data of the study. The use of NG network in pharmaceutical technology was still unpublished. NG therefore has great potential for use in the development of software for use in automated classification systems of pharmaceutical powders and as a new tool for mining and clustering data in drug development
Resumo:
The skin cancer is the most common of all cancers and the increase of its incidence must, in part, caused by the behavior of the people in relation to the exposition to the sun. In Brazil, the non-melanoma skin cancer is the most incident in the majority of the regions. The dermatoscopy and videodermatoscopy are the main types of examinations for the diagnosis of dermatological illnesses of the skin. The field that involves the use of computational tools to help or follow medical diagnosis in dermatological injuries is seen as very recent. Some methods had been proposed for automatic classification of pathology of the skin using images. The present work has the objective to present a new intelligent methodology for analysis and classification of skin cancer images, based on the techniques of digital processing of images for extraction of color characteristics, forms and texture, using Wavelet Packet Transform (WPT) and learning techniques called Support Vector Machine (SVM). The Wavelet Packet Transform is applied for extraction of texture characteristics in the images. The WPT consists of a set of base functions that represents the image in different bands of frequency, each one with distinct resolutions corresponding to each scale. Moreover, the characteristics of color of the injury are also computed that are dependants of a visual context, influenced for the existing colors in its surround, and the attributes of form through the Fourier describers. The Support Vector Machine is used for the classification task, which is based on the minimization principles of the structural risk, coming from the statistical learning theory. The SVM has the objective to construct optimum hyperplanes that represent the separation between classes. The generated hyperplane is determined by a subset of the classes, called support vectors. For the used database in this work, the results had revealed a good performance getting a global rightness of 92,73% for melanoma, and 86% for non-melanoma and benign injuries. The extracted describers and the SVM classifier became a method capable to recognize and to classify the analyzed skin injuries
Resumo:
Nowadays, classifying proteins in structural classes, which concerns the inference of patterns in their 3D conformation, is one of the most important open problems in Molecular Biology. The main reason for this is that the function of a protein is intrinsically related to its spatial conformation. However, such conformations are very difficult to be obtained experimentally in laboratory. Thus, this problem has drawn the attention of many researchers in Bioinformatics. Considering the great difference between the number of protein sequences already known and the number of three-dimensional structures determined experimentally, the demand of automated techniques for structural classification of proteins is very high. In this context, computational tools, especially Machine Learning (ML) techniques, have become essential to deal with this problem. In this work, ML techniques are used in the recognition of protein structural classes: Decision Trees, k-Nearest Neighbor, Naive Bayes, Support Vector Machine and Neural Networks. These methods have been chosen because they represent different paradigms of learning and have been widely used in the Bioinfornmatics literature. Aiming to obtain an improvment in the performance of these techniques (individual classifiers), homogeneous (Bagging and Boosting) and heterogeneous (Voting, Stacking and StackingC) multiclassification systems are used. Moreover, since the protein database used in this work presents the problem of imbalanced classes, artificial techniques for class balance (Undersampling Random, Tomek Links, CNN, NCL and OSS) are used to minimize such a problem. In order to evaluate the ML methods, a cross-validation procedure is applied, where the accuracy of the classifiers is measured using the mean of classification error rate, on independent test sets. These means are compared, two by two, by the hypothesis test aiming to evaluate if there is, statistically, a significant difference between them. With respect to the results obtained with the individual classifiers, Support Vector Machine presented the best accuracy. In terms of the multi-classification systems (homogeneous and heterogeneous), they showed, in general, a superior or similar performance when compared to the one achieved by the individual classifiers used - especially Boosting with Decision Tree and the StackingC with Linear Regression as meta classifier. The Voting method, despite of its simplicity, has shown to be adequate for solving the problem presented in this work. The techniques for class balance, on the other hand, have not produced a significant improvement in the global classification error. Nevertheless, the use of such techniques did improve the classification error for the minority class. In this context, the NCL technique has shown to be more appropriated
Resumo:
Background: The tectum is a structure localized in the roof of the midbrain in vertebrates, and is taken to be highly conserved in evolution. The present article assessed three hypotheses concerning the evolution of lamination and citoarchitecture of the tectum of nontetrapod animals: 1) There is a significant degree of phylogenetic inertia in both traits studied (number of cellular layers and number of cell classes in tectum); 2) Both traits are positively correlated accross evolution after correction for phylogeny; and 3) Different developmental pathways should generate different patterns of lamination and cytoarchitecture.Methodology/Principal Findings: The hypotheses were tested using analytical-computational tools for phylogenetic hypothesis testing. Both traits presented a considerably large phylogenetic signal and were positively associated. However, no difference was found between two clades classified as per the general developmental pathways of their brains.Conclusions/Significance: The evidence amassed points to more variation in the tectum than would be expected by phylogeny in three species from the taxa analysed; this variation is not better explained by differences in the main course of development, as would be predicted by the developmental clade hypothesis. Those findings shed new light on the evolution of an functionally important structure in nontetrapods, the most basal radiations of vertebrates.
Resumo:
The use of non-human primates in scientific research has contributed significantly to the biomedical area and, in the case of Callithrix jacchus, has provided important evidence on physiological mechanisms that help explain its biology, making the species a valuable experimental model in different pathologies. However, raising non-human primates in captivity for long periods of time is accompanied by behavioral disorders and chronic diseases, as well as progressive weight loss in most of the animals. The Primatology Center of the Universidade Federal do Rio Grande do Norte (UFRN) has housed a colony of C. jacchus for nearly 30 years and during this period these animals have been weighed systematically to detect possible alterations in their clinical conditions. This procedure has generated a volume of data on the weight of animals at different age ranges. These data are of great importance in the study of this variable from different perspectives. Accordingly, this paper presents three studies using weight data collected over 15 years (1985-2000) as a way of verifying the health status and development of the animals. The first study produced the first article, which describes the histopathological findings of animals with probable diagnosis of permanent wasting marmoset syndrome (WMS). All the animals were carriers of trematode parasites (Platynosomum spp) and had obstruction in the hepatobiliary system; it is suggested that this agent is one of the etiological factors of the syndrome. In the second article, the analysis focused on comparing environmental profile and cortisol levels between the animals with normal weight curve evolution and those with WMS. We observed a marked decrease in locomotion, increased use of lower cage extracts and hypocortisolemia. The latter is likely associated to an adaptation of the mechanisms that make up the hypothalamus-hypophysis-adrenal axis, as observed in other mammals under conditions of chronic malnutrition. Finally, in the third study, the animals with weight alterations were excluded from the sample and, using computational tools (K-means and SOM) in a non-supervised way, we suggest found new ontogenetic development classes for C. jacchus. These were redimensioned from five to eight classes: infant I, infant II, infant III, juvenile I, juvenile II, sub-adult, young adult and elderly adult, in order to provide a more suitable classification for more detailed studies that require better control over the animal development