887 resultados para cashew nut kernel
Resumo:
In our study we use a kernel based classification technique, Support Vector Machine Regression for predicting the Melting Point of Drug – like compounds in terms of Topological Descriptors, Topological Charge Indices, Connectivity Indices and 2D Auto Correlations. The Machine Learning model was designed, trained and tested using a dataset of 100 compounds and it was found that an SVMReg model with RBF Kernel could predict the Melting Point with a mean absolute error 15.5854 and Root Mean Squared Error 19.7576
Resumo:
The aim of this paper is to extend the method of approximate approximations to boundary value problems. This method was introduced by V. Maz'ya in 1991 and has been used until now for the approximation of smooth functions defined on the whole space and for the approximation of volume potentials. In the present paper we develop an approximation procedure for the solution of the interior Dirichlet problem for the Laplace equation in two dimensions using approximate approximations. The procedure is based on potential theoretical considerations in connection with a boundary integral equations method and consists of three approximation steps as follows. In a first step the unknown source density in the potential representation of the solution is replaced by approximate approximations. In a second step the decay behavior of the generating functions is used to gain a suitable approximation for the potential kernel, and in a third step Nyström's method leads to a linear algebraic system for the approximate source density. For every step a convergence analysis is established and corresponding error estimates are given.
Resumo:
Software Defined Radio (SDR) hardware platforms use parallel architectures. Current concepts of developing applications (such as WLAN) for these platforms are complex, because developers describe an application with hardware-specifics that are relevant to parallelism such as mapping and scheduling. To reduce this complexity, we have developed a new programming approach for SDR applications, called Virtual Radio Engine (VRE). VRE defines a language for describing applications, and a tool chain that consists of a compiler kernel and other tools (such as a code generator) to generate executables. The thesis presents this concept, as well as describes the language and the compiler kernel that have been developed by the author. The language is hardware-independent, i.e., developers describe tasks and dependencies between them. The compiler kernel performs automatic parallelization, i.e., it is capable of transforming a hardware-independent program into a hardware-specific program by solving hardware-specifics, in particular mapping, scheduling and synchronizations. Thus, VRE simplifies programming tasks as developers do not solve hardware-specifics manually.
Resumo:
The aim of this paper is the numerical treatment of a boundary value problem for the system of Stokes' equations. For this we extend the method of approximate approximations to boundary value problems. This method was introduced by V. Maz'ya in 1991 and has been used until now for the approximation of smooth functions defined on the whole space and for the approximation of volume potentials. In the present paper we develop an approximation procedure for the solution of the interior Dirichlet problem for the system of Stokes' equations in two dimensions. The procedure is based on potential theoretical considerations in connection with a boundary integral equations method and consists of three approximation steps as follows. In a first step the unknown source density in the potential representation of the solution is replaced by approximate approximations. In a second step the decay behavior of the generating functions is used to gain a suitable approximation for the potential kernel, and in a third step Nyström's method leads to a linear algebraic system for the approximate source density. For every step a convergence analysis is established and corresponding error estimates are given.
Resumo:
The Support Vector (SV) machine is a novel type of learning machine, based on statistical learning theory, which contains polynomial classifiers, neural networks, and radial basis function (RBF) networks as special cases. In the RBF case, the SV algorithm automatically determines centers, weights and threshold such as to minimize an upper bound on the expected test error. The present study is devoted to an experimental comparison of these machines with a classical approach, where the centers are determined by $k$--means clustering and the weights are found using error backpropagation. We consider three machines, namely a classical RBF machine, an SV machine with Gaussian kernel, and a hybrid system with the centers determined by the SV method and the weights trained by error backpropagation. Our results show that on the US postal service database of handwritten digits, the SV machine achieves the highest test accuracy, followed by the hybrid approach. The SV approach is thus not only theoretically well--founded, but also superior in a practical application.
Resumo:
Impressive claims have been made for the performance of the SNoW algorithm on face detection tasks by Yang et. al. [7]. In particular, by looking at both their results and those of Heisele et. al. [3], one could infer that the SNoW system performed substantially better than an SVM-based system, even when the SVM used a polynomial kernel and the SNoW system used a particularly simplistic 'primitive' linear representation. We evaluated the two approaches in a controlled experiment, looking directly at performance on a simple, fixed-sized test set, isolating out 'infrastructure' issues related to detecting faces at various scales in large images. We found that SNoW performed about as well as linear SVMs, and substantially worse than polynomial SVMs.
Resumo:
We derive a new representation for a function as a linear combination of local correlation kernels at optimal sparse locations and discuss its relation to PCA, regularization, sparsity principles and Support Vector Machines. We first review previous results for the approximation of a function from discrete data (Girosi, 1998) in the context of Vapnik"s feature space and dual representation (Vapnik, 1995). We apply them to show 1) that a standard regularization functional with a stabilizer defined in terms of the correlation function induces a regression function in the span of the feature space of classical Principal Components and 2) that there exist a dual representations of the regression function in terms of a regularization network with a kernel equal to a generalized correlation function. We then describe the main observation of the paper: the dual representation in terms of the correlation function can be sparsified using the Support Vector Machines (Vapnik, 1982) technique and this operation is equivalent to sparsify a large dictionary of basis functions adapted to the task, using a variation of Basis Pursuit De-Noising (Chen, Donoho and Saunders, 1995; see also related work by Donahue and Geiger, 1994; Olshausen and Field, 1995; Lewicki and Sejnowski, 1998). In addition to extending the close relations between regularization, Support Vector Machines and sparsity, our work also illuminates and formalizes the LFA concept of Penev and Atick (1996). We discuss the relation between our results, which are about regression, and the different problem of pattern classification.
Resumo:
In the first part of this paper we show a similarity between the principle of Structural Risk Minimization Principle (SRM) (Vapnik, 1982) and the idea of Sparse Approximation, as defined in (Chen, Donoho and Saunders, 1995) and Olshausen and Field (1996). Then we focus on two specific (approximate) implementations of SRM and Sparse Approximation, which have been used to solve the problem of function approximation. For SRM we consider the Support Vector Machine technique proposed by V. Vapnik and his team at AT&T Bell Labs, and for Sparse Approximation we consider a modification of the Basis Pursuit De-Noising algorithm proposed by Chen, Donoho and Saunders (1995). We show that, under certain conditions, these two techniques are equivalent: they give the same solution and they require the solution of the same quadratic programming problem.
Resumo:
A problem in the archaeometric classification of Catalan Renaissance pottery is the fact, that the clay supply of the pottery workshops was centrally organized by guilds, and therefore usually all potters of a single production centre produced chemically similar ceramics. However, analysing the glazes of the ware usually a large number of inclusions in the glaze is found, which reveal technological differences between single workshops. These inclusions have been used by the potters in order to opacify the transparent glaze and to achieve a white background for further decoration. In order to distinguish different technological preparation procedures of the single workshops, at a Scanning Electron Microscope the chemical composition of those inclusions as well as their size in the two-dimensional cut is recorded. Based on the latter, a frequency distribution of the apparent diameters is estimated for each sample and type of inclusion. Following an approach by S.D. Wicksell (1925), it is principally possible to transform the distributions of the apparent 2D-diameters back to those of the true three-dimensional bodies. The applicability of this approach and its practical problems are examined using different ways of kernel density estimation and Monte-Carlo tests of the methodology. Finally, it is tested in how far the obtained frequency distributions can be used to classify the pottery
Resumo:
In this paper a colour texture segmentation method, which unifies region and boundary information, is proposed. The algorithm uses a coarse detection of the perceptual (colour and texture) edges of the image to adequately place and initialise a set of active regions. Colour texture of regions is modelled by the conjunction of non-parametric techniques of kernel density estimation (which allow to estimate the colour behaviour) and classical co-occurrence matrix based texture features. Therefore, region information is defined and accurate boundary information can be extracted to guide the segmentation process. Regions concurrently compete for the image pixels in order to segment the whole image taking both information sources into account. Furthermore, experimental results are shown which prove the performance of the proposed method
Resumo:
El propósito de este estudio es evaluar la sensibilidad, especificidad y valores predictivos del Cuestionario Anamnésico de Síntomas de Miembro Superior y Columna (CASMSC) desarrollado por la Unidad de Investigación de Ergonomía de Postura y Movimiento (EPM). Se realizó un estudio descriptivo de tipo correlacional, mediante el análisis de datos secundarios de una base de datos con registros de trabajadores de la industria de alimentos (n=401) en el año 2013, a quienes se les había aplicado el CASMSC, así como una evaluación clínica fisioterapéutica enfocada en los mismos segmentos corporales; esta última utilizada como prueba de oro. Para analizar si existían diferencias estadísticas por edad, antigüedad y género se aplicó el análisis de varianza de una vía. La sensibilidad, especificidad y valores predictivos del CASMSC se informan con sus respectivos intervalos de confianza (95%). La prevalencia de umbral positivo para sospecha de Desorden Músculo Esquelético (DME) tanto de miembro superior como de columna se encontró muy por encima de la media nacional para el sector. La sensibilidad del CASMSC para miembro superior estuvo en el rango de un 80% a 94,57% y para columna cervical y lumbar fue de 36,4% y 43,4%, respectivamente. Para la región dorsal fue casi del doble de las otras dos regiones (85,7%). El CASMSC es recomendable en su apartado para miembro superior dado a su alto nivel de sensibilidad.
Resumo:
Introducción: El tratamiento con antagonistas del factor de necrosis tumoral alfa (anti TNF) ha impactado el pronóstico y la calidad de vida de los pacientes con artritis reumatoide (AR) positivamente, sin embargo, se interroga un incremento en el riesgo de desarrollar melanoma. Objetivo: Conocer la asociación entre el uso de anti TNF y el desarrollo de melanoma maligno en pacientes con AR. Metodología: Se realizó una búsqueda sistemática en MEDLINE, EMBASE, COCHRANE LIBRARY y LILACS para ensayos clínicos, estudios observacionales, revisiones y meta-análisis en pacientes adultos con diagnóstico de AR y manejo con anti TNF (Certolizumab pegol, Adalimumab, Etanercept, Infliximab y Golimumab). Resultados: 37 estudios clínicos cumplieron los criterios de inclusión para el meta-análisis, con una población de 16567 pacientes. El análisis de heterogeneidad no fue significativo (p=1), no se encontró diferencia en el riesgo entre los grupos comparados DR -0.00 (IC 95% -0.001; -0.001). Un análisis adicional de los estudios en los que se reportó al menos 1 caso de melanoma (4222 pacientes) tampoco mostró diferencia en el riesgo DR -0.00 (IC 95% -0.004 ; -0.003). Conclusión: En la evidencia disponible a la fecha no encontramos asociación significativa entre el tratamiento con anti TNF en pacientes con diagnóstico de AR y el desarrollo de melanoma cutáneo.
Resumo:
We document the existence of a Crime Kuznets Curve in US states since the 1970s. As income levels have risen, crime has followed an inverted U-shaped pattern, first increasing and then dropping. The Crime Kuznets Curve is not explained by income inequality. In fact, we show that during the sample period inequality has risen monotonically with income, ruling out the traditional Kuznets Curve. Our finding is robust to adding a large set of controls that are used in the literature to explain the incidence of crime, as well as to controlling for state and year fixed effects. The Curve is also revealed in nonparametric specifications. The Crime Kuznets Curve exists for property crime and for some categories of violent crime.
Resumo:
No publicado
Resumo:
Esta guía ha sido elaborada en el Marco del Convenio de Colaboración para fomentar la educación para la salud en la escuela, suscrito entre los Ministerios del Interior, de Educación y Cultura y de Sanidad y Consumo. Tít. del v. II: Propuestas de actividades prácticas