971 resultados para Federico Zannoni
Resumo:
Support Vector Machines Regression (SVMR) is a regression technique which has been recently introduced by V. Vapnik and his collaborators (Vapnik, 1995; Vapnik, Golowich and Smola, 1996). In SVMR the goodness of fit is measured not by the usual quadratic loss function (the mean square error), but by a different loss function called Vapnik"s $epsilon$- insensitive loss function, which is similar to the "robust" loss functions introduced by Huber (Huber, 1981). The quadratic loss function is well justified under the assumption of Gaussian additive noise. However, the noise model underlying the choice of Vapnik's loss function is less clear. In this paper the use of Vapnik's loss function is shown to be equivalent to a model of additive and Gaussian noise, where the variance and mean of the Gaussian are random variables. The probability distributions for the variance and mean will be stated explicitly. While this work is presented in the framework of SVMR, it can be extended to justify non-quadratic loss functions in any Maximum Likelihood or Maximum A Posteriori approach. It applies not only to Vapnik's loss function, but to a much broader class of loss functions.
Resumo:
In the first part of this paper we show a similarity between the principle of Structural Risk Minimization Principle (SRM) (Vapnik, 1982) and the idea of Sparse Approximation, as defined in (Chen, Donoho and Saunders, 1995) and Olshausen and Field (1996). Then we focus on two specific (approximate) implementations of SRM and Sparse Approximation, which have been used to solve the problem of function approximation. For SRM we consider the Support Vector Machine technique proposed by V. Vapnik and his team at AT&T Bell Labs, and for Sparse Approximation we consider a modification of the Basis Pursuit De-Noising algorithm proposed by Chen, Donoho and Saunders (1995). We show that, under certain conditions, these two techniques are equivalent: they give the same solution and they require the solution of the same quadratic programming problem.
Resumo:
The Support Vector Machine (SVM) is a new and very promising classification technique developed by Vapnik and his group at AT&T Bell Labs. This new learning algorithm can be seen as an alternative training technique for Polynomial, Radial Basis Function and Multi-Layer Perceptron classifiers. An interesting property of this approach is that it is an approximate implementation of the Structural Risk Minimization (SRM) induction principle. The derivation of Support Vector Machines, its relationship with SRM, and its geometrical insight, are discussed in this paper. Training a SVM is equivalent to solve a quadratic programming problem with linear and box constraints in a number of variables equal to the number of data points. When the number of data points exceeds few thousands the problem is very challenging, because the quadratic form is completely dense, so the memory needed to store the problem grows with the square of the number of data points. Therefore, training problems arising in some real applications with large data sets are impossible to load into memory, and cannot be solved using standard non-linear constrained optimization algorithms. We present a decomposition algorithm that can be used to train SVM's over large data sets. The main idea behind the decomposition is the iterative solution of sub-problems and the evaluation of, and also establish the stopping criteria for the algorithm. We present previous approaches, as well as results and important details of our implementation of the algorithm using a second-order variant of the Reduced Gradient Method as the solver of the sub-problems. As an application of SVM's, we present preliminary results we obtained applying SVM to the problem of detecting frontal human faces in real images.
Resumo:
In this paper we consider the problem of approximating a function belonging to some funtion space Φ by a linear comination of n translates of a given function G. Ussing a lemma by Jones (1990) and Barron (1991) we show that it is possible to define function spaces and functions G for which the rate of convergence to zero of the erro is 0(1/n) in any number of dimensions. The apparent avoidance of the "curse of dimensionality" is due to the fact that these function spaces are more and more constrained as the dimension increases. Examples include spaces of the Sobolev tpe, in which the number of weak derivatives is required to be larger than the number of dimensions. We give results both for approximation in the L2 norm and in the Lc norm. The interesting feature of these results is that, thanks to the constructive nature of Jones" and Barron"s lemma, an iterative procedure is defined that can achieve this rate.
Resumo:
Su contenido puede ser consultado a través de Internet www.infoconsumo.es/eecred/
Resumo:
As universities are offering tuition through online learning environments, “onsite students” in higher education are increasingly becoming “online learners”. Since the medium for learning (and teaching) online is a digital environment, and at a distance, the role taken by students and teaching staff is different to the one these are used to in onsite, traditional settings. Therefore the Role of the Online Learner, presented in this paper, is key to onsite students who are to become online learners. This role consists of five competences: Operational, Cognitive, Collaborative, Self-directing, Course-specific. These five competences integrate the various skills, strategies, attitudes and awareness that make up the role of online learner, which learners use to perform efficiently online. They also make up the basis of a tutorial for would-be online learners, going over the Role of the Online Learner by means of concepts, examples and reflective activities. This tutorial, available to students in the author’s website, is also helpful to teaching and counselling staff in guiding their students to become online learners
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Los dibujos son de Federico Herrera Cuesta, 'Fiquín'. Este libro se complementa con la guía didáctica del mismo título de la colección Riola, número 3
Resumo:
La bibliografía se presenta a doble columna. El diseño, ilustraciones y maquetación estuvieron a cargo de Federico Herrera, Fiquín, y de María del Carmen García, Badiqui
Resumo:
Resumo:
El CEIP Manuel Pérez de Bollullos Par del Condado (Huelva) ha recibido el Primer Premio de Páginas Web Educativas de la Junta de Andalucía, y ha sido galardonado con la Medalla de Oro al Mérito Educativo
Resumo:
Resumen tomado de la publicaci??n. Resumen tambi??n en ingl??s
Resumo:
El cáncer de colon y recto en Colombia se ubica como la quinta causa de mortalidad atribuible a cáncer. Se realiza un análisis de tendencias de mortalidad por cáncer colorrectal entre 1985 y 2004, utilizando un modelo estadístico de edad-periodo-cohorte. Se identificaron los certificados de defunción con registro de este diagnóstico como causa básica de muerte, en las bases de datos de defunción del DANE entre 1985 y 2004. Se agruparon las muertes por grupos de edad y por período de defunción de cinco años. Se hizo un análisis de tendencias de mortalidad utilizando un modelo estadístico de Edad Periodo Cohorte (EPC) Resultados. Los modelos de EDAD + COHORTE se ajustaron a la explicación del comportamiento de la mortalidad por cáncer de colon y recto en hombres tanto a nivel nacional (deviance 24,44; p 0,169 AIC 8,72) y en la ciudad de Bogotá y entre las mujeres a nivel nacional; mientras que entre las mujeres a nivel de Bogotá, el comportamiento de la mortalidad por cáncer de colorrectal es explicado únicamente por efecto de la edad (deviance 36,1; p 0,2; AIC 6,92) Discusión. El efecto que se observa más claramente es el efecto de cohorte, el cual es evidente en las cohortes de mediados del siglo XX, lo cual podría corresponder a los cambios en el estilo de vida. Se recomienda hacer estudios analíticos que expliquen de mejor forma el efecto de cohorte y aclaren el hecho de que dicho efecto no se presente entre las mujeres de Bogotá.
Resumo:
Se desconoce en la actualidad en Colombia la calidad de la interpretación de los gases arteriales por parte de los residentes de medicina de emergencias. Los gases arteriales es una de las ayudas diagnósticas de más rápida consecución en el servicio de urgencias y más utilizadas por ser indispensable en la valoración de patologías de alta prevalencia como son las enfermedades respiratorias y la sepsis. Su mala interpretación puede llevar a mal direccionamiento del manejo de pacientes en estado crítico por lo que es indefectible que los residentes logren un buen entrenamiento en la interpretación de los mismos. Por esta razón se realiza este estudio analítico de concordancia con recolección prospectiva, de corte transversal que busca determinar el grado de concordancia en la interpretación de gases arteriales de los residentes del programa de Medicina de Emergencias de la Universidad del Rosario y especialista en cuidado crítico, así como la interpretación entre ellos según su nivel de entrenamiento y describir cuáles son los hallazgos que encuentran en la interpretación de los mismos. Se recolectaron 60 gases arteriales realizados a paciente hospitalizados en la unidad de cuidados intensivos de la Fundación Santa Fe de Bogotá y se halló la concordancia entre la lectura de los residentes del programa de Medicina de Emergencias y un intensivista. Encontrando una concordancia moderada (r 0.445 y 0.442, ) en las respuestas identificadas en los residentes de segundo y tercer año de residencia(p:0,000y0,01).(MESH: Blood Gas Análisis, Emergency Medical Services, Education, Medical, Graduate)
Resumo:
En éste documento nos preguntamos por la posible compatibilidad entre una herramienta estadística en particular, la Teoría Bayesiana de la Decisión individual bajo la incertidumbre, con procesos de planeación estratégica a largo plazo en cualquier tipo de organización. Esta compatibilidad la buscamos a partir de las afirmaciones de dos autores en particular: Herbert SIMON y Henry MINTZBERG.