10 resultados para variance analysis
em Universidad Politécnica de Madrid
Resumo:
In the last decade, Object Based Image Analysis (OBIA) has been accepted as an effective method for processing high spatial resolution multiband images. This image analysis method is an approach that starts with the segmentation of the image. Image segmentation in general is a procedure to partition an image into homogenous groups (segments). In practice, visual interpretation is often used to assess the quality of segmentation and the analysis relies on the experience of an analyst. In an effort to address the issue, in this study, we evaluate several seed selection strategies for an automatic image segmentation methodology based on a seeded region growing-merging approach. In order to evaluate the segmentation quality, segments were subjected to spatial autocorrelation analysis using Moran's I index and intra-segment variance analysis. We apply the algorithm to image segmentation using an aerial multiband image.
Resumo:
In this paper, a new method is presented to ensure automatic synchronization of intracardiac ECG data, yielding a three-stage algorithm. We first compute a robust estimate of the derivative of the data to remove low-frequency perturbations. Then we provide a grouped-sparse representation of the data, by means of the Group LASSO, to ensure that all the electrical spikes are simultaneously detected. Finally, a post-processing step, based on a variance analysis, is performed to discard false alarms. Preliminary results on real data for sinus rhythm and atrial fibrillation show the potential of this approach.
Resumo:
Esta Tesis tiene como objetivo demostrar que los programas de preservación del patrimonio, durante su fase de implementación, deben someterse a un análisis multidisciplinar que haga un balance de su ejecución. Dicho análisis permitirá identificar resultados significativos, capaces de fundamentar la rectificación de las bases conceptuales de la política pública en cuestión. Este reajuste podrá darse, por lo tanto, durante su vigencia y, de forma más relevante, posteriormente, sus motivos y resultados permitirán elaborar nuevas estrategias que serán aplicadas en futuros programas de intervención. Por otro lado se indagó, además, si las ciudades participantes en un programa nacional de preservación del patrimonio, regidas por normas y metas comunes, podrían alcanzar resultados diferentes. Para atender a estos objetivos, la investigación se encuentra enfocada a la realidad brasileña, siendo seleccionado como objeto de estudio el Programa Monumenta. Este Programa formó parte de la política pública cultural del Ministerio de Cultura con una importante implicación del Banco Interamericano de Desarrollo. Implementado a partir de 1999, procuró promover un proceso de recuperación urbana sostenible y de preservación del patrimonio de 26 Sitios Históricos Urbanos o Conjuntos de Monumentos Urbanos protegidos por el Instituto del Patrimonio Histórico y Artístico Nacional (Iphan). El Programa contó con el apoyo de la Unesco y del Iphan, así como con la participación de las Administraciones Municipales y/o Estatales, sectores privados y la sociedad civil. Las hipótesis planteadas en esta Tesis Doctoral fueron: (a) el análisis de los resultados en la fase de implementación de un programa de preservación del patrimonio es imprescindible, porque permite extraer conclusiones preliminares y orientar sus reformas; (b) los objetivos a corto plazo establecidos por el Programa Monumenta fueron alcanzados de modo diferenciado en las distintas ciudades beneficiadas; (c) a pesar de las diferencias, el Programa Monumenta presentó resultados preliminares positivos y significativos en la preservación del patrimonio histórico urbano brasileño. Los procedimientos metodológicos se centraron en un análisis cuantitativo, cualitativo y comparativo de los resultados alcanzados por tres ciudades beneficiadas por el Programa Monumenta, seleccionadas según su tamaño poblacional: Pelotas, Porto Alegre (Estado de Rio Grande do Sul) y São Francisco do Sul (Estado de Santa Catarina). Estos procedimientos fueron aplicados en los siguientes indicadores: utilización de los equipamientos culturales, características de la población y de los domicilios, variación de las actividades económicas, financiación destinada al sector privado para la recuperación de inmuebles y el fomento de la seguridad urbana. La Tesis ha englobado discusiones y conceptos abordados en las disciplinas de la Sociología Urbana, Geografía Urbana, Historia, Economía y Estadística de modo que se atribuye al objeto de investigación una visión interdisciplinar que ayudará a la comprensión de la teoría y la práctica preservacionistas. El análisis de la varianza, la regresión lineal y el análisis factorial fueron las herramientas estadísticas aplicadas sobre los datos con el objetivo de constatar la significación de los resultados y la relación de correspondencia entre algunas variables. Esta Tesis contribuye a la elaboración de una metodología analítica que puede ser aplicada en el cálculo de la superficie ocupada por las actividades económicas, con base en el método estadístico del Diagrama de Caja y Bigotes, de John Wilder Tukey. Las conclusiones corroboran las hipótesis planteadas y pretenden contribuir al diseño de las nuevas políticas públicas de preservación de sitios históricos de carácter urbano, enfatizando, con ello, la necesidad de evaluaciones más profundas de los resultados durante su fase de implementación. ---------------------------------------------------------------------------------- RESUMO--------------------------------------------------------------------------- A presente Tese apresenta como objetivo principal demonstrar que os programas de preservação do patrimônio histórico, durante a sua fase de implementação, necessitam de uma análise multidisciplinar sobre a sua execução. Essa análise permite identificar resultados significativos, capazes de fundamentar a retificação das bases conceituais da política pública em questão. A correção poderá, portanto, ser realizada tanto durante a sua vigência como posteriormente, ao permitir a elaboração de novas estratégias a serem aplicadas nos futuros programas de intervenção. Por outro lado, indagou-se se cidades participantes de um mesmo programa nacional de preservação do patrimônio histórico, regidas por normas e metas comuns, poderiam alcançar resultados não similares. Para atender tais objetivos, a investigação enfoca a realidade brasileira, tendo sido selecionado o Programa Monumenta como objeto de estudo. Esse Programa fez parte de uma política pública cultural do Ministério da Cultura, que atuou em parceria com o Banco Interamericano de Desenvolvimento. Implantado em nível nacional, a partir de 1999, visava promover um processo de recuperação urbana sustentável, bem como a preservação do patrimônio de 26 Sítios Urbanos Históricos ou Conjuntos de Monumentos Urbanos, protegidos pelo Instituto do Patrimônio Histórico e Artístico Nacional (Iphan). O Programa contou com o apoio da Unesco e do Iphan, além da participação das Administrações Municipais e/ou Estaduais, setores privados e sociedade civil. As hipóteses estabelecidas nesta Tese Doutoral foram: (a) a análise dos resultados na fase de implementação de um programa de preservação do patrimônio é imprescindível, pois permite extrair conclusões preliminares e orientar as suas reformulações; (b) os objetivos em curto prazo, estabelecidos pelo Programa Monumenta, foram alcançados de modo diferente pelas cidades beneficiadas; (c) apesar das diferenças, o Programa Monumenta apresentou resultados preliminares positivos e significativos sobre a preservação do patrimônio histórico urbano brasileiro. Os procedimentos metodológicos se centraram em análises quantitativa, qualitativa e comparativa dos resultados alcançados em três cidades beneficiadas pelo Programa Monumenta, selecionadas de acordo com o tamanho populacional: Pelotas, Porto Alegre (Estado do Rio Grande do Sul) e São Francisco do Sul (Estado de Santa Catarina). Esses procedimentos foram aplicados nos seguintes indicadores: utilização dos equipamentos culturais, características da população e dos domicílios, atividades econômicas, financiamento destinado ao setor privado para a recuperação dos imóveis e, ainda, o fomento da segurança urbana. A Tese inclui discussões e conceitos abordados nas disciplinas de Sociologia Urbana, Geografia Urbana, História, Economia e Estatística, de modo a atribuir ao objeto de investigação uma visão interdisciplinar e uma compreensão entre a teoria e a prática preservacionista. A análise de variância, regressão linear e análise fatorial foram as técnicas estatísticas aplicadas sobre os dados, com o objetivo de constatar a significação dos resultados e a relação de correspondência entre algumas variáveis. Esta Tese contribui com a elaboração de uma metodologia aplicada no cálculo da superfície ocupada pelas atividades econômicas, utilizando como método estatístico o Diagrama de Caixa e Bigodes, de John Wilder Tukey. As conclusões corroboram com as hipóteses estabelecidas e pretendem contribuir para o desenho de novas políticas públicas de preservação de sítios históricos de caráter urbano, enfatizando a necessidade de avaliações mais profundas dos resultados durante a sua fase de implementação. ---------------------------------------------------------------------------------- ABSTRACT ---------------------------------------------------------------------------------- The main goal of this PhD. Thesis is to demonstrate that a multidisciplinary analysis is needed during the implementation phase of preservation of heritage programmes. Such analysis allows the identification of significant results, which in turn can serve as the foundation for the conceptual bases of the public policy at hand. Thus, any corrections can be made both during the programme and afterward, by introducing new strategies to be applied in future intervention programmes. On the other hand, this project also asks whether cities participating in the same national preservation of heritage programme with common rules and goals can achieve distinct results. In order to meet these objectives, the project chose Brazil as its focus and Monumenta Programme for its object of study. This Programme is part of the Ministry of Culture’s public cultural policy, and was developed with cooperation by the Inter-American Development Bank. Implemented at the national level in 1999, the Programme aimed at promoting a process of sustainable urban renewal and the preservation of 26 urban historic sites or urban monumental ensembles, protected by the National Historic and Artistic Heritage Institute (IPHAN). UNESCO and IPHAN supported the Programme, and participants included municipal and state offices, private businesses, and local residents. The hypotheses established in this Doctoral Thesis were: (a) the analysis of the results in the implementation phase of a cultural public policy is imperative, because it enables preliminary conclusions to be drawn and orientate reforms; (b) the short-term objectives set out in the Monumenta Programme were achieved differently in the benefitted cities; (c) despite the differences, the Monumenta Programme displayed significant positive preliminary results in the conservation of the urban historic heritage in Brazil. Methodology procedures centered around quantitative, qualitative, and comparative analyses of the results achieved in three benefiting cities, selected according to population size: Pelotas, Porto Alegre (Rio Grande do Sul State) and São Francisco do Sul (Santa Catarina State). These procedures were applied to the following indicators: the use of the cultural facilities, characteristics of the population and the residential housing, variation of the economic activities, financing destined for the recovery of real estate, and promoting urban safety. The Thesis includes discussions and concepts addressed in urban sociology, urban geography, history, economics, and statistics, in order to examine the object of study with an interdisciplinary eye and an understanding between preservation theory and practice. Variance analysis, linear regression, and factorial analysis were the statistical techniques applied to the data, with the goal of defining the significance of the results and the correspondence ratio between some of the variables. This Thesis attempted to elaborate an applied methodology for calculating the space occupied by economic activities, using John Wilder Tukey's statistical method of Box-and-Whisker Plot. The conclusions corroborated the hypotheses established and are meant to contribute to the design of new public policies of historical site preservation in urban settings, emphasizing the need for deeper evaluations of the results during the implementation phase.
Resumo:
Background: Several meta-analysis methods can be used to quantitatively combine the results of a group of experiments, including the weighted mean difference, statistical vote counting, the parametric response ratio and the non-parametric response ratio. The software engineering community has focused on the weighted mean difference method. However, other meta-analysis methods have distinct strengths, such as being able to be used when variances are not reported. There are as yet no guidelines to indicate which method is best for use in each case. Aim: Compile a set of rules that SE researchers can use to ascertain which aggregation method is best for use in the synthesis phase of a systematic review. Method: Monte Carlo simulation varying the number of experiments in the meta analyses, the number of subjects that they include, their variance and effect size. We empirically calculated the reliability and statistical power in each case Results: WMD is generally reliable if the variance is low, whereas its power depends on the effect size and number of subjects per meta-analysis; the reliability of RR is generally unaffected by changes in variance, but it does require more subjects than WMD to be powerful; NPRR is the most reliable method, but it is not very powerful; SVC behaves well when the effect size is moderate, but is less reliable with other effect sizes. Detailed tables of results are annexed. Conclusions: Before undertaking statistical aggregation in software engineering, it is worthwhile checking whether there is any appreciable difference in the reliability and power of the methods. If there is, software engineers should select the method that optimizes both parameters.
Resumo:
Non-failure analysis aims at inferring that predicate calis in a program will never fail. This type of information has many applications in functional/logic programming. It is essential for determining lower bounds on the computational cost of calis, useful in the context of program parallelization, instrumental in partial evaluation and other program transformations, and has also been used in query optimization. In this paper, we re-cast the non-failure analysis proposed by Debray et al. as an abstract interpretation, which not only allows to investígate it from a standard and well understood theoretical framework, but has also several practical advantages. It allows us to incorpórate non-failure analysis into a standard, generic abstract interpretation engine. The analysis thus benefits from the fixpoint propagation algorithm, which leads to improved information propagation. Also, the analysis takes advantage of the multi-variance of the generic engine, so that it is now able to infer sepárate non-failure information for different cali patterns. Moreover, the implementation is simpler, and allows to perform non-failure and covering analyses alongside other analyses, such as those for modes and types, in the same framework. Finally, besides the precisión improvements and the additional simplicity, our implementation (in the Ciao/CiaoPP multiparadigm programming system) also shows better efRciency.
Resumo:
Pragmatism is the leading motivation of regularization. We can understand regularization as a modification of the maximum-likelihood estimator so that a reasonable answer could be given in an unstable or ill-posed situation. To mention some typical examples, this happens when fitting parametric or non-parametric models with more parameters than data or when estimating large covariance matrices. Regularization is usually used, in addition, to improve the bias-variance tradeoff of an estimation. Then, the definition of regularization is quite general, and, although the introduction of a penalty is probably the most popular type, it is just one out of multiple forms of regularization. In this dissertation, we focus on the applications of regularization for obtaining sparse or parsimonious representations, where only a subset of the inputs is used. A particular form of regularization, L1-regularization, plays a key role for reaching sparsity. Most of the contributions presented here revolve around L1-regularization, although other forms of regularization are explored (also pursuing sparsity in some sense). In addition to present a compact review of L1-regularization and its applications in statistical and machine learning, we devise methodology for regression, supervised classification and structure induction of graphical models. Within the regression paradigm, we focus on kernel smoothing learning, proposing techniques for kernel design that are suitable for high dimensional settings and sparse regression functions. We also present an application of regularized regression techniques for modeling the response of biological neurons. Supervised classification advances deal, on the one hand, with the application of regularization for obtaining a na¨ıve Bayes classifier and, on the other hand, with a novel algorithm for brain-computer interface design that uses group regularization in an efficient manner. Finally, we present a heuristic for inducing structures of Gaussian Bayesian networks using L1-regularization as a filter. El pragmatismo es la principal motivación de la regularización. Podemos entender la regularización como una modificación del estimador de máxima verosimilitud, de tal manera que se pueda dar una respuesta cuando la configuración del problema es inestable. A modo de ejemplo, podemos mencionar el ajuste de modelos paramétricos o no paramétricos cuando hay más parámetros que casos en el conjunto de datos, o la estimación de grandes matrices de covarianzas. Se suele recurrir a la regularización, además, para mejorar el compromiso sesgo-varianza en una estimación. Por tanto, la definición de regularización es muy general y, aunque la introducción de una función de penalización es probablemente el método más popular, éste es sólo uno de entre varias posibilidades. En esta tesis se ha trabajado en aplicaciones de regularización para obtener representaciones dispersas, donde sólo se usa un subconjunto de las entradas. En particular, la regularización L1 juega un papel clave en la búsqueda de dicha dispersión. La mayor parte de las contribuciones presentadas en la tesis giran alrededor de la regularización L1, aunque también se exploran otras formas de regularización (que igualmente persiguen un modelo disperso). Además de presentar una revisión de la regularización L1 y sus aplicaciones en estadística y aprendizaje de máquina, se ha desarrollado metodología para regresión, clasificación supervisada y aprendizaje de estructura en modelos gráficos. Dentro de la regresión, se ha trabajado principalmente en métodos de regresión local, proponiendo técnicas de diseño del kernel que sean adecuadas a configuraciones de alta dimensionalidad y funciones de regresión dispersas. También se presenta una aplicación de las técnicas de regresión regularizada para modelar la respuesta de neuronas reales. Los avances en clasificación supervisada tratan, por una parte, con el uso de regularización para obtener un clasificador naive Bayes y, por otra parte, con el desarrollo de un algoritmo que usa regularización por grupos de una manera eficiente y que se ha aplicado al diseño de interfaces cerebromáquina. Finalmente, se presenta una heurística para inducir la estructura de redes Bayesianas Gaussianas usando regularización L1 a modo de filtro.
Resumo:
Nitrous oxide emissions from a network of agricultural experiments in Europe were used to explore the relative importance of site and management controls of emissions. At each site, a selection of management interventions were compared within replicated experimental designs in plot-based experiments. Arable experiments were conducted at Beano in Italy, El Encin in Spain, Foulum in Denmark, Logarden in Sweden, Maulde in Belgium CE1, Paulinenaue in Germany, and Tulloch in the UK. Grassland experiments were conducted at Crichton, Nafferton and Peaknaze in the UK, Godollo in Hungary, Rzecin in Poland, Zarnekow in Germany and Theix in France. Nitrous oxide emissions were measured at each site over a period of at least two years using static chambers. Emissions varied widely between sites and as a result of manipulation treatments. Average site emissions (throughout the study period) varied between 0.04 and 21.21 kg N2O-N ha−1yr−1, with the largest fluxes and variability associated with the grassland sites. Total nitrogen addition was found to be the single most important deter- minant of emissions, accounting for 15 % of the variance (using linear regression) in the data from the arable sites (p<0.0001), and 77 % in the grassland sites. The annual emissions from arable sites were significantly greater than those that would be predicted by IPCC default emission fac- tors. Variability of N2O emissions within sites that occurred as a result of manipulation treatments was greater than that resulting from site-to-site and year-to-year variation, highlighting the importance of management interventions in contributing to greenhouse gas mitigation
Resumo:
Esta tesis estudia la evolución estructural de conjuntos de neuronas como la capacidad de auto-organización desde conjuntos de neuronas separadas hasta que forman una red (clusterizada) compleja. Esta tesis contribuye con el diseño e implementación de un algoritmo no supervisado de segmentación basado en grafos con un coste computacional muy bajo. Este algoritmo proporciona de forma automática la estructura completa de la red a partir de imágenes de cultivos neuronales tomadas con microscopios de fase con una resolución muy alta. La estructura de la red es representada mediante un objeto matemático (matriz) cuyos nodos representan a las neuronas o grupos de neuronas y los enlaces son las conexiones reconstruidas entre ellos. Este algoritmo extrae también otras medidas morfológicas importantes que caracterizan a las neuronas y a las neuritas. A diferencia de otros algoritmos hasta el momento, que necesitan de fluorescencia y técnicas inmunocitoquímicas, el algoritmo propuesto permite el estudio longitudinal de forma no invasiva posibilitando el estudio durante la formación de un cultivo. Además, esta tesis, estudia de forma sistemática un grupo de variables topológicas que garantizan la posibilidad de cuantificar e investigar la progresión de las características principales durante el proceso de auto-organización del cultivo. Nuestros resultados muestran la existencia de un estado concreto correspondiente a redes con configuracin small-world y la emergencia de propiedades a micro- y meso-escala de la estructura de la red. Finalmente, identificamos los procesos físicos principales que guían las transformaciones morfológicas de los cultivos y proponemos un modelo de crecimiento de red que reproduce el comportamiento cuantitativamente de las observaciones experimentales. ABSTRACT The thesis analyzes the morphological evolution of assemblies of living neurons, as they self-organize from collections of separated cells into elaborated, clustered, networks. In particular, it contributes with the design and implementation of a graph-based unsupervised segmentation algorithm, having an associated very low computational cost. The processing automatically retrieves the whole network structure from large scale phase-contrast images taken at high resolution throughout the entire life of a cultured neuronal network. The network structure is represented by a mathematical object (a matrix) in which nodes are identified neurons or neurons clusters, and links are the reconstructed connections between them. The algorithm is also able to extract any other relevant morphological information characterizing neurons and neurites. More importantly, and at variance with other segmentation methods that require fluorescence imaging from immunocyto- chemistry techniques, our measures are non invasive and entitle us to carry out a fully longitudinal analysis during the maturation of a single culture. In turn, a systematic statistical analysis of a group of topological observables grants us the possibility of quantifying and tracking the progression of the main networks characteristics during the self-organization process of the culture. Our results point to the existence of a particular state corresponding to a small-world network configuration, in which several relevant graphs micro- and meso-scale properties emerge. Finally, we identify the main physical processes taking place during the cultures morphological transformations, and embed them into a simplified growth model that quantitatively reproduces the overall set of experimental observations.
Resumo:
Esta tesis estudia la evolución estructural de conjuntos de neuronas como la capacidad de auto-organización desde conjuntos de neuronas separadas hasta que forman una red (clusterizada) compleja. Esta tesis contribuye con el diseño e implementación de un algoritmo no supervisado de segmentación basado en grafos con un coste computacional muy bajo. Este algoritmo proporciona de forma automática la estructura completa de la red a partir de imágenes de cultivos neuronales tomadas con microscopios de fase con una resolución muy alta. La estructura de la red es representada mediante un objeto matemático (matriz) cuyos nodos representan a las neuronas o grupos de neuronas y los enlaces son las conexiones reconstruidas entre ellos. Este algoritmo extrae también otras medidas morfológicas importantes que caracterizan a las neuronas y a las neuritas. A diferencia de otros algoritmos hasta el momento, que necesitan de fluorescencia y técnicas inmunocitoquímicas, el algoritmo propuesto permite el estudio longitudinal de forma no invasiva posibilitando el estudio durante la formación de un cultivo. Además, esta tesis, estudia de forma sistemática un grupo de variables topológicas que garantizan la posibilidad de cuantificar e investigar la progresión de las características principales durante el proceso de auto-organización del cultivo. Nuestros resultados muestran la existencia de un estado concreto correspondiente a redes con configuracin small-world y la emergencia de propiedades a micro- y meso-escala de la estructura de la red. Finalmente, identificamos los procesos físicos principales que guían las transformaciones morfológicas de los cultivos y proponemos un modelo de crecimiento de red que reproduce el comportamiento cuantitativamente de las observaciones experimentales. ABSTRACT The thesis analyzes the morphological evolution of assemblies of living neurons, as they self-organize from collections of separated cells into elaborated, clustered, networks. In particular, it contributes with the design and implementation of a graph-based unsupervised segmentation algorithm, having an associated very low computational cost. The processing automatically retrieves the whole network structure from large scale phase-contrast images taken at high resolution throughout the entire life of a cultured neuronal network. The network structure is represented by a mathematical object (a matrix) in which nodes are identified neurons or neurons clusters, and links are the reconstructed connections between them. The algorithm is also able to extract any other relevant morphological information characterizing neurons and neurites. More importantly, and at variance with other segmentation methods that require fluorescence imaging from immunocyto- chemistry techniques, our measures are non invasive and entitle us to carry out a fully longitudinal analysis during the maturation of a single culture. In turn, a systematic statistical analysis of a group of topological observables grants us the possibility of quantifying and tracking the progression of the main networks characteristics during the self-organization process of the culture. Our results point to the existence of a particular state corresponding to a small-world network configuration, in which several relevant graphs micro- and meso-scale properties emerge. Finally, we identify the main physical processes taking place during the cultures morphological transformations, and embed them into a simplified growth model that quantitatively reproduces the overall set of experimental observations.
Resumo:
This paper discusses a model based on the agency theory to analyze the optimal transfer of construction risk in public works contracts. The base assumption is that of a contract between a principal (public authority) and an agent (firm), where the payment mechanism is linear and contains an incentive mechanism to enhance the effort of the agent to reduce construction costs. A theoretical model is proposed starting from a cost function with a random component and assuming that both the public authority and the firm are risk averse. The main outcome of the paper is that the optimal transfer of construction risk will be lower when the variance of errors in cost forecast, the risk aversion of the firm and the marginal cost of public funds are larger, while the optimal transfer of construction risk will grow when the variance of errors in cost monitoring and the risk aversion of the public authority are larger