17 resultados para Local classification method
em Cambridge University Engineering Department Publications Database
Resumo:
We provide a comprehensive overview of many recent algorithms for approximate inference in Gaussian process models for probabilistic binary classification. The relationships between several approaches are elucidated theoretically, and the properties of the different algorithms are corroborated by experimental results. We examine both 1) the quality of the predictive distributions and 2) the suitability of the different marginal likelihood approximations for model selection (selecting hyperparameters) and compare to a gold standard based on MCMC. Interestingly, some methods produce good predictive distributions although their marginal likelihood approximations are poor. Strong conclusions are drawn about the methods: The Expectation Propagation algorithm is almost always the method of choice unless the computational budget is very tight. We also extend existing methods in various ways, and provide unifying code implementing all approaches.
Resumo:
With recent developments in carbon-based electronics, it is imperative to understand the interplay between the morphology and electronic structure in graphene and graphite. We demonstrate controlled and repeatable vertical displacement of the top graphene layer from the substrate mediated by the scanning tunneling microscopy (STM) tip-sample interaction, manifested at the atomic level as well as over superlattices spanning several tens of nanometers. Besides the full-displacement, we observed the first half-displacement of the surface graphene layer, confirming that a reduced coupling rather than a change in lateral layer stacking is responsible for the triangular/honeycomb atomic lattice transition phenomenon, clearing the controversy surrounding it. Furthermore, an atomic scale mechanical stress at a grain boundary in graphite, resulting in the localization of states near the Fermi energy, is revealed through voltage-dependent imaging. A method of producing graphene nanoribbons based on the manipulation capabilities of the STM is also implemented.
Resumo:
Holistic representations of natural scenes is an effective and powerful source of information for semantic classification and analysis of arbitrary images. Recently, the frequency domain has been successfully exploited to holistically encode the content of natural scenes in order to obtain a robust representation for scene classification. In this paper, we present a new approach to naturalness classification of scenes using frequency domain. The proposed method is based on the ordering of the Discrete Fourier Power Spectra. Features extracted from this ordering are shown sufficient to build a robust holistic representation for Natural vs. Artificial scene classification. Experiments show that the proposed frequency domain method matches the accuracy of other state-of-the-art solutions. © 2008 Springer Berlin Heidelberg.
Fourier analysis and gabor filtering for texture analysis and local reconstruction of general shapes
Resumo:
Since the pioneering work of Gibson in 1950, Shape- From-Texture has been considered by researchers as a hard problem, mainly due to restrictive assumptions which often limit its applicability. We assume a very general stochastic homogeneity and perspective camera model, for both deterministic and stochastic textures. A multi-scale distortion is efficiently estimated with a previously presented method based on Fourier analysis and Gabor filters. The novel 3D reconstruction method that we propose applies to general shapes, and includes non-developable and extensive surfaces. Our algorithm is accurate, robust and compares favorably to the present state of the art of Shape-From- Texture. Results show its application to non-invasively study shape changes with laid-on textures, while rendering and retexturing of cloth is suggested for future work. © 2009 IEEE.
Resumo:
In Immersed Boundary Methods (IBM) the effect of complex geometries is introduced through the forces added in the Navier-Stokes solver at the grid points in the vicinity of the immersed boundaries. Most of the methods in the literature have been used with Cartesian grids. Moreover many of the methods developed in the literature do not satisfy some basic conservation properties (the conservation of torque, for instance) on non-uniform meshes. In this paper we will follow the RKPM method originated by Liu et al. [1] to build locally regularized functions that verify a number of integral conditions. These local approximants will be used both for interpolating the velocity field and for spreading the singular force field in the framework of a pressure correction scheme for the incompressible Navier-Stokes equations. We will also demonstrate the robustness and effectiveness of the scheme through various examples. Copyright © 2010 by ASME.
Resumo:
Information theoretic active learning has been widely studied for probabilistic models. For simple regression an optimal myopic policy is easily tractable. However, for other tasks and with more complex models, such as classification with nonparametric models, the optimal solution is harder to compute. Current approaches make approximations to achieve tractability. We propose an approach that expresses information gain in terms of predictive entropies, and apply this method to the Gaussian Process Classifier (GPC). Our approach makes minimal approximations to the full information theoretic objective. Our experimental performance compares favourably to many popular active learning algorithms, and has equal or lower computational complexity. We compare well to decision theoretic approaches also, which are privy to more information and require much more computational time. Secondly, by developing further a reformulation of binary preference learning to a classification problem, we extend our algorithm to Gaussian Process preference learning.
Resumo:
A computational impact analysis methodology has been developed, based on modal analysis and a local contact force-deflection model. The contact law is based on Hertz contact theory while contact stresses are elastic, defines a modified contact theory to take account of local permanent indentation, and considers elastic recovery during unloading. The model was validated experimentally through impact testing of glass-carbon hybrid braided composite panels. Specimens were mounted in a support frame and the contact force was inferred from the deceleration of the impactor, measured by high-speed photography. A Finite Element analysis of the panel and support frame assembly was performed to compute the modal responses. The new contact model performed well in predicting the peak forces and impact durations for moderate energy impacts (15 J), where contact stresses locally exceed the linear elastic limit and damage may be deemed to have occurred. C-scan measurements revealed substantial damage for impact energies in the range of 30-50 J. For this regime the new model predictions might be improved by characterisation of the contact law hysteresis during the unloading phase, and a modification of the elastic vibration response in line with damage levels acquired during the impact. © 2011 Elsevier Ltd. All rights reserved.
Resumo:
We present a new haplotype-based approach for inferring local genetic ancestry of individuals in an admixed population. Most existing approaches for local ancestry estimation ignore the latent genetic relatedness between ancestral populations and treat them as independent. In this article, we exploit such information by building an inheritance model that describes both the ancestral populations and the admixed population jointly in a unified framework. Based on an assumption that the common hypothetical founder haplotypes give rise to both the ancestral and the admixed population haplotypes, we employ an infinite hidden Markov model to characterize each ancestral population and further extend it to generate the admixed population. Through an effective utilization of the population structural information under a principled nonparametric Bayesian framework, the resulting model is significantly less sensitive to the choice and the amount of training data for ancestral populations than state-of-the-art algorithms. We also improve the robustness under deviation from common modeling assumptions by incorporating population-specific scale parameters that allow variable recombination rates in different populations. Our method is applicable to an admixed population from an arbitrary number of ancestral populations and also performs competitively in terms of spurious ancestry proportions under a general multiway admixture assumption. We validate the proposed method by simulation under various admixing scenarios and present empirical analysis results from a worldwide-distributed dataset from the Human Genome Diversity Project.
Resumo:
In this paper a method to incorporate linguistic information regarding single-word and compound verbs is proposed, as a first step towards an SMT model based on linguistically-classified phrases. By substituting these verb structures by the base form of the head verb, we achieve a better statistical word alignment performance, and are able to better estimate the translation model and generalize to unseen verb forms during translation. Preliminary experiments for the English - Spanish language pair are performed, and future research lines are detailed. © 2005 Association for Computational Linguistics.
Resumo:
A hybrid method for the incompressible Navier-Stokes equations is presented. The method inherits the attractive stabilizing mechanism of upwinded discontinuous Galerkin methods when momentum advection becomes significant, equal-order interpolations can be used for the velocity and pressure fields, and mass can be conserved locally. Using continuous Lagrange multiplier spaces to enforce flux continuity across cell facets, the number of global degrees of freedom is the same as for a continuous Galerkin method on the same mesh. Different from our earlier investigations on the approach for the Navier-Stokes equations, the pressure field in this work is discontinuous across cell boundaries. It is shown that this leads to very good local mass conservation and, for an appropriate choice of finite element spaces, momentum conservation. Also, a new form of the momentum transport terms for the method is constructed such that global energy stability is guaranteed, even in the absence of a pointwise solenoidal velocity field. Mass conservation, momentum conservation, and global energy stability are proved for the time-continuous case and for a fully discrete scheme. The presented analysis results are supported by a range of numerical simulations. © 2012 Society for Industrial and Applied Mathematics.
Resumo:
The amount of original imaging information produced yearly during the last decade has experienced a tremendous growth in all industries due to the technological breakthroughs in digital imaging and electronic storage capabilities. This trend is affecting the construction industry as well, where digital cameras and image databases are gradually replacing traditional photography. Owners demand complete site photograph logs and engineers store thousands of images for each project to use in a number of construction management tasks like monitoring an activity's progress and keeping evidence of the "as built" in case any disputes arise. So far, retrieval methodologies are done manually with the user being responsible for imaging classification according to specific rules that serve a limited number of construction management tasks. New methods that, with the guidance of the user, can automatically classify and retrieve construction site images are being developed and promise to remove the heavy burden of manually indexing images. In this paper, both the existing methods and a novel image retrieval method developed by the authors for the classification and retrieval of construction site images are described and compared. Specifically a number of examples are deployed in order to present their advantages and limitations. The results from this comparison demonstrates that the content based image retrieval method developed by the authors can reduce the overall time spent for the classification and retrieval of construction images while providing the user with the flexibility to retrieve images according different classification schemes.
Resumo:
Data quality (DQ) assessment can be significantly enhanced with the use of the right DQ assessment methods, which provide automated solutions to assess DQ. The range of DQ assessment methods is very broad: from data profiling and semantic profiling to data matching and data validation. This paper gives an overview of current methods for DQ assessment and classifies the DQ assessment methods into an existing taxonomy of DQ problems. Specific examples of the placement of each DQ method in the taxonomy are provided and illustrate why the method is relevant to the particular taxonomy position. The gaps in the taxonomy, where no current DQ methods exist, show where new methods are required and can guide future research and DQ tool development.
Resumo:
Surface temperature measurements from two discs of a gas turbine compressor rig are used as boundary conditions for the transient conduction solution (inverse heat transfer analysis). The disc geometry is complex, and so the finite element method is used. There are often large radial temperature gradients on the discs, and the equations are therefore solved taking into account the dependence of thermal conductivity on temperature. The solution technique also makes use of a multigrid algorithm to reduce the solution time. This is particularly important since a large amount of data must be analyzed to obtain correlations of the heat transfer. The finite element grid is also used for a network analysis to calculate the radiant heat transfer in the cavity formed between the two compressor discs. The work discussed here proved particularly challenging as the disc temperatures were only measured at four different radial locations. Four methods of surface temperature interpolation are examined, together with their effect on the local heat fluxes. It is found that the choice of interpolation method depends on the available number of data points. Bessel interpolation gives the best results for four data points, whereas cubic splines are preferred when there are considerably more data points. The results from the analysis of the compressor rig data show that the heat transfer near the disc inner radius appears to be influenced by the central throughflow. However, for larger radii, the heat transfer from the discs and peripheral shroud is found to be consistent with that of a buoyancy-induced flow.
Resumo:
This paper proposes a method for analysing the operational complexity in supply chains by using an entropic measure based on information theory. The proposed approach estimates the operational complexity at each stage of the supply chain and analyses the changes between stages. In this paper a stage is identified by the exchange of data and/or material. Through analysis the method identifies the stages where the operational complexity is both generated and propagated (exported, imported, generated or absorbed). Central to the method is the identification of a reference point within the supply chain. This is where the operational complexity is at a local minimum along the data transfer stages. Such a point can be thought of as a 'sink' for turbulence generated in the supply chain. Where it exists, it has the merit of stabilising the supply chain by attenuating uncertainty. However, the location of the reference point is also a matter of choice. If the preferred location is other than the current one, this is a trigger for management action. The analysis can help decide appropriate remedial action. More generally, the approach can assist logistics management by highlighting problem areas. An industrial application is presented to demonstrate the applicability of the method. © 2013 Operational Research Society Ltd. All rights reserved.