41 resultados para winogradsly columns
Resumo:
The US National Academy of Engineering recently identified restoring and improving urban infrastructure as one of the grand challenges of engineering. Part of this challenge stems from the lack of viable methods to map/label existing infrastructure. For computer vision, this challenge becomes “How can we automate the process of extracting geometric, object oriented models of infrastructure from visual data?” Object recognition and reconstruction methods have been successfully devised and/or adapted to answer this question for small or linear objects (e.g. columns). However, many infrastructure objects are large and/or planar without significant and distinctive features, such as walls, floor slabs, and bridge decks. How can we recognize and reconstruct them in a 3D model? In this paper, strategies for infrastructure object recognition and reconstruction are presented, to set the stage for posing the question above and discuss future research in featureless, large/planar object recognition and modeling.
Resumo:
Distributions over exchangeable matrices with infinitely many columns, such as the Indian buffet process, are useful in constructing nonparametric latent variable models. However, the distribution implied by such models over the number of features exhibited by each data point may be poorly- suited for many modeling tasks. In this paper, we propose a class of exchangeable nonparametric priors obtained by restricting the domain of existing models. Such models allow us to specify the distribution over the number of features per data point, and can achieve better performance on data sets where the number of features is not well-modeled by the original distribution.
Resumo:
The current procedures in post-earthquake safety and structural assessment are performed manually by a skilled triage team of structural engineers/certified inspectors. These procedures, and particularly the physical measurement of the damage properties, are time-consuming and qualitative in nature. This paper proposes a novel method that automatically detects spalled regions on the surface of reinforced concrete columns and measures their properties in image data. Spalling has been accepted as an important indicator of significant damage to structural elements during an earthquake. According to this method, the region of spalling is first isolated by way of a local entropy-based thresholding algorithm. Following this, the exposure of longitudinal reinforcement (depth of spalling into the column) and length of spalling along the column are measured using a novel global adaptive thresholding algorithm in conjunction with image processing methods in template matching and morphological operations. The method was tested on a database of damaged RC column images collected after the 2010 Haiti earthquake, and comparison of the results with manual measurements indicate the validity of the method.
Resumo:
Reinforced concrete buildings in low-to-moderate seismic zones are often designed only for gravity loads in accordance with the non-seismic detailing provisions. Deficient detailing of columns and beam-column joints can lead to unpredictable brittle failures even under moderate earthquakes. Therefore, a reliable estimate of structural response is required for the seismic evaluation of these structures. For this purpose, analytical models for both interior and exterior slab-beam-column subassemblages and for a 1/3 scale model frame were implemented into the nonlinear finite element platform OpenSees. Comparison between the analytical results and experimental data available in the literature is carried out using nonlinear pushover analyses and nonlinear time history analysis for the subassemblages and the model frame, respectively. Furthermore, the seismic fragility assessment of reinforced concrete buildings is performed on a set of non-ductile frames using nonlinear time history analyses. The fragility curves, which are developed for various damage states for the maximum interstory drift ratio are characterized in terms of peak ground acceleration and spectral acceleration using a suite of ground motions representative of the seismic hazard in the region.
Resumo:
A fundamental problem in the analysis of structured relational data like graphs, networks, databases, and matrices is to extract a summary of the common structure underlying relations between individual entities. Relational data are typically encoded in the form of arrays; invariance to the ordering of rows and columns corresponds to exchangeable arrays. Results in probability theory due to Aldous, Hoover and Kallenberg show that exchangeable arrays can be represented in terms of a random measurable function which constitutes the natural model parameter in a Bayesian model. We obtain a flexible yet simple Bayesian nonparametric model by placing a Gaussian process prior on the parameter function. Efficient inference utilises elliptical slice sampling combined with a random sparse approximation to the Gaussian process. We demonstrate applications of the model to network data and clarify its relation to models in the literature, several of which emerge as special cases.
Resumo:
The finite element method (FEM) is growing in popularity over the pressure diagram/hand calculation method for analysis of excavation systems in general and deep soil mixing excavations in particular. In this paper, a finite element analysis is used to study the behavior of a deep mixed excavation. Through the use of Plaxis (a FEM software program), the construction sequence is simulated by following the various construction phases allowing for deflections due to strut or anchor installation to be predicted. The numerical model used in this study simulates the soil cement columns as a continuous wall matching the bending stiffness of the actual wall. Input parameters based on laboratory tests and modeling assumptions are discussed. An example of the approach is illustrated using the Islais Creek Transport/Storage Project in San Francisco, California. Copyright ASCE 2006.
Resumo:
We propose an algorithm for solving optimization problems defined on a subset of the cone of symmetric positive semidefinite matrices. This algorithm relies on the factorization X = Y Y T , where the number of columns of Y fixes an upper bound on the rank of the positive semidefinite matrix X. It is thus very effective for solving problems that have a low-rank solution. The factorization X = Y Y T leads to a reformulation of the original problem as an optimization on a particular quotient manifold. The present paper discusses the geometry of that manifold and derives a second-order optimization method with guaranteed quadratic convergence. It furthermore provides some conditions on the rank of the factorization to ensure equivalence with the original problem. In contrast to existing methods, the proposed algorithm converges monotonically to the sought solution. Its numerical efficiency is evaluated on two applications: the maximal cut of a graph and the problem of sparse principal component analysis. © 2010 Society for Industrial and Applied Mathematics.
Resumo:
In this paper we develop a new approach to sparse principal component analysis (sparse PCA). We propose two single-unit and two block optimization formulations of the sparse PCA problem, aimed at extracting a single sparse dominant principal component of a data matrix, or more components at once, respectively. While the initial formulations involve nonconvex functions, and are therefore computationally intractable, we rewrite them into the form of an optimization program involving maximization of a convex function on a compact set. The dimension of the search space is decreased enormously if the data matrix has many more columns (variables) than rows. We then propose and analyze a simple gradient method suited for the task. It appears that our algorithm has best convergence properties in the case when either the objective function or the feasible set are strongly convex, which is the case with our single-unit formulations and can be enforced in the block case. Finally, we demonstrate numerically on a set of random and gene expression test problems that our approach outperforms existing algorithms both in quality of the obtained solution and in computational speed. © 2010 Michel Journée, Yurii Nesterov, Peter Richtárik and Rodolphe Sepulchre.
Resumo:
Avalanches, debris flows, and landslides are geophysical hazards, which involve rapid mass movement of granular solids, water and air as a single-phase system. The dynamics of a granular flow involve at least three distinct scales: the micro-scale, meso-scale, and the macro-scale. This study aims to understand the ability of continuum models to capture the micro-mechanics of dry granular collapse. Material Point Method (MPM), a hybrid Lagrangian and Eulerian approach, with Mohr-Coulomb failure criterion is used to describe the continuum behaviour of granular column collapse, while the micromechanics is captured using Discrete Element Method (DEM) with tangential contact force model. The run-out profile predicted by the continuum simulations matches with DEM simulations for columns with small aspect ratios ('h/r' < 2), however MPM predicts larger run-out distances for columns with higher aspect ratios ('h/r' > 2). Energy evolution studies in DEM simulations reveal higher collisional dissipation in the initial free-fall regime for tall columns. The lack of a collisional energy dissipation mechanism in MPM simulations results in larger run-out distances. Micro-structural effects, such as shear band formations, were observed both in DEM and MPM simulations. A sliding flow regime is observed above the distinct passive zone at the core of the column. Velocity profiles obtained from both the scales are compared to understand the reason for a slow flow run-out mobilization in MPM simulations. © 2013 AIP Publishing LLC.
Resumo:
The paper addresses the problem of low-rank trace norm minimization. We propose an algorithm that alternates between fixed-rank optimization and rank-one updates. The fixed-rank optimization is characterized by an efficient factorization that makes the trace norm differentiable in the search space and the computation of duality gap numerically tractable. The search space is nonlinear but is equipped with a Riemannian structure that leads to efficient computations. We present a second-order trust-region algorithm with a guaranteed quadratic rate of convergence. Overall, the proposed optimization scheme converges superlinearly to the global solution while maintaining complexity that is linear in the number of rows and columns of the matrix. To compute a set of solutions efficiently for a grid of regularization parameters we propose a predictor-corrector approach that outperforms the naive warm-restart approach on the fixed-rank quotient manifold. The performance of the proposed algorithm is illustrated on problems of low-rank matrix completion and multivariate linear regression. © 2013 Society for Industrial and Applied Mathematics.
Resumo:
Large concrete structures need to be inspected in order to assess their current physical and functional state, to predict future conditions, to support investment planning and decision making, and to allocate limited maintenance and rehabilitation resources. Current procedures in condition and safety assessment of large concrete structures are performed manually leading to subjective and unreliable results, costly and time-consuming data collection, and safety issues. To address these limitations, automated machine vision-based inspection procedures have increasingly been proposed by the research community. This paper presents current achievements and open challenges in vision-based inspection of large concrete structures. First, the general concept of Building Information Modeling is introduced. Then, vision-based 3D reconstruction and as-built spatial modeling of concrete civil infrastructure are presented. Following that, the focus is set on structural member recognition as well as on concrete damage detection and assessment exemplified for concrete columns. Although some challenges are still under investigation, it can be concluded that vision-based inspection methods have significantly improved over the last 10 years, and now, as-built spatial modeling as well as damage detection and assessment of large concrete structures have the potential to be fully automated.