100 resultados para semi-deciduous forest
Resumo:
The ground movements induced by the construction of supported excavation systems are generally predicted by empirical/semi-empirical methods in the design stage. However, these methods cannot account for the site-specific conditions and for information that becomes available as an excavation proceeds. A Bayesian updating methodology is proposed to update the predictions of ground movements in the later stages of excavation based on recorded deformation measurements. As an application, the proposed framework is used to predict the three-dimensional deformation shapes at four incremental excavation stages of an actual supported excavation project. © 2011 Taylor & Francis Group, London.
Resumo:
The design and construction of deep excavations in urban environment is often governed by serviceability limit state related to the risk of damage to adjacent buildings. In current practice, the assessment of excavation-induced building damage has focused on a deterministic approach. This paper presents a component/system reliability analysis framework to assess the probability that specified threshold design criteria for multiple serviceability limit states are exceeded. A recently developed Bayesian probabilistic framework is used to update the predictions of ground movements in the later stages of excavation based on the recorded deformation measurements. An example is presented to show how the serviceability performance for excavation problems can be assessed based on the component/system reliability analysis. © 2011 ASCE.
Resumo:
The ground movements induced by the construction of supported excavation systems are generally predicted in the design stage by empirical/semi-empirical methods. However, these methods cannot account for the site-specific conditions and for information that become available as an excavation proceeds. A Bayesian updating methodology is proposed to update the predictions of ground movements in the later stages of excavation based on recorded deformation measurements. As an application, the proposed framework is used to predict the three-dimensional deformation shapes at four incremental excavation stages of an actual supported excavation project. Copyright © ASCE 2011.
Resumo:
The generalization of the geometric mean of positive scalars to positive definite matrices has attracted considerable attention since the seminal work of Ando. The paper generalizes this framework of matrix means by proposing the definition of a rank-preserving mean for two or an arbitrary number of positive semi-definite matrices of fixed rank. The proposed mean is shown to be geometric in that it satisfies all the expected properties of a rank-preserving geometric mean. The work is motivated by operations on low-rank approximations of positive definite matrices in high-dimensional spaces.© 2012 Elsevier Inc. All rights reserved.
Resumo:
Nanotube forest behaves as highly absorbent material when they are randomly placed in sub-wavelength scales. Furthermore, it is possible to create diffractive structures when these bulks are patterned in a substrate. Here, we introduce an alternative to fabricate intensity holograms by patterning fringes of nanotube forest on a substrate. The result is an efficient intensity hologram that is not restricted to sub-wavelength patterning. Both the theoretical and experimental analysis was performed with good agreement. The produced holograms show a uniform behaviour throughout the visible spectra. © 2013 AIP Publishing LLC.
Resumo:
Semi-supervised clustering is the task of clustering data points into clusters where only a fraction of the points are labelled. The true number of clusters in the data is often unknown and most models require this parameter as an input. Dirichlet process mixture models are appealing as they can infer the number of clusters from the data. However, these models do not deal with high dimensional data well and can encounter difficulties in inference. We present a novel nonparameteric Bayesian kernel based method to cluster data points without the need to prespecify the number of clusters or to model complicated densities from which data points are assumed to be generated from. The key insight is to use determinants of submatrices of a kernel matrix as a measure of how close together a set of points are. We explore some theoretical properties of the model and derive a natural Gibbs based algorithm with MCMC hyperparameter learning. The model is implemented on a variety of synthetic and real world data sets.
Resumo:
This work addresses the challenging problem of unconstrained 3D human pose estimation (HPE) from a novel perspective. Existing approaches struggle to operate in realistic applications, mainly due to their scene-dependent priors, such as background segmentation and multi-camera network, which restrict their use in unconstrained environments. We therfore present a framework which applies action detection and 2D pose estimation techniques to infer 3D poses in an unconstrained video. Action detection offers spatiotemporal priors to 3D human pose estimation by both recognising and localising actions in space-time. Instead of holistic features, e.g. silhouettes, we leverage the flexibility of deformable part model to detect 2D body parts as a feature to estimate 3D poses. A new unconstrained pose dataset has been collected to justify the feasibility of our method, which demonstrated promising results, significantly outperforming the relevant state-of-the-arts. © 2013 IEEE.
Resumo:
We present Random Partition Kernels, a new class of kernels derived by demonstrating a natural connection between random partitions of objects and kernels between those objects. We show how the construction can be used to create kernels from methods that would not normally be viewed as random partitions, such as Random Forest. To demonstrate the potential of this method, we propose two new kernels, the Random Forest Kernel and the Fast Cluster Kernel, and show that these kernels consistently outperform standard kernels on problems involving real-world datasets. Finally, we show how the form of these kernels lend themselves to a natural approximation that is appropriate for certain big data problems, allowing $O(N)$ inference in methods such as Gaussian Processes, Support Vector Machines and Kernel PCA.