114 resultados para Redistricting problems


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Underground space is commonly exploited both to maximise the utility of costly land in urban development and to reduce the vertical load acting on the ground. Deep excavations are carried out to construct various types of underground infrastructure such as deep basements, subways and service tunnels. Although the soil response to excavation is known in principle, designers lack practical calculation methods for predicting both short- and long-term ground movements. As the understanding of how soil behaves around an excavation in both the short and long term is insufficient and usually empirical, the judgements used in design are also empirical and serious accidents are common. To gain a better understanding of the mechanisms involved in soil excavation, a new apparatus for the centrifuge model testing of deep excavations in soft clay has been developed. This apparatus simulates the field construction sequence of a multi-propped retaining wall during centrifuge flight. A comparison is given between the new technique and the previously used method of draining heavy fluid to simulate excavation in a centrifuge model. The new system has the benefit of giving the correct initial ground conditions before excavation and the proper earth pressure distribution on the retaining structures during excavation, whereas heavy fluid only gives an earth pressure coefficient of unity and is unable to capture any changes in the earth pressure coefficient of soil inside the zone of excavation, for example owing to wall movements. Settlements of the ground surface, changes in pore water pressure, variations in earth pressure, prop forces and bending moments in the retaining wall are all monitored during excavation. Furthermore, digital images taken of a cross-section during the test are analysed using particle image velocimetry to illustrate ground deformation and soil–structure interaction mechanisms. The significance of these observations is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Variational methods are a key component of the approximate inference and learning toolbox. These methods fill an important middle ground, retaining distributional information about uncertainty in latent variables, unlike maximum a posteriori methods (MAP), and yet generally requiring less computational time than Monte Carlo Markov Chain methods. In particular the variational Expectation Maximisation (vEM) and variational Bayes algorithms, both involving variational optimisation of a free-energy, are widely used in time-series modelling. Here, we investigate the success of vEM in simple probabilistic time-series models. First we consider the inference step of vEM, and show that a consequence of the well-known compactness property of variational inference is a failure to propagate uncertainty in time, thus limiting the usefulness of the retained distributional information. In particular, the uncertainty may appear to be smallest precisely when the approximation is poorest. Second, we consider parameter learning and analytically reveal systematic biases in the parameters found by vEM. Surprisingly, simpler variational approximations (such a mean-field) can lead to less bias than more complicated structured approximations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coupled hydrology and water quality models are an important tool today, used in the understanding and management of surface water and watershed areas. Such problems are generally subject to substantial uncertainty in parameters, process understanding, and data. Component models, drawing on different data, concepts, and structures, are affected differently by each of these uncertain elements. This paper proposes a framework wherein the response of component models to their respective uncertain elements can be quantified and assessed, using a hydrological model and water quality model as two exemplars. The resulting assessments can be used to identify model coupling strategies that permit more appropriate use and calibration of individual models, and a better overall coupled model response. One key finding was that an approximate balance of water quality and hydrological model responses can be obtained using both the QUAL2E and Mike11 water quality models. The balance point, however, does not support a particularly narrow surface response (or stringent calibration criteria) with respect to the water quality calibration data, at least in the case examined here. Additionally, it is clear from the results presented that the structural source of uncertainty is at least as significant as parameter-based uncertainties in areal models. © 2012 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a new formulation of the material point method (MPM) for solving coupled hydromechanical problems of fluid-saturated soil subjected to large deformation. A soil-pore fluid coupled MPM algorithm based on Biot's mixture theory is proposed for solving hydromechanical interaction problems that include changes in water table location with time. The accuracy of the proposed method is examined by comparing the results of the simulation of a one-dimensional consolidation test with the corresponding analytical solution. A sensitivity analysis of the MPM parameters used in the proposed method is carried out for examining the effect of the number of particles per mesh and mesh size on solution accuracy. For demonstrating the capability of the proposed method, a physical model experiment of a large-scale levee failure by seepage is simulated. The behavior of the levee model with time-dependent changes in water table matches well to the experimental observations. The mechanisms of seepage-induced failure are discussed by examining the pore-water pressures, as well as the effective stresses computed from the simulations © 2013 American Society of Civil Engineers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data in an organisation often contains business secrets that organisations do not want to release. However, there are occasions when it is necessary for an organisation to release its data such as when outsourcing work or using the cloud for Data Quality (DQ) related tasks like data cleansing. Currently, there is no mechanism that allows organisations to release their data for DQ tasks while ensuring that it is suitably protected from releasing business related secrets. The aim of this paper is therefore to present our current progress on determining which methods are able to modify secret data and retain DQ problems. So far we have identified the ways in which data swapping and the SHA-2 hash function alterations methods can be used to preserve missing data, incorrectly formatted values, and domain violations DQ problems while minimising the risk of disclosing secrets. © (2012) by the AIS/ICIS Administrative Office All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

© Springer International Publishing Switzerland 2015. Making sound asset management decisions, such as whether to replace or maintain an ageing underground water pipe, are critical to ensure that organisations maximise the performance of their assets. These decisions are only as good as the data that supports them, and hence many asset management organisations are in desperate need to improve the quality of their data. This chapter reviews the key academic research on data quality (DQ) and Information Quality (IQ) (used interchangeably in this chapter) in asset management, combines this with the current DQ problems faced by asset management organisations in various business sectors, and presents a classification of the most important DQ problems that need to be tackled by asset management organisations. In this research, eleven semi structured interviews were carried out with asset management professionals in a range of business sectors in the UK. The problems described in the academic literature were cross checked against the problems found in industry. In order to support asset management professionals in solving these problems, we categorised them into seven different DQ dimensions, used in the academic literature, so that it is clear how these problems fit within the standard frameworks for assessing and improving data quality. Asset management professionals can therefore now use these frameworks to underpin their DQ improvement initiatives while focussing on the most critical DQ problems.