29 resultados para Compositional data analysis-roots in geosciences
Resumo:
Data in an organisation often contains business secrets that organisations do not want to release. However, there are occasions when it is necessary for an organisation to release its data such as when outsourcing work or using the cloud for Data Quality (DQ) related tasks like data cleansing. Currently, there is no mechanism that allows organisations to release their data for DQ tasks while ensuring that it is suitably protected from releasing business related secrets. The aim of this paper is therefore to present our current progress on determining which methods are able to modify secret data and retain DQ problems. So far we have identified the ways in which data swapping and the SHA-2 hash function alterations methods can be used to preserve missing data, incorrectly formatted values, and domain violations DQ problems while minimising the risk of disclosing secrets. © (2012) by the AIS/ICIS Administrative Office All rights reserved.
Resumo:
© Springer International Publishing Switzerland 2015. Making sound asset management decisions, such as whether to replace or maintain an ageing underground water pipe, are critical to ensure that organisations maximise the performance of their assets. These decisions are only as good as the data that supports them, and hence many asset management organisations are in desperate need to improve the quality of their data. This chapter reviews the key academic research on data quality (DQ) and Information Quality (IQ) (used interchangeably in this chapter) in asset management, combines this with the current DQ problems faced by asset management organisations in various business sectors, and presents a classification of the most important DQ problems that need to be tackled by asset management organisations. In this research, eleven semi structured interviews were carried out with asset management professionals in a range of business sectors in the UK. The problems described in the academic literature were cross checked against the problems found in industry. In order to support asset management professionals in solving these problems, we categorised them into seven different DQ dimensions, used in the academic literature, so that it is clear how these problems fit within the standard frameworks for assessing and improving data quality. Asset management professionals can therefore now use these frameworks to underpin their DQ improvement initiatives while focussing on the most critical DQ problems.
Resumo:
Interest in hydrogel materials is growing rapidly, due to the potential for hydrogel use in tissue engineering and drug delivery applications, and as coatings on medical devices. However, a key limitation with the use of hydrogel materials in many applications is their relatively poor mechanical properties compared with those of (less biocompatible) solid polymers. In this review, basic chemistry, microstructure and processing routes for common natural and synthetic hydrogel materials are explored first. Underlying structure-properties relationships for hydrogels are considered. A series of mechanical testing modalities suitable for hydrogel characterisation are next considered, including emerging test modalities, such as nanoindentation and atomic force microscopy (AFM) indentation. As the data analysis depends in part on the material's constitutive behaviour, a series of increasingly complex constitutive models will be examined, including elastic, viscoelastic and theories that explicitly treat the multiphasic poroelastic nature of hydrogel materials. Results from the existing literature on agar and polyacrylamide mechanical properties are compiled and compared, highlighting the challenges and uncertainties inherent in the process of gel mechanical characterisation. © 2014 Institute of Materials, Minerals and Mining and ASM International.
Resumo:
In this paper we discuss key implementation challenges of a systems approach that combines System Dynamics, Scenario Planning and Qualitative Data Analysis methods in tackling a complex problem. We present the methods and the underlying framework. We then detail the main difficulties encountered in designing and planning the Scenario Planning workshop and how they were overcome, such as finding and involving the stakeholders and customising the process to fit within timing constraints. After presenting the results from this application, we argue that the consultants or system analysts need to engage with the stakeholders as process facilitators and not as system experts in order to gain commitment, trust and to improve information sharing. They also need be ready to adapt their tools and processes as well as their own thinking for more effective complex problem solving.
Resumo:
The soil-pipeline interactions under lateral and upward pipe movements in sand are investigated using DEM analysis. The simulations are performed for both medium and dense sand conditions at different embedment ratios of up to 60. The comparison of peak dimensionless forces from the DEM and earlier FEM analyses shows that, for medium sand, both methods show similar peak dimensionless forces. For dense sand, the DEM analysis gives more gradual transition of shallow to deep failure mechanisms than the FEM analysis and the peak dimensionless forces at very deep depth are higher in the DEM analysis than in the FEM analysis. Comparison of the deformation mechanism suggests that this is due to the differences in soil movements around the pipe associated with its particulate nature. The DEM analysis provides supplementary data of the soil-pipeline interaction in sand at deep embedment condition.
Resumo:
Cluster analysis of ranking data, which occurs in consumer questionnaires, voting forms or other inquiries of preferences, attempts to identify typical groups of rank choices. Empirically measured rankings are often incomplete, i.e. different numbers of filled rank positions cause heterogeneity in the data. We propose a mixture approach for clustering of heterogeneous rank data. Rankings of different lengths can be described and compared by means of a single probabilistic model. A maximum entropy approach avoids hidden assumptions about missing rank positions. Parameter estimators and an efficient EM algorithm for unsupervised inference are derived for the ranking mixture model. Experiments on both synthetic data and real-world data demonstrate significantly improved parameter estimates on heterogeneous data when the incomplete rankings are included in the inference process.
Resumo:
Reducing energy consumption is a major challenge for "energy-intensive" industries such as papermaking. A commercially viable energy saving solution is to employ data-based optimization techniques to obtain a set of "optimized" operational settings that satisfy certain performance indices. The difficulties of this are: 1) the problems of this type are inherently multicriteria in the sense that improving one performance index might result in compromising the other important measures; 2) practical systems often exhibit unknown complex dynamics and several interconnections which make the modeling task difficult; and 3) as the models are acquired from the existing historical data, they are valid only locally and extrapolations incorporate risk of increasing process variability. To overcome these difficulties, this paper presents a new decision support system for robust multiobjective optimization of interconnected processes. The plant is first divided into serially connected units to model the process, product quality, energy consumption, and corresponding uncertainty measures. Then multiobjective gradient descent algorithm is used to solve the problem in line with user's preference information. Finally, the optimization results are visualized for analysis and decision making. In practice, if further iterations of the optimization algorithm are considered, validity of the local models must be checked prior to proceeding to further iterations. The method is implemented by a MATLAB-based interactive tool DataExplorer supporting a range of data analysis, modeling, and multiobjective optimization techniques. The proposed approach was tested in two U.K.-based commercial paper mills where the aim was reducing steam consumption and increasing productivity while maintaining the product quality by optimization of vacuum pressures in forming and press sections. The experimental results demonstrate the effectiveness of the method.
Resumo:
Reducing energy consumption is a major challenge for energy-intensive industries such as papermaking. A commercially viable energy saving solution is to employ data-based optimization techniques to obtain a set of optimized operational settings that satisfy certain performance indices. The difficulties of this are: 1) the problems of this type are inherently multicriteria in the sense that improving one performance index might result in compromising the other important measures; 2) practical systems often exhibit unknown complex dynamics and several interconnections which make the modeling task difficult; and 3) as the models are acquired from the existing historical data, they are valid only locally and extrapolations incorporate risk of increasing process variability. To overcome these difficulties, this paper presents a new decision support system for robust multiobjective optimization of interconnected processes. The plant is first divided into serially connected units to model the process, product quality, energy consumption, and corresponding uncertainty measures. Then multiobjective gradient descent algorithm is used to solve the problem in line with user's preference information. Finally, the optimization results are visualized for analysis and decision making. In practice, if further iterations of the optimization algorithm are considered, validity of the local models must be checked prior to proceeding to further iterations. The method is implemented by a MATLAB-based interactive tool DataExplorer supporting a range of data analysis, modeling, and multiobjective optimization techniques. The proposed approach was tested in two U.K.-based commercial paper mills where the aim was reducing steam consumption and increasing productivity while maintaining the product quality by optimization of vacuum pressures in forming and press sections. The experimental results demonstrate the effectiveness of the method. © 2006 IEEE.