14 resultados para FCP and FCC mapping

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, issues of childhood obesity, unsafe toys, and child labor have raised the question of corporate responsibilities to children. However, business impacts on children are complex, multi-faceted, and frequently overlooked by senior managers. This article reports on a systematic analysis of the reputational landscape constructed by the media, corporations, and non-government organizations around business responsibilities to children. A content analysis methodology is applied to a sample of more than 350 relevant accounts during a 5-year period. We identify seven core responsibilities that are then used to provide a framework for enabling businesses to map their range of impacts on children. We set out guidelines for how to identify and manage the firm’s strategic responsibilities in this arena, and identify the␣constraints that corporations face in meeting such responsibilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Substantial behavioural and neuropsychological evidence has been amassed to support the dual-route model of morphological processing, which distinguishes between a rule-based system for regular items (walk–walked, call–called) and an associative system for the irregular items (go–went). Some neural-network models attempt to explain the neuropsychological and brain-mapping dissociations in terms of single-system associative processing. We show that there are problems in the accounts of homogeneous networks in the light of recent brain-mapping evidence of systematic double-dissociation. We also examine the superior capabilities of more internally differentiated connectionist models, which, under certain conditions, display systematic double-dissociations. It appears that the more differentiation models show, the more easily they account for dissociation patterns, yet without implementing symbolic computations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design and implementation of data bases involve, firstly, the formulation of a conceptual data model by systematic analysis of the structure and information requirements of the organisation for which the system is being designed; secondly, the logical mapping of this conceptual model onto the data structure of the target data base management system (DBMS); and thirdly, the physical mapping of this structured model into storage structures of the target DBMS. The accuracy of both the logical and physical mapping determine the performance of the resulting systems. This thesis describes research which develops software tools to facilitate the implementation of data bases. A conceptual model describing the information structure of a hospital is derived using the Entity-Relationship (E-R) approach and this model forms the basis for mapping onto the logical model. Rules are derived for automatically mapping the conceptual model onto relational and CODASYL types of data structures. Further algorithms are developed for partly automating the implementation of these models onto INGRES, MIMER and VAX-11 DBMS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Semantic Web relies on carefully structured, well defined, data to allow machines to communicate and understand one another. In many domains (e.g. geospatial) the data being described contains some uncertainty, often due to incomplete knowledge; meaningful processing of this data requires these uncertainties to be carefully analysed and integrated into the process chain. Currently, within the SemanticWeb there is no standard mechanism for interoperable description and exchange of uncertain information, which renders the automated processing of such information implausible, particularly where error must be considered and captured as it propagates through a processing sequence. In particular we adopt a Bayesian perspective and focus on the case where the inputs / outputs are naturally treated as random variables. This paper discusses a solution to the problem in the form of the Uncertainty Markup Language (UncertML). UncertML is a conceptual model, realised as an XML schema, that allows uncertainty to be quantified in a variety of ways i.e. realisations, statistics and probability distributions. UncertML is based upon a soft-typed XML schema design that provides a generic framework from which any statistic or distribution may be created. Making extensive use of Geography Markup Language (GML) dictionaries, UncertML provides a collection of definitions for common uncertainty types. Containing both written descriptions and mathematical functions, encoded as MathML, the definitions within these dictionaries provide a robust mechanism for defining any statistic or distribution and can be easily extended. Universal Resource Identifiers (URIs) are used to introduce semantics to the soft-typed elements by linking to these dictionary definitions. The INTAMAP (INTeroperability and Automated MAPping) project provides a use case for UncertML. This paper demonstrates how observation errors can be quantified using UncertML and wrapped within an Observations & Measurements (O&M) Observation. The interpolation service uses the information within these observations to influence the prediction outcome. The output uncertainties may be encoded in a variety of UncertML types, e.g. a series of marginal Gaussian distributions, a set of statistics, such as the first three marginal moments, or a set of realisations from a Monte Carlo treatment. Quantifying and propagating uncertainty in this way allows such interpolation results to be consumed by other services. This could form part of a risk management chain or a decision support system, and ultimately paves the way for complex data processing chains in the Semantic Web.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rural electrification projects and programmes in many countries have suffered from design, planning, implementation and operational flaws as a result of ineffective project planning and lack of systematic project risk analysis. This paper presents a hierarchical risk-management framework for effectively managing large-scale development projects. The proposed framework first identifies, with the involvement of stakeholders, the risk factors for a rural electrification programme at three different levels (national, state and site). Subsequently it develops a qualitative risk prioritising scheme through probability and severity mapping and provides mitigating measures for most vulnerable risks. The study concludes that the hierarchical risk-management approach provides an effective framework for managing large-scale rural electrification programmes. © IAIA 2007.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A structured approach to process improvement is described in the context of the human resources division of a UK police force. The approach combines a number of established techniques of process improvement such as the balanced scorecard and process mapping with a scoring system developed to prioritise processes for improvement. The methodology described presents one way of ensuring the correct processes are identified and redesigned at an operational level in such a way as to support the organisation's strategic aims. In addition, a performance measurement system is utilised to attempt to ensure that the changes implemented do actually achieve the desired effect over time. The case demonstrates the need to choose and in some cases develop in-house tools and techniques dependent on the context of the process improvement effort.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditionally, geostatistical algorithms are contained within specialist GIS and spatial statistics software. Such packages are often expensive, with relatively complex user interfaces and steep learning curves, and cannot be easily integrated into more complex process chains. In contrast, Service Oriented Architectures (SOAs) promote interoperability and loose coupling within distributed systems, typically using XML (eXtensible Markup Language) and Web services. Web services provide a mechanism for a user to discover and consume a particular process, often as part of a larger process chain, with minimal knowledge of how it works. Wrapping current geostatistical algorithms with a Web service layer would thus increase their accessibility, but raises several complex issues. This paper discusses a solution to providing interoperable, automatic geostatistical processing through the use of Web services, developed in the INTAMAP project (INTeroperability and Automated MAPping). The project builds upon Open Geospatial Consortium standards for describing observations, typically used within sensor webs, and employs Geography Markup Language (GML) to describe the spatial aspect of the problem domain. Thus the interpolation service is extremely flexible, being able to support a range of observation types, and can cope with issues such as change of support and differing error characteristics of sensors (by utilising descriptions of the observation process provided by SensorML). XML is accepted as the de facto standard for describing Web services, due to its expressive capabilities which allow automatic discovery and consumption by ‘naive’ users. Any XML schema employed must therefore be capable of describing every aspect of a service and its processes. However, no schema currently exists that can define the complex uncertainties and modelling choices that are often present within geostatistical analysis. We show a solution to this problem, developing a family of XML schemata to enable the description of a full range of uncertainty types. These types will range from simple statistics, such as the kriging mean and variances, through to a range of probability distributions and non-parametric models, such as realisations from a conditional simulation. By employing these schemata within a Web Processing Service (WPS) we show a prototype moving towards a truly interoperable geostatistical software architecture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Overlaying maps using a desktop GIS is often the first step of a multivariate spatial analysis. The potential of this operation has increased considerably as data sources and Web services to manipulate them are becoming widely available via the Internet. Standards from the OGC enable such geospatial mashups to be seamless and user driven, involving discovery of thematic data. The user is naturally inclined to look for spatial clusters and correlation of outcomes. Using classical cluster detection scan methods to identify multivariate associations can be problematic in this context, because of a lack of control on or knowledge about background populations. For public health and epidemiological mapping, this limiting factor can be critical but often the focus is on spatial identification of risk factors associated with health or clinical status. Spatial entropy index HSu for the ScankOO analysis of the hypothetical dataset using a vicinity which is fixed by the number of points without distinction between their labels. (The size of the labels is proportional to the inverse of the index) In this article we point out that this association itself can ensure some control on underlying populations, and develop an exploratory scan statistic framework for multivariate associations. Inference using statistical map methodologies can be used to test the clustered associations. The approach is illustrated with a hypothetical data example and an epidemiological study on community MRSA. Scenarios of potential use for online mashups are introduced but full implementation is left for further research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Digital image processing is exploited in many diverse applications but the size of digital images places excessive demands on current storage and transmission technology. Image data compression is required to permit further use of digital image processing. Conventional image compression techniques based on statistical analysis have reached a saturation level so it is necessary to explore more radical methods. This thesis is concerned with novel methods, based on the use of fractals, for achieving significant compression of image data within reasonable processing time without introducing excessive distortion. Images are modelled as fractal data and this model is exploited directly by compression schemes. The validity of this is demonstrated by showing that the fractal complexity measure of fractal dimension is an excellent predictor of image compressibility. A method of fractal waveform coding is developed which has low computational demands and performs better than conventional waveform coding methods such as PCM and DPCM. Fractal techniques based on the use of space-filling curves are developed as a mechanism for hierarchical application of conventional techniques. Two particular applications are highlighted: the re-ordering of data during image scanning and the mapping of multi-dimensional data to one dimension. It is shown that there are many possible space-filling curves which may be used to scan images and that selection of an optimum curve leads to significantly improved data compression. The multi-dimensional mapping property of space-filling curves is used to speed up substantially the lookup process in vector quantisation. Iterated function systems are compared with vector quantisers and the computational complexity or iterated function system encoding is also reduced by using the efficient matching algcnithms identified for vector quantisers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Overlaying maps using a desktop GIS is often the first step of a multivariate spatial analysis. The potential of this operation has increased considerably as data sources an dWeb services to manipulate them are becoming widely available via the Internet. Standards from the OGC enable such geospatial ‘mashups’ to be seamless and user driven, involving discovery of thematic data. The user is naturally inclined to look for spatial clusters and ‘correlation’ of outcomes. Using classical cluster detection scan methods to identify multivariate associations can be problematic in this context, because of a lack of control on or knowledge about background populations. For public health and epidemiological mapping, this limiting factor can be critical but often the focus is on spatial identification of risk factors associated with health or clinical status. In this article we point out that this association itself can ensure some control on underlying populations, and develop an exploratory scan statistic framework for multivariate associations. Inference using statistical map methodologies can be used to test the clustered associations. The approach is illustrated with a hypothetical data example and an epidemiological study on community MRSA. Scenarios of potential use for online mashups are introduced but full implementation is left for further research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of cobalt molybdenum and cobalt tungsten brush plating electrolytes is described. Their optimum compositions and operating conditions for commercial applications have been determined. The effects of composition, pH, applied voltage, stylus speed and pressure upon deposit composition and efficiency have been investigated. Transmission and Scanning Electron Microscopy have been employed to study the cobalt alloy deposits produced. Evaluation of the wear resistant properties of the cobalt alloys developed in this work was carried out in the laboratory using a pin and disc technique and a simulated hot forging test, and by industrial trials involving the "on site" plating of hot forging dies and cold pressing tools. It was concluded that the electrolytes developed in tl1is work enabled cobalt alloys containing 6% Mo or 8% W to be deposited at 17-20V. Brush plated cobalt deposits possessed a mixed CPU and FCC crystallographic structure at room temperature. The application of 13µm of either of the cobalt alloys resulted in improved wear performance in both pin and disc and simulated hot forging tests. The results of the industrial trials indicated that by the use of these alloys, the life of hot forging dies may be increased by 20-100%. A commercial forging organisation is using electrolytes developed in this work to coat dies prior to forging nimonic alloys. Reductions in forging temperature and improved forging qualities have been reported. Cold pressing tools coated with the alloys showed a reduced tendency to "pick-up" and scoring of the pressed panels. Reports of a reduced need for lubrication of panels before pressing have also been received.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nanostructured Cu/304 stainless steel (SS) multilayers were prepared by magnetron sputtering. 304SS has a face-centered-cubic (fcc) structure in bulk. However, in the Cu/304SS multilayers, the 304SS layers exhibit the fcc structure for layer thickness of =5 nm in epitaxy with the neighboring fcc Cu. For 304SS layer thickness larger than 5 nm, body-centered-cubic (bcc) 304SS grains grow on top of the initial 5 nm fcc SS with the Kurdjumov-Sachs orientation relationship between bcc and fcc SS grains. The maximum hardness of Cu/304SS multilayers is about 5.5 GPa (factor of two enhancement compared to rule-of-mixtures hardness) at a layer thickness of 5 nm. Below 5 nm, hardness decreases with decreasing layer thickness. The peak hardness of fcc/fcc Cu/304SS multilayer is greater than that of Cu/Ni, even though the lattice-parameter mismatch between Cu and Ni is five times greater than that between Cu and 304SS. This result may primarily be attributed to the higher interface barrier stress for single-dislocation transmission across the {111} twinned interfaces in Cu/304SS as compared to the {100} interfaces in Cu/Ni.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main aim of this research is to demonstrate strategic supplier performance evaluation of a UK-based manufacturing organisation using an integrated analytical framework. Developing long term relationship with strategic suppliers is common in today's industry. However, monitoring suppliers' performance all through the contractual period is important in order to ensure overall supply chain performance. Therefore, client organisations need to measure suppliers' performance dynamically and inform them on improvement measures. Although there are many studies introducing innovative supplier performance evaluation frameworks and empirical researches on identifying criteria for supplier evaluation, little has been reported on detailed application of strategic supplier performance evaluation and its implication on overall performance of organisation. Additionally, majority of the prior studies emphasise on lagging factors (quality, delivery schedule and value/cost) for supplier selection and evaluation. This research proposes both leading (organisational practices, risk management, environmental and social practices) and lagging factors for supplier evaluation and demonstrates a systematic method for identifying those factors with the involvement of relevant stakeholders and process mapping. The contribution of this article is a real-life case-based action research utilising an integrated analytical model that combines quality function deployment and the analytic hierarchy process method for suppliers' performance evaluation. The effectiveness of the method has been demonstrated through number of validations (e.g. focus group, business results, and statistical analysis). Additionally, the study reveals that enhanced supplier performance results positive impact on operational and business performance of client organisation.