38 resultados para Information Ethics and its Applications

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to its wide applicability and ease of use, the analytic hierarchy process (AHP) has been studied extensively for the last 20 years. Recently, it is observed that the focus has been confined to the applications of the integrated AHPs rather than the stand-alone AHP. The five tools that commonly combined with the AHP include mathematical programming, quality function deployment (QFD), meta-heuristics, SWOT analysis, and data envelopment analysis (DEA). This paper reviews the literature of the applications of the integrated AHPs. Related articles appearing in the international journals from 1997 to 2006 are gathered and analyzed so that the following three questions can be answered: (i) which type of the integrated AHPs was paid most attention to? (ii) which area the integrated AHPs were prevalently applied to? (iii) is there any inadequacy of the approaches? Based on the inadequacy, if any, some improvements and possible future work are recommended. This research not only provides evidence that the integrated AHPs are better than the stand-alone AHP, but also aids the researchers and decision makers in applying the integrated AHPs effectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes an investigation into methods for controlling the mode distribution in multimode optical fibres. The major contributions presented in this thesis are summarised below. Emerging standards for Gigabit Ethernet transmission over multimode optical fibre have led to a resurgence of interest in the precise control, and specification, of modal launch conditions. In particular, commercial LED and OTDR test equipment does not, in general, comply with these standards. There is therefore a need for mode control devices, which can ensure compliance with the standards. A novel device consisting of a point-load mode-scrambler in tandem with a mode-filter is described in this thesis. The device, which has been patented, may be tuned to achieve a wide range of mode distributions and has been implemented in a ruggedised package for field use. Various other techniques for mode control have been described in this work, including the use of Long Period Gratings and air-gap mode-filters. Some of the methods have been applied to other applications, such as speckle suppression and in sensor technology. A novel, self-referencing, sensor comprising two modal groups in the Mode Power Distribution has been designed and tested. The feasibility of a two-channel Mode Group Diversity Multiplexed system has been demonstrated over 985m. A test apparatus for measuring mode distribution has been designed and constructed. The apparatus consists of a purpose-built video microscope, and comprehensive control and analysis software written in Visual Basic. The system may be fitted with a Silicon camera or an InGaAs camera, for measurement in the 850nm and 130nm transmission windows respectively. A limitation of the measurement method, when applied to well-filled fibres, has been identified and an improvement to the method has been proposed, based on modelled Laguerre Gauss field solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter provides the theoretical foundation and background on Data Envelopment Analysis (DEA) method and some variants of basic DEA models and applications to various sectors. Some illustrative examples, helpful resources on DEA, including DEA software package, are also presented in this chapter. DEA is useful for measuring relative efficiency for variety of institutions and has its own merits and limitations. This chapter concludes that DEA results should be interpreted with much caution to avoid giving wrong signals and providing inappropriate recommendations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Erasure control coding has been exploited in communication networks with an aim to improve the end-to-end performance of data delivery across the network. To address the concerns over the strengths and constraints of erasure coding schemes in this application, we examine the performance limits of two erasure control coding strategies, forward erasure recovery and adaptive erasure recovery. Our investigation shows that the throughput of a network using an (n, k) forward erasure control code is capped by r =k/n when the packet loss rate p ≤ (te/n) and by k(l-p)/(n-te) when p > (t e/n), where te is the erasure control capability of the code. It also shows that the lower bound of the residual loss rate of such a network is (np-te)/(n-te) for (te/n) < p ≤ 1. Especially, if the code used is maximum distance separable, the Shannon capacity of the erasure channel, i.e. 1-p, can be achieved and the residual loss rate is lower bounded by (p+r-1)/r, for (1-r) < p ≤ 1. To address the requirements in real-time applications, we also investigate the service completion time of different schemes. It is revealed that the latency of the forward erasure recovery scheme is fractionally higher than that of the scheme without erasure control coding or retransmission mechanisms (using UDP), but much lower than that of the adaptive erasure scheme when the packet loss rate is high. Results on comparisons between the two erasure control schemes exhibit their advantages as well as disadvantages in the role of delivering end-to-end services. To show the impact of the bounds derived on the end-to-end performance of a TCP/IP network, a case study is provided to demonstrate how erasure control coding could be used to maximize the performance of practical systems. © 2010 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a new mathematical model for efficiency analysis, which combines DEA methodology with an old idea-Ratio Analysis. Our model, called DEA-R, treats all possible ratios "output/input" as outputs within the standard DEA model. Although DEA and DEA-R generate different summary measures for efficiency, the two measures are comparable. Our mathematical and empirical comparisons establish the validity of DEA-R model in its own right. The key advantage of DEA-R over DEA is that it allows effective integration of the model with experts' opinions via flexible restrictive conditions on individual "output/input" pairs. © 2007 Springer Science+Business Media, LLC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present investigation is based on a linguistic analysis of the 'Housing Act 1980' and attempts to examine the role of qualifications in the structuring of the legislative statement. The introductory chapter isolates legislative writing as a "sub-variety “of legal language and provides an overview of the controversies surrounding the way it is written and the problems it poses to its readers. Chapter two emphasizes the limitations of the available work on the description of language-varieties for the analysis of legislative writing and outlines the approach adopted for the present analysis. This chapter also gives some idea of the information-structuring of legislative provisions and establishes qualification as a key element in their textualisation. The next three chapters offer a detailed account of the ten major qualification-types identified in the corpus, concentrating on the surface form they take, the features of legislative statements they textualize and the syntactic positions to which they are generally assigned in the statement of legislative provisions. The emerging hypotheses in these chapters have often been verified through a specialist reaction from a Parliamentary Counsel, largely responsible for the writing of the ‘Housing Act 1980’• The findings suggest useful correlations between a number of qualificational initiators and the various aspects of the legislative statement. They also reveal that many of these qualifications typically occur in those clause-medial syntactic positions which are sparingly used in other specialist discourse, thus creating syntactic discontinuity in the legislative sentence. Such syntactic discontinuities, on the evidence from psycholinguistic experiments reported in chapter six, create special problems in the processing and comprehension of legislative statements. The final chapter converts the main linguistic findings into a series of pedagogical generalizations, offers indications of how this may be applied in EALP situations and concludes with other considerations of possible applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research examines the role of the information management process within a process-oriented enterprise, Xerox Ltd. The research approach is based on a post-positive paradigm and has resulted in thirty-five idiographic statements. The three major outcomes are: 1. The process-oriented holistic enterprise is an organisation that requires a long-term management commitment to its development. It depends on the careful management of people, tasks, information and technology. A complex integration of business processes is required and this can be managed through the use of consistent documentation techniques, clarity in the definition of process responsibilities and management attention to the global metrics and the centralisation of the management of the process model are critical to its success. 2. The role of the information management process within the context of a process-oriented enterprise is to provide flexible and cost-effective applications, technological, and process support to the business. This is best achieved through a centralisation of the management of information management and of the process model. A business-led approach combined with the consolidation of applications, information, process, and data architectures is central to providing effective business and process-focused support. 3. In a process oriented holistic enterprise, process and information management are inextricably linked. The model of process management depends heavily on information management, whilst the model of information management is totally focused around supporting and creating the process model. The two models are mutually creating - one cannot exist without the other. There is a duality concept of process and information management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radio Frequency Identification (RFID) has been identified as a crucial technology for the modern 21st century knowledge-based economy. Some businesses have realised benefits of RFID adoption through improvements in operational efficiency, additional cost savings, and opportunities for higher revenues. RFID research in warehousing operations has been less prominent than in other application domains. To investigate how RFID technology has had an impact in warehousing, a comprehensive analysis of research findings available from articles through leading scientific article databases has been conducted. Articles from years 1995 to 2010 have been reviewed and analysed with respect to warehouse operations, RFID application domains, benefits achieved and obstacles encountered. Four discussion topics are presented covering RFID in warehousing focusing on its applications, perceived benefits, obstacles to its adoption and future trends. This is aimed at elucidating the current state of RFID in the warehouse and providing insights for researchers to establish new research agendas and for practitioners to consider and assess the adoption of RFID in warehousing functions. © 2013 Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to vigorous globalisation and product proliferation in recent years, more waste has been produced by the soaring manufacturing activities. This has contributed to the significant need for an efficient waste management system to ensure, with all efforts, the waste is properly treated for recycling or disposed. This paper presents a Decision Support System (DSS) framework, based on Constraint Logic Programming (CLP), for the collection management of industrial waste (of all kinds) and discusses the potential employment of Radio-Frequency Identification Technology (RFID) to improve several critical procedures involved in managing waste collection. This paper also demonstrates a widely distributed and semi-structured network of waste producing enterprises (e.g. manufacturers) and waste processing enterprises (i.e. waste recycling/treatment stations) improving their operations planning by means of using the proposed DSS. The potential RFID applications to update and validate information in a continuous manner to bring value-added benefits to the waste collection business are also presented. © 2012 Inderscience Enterprises Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This review summarises the functions of the enzyme tissue transglutaminase (TG2) in the extracellular matrix (ECM) both as a matrix stabiliser through its protein cross-linking activity and as an important cell adhesion protein involved in cell survival. The contribution of extracellular TG2 to the pathology of important diseases such as cancer and fibrosis are discussed with a view to the potential importance of TG2 as a therapeutic target. The medical applications of TG2 are further expanded by detailing the use of transglutaminase cross-linking in the development of novel biocompatible biomaterials for use in soft and hard tissue repair.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis seeks to describe the development of an inexpensive and efficient clustering technique for multivariate data analysis. The technique starts from a multivariate data matrix and ends with graphical representation of the data and pattern recognition discriminant function. The technique also results in distances frequency distribution that might be useful in detecting clustering in the data or for the estimation of parameters useful in the discrimination between the different populations in the data. The technique can also be used in feature selection. The technique is essentially for the discovery of data structure by revealing the component parts of the data. lhe thesis offers three distinct contributions for cluster analysis and pattern recognition techniques. The first contribution is the introduction of transformation function in the technique of nonlinear mapping. The second contribution is the us~ of distances frequency distribution instead of distances time-sequence in nonlinear mapping, The third contribution is the formulation of a new generalised and normalised error function together with its optimal step size formula for gradient method minimisation. The thesis consists of five chapters. The first chapter is the introduction. The second chapter describes multidimensional scaling as an origin of nonlinear mapping technique. The third chapter describes the first developing step in the technique of nonlinear mapping that is the introduction of "transformation function". The fourth chapter describes the second developing step of the nonlinear mapping technique. This is the use of distances frequency distribution instead of distances time-sequence. The chapter also includes the new generalised and normalised error function formulation. Finally, the fifth chapter, the conclusion, evaluates all developments and proposes a new program. for cluster analysis and pattern recognition by integrating all the new features.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exporting is one of the main ways in which organizations internationalize. With the more turbulent, heterogeneous, sophisticated and less familiar export environment, the organizational learning ability of the exporting organization may become its only source of sustainable competitive advantage. However, achieving a competitive level of learning is not easy. Companies must be able to find ways to improve their learning capability by enhancing the different aspects of the learning process. One of these is export memory. Building from an export information processing framework this research work particularly focuses on the quality of export memory, its determinants, its subsequent use in decision-making, and its ultimate relationship with export performance. Within export memory use, four export memory use dimensions have been discovered: instrumental, conceptual, legitimizing and manipulating. Results from the qualitative study based on the data from a mail survey with 354 responses reveal that the development of export memory quality is positively related with quality of export information acquisition, the quality of export information interpretation, export coordination, and integration of the information into the organizational system. Several company and environmental factors have also been examined in terms of their relationship with export memory use. The two factors found to be significantly related to the extent of export memory use are acquisition of export information quality and export memory quality. The results reveal that export memory quality is positively related to the extent of export memory use which in turn was found to be positively related to export performance. Furthermore, results of the study show that there is only one aspect of export memory use that significantly affects export performance – the extent of export memory use. This finding could mean that there is no particular type of export memory use favored since the choice of the type of use is situation specific. Additional results reveal that environmental turbulence and export memory overload have moderating effects on the relationship between export memory use and export performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent history of small shop and independent retailing has been one of decline. The most desirable form of assistance is the provision of information which will increase the efficiency model of marketing mix effeciveness which may be applied in small scale retailing. A further aim is to enhance theoretical development in the marketing field. Recent changes in retailing have affected location, product range, pricing and promotion practices. Although a large number of variables representing aspects of the marketing mix may be identified, it is not possible, on the basis of currently available information, to quantify or rank them according to their effect on sales performance. In designing a suitable study a major issue is that of access to a suitable representative sample of small retailers. The publish nature of the retail activities involved facilitates the use of a novel observation approach to data collection. A cross-sectional survey research design was used focussing on a clustered random sample of greengrocers and gent's fashion outfitters in the West Midlands. Linear multiple regression was the main analytical technique. Powerful regression models were evolved for both types of retailing. For greengrocers the major influences on trade are pedestrian traffic and shelf display space. For gent's outfitters they are centrality-to-other shopping, advertising and shelf display space. The models may be utilised by retailers to determine the relative strength of marketing mix variables. The level of precision is not sufficient to permit cost benefit analysis. Comparison of the findings for the two distinct kinds of business studied suggests an overall model of marketing mix effectiveness might be based on frequency of purchase, homogeneity of the shopping environment, elasticity of demand and bulk characteristics of the good sold by a shop.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents the design, fabrication and testing of novel grating based Optical Fibre Sensor (OFS) systems being interrogated using “off the shelf” interrogation systems, with the eventual development of marketable commercial systems at the forefront of the research. Both in the industrial weighing and aerospace industries, there has been a drive to investigate the feasibility of using optical fibre sensors being deployed where traditionally their electrical or mechanical counterparts would traditionally have been. Already, in the industrial weighing industry, commercial operators are deploying OFS-based Weigh-In-Motion (WIM) systems. Likewise, in the aerospace industry, OFS have been deployed to monitor such parameters as load history, impact detection, structural damage, overload detection, centre of gravity and the determination of blade shape. Based on the intrinsic properties of fibre Bragg gratings (FBGs) and Long Period Fibre Gratings (LPFGs), a number of novel OFS-based systems have been realised. Experimental work has shown that in the case of static industrial weighing, FBGs can be integrated with current commercial products and used to detect applied loads. The work has also shown that embedding FBGs in e-glass, to form a sensing patch, can result in said patches being bonded to rail track, forming the basis of an FBG-based WIM system. The results obtained have been sufficiently encouraging to the industrial partner that this work will be progressed beyond the scope of the work presented in this thesis. Likewise, and to the best of the author’s knowledge, a novel Bragg grating based systems for aircraft fuel parameter sensing has been presented. FBG-based pressure sensors have been shown to demonstrate good sensitivity, linearity and repeatability, whilst LPFG-based systems have demonstrated a far greater sensitivity when compared to FBGs, as well the advantage of being potentially able to detect causes of fuel adulteration based on their sensitivity to refractive index (RI). In the case of the LPFG-based system, considerable work remains to be done on the mechanical strengthening to improve its survivability in a live aircraft fuel tank environment. The FBG system has already been developed to an aerospace compliant prototype and is due to be tested at the fuel testing facility based at Airbus, Filton, UK. It is envisaged by the author that in both application areas, continued research in this area will lead to the eventual development of marketable commercial products.