32 resultados para Analysis Tools

em Aston University Research Archive


Relevância:

70.00% 70.00%

Publicador:

Resumo:

The major role of information and communication technology (ICT) in the new economy is well documented: countries worldwide are pouring resources into their ICT infrastructure despite the widely acknowledged “productivity paradox”. Evaluating the contribution of ICT investments has become an elusive but important goal of IS researchers and economists. But this area of research is fraught with complexity and we have used Solow's Residual together with time-series analysis tools to overcome some methodological inadequacies of previous studies. Using this approach, we conduct a study of 20 countries to determine if there was empirical evidence to support claims that ICT investments are worthwhile. The results show that ICT contributes to economic growth in many developed countries and newly industrialized economies (NIEs), but not in developing countries. We finally suggest ICT-complementary factors, in an attempt to rectify possible flaws in ICT policies as a contribution towards improvement in global productivity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Construction projects are risky. A build-operate-transfer (BOT) project is recognised as one of the most risky project schemes. This scheme has been employed rather frequently in the past few decades, in both developed and developing countries. However, because of its risky nature, there have been failures as well as successes. Risk analysis in an appropriate way is desirable in implementing BOT projects. There are various tools and techniques applicable to risk analysis. The application of these risk analysis tools and techniques (RATTs) to BOT projects depends on an understanding of the contents and contexts of BOT projects, together with a thorough understanding of RATTs. This paper studies key points in their applications through reviews of relevant literatures and discusses the application of RATTs to BOT projects. The application to BOT projects is considered from the viewpoints of the major project participants, i.e. government, lenders and project companies. Discussion is also made with regard to political risks, which are very important in BOT projects. A flow chart has been introduced to select an appropriate tool for risk management in BOT projects. This study contributes to the establishment of a framework for systematic risk management in BOT projects.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The continuing threat of infectious disease and future pandemics, coupled to the continuous increase of drug-resistant pathogens, makes the discovery of new and better vaccines imperative. For effective vaccine development, antigen discovery and validation is a prerequisite. The compilation of information concerning pathogens, virulence factors and antigenic epitopes has resulted in many useful databases. However, most such immunological databases focus almost exclusively on antigens where epitopes are known and ignore those for which epitope information was unavailable. We have compiled more than 500 antigens into the AntigenDB database, making use of the literature and other immunological resources. These antigens come from 44 important pathogenic species. In AntigenDB, a database entry contains information regarding the sequence, structure, origin, etc. of an antigen with additional information such as B and T-cell epitopes, MHC binding, function, gene-expression and post translational modifications, where available. AntigenDB also provides links to major internal and external databases. We shall update AntigenDB on a rolling basis, regularly adding antigens from other organisms and extra data analysis tools. AntigenDB is available freely at http://www.imtech.res.in/raghava/antigendb and its mirror site http://www.bic.uams.edu/raghava/antigendb.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Product design decisions can have a significant impact on the financial and operation performance of manufacturing companies. Therefore good analysis of the financial impact of design decisions is required if the profitability of the business is to be maximised. The product design process can be viewed as a chain of decisions which links decisions about the concept to decisions about the detail. The idea of decision chains can be extended to include the design and operation of the 'downstream' business processes which manufacture and support the product. These chains of decisions are not independent but are interrelated in a complex manner. To deal with the interdependencies requires a modelling approach which represents all the chains of decisions, to a level of detail not normally considered in the analysis of product design. The operational, control and financial elements of a manufacturing business constitute a dynamic system. These elements interact with each other and with external elements (i.e. customers and suppliers). Analysing the chain of decisions for such an environment requires the application of simulation techniques, not just to any one area of interest, but to the whole business i.e. an enterprise simulation. To investigate the capability and viability of enterprise simulation an experimental 'Whole Business Simulation' system has been developed. This system combines specialist simulation elements and standard operational applications software packages, to create a model that incorporates all the key elements of a manufacturing business, including its customers and suppliers. By means of a series of experiments, the performance of this system was compared with a range of existing analysis tools (i.e. DFX, capacity calculation, shop floor simulator, and business planner driven by a shop floor simulator).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We report statistical time-series analysis tools providing improvements in the rapid, precision extraction of discrete state dynamics from time traces of experimental observations of molecular machines. By building physical knowledge and statistical innovations into analysis tools, we provide techniques for estimating discrete state transitions buried in highly correlated molecular noise. We demonstrate the effectiveness of our approach on simulated and real examples of steplike rotation of the bacterial flagellar motor and the F1-ATPase enzyme. We show that our method can clearly identify molecular steps, periodicities and cascaded processes that are too weak for existing algorithms to detect, and can do so much faster than existing algorithms. Our techniques represent a step in the direction toward automated analysis of high-sample-rate, molecular-machine dynamics. Modular, open-source software that implements these techniques is provided.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In an Arab oil producing country in the Middle East such as Kuwait, Oil industry is considered as the main and most important industry of the country. This industry’s importance emerged from the significant role it plays in both country’s national economy and also global economy. Moreover, Oil industry’s criticality comes from its interconnectivity with national security and power in the Middle East region. Hence, conducting this research in this crucial industry had certainly added values to companies in this industry as it investigated thoroughly the main components of the TQM implementation process and identified which components affects significantly TQM’s implementation and its gained business results. In addition, as the Oil sector is a large sector that is known for its richness of employees with different national cultures and backgrounds. Thus, this culture-heterogeneous industry seems to be the most appropriate environment to address and satisfy a need in the literature to investigate the national culture values’ effects on TQM implementation process. Furthermore, this research has developed a new conceptual model of TQM implementation process in the Kuwaiti Oil industry that applies in general to operations and productions organizations at the Kuwaiti business environment and in specific to organizations in the Oil industry, as well it serves as a good theoretical model for improving operations and production level of the oil industry in other developing and developed countries. Thus, such research findings minimized the literature’s gap found the limited amount of empirical research of TQM implementation in well-developed industries existing in an Arab, developing countries and specifically in Kuwait, where there was no coherent national model for a universal TQM implementation in the Kuwaiti Oil industry in specific and Kuwaiti business environment in general. Finally, this newly developed research framework, which emerged from the literature search, was validated by rigorous quantitative analysis tools including SPSS and Structural Equation Modeling. The quantitative findings of questionnaires collected were supported by the qualitative findings of interviews conducted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In view of the need to provide tools to facilitate the re-use of existing knowledge structures such as ontologies, we present in this paper a system, AKTiveRank, for the ranking of ontologies. AKTiveRank uses as input the search terms provided by a knowledge engineer and, using the output of an ontology search engine, ranks the ontologies. We apply a number of metrics in an attempt to investigate their appropriateness for ranking ontologies, and compare the results with a questionnaire-based human study. Our results show that AKTiveRank will have great utility although there is potential for improvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper surveys the context of feature extraction by neural network approaches, and compares and contrasts their behaviour as prospective data visualisation tools in a real world problem. We also introduce and discuss a hybrid approach which allows us to control the degree of discriminatory and topographic information in the extracted feature space.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to develop an integrated quality management model that identifies problems, suggests solutions, develops a framework for implementation and helps to evaluate dynamically healthcare service performance. Design/methodology/approach - This study used the logical framework analysis (LFA) to improve the performance of healthcare service processes. LFA has three major steps - problems identification, solution derivation, and formation of a planning matrix for implementation. LFA has been applied in a case-study environment to three acute healthcare services (Operating Room utilisation, Accident and Emergency, and Intensive Care) in order to demonstrate its effectiveness. Findings - The paper finds that LFA is an effective method of quality management of hospital-based healthcare services. Research limitations/implications - This study shows LFA application in three service processes in one hospital. This very limited population sample needs to be extended. Practical implications - The proposed model can be implemented in hospital-based healthcare services in order to improve performance. It may also be applied to other services. Originality/value - Quality improvement in healthcare services is a complex and multi-dimensional task. Although various quality management tools are routinely deployed for identifying quality issues in healthcare delivery, they are not without flaws. There is an absence of an integrated approach, which can identify and analyse issues, provide solutions to resolve those issues, develop a project management framework to implement those solutions. This study introduces an integrated and uniform quality management tool for healthcare services. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This review is structured in three sections and provides a conceptual framework for the empirical analysis of strategy tools as they are used in practice. Examples of strategy tools are SWOT analysis or Porter’s Five Forces, among others. Section one reviews empirical research into the use of strategy tools, classifying them according to variations in their use. Section two explains the concept of boundary objects as the basis for our argument that strategy tools may be understood as boundary objects. Boundary objects are artefacts that are meaningfully and usefully incorporated to enable sharing of information and transfer of knowledge across intra-organizational boundaries, such as laterally across different strategic business units or vertically across hierarchical levels. Section three draws the two bodies of literature together, conceptualizing strategy tools in practice as boundary objects. This review contributes to knowledge on using strategy tools in practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of a task is fundamental to the discipline of ergonomics. Approaches to the analysis of tasks began in the early 1900's. These approaches have evolved and developed to the present day, when there is a vast array of methods available. Some of these methods are specific to particular contexts or applications, others more general. However, whilst many of these analyses allow tasks to be examined in detail, they do not act as tools to aid the design process or the designer. The present thesis examines the use of task analysis in a process control context, and in particular the use of task analysis to specify operator information and display requirements in such systems. The first part of the thesis examines the theoretical aspect of task analysis and presents a review of the methods, issues and concepts relating to task analysis. A review of over 80 methods of task analysis was carried out to form a basis for the development of a task analysis method to specify operator information requirements in industrial process control contexts. Of the methods reviewed Hierarchical Task Analysis was selected to provide such a basis and developed to meet the criteria outlined for such a method of task analysis. The second section outlines the practical application and evolution of the developed task analysis method. Four case studies were used to examine the method in an empirical context. The case studies represent a range of plant contexts and types, both complex and more simple, batch and continuous and high risk and low risk processes. The theoretical and empirical issues are drawn together and a method developed to provide a task analysis technique to specify operator information requirements and to provide the first stages of a tool to aid the design of VDU displays for process control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of the work presented in this thesis is to investigate the two sides of the flute, the face and the heel of a twist drill. The flute face was designed to yield straight diametral lips which could be extended to eliminate the chisel edge, and consequently a single cutting edge will be obtained. Since drill rigidity and space for chip conveyance have to be a compromise a theoretical expression is deduced which enables optimum chip disposal capacity to be described in terms of drill parameters. This expression is used to describe the flute heel side. Another main objective is to study the effect on drill performance of changing the conventional drill flute. Drills were manufactured according to the new flute design. Tests were run in order to compare the performance of a conventional flute drill and non conventional design put forward. The results showed that 50% reduction in thrust force and approximately 18% reduction in torque were attained for the new design. The flank wear was measured at the outer corner and found to be less for the new design drill than for the conventional one in the majority of cases. Hole quality, roundness, size and roughness were also considered as a further aspect of drill performance. Improvement in hole quality is shown to arise under certain cutting conditions. Accordingly it might be possible to use a hole which is produced in one pass of the new drill which previously would have required a drilled and reamed hole. A subsidiary objective is to design the form milling cutter that should be employed for milling the foregoing special flute from drill blank allowing for the interference effect. A mathematical analysis in conjunction with computing technique and computers is used. To control the grinding parameter, a prototype drill grinder was designed and built upon the framework of an existing cincinnati cutter grinder. The design and build of the new grinder is based on a computer aided drill point geometry analysis. In addition to the conical grinding concept, the new grinder is also used to produce spherical point utilizing a computer aided drill point geometry analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visual perception is dependent on both light transmission through the eye and neuronal conduction through the visual pathway. Advances in clinical diagnostics and treatment modalities over recent years have increased the opportunities to improve the optical path and retinal image quality. Higher order aberrations and retinal straylight are two major factors that influence light transmission through the eye and ultimately, visual outcome. Recent technological advancements have brought these important factors into the clinical domain, however the potential applications of these tools and considerations regarding interpretation of data are much underestimated. The purpose of this thesis was to validate and optimise wavefront analysers and a new clinical tool for the objective evaluation of intraocular scatter. The application of these methods in a clinical setting involving a range of conditions was also explored. The work was divided into two principal sections: 1. Wavefront Aberrometry: optimisation, validation and clinical application The main findings of this work were: • Observer manipulation of the aberrometer increases variability by a factor of 3. • Ocular misalignment can profoundly affect reliability, notably for off-axis aberrations. • Aberrations measured with wavefront analysers using different principles are not interchangeable, with poor relationships and significant differences between values. • Instrument myopia of around 0.30D is induced when performing wavefront analysis in non-cyclopleged eyes; values can be as high as 3D, being higher as the baseline level of myopia decreases. Associated accommodation changes may result in relevant changes to the aberration profile, particularly with respect to spherical aberration. • Young adult healthy Caucasian eyes have significantly more spherical aberration than Asian eyes when matched for age, gender, axial length and refractive error. Axial length is significantly correlated with most components of the aberration profile. 2. Intraocular light scatter: Evaluation of subjective measures and validation and application of a new objective method utilising clinically derived wavefront patterns. The main findings of this work were: • Subjective measures of clinical straylight are highly repeatable. Three measurements are suggested as the optimum number for increased reliability. • Significant differences in straylight values were found for contact lenses designed for contrast enhancement compared to clear lenses of the same design and material specifications. Specifically, grey/green tints induced significantly higher values of retinal straylight. • Wavefront patterns from a commercial Hartmann-Shack device can be used to obtain objective measures of scatter and are well correlated with subjective straylight values. • Perceived retinal stray light was similar in groups of patients implanted with monofocal and multi focal intraocular lenses. Correlation between objective and subjective measurements of scatter is poor, possibly due to different illumination conditions between the testing procedures, or a neural component which may alter with age. Careful acquisition results in highly reproducible in vivo measures of higher order aberrations; however, data from different devices are not interchangeable which brings the accuracy of measurement into question. Objective measures of intraocular straylight can be derived from clinical aberrometry and may be of great diagnostic and management importance in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigates the discursive patterns of interactions between police interviewers and women reporting rape in significant witness interviews. Data in the form of video recorded interviews were obtained from a UK police force for the purposes of this study. The data are analysed using a multi-method approach, incorporating tools from micro-sociology, Conversation Analysis and Discursive Psychology, to reveal patterns of interactional control, negotiation, and interpretation. The study adopts a critical approach, which is to say that as well as describing discursive patterns, it explains them in light of the discourse processes involved in the production and consumption of police interview talk, and comments on the relationship between these discourse processes and the social context in which they occur. A central focus of the study is how interviewers draw on particular interactional resources to shape interviewees? accounts in particular ways, and this is discussed in relation to the institutional role of the significant witness interview. The discussion is also extended to the ways in which mainstream rape ideology is both reflected in, and maintained by, the discursive choices of participants. The findings of this study indicate that there are a number of issues to be addressed in terms of the training currently offered to officers at Level 2 of the Professionalising Investigation Programme (PIP) (NPIA, 2009) who intend to conduct significant witness interviews. Furthermore, a need is identified to bring the linguistic and discursive processes of negotiation and transformation identified by the study to the attention of the justice system as a whole. This is a particularly pressing need in light of judicial reluctance to replace written witness statements, the current „end product? of significant witness interviews, with the video recorded interview in place of direct examination in cases of rape.