910 resultados para Integrated Expert Systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Logistics distribution network design is one of the major decision problems arising in contemporary supply chain management. The decision involves many quantitative and qualitative factors that may be conflicting in nature. This paper applies an integrated multiple criteria decision making approach to design an optimal distribution network. In the approach, the analytic hierarchy process (AHP) is used first to determine the relative importance weightings or priorities of alternative warehouses with respect to both deliverer oriented and customer oriented criteria. Then, the goal programming (GP) model incorporating the constraints of system, resource, and AHP priority is formulated to select the best set of warehouses without exceeding the limited available resources. In this paper, two commercial packages are used: Expert Choice for determining the AHP priorities of the warehouses, and LINDO for solving the GP model. © 2007 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resource allocation is one of the major decision problems arising in higher education. Resources must be allocated optimally in such a way that the performance of universities can be improved. This paper applies an integrated multiple criteria decision making approach to the resource allocation problem. In the approach, the Analytic Hierarchy Process (AHP) is first used to determine the priority or relative importance of proposed projects with respect to the goals of the universities. Then, the Goal Programming (GP) model incorporating the constraints of AHP priority, system, and resource is formulated for selecting the best set of projects without exceeding the limited available resources. The projects include 'hardware' (tangible university's infrastructures), and 'software' (intangible effects that can be beneficial to the university, its members, and its students). In this paper, two commercial packages are used: Expert Choice for determining the AHP priority ranking of the projects, and LINDO for solving the GP model. Copyright © 2007 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of strategy remains a debate for academics and a concern for practitioners. Published research has focused on producing models for strategy development and on studying how strategy is developed in organisations. The Operational Research literature has highlighted the importance of considering complexity within strategic decision making; but little has been done to link strategy development with complexity theories, despite organisations and organisational environments becoming increasingly more complex. We review the dominant streams of strategy development and complexity theories. Our theoretical investigation results in the first conceptual framework which links an established Strategic Operational Research model, the Strategy Development Process model, with complexity via Complex Adaptive Systems theory. We present preliminary findings from the use of this conceptual framework applied to a longitudinal, in-depth case study, to demonstrate the advantages of using this integrated conceptual model. Our research shows that the conceptual model proposed provides rich data and allows for a more holistic examination of the strategy development process. © 2012 Operational Research Society Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When constructing and using environmental models, it is typical that many of the inputs to the models will not be known perfectly. In some cases, it will be possible to make observations, or occasionally physics-based uncertainty propagation, to ascertain the uncertainty on these inputs. However, such observations are often either not available or even possible, and another approach to characterising the uncertainty on the inputs must be sought. Even when observations are available, if the analysis is being carried out within a Bayesian framework then prior distributions will have to be specified. One option for gathering or at least estimating this information is to employ expert elicitation. Expert elicitation is well studied within statistics and psychology and involves the assessment of the beliefs of a group of experts about an uncertain quantity, (for example an input / parameter within a model), typically in terms of obtaining a probability distribution. One of the challenges in expert elicitation is to minimise the biases that might enter into the judgements made by the individual experts, and then to come to a consensus decision within the group of experts. Effort is made in the elicitation exercise to prevent biases clouding the judgements through well-devised questioning schemes. It is also important that, when reaching a consensus, the experts are exposed to the knowledge of the others in the group. Within the FP7 UncertWeb project (http://www.uncertweb.org/), there is a requirement to build a Webbased tool for expert elicitation. In this paper, we discuss some of the issues of building a Web-based elicitation system - both the technological aspects and the statistical and scientific issues. In particular, we demonstrate two tools: a Web-based system for the elicitation of continuous random variables and a system designed to elicit uncertainty about categorical random variables in the setting of landcover classification uncertainty. The first of these examples is a generic tool developed to elicit uncertainty about univariate continuous random variables. It is designed to be used within an application context and extends the existing SHELF method, adding a web interface and access to metadata. The tool is developed so that it can be readily integrated with environmental models exposed as web services. The second example was developed for the TREES-3 initiative which monitors tropical landcover change through ground-truthing at confluence points. It allows experts to validate the accuracy of automated landcover classifications using site-specific imagery and local knowledge. Experts may provide uncertainty information at various levels: from a general rating of their confidence in a site validation to a numerical ranking of the possible landcover types within a segment. A key challenge in the web based setting is the design of the user interface and the method of interacting between the problem owner and the problem experts. We show the workflow of the elicitation tool, and show how we can represent the final elicited distributions and confusion matrices using UncertML, ready for integration into uncertainty enabled workflows.We also show how the metadata associated with the elicitation exercise is captured and can be referenced from the elicited result, providing crucial lineage information and thus traceability in the decision making process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliable, high throughput, in vitro preliminary screening batteries have the potential to greatly accelerate the rate at which regulatory neurotoxicity data is generated. This study evaluated the importance of astrocytes when predicting acute toxic potential using a neuronal screening battery of pure neuronal (NT2.N) and astrocytic (NT2.A) and integrated neuronal/astrocytic (NT2.N/A) cell systems derived from the human NT2.D1 cell line, using biochemical endpoints (mitochondrial membrane potential (MMP) depolarisation and ATP and GSH depletion). Following exposure for 72 h, the known acute human neurotoxicants trimethyltin-chloride, chloroquine and 6-hydroxydopamine were frequently capable of disrupting biochemical processes in all of the cell systems at non-cytotoxic concentrations. Astrocytes provide key metabolic and protective support to neurons during toxic challenge in vivo and generally the astrocyte containing cell systems showed increased tolerance to toxicant insult compared with the NT2.N mono-culture in vitro. Whilst there was no consistent relationship between MMP, ATP and GSH log IC(50) values for the NT2.N/A and NT2.A cell systems, these data did provide preliminary evidence of modulation of the acute neuronal toxic response by astrocytes. In conclusion, the suitability of NT2 neurons and astrocytes as cell systems for acute toxicity screening deserves further investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates how existing software engineering techniques can be employed, adapted and integrated for the development of systems of systems. Starting from existing system-of-systems (SoS) studies, we identify computing paradigms and techniques that have the potential to help address the challenges associated with SoS development, and propose an SoS development framework that combines these techniques in a novel way. This framework addresses the development of a class of IT systems of systems characterised by high variability in the types of interactions between their component systems, and by relatively small numbers of such interactions. We describe how the framework supports the dynamic, automated generation of the system interfaces required to achieve these interactions, and present a case study illustrating the development of a data-centre SoS using the new framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Plasma or "dry" etching is an essential process for the production of modern microelectronic circuits. However, despite intensive research, many aspects of the etch process are not fully understood. The results of studies of the plasma etching of Si and Si02 in fluorine-containing discharges, and the complementary technique of plasma polymerisation are presented in this thesis. Optical emission spectroscopy with argon actinometry was used as the principle plasma diagnostic. Statistical experimental design was used to model and compare Si and Si02 etch rates in CF4 and SF6 discharges as a function of flow, pressure and power. Etch mechanisms m both systems, including the potential reduction of Si etch rates in CF4 due to fluorocarbon polymer formation, are discussed. Si etch rates in CF4 /SF6 mixtures were successfully accounted for by the models produced. Si etch rates in CF4/C2F6 and CHF3 as a function of the addition of oxygen-containing additives (02, N20 and CO2) are shown to be consistent with a simple competition between F, 0 and CFx species for Si surface sites. For the range of conditions studied, Si02 etch rates were not dependent on F-atom concentration, but the presence of fluorine was essential in order to achieve significant etch rates. The influence of a wide range of electrode materials on the etch rate of Si and Si02 in CF4 and CF4 /02 plasmas was studied. It was found that the Si etch rate in a CF4 plasma was considerably enhanced, relative to an anodised aluminium electrode, in the presence of soda glass or sodium or potassium "doped" quartz. The effect was even more pronounced in a CF4 /02 discharge. In the latter system lead and copper electrodes also enhanced the Si etch rate. These results could not be accounted for by a corresponding rise in atomic fluorine concentration. Three possible etch enhancement mechanisms are discussed. Fluorocarbon polymer deposition was studied, both because of its relevance to etch mechanisms and its intrinsic interest, as a function of fluorocarbon source gas (CF4, C2F6, C3F8 and CHF3), process time, RF power and percentage hydrogen addition. Gas phase concentrations of F, H and CF2 were measured by optical emission spectroscopy, and the resultant polymer structure determined by X-ray photoelectron spectroscopy and infrared spectroscopy. Thermal and electrical properties were measured also. Hydrogen additions are shown to have a dominant role in determining deposition rate and polymer composition. A qualitative description of the polymer growth mechanism is presented which accounts for both changes in growth rate and structure, and leads to an empirical deposition rate model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis addresses the viability of automatic speech recognition for control room systems; with careful system design, automatic speech recognition (ASR) devices can be useful means for human computer interaction in specific types of task. These tasks can be defined as complex verbal activities, such as command and control, and can be paired with spatial tasks, such as monitoring, without detriment. It is suggested that ASR use be confined to routine plant operation, as opposed the critical incidents, due to possible problems of stress on the operators' speech.  It is proposed that using ASR will require operators to adapt a commonly used skill to cater for a novel use of speech. Before using the ASR device, new operators will require some form of training. It is shown that a demonstration by an experienced user of the device can lead to superior performance than instructions. Thus, a relatively cheap and very efficient form of operator training can be supplied by demonstration by experienced ASR operators. From a series of studies into speech based interaction with computers, it is concluded that the interaction be designed to capitalise upon the tendency of operators to use short, succinct, task specific styles of speech. From studies comparing different types of feedback, it is concluded that operators be given screen based feedback, rather than auditory feedback, for control room operation. Feedback will take two forms: the use of the ASR device will require recognition feedback, which will be best supplied using text; the performance of a process control task will require task feedback integrated into the mimic display. This latter feedback can be either textual or symbolic, but it is suggested that symbolic feedback will be more beneficial. Related to both interaction style and feedback is the issue of handling recognition errors. These should be corrected by simple command repetition practices, rather than use error handling dialogues. This method of error correction is held to be non intrusive to primary command and control operations. This thesis also addresses some of the problems of user error in ASR use, and provides a number of recommendations for its reduction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recent explosive growth in advanced manufacturing technology (AMT) and continued development of sophisticated information technologies (IT) is expected to have a profound effect on the way we design and operate manufacturing businesses. Furthermore, the escalating capital requirements associated with these developments have significantly increased the level of risk associated with initial design, ongoing development and operation. This dissertation has examined the integration of two key sub-elements of the Computer Integrated Manufacturing (CIM) system, namely the manufacturing facility and the production control system. This research has concentrated on the interactions between production control (MRP) and an AMT based production facility. The disappointing performance of such systems has been discussed in the context of a number of potential technological and performance incompatibilities between these two elements. It was argued that the design and selection of operating policies for both is the key to successful integration. Furthermore, policy decisions are shown to play an important role in matching the performance of the total system to the demands of the marketplace. It is demonstrated that a holistic approach to policy design must be adopted if successful integration is to be achieved. It is shown that the complexity of the issues resulting from such an approach required the formulation of a structured design methodology. Such a methodology was subsequently developed and discussed. This combined a first principles approach to the behaviour of system elements with the specification of a detailed holistic model for use in the policy design environment. The methodology aimed to make full use of the `low inertia' characteristics of AMT, whilst adopting a JIT configuration of MRP and re-coupling the total system to the market demands. This dissertation discussed the application of the methodology to an industrial case study and the subsequent design of operational policies. Consequently a novel approach to production control resulted. A central feature of which was a move toward reduced manual intervention in the MRP processing and scheduling logic with increased human involvement and motivation in the management of work-flow on the shopfloor. Experimental results indicated that significant performance advantages would result from the adoption of the recommended policy set.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer integrated monitoring is a very large area in engineering where on-line, real time data acquisition with the aid of sensors is the solution to many problems in the manufacturing industry as opposed to the old data logging method by graphics analysis. The raw data which is collected this way however is useless in the absence of a proper computerized management system. The transfer of data between the management and the shop floor processes has been impossible in the past unless all the computers in the system were totally compatible with each other. This limits the efficiency of the systems because they get governed by the limitations of the computers. General Motors of U.S.A. have recently started research on a new standard called the Manufacturing Automation Protocol (MAP) which is expected to allow data transfer between different types of computers. This is still in early development stages and also is currently very expensive. This research programme shows how such a shop floor data acquisition system and a complete management system on entirely different computers can be integrated together to form a single system by achieving data transfer communications using a cheaper but a superior alternative to MAP. Standard communication character sets and hardware such as ASCII and UARTs have been used in this method but the technique is so powerful that totally incompatible computers are shown to run different programs (in different languages) simultaneously and yet receive data from each other and process in their own CPUs with no human intervention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many manufacturing companies have long endured the problems associated with the presence of `islands of automation'. Due to rapid computerisation, `islands' such as Computer-Aided Design (CAD), Computer-Aided Manufacturing (CAM), Flexible Manufacturing Systems (FMS) and Material Requirement Planning (MRP), have emerged, and with a lack of co-ordination, often lead to inefficient performance of the overall system. The main objective of Computer-Integrated Manufacturing (CIM) technology is to form a cohesive network between these islands. Unfortunately, a commonly used approach - the centralised system approach, has imposed major technical constraints and design complication on development strategies. As a consequence, small companies have experienced difficulties in participating in CIM technology. The research described in this thesis has aimed to examine alternative approaches to CIM system design. Through research and experimentation, the cellular system approach, which has existed in the form of manufacturing layouts, has been found to simplify the complexity of an integrated manufacturing system, leading to better control and far higher system flexibility. Based on the cellular principle, some central management functions have also been distributed to smaller cells within the system. This concept is known, specifically, as distributed planning and control. Through the development of an embryo cellular CIM system, the influence of both the cellular principle and the distribution methodology have been evaluated. Based on the evidence obtained, it has been concluded that distributed planning and control methodology can greatly enhance cellular features within an integrated system. Both the cellular system approach and the distributed control concept will therefore make significant contributions to the design of future CIM systems, particularly systems designed with respect to small company requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In response to the increasing international competitiveness, many manufacturing businesses are rethinking their management strategies and philosophies towards achieving a computer integrated environment. The explosive growth in Advanced Manufacturing Technology (AMI) has resulted in the formation of functional "Islands of Automation" such as Computer Aided Design (CAD), Computer Aided Manufacturing (CAM), Computer Aided Process Planning (CAPP) and Manufacturing Resources Planning (MRPII). This has resulted in an environment which has focussed areas of excellence and poor overall efficiency, co-ordination and control. The main role of Computer Integrated Manufacturing (CIM) is to integrate these islands of automation and develop a totally integrated and controlled environment. However, the various perceptions of CIM, although developing, remain focussed on a very narrow integration scope and have consequently resulted in mere linked islands of automation with little improvement in overall co-ordination and control. This thesis, that is the research described within, develops and examines a more holistic view of CIM, which is based on the integration of various business elements. One particular business element, namely control, has been shown to have a multi-facetted and underpinning relationship with the CIM philosophy. This relationship impacts various CIM system design aspects including the CIM business analysis and modelling technique, the specification of systems integration requirements, the CIM system architectural form and the degree of business redesign. The research findings show that fundamental changes to CIM system design are required; these are incorporated in a generic CIM design methodology. The affect and influence of this holistic view of CIM on a manufacturing business has been evaluated through various industrial case study applications. Based on the evidence obtained, it has been concluded that this holistic, control based approach to CIM can provide a greatly improved means of achieving a totally integrated and controlled business environment. This generic CIM methodology will therefore make a significant contribution to the planning, modelling, design and development of future CIM systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computerised production control developments have concentrated on Manufacturing Resources Planning (MRP II) systems. The literature suggests however, that despite the massive investment in hardware, software and management education, successful implementation of such systems in manufacturing industries has proved difficult. This thesis reviews the development of production planning and control systems, in particular, investigates the causes of failures in implementing MRP/MRP II systems in industrial environments and argues that the centralised and top-down planning structure, as well as the routine operational methodology of such systems, is inherently prone to failure. The thesis reviews the control benefits of cellular manufacturing systems but concludes that in more dynamic manufacturing environments, techniques such as Kanban are inappropriate. The basic shortcomings of MRP II systems are highlighted and a new enhanced operational methodology based on distributed planning and control principles is introduced. Distributed Manufacturing Resources Planning (DMRP), was developed as a capacity sensitive production planning and control solution for cellular manufacturing environments. The system utilises cell based, independently operated MRP II systems, integrated into a plant-wide control system through a Local Area Network. The potential benefits of adopting the system in industrial environments is discussed and the results of computer simulation experiments to compare the performance of the DMRP system against the conventional MRP II systems presented. DMRP methodology is shown to offer significant potential advantages which include ease of implementation, cost effectiveness, capacity sensitivity, shorter manufacturing lead times, lower working in progress levels and improved customer service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Initially the study focussed on the factors affecting the ability of the police to solve crimes. An analysts of over twenty thousand police deployments revealed the proportion of time spent investigating crime contrasted to its perceived importance and the time spent on other activities. The fictional portrayal of skills believed important in successful crime investigation were identified and compared to the professional training and 'taught skills’ given to police and detectives. Police practitioners and middle management provided views on the skills needed to solve crimes. The relative importance of the forensic science role. fingerprint examination and interrogation skills were contrasted with changes in police methods resulting from the Police and Criminal Evidence Act and its effect on confessions. The study revealed that existing police systems for investigating crime excluding specifically cases of murder and other serious offences, were unsystematic, uncoordinated, unsupervised and unproductive in using police resources. The study examined relevant and contemporary research in the United States and United Kingdom and with organisational support introduced an experimental system of data capture and initial investigation with features of case screening and management. Preliminary results indicated increases in the collection of essential information and more effective use of investigative resources. In the managerial framework within which this study has been conducted, research has been undertaken in the knowledge elicitation area as a basis for an expert system of crime investigation and the potential organisational benefits of utilising the Lap computer in the first stages of data gathering and investigation. The conclusions demonstrate the need for a totally integrated system of criminal investigation with emphasis on an organisational rather than individual response. In some areas the evidence produced is sufficient to warrant replication, in others additional research is needed to further explore other concepts and proposed systems pioneered by this study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This exploratory study is concerned with the integrated appraisal of multi-storey dwelling blocks which incorporate large concrete panel systems (LPS). The first step was to look at U.K. multi-storey dwelling stock in general, and under the management of Birmingham City Council in particular. The information has been taken from the databases of three departments in the City of Birmingham, and rearranged in a new database using a suite of PC software called `PROXIMA' for clarity and analysis. One hundred of their stock were built large concrete panel system. Thirteen LPS blocks were chosen for the purpose of this study as case-studies depending mainly on the height and age factors of the block. A new integrated appraisal technique has been created for the LPS dwelling blocks, which takes into account the most physical and social factors affecting the condition and acceptability of these blocks. This appraisal technique is built up in a hierarchical form moving from the general approach to particular elements (a tree model). It comprises two main approaches; physical and social. In the physical approach, the building is viewed as a series of manageable elements and sub-elements to cover every single physical or environmental factor of the block, in which the condition of the block is analysed. A quality score system has been developed which depends mainly on the qualitative and quantitative conditions of each category in the appraisal tree model, and leads to physical ranking order of the study blocks. In the social appraisal approach, the residents' satisfaction and attitude toward their multi-storey dwelling block was analysed in relation to: a. biographical and housing related characteristics; and b. social, physical and environmental factors associated with this sort of dwelling, block and estate in general.The random sample consisted of 268 residents living in the 13 case study blocks. Data collected was analysed using frequency counts, percentages, means, standard deviations, Kendall's tue, r-correlation coefficients, t-test, analysis of variance (ANOVA) and multiple regression analysis. The analysis showed a marginally positive satisfaction and attitude towards living in the block. The five most significant factors associated with the residents' satisfaction and attitude in descending order were: the estate, in general; the service categories in the block, including heating system and lift services; vandalism; the neighbours; and the security system of the block. An important attribute of this method, is that it is relatively inexpensive to implement, especially when compared to alternatives adopted by some local authorities and the BRE. It is designed to save time, money and effort, to aid decision making, and to provide ranked priority to the multi-storey dwelling stock, in addition to many other advantages. A series of solution options to the problems of the block was sought for selection and testing before implementation. The traditional solutions have usually resulted in either demolition or costly physical maintenance and social improvement of the blocks. However, a new solution has now emerged, which is particularly suited to structurally sound units. The solution of `re-cycling' might incorporate the reuse of an entire block or part of it, by removing panels, slabs and so forth from the upper floors in order to reconstruct them as low-rise accommodations.