926 resultados para TOTAL ANALYSIS SYSTEMS
Resumo:
The initial aim of this research was to investigate the application of expert Systems, or Knowledge Base Systems technology to the automated synthesis of Hazard and Operability Studies. Due to the generic nature of Fault Analysis problems and the way in which Knowledge Base Systems work, this goal has evolved into a consideration of automated support for Fault Analysis in general, covering HAZOP, Fault Tree Analysis, FMEA and Fault Diagnosis in the Process Industries. This thesis described a proposed architecture for such an Expert System. The purpose of the System is to produce a descriptive model of faults and fault propagation from a description of the physical structure of the plant. From these descriptive models, the desired Fault Analysis may be produced. The way in which this is done reflects the complexity of the problem which, in principle, encompasses the whole of the discipline of Process Engineering. An attempt is made to incorporate the perceived method that an expert uses to solve the problem; keywords, heuristics and guidelines from techniques such as HAZOP and Fault Tree Synthesis are used. In a truly Expert System, the performance of the system is strongly dependent on the high quality of the knowledge that is incorporated. This expert knowledge takes the form of heuristics or rules of thumb which are used in problem solving. This research has shown that, for the application of fault analysis heuristics, it is necessary to have a representation of the details of fault propagation within a process. This helps to ensure the robustness of the system - a gradual rather than abrupt degradation at the boundaries of the domain knowledge.
Resumo:
This dissertation studies the process of operations systems design within the context of the manufacturing organization. Using the DRAMA (Design Routine for Adopting Modular Assembly) model as developed by a team from the IDOM Research Unit at Aston University as a starting point, the research employed empirically based fieldwork and a survey to investigate the process of production systems design and implementation within four UK manufacturing industries: electronics assembly, electrical engineering, mechanical engineering and carpet manufacturing. The intention was to validate the basic DRAMA model as a framework for research enquiry within the electronics industry, where the initial IDOM work was conducted, and then to test its generic applicability, further developing the model where appropriate, within the other industries selected. The thesis contains a review of production systems design theory and practice prior to presenting thirteen industrial case studies of production systems design from the four industry sectors. The results and analysis of the postal survey into production systems design are then presented. The strategic decisions of manufacturing and their relationship to production systems design, and the detailed process of production systems design and operation are then discussed. These analyses are used to develop the generic model of production systems design entitled DRAMA II (Decision Rules for Analysing Manufacturing Activities). The model contains three main constituent parts: the basic DRAMA model, the extended DRAMA II model showing the imperatives and relationships within the design process, and a benchmark generic approach for the design and analysis of each component in the design process. DRAMA II is primarily intended for use by researchers as an analytical framework of enquiry, but is also seen as having application for manufacturing practitioners.
Resumo:
This research explores the links between the strategies adopted by companies and the mechanisms used to control the organisation. This is not seen as a one way process with the control system following from the strategy but rather as an interactive process between the control systems, the environment and the business strategy. The main proposition of the research, derived from a review of the relevant literature, is that the dimensions of Business Pro-Activity and Environmental Change provide a plausible explanation of the reasons why companies need to adopt different strategies in order to be successful in different markets. A model is proposed which links these dimensions with the business strategy, organisational structure, strategic planning system and management control systems. The model is used as a framework for analysing four companies in order to further our understanding of these interactions and the mechanisms which act to both promote and resist change. Whilst it is not suggested that the model in its present form is a perfect instrument it has, during the course of this research, proved to be an appropriate framework for analysing the various mechanisms used by four companies to formulate and implement their strategies. The research reveals that these should not be viewed independently but as a balanced system.
A comparison of U.S. and Japanese management systems and their transferability to Singapore industry
Resumo:
This research compares U.S. and Japanese management systems and evaluates their transferability to the Singaporean manufacturing industry. The objectives were:- a) To determine the effectiveness of U.S. and Japanese management systems when applied to Singapore. b) Determine the extent of transferability of U.S. and Japanese management systems to Singapore. c) Survey general problems ecountered in the application of U.S. and Japanese management systems to the Singapore industry. The study using questionnaire survey and interviews covered a total of eighty companies from four groups of firms in four industrial sectors comprising of U.S. and Japanese subsidiaries based in Singapore and their respective parent companies. Data from the questionnaires and interviews were used to investigate environmental conditions, management philosophy, management functions/practices, management effectiveness, and firm productivity. Two-way analysis of variance was used to analyse the questionnaire data. The analysis of the perceptual data from the questionnaire survey and interviews suggested that both U.S. and Japanese parent companies performed better in almost all the management variables studied when compared to their subsidiaries in Singapore. U.S. subsidiaries have less difficulty in adjusting to the Singapore environmental conditions and obtained better results than the Japanese subsidiaries in management functions/practices and management philosophy than the U.S. subsidiaries. In addition, the firm productivity (in terms of labour and capital productivity) of U.S. subsidiaries in Singapore was found to be higher than those of the Japanese subsidiaries. It was found that the Japanese parent companies returned the highest score among the four groups of firms in all the four industrial sectors for all the four management variables (i.e. environmental conditions, management philosophy, management functions/practices, and management effectiveness) surveyed using questionnaires. In contrast, the average score for Japanese subsidiaries in Singapore was generally the lowest among the four groups of firms. Thus the results of this study suggest that the transfer of U.S. management system into the Singapore industry is more successful than the Japanese management system. The problems encountered in the application of U.S. and Japanese management in Singapore were identified and discussed by the study. General recommendations for the Singaporean manufacturing industry were then made based on the findings of the questionnaire survey and interview analysis.
Resumo:
Firstly, we numerically model a practical 20 Gb/s undersea configuration employing the Return-to-Zero Differential Phase Shift Keying data format. The modelling is completed using the Split-Step Fourier Method to solve the Generalised Nonlinear Schrdinger Equation. We optimise the dispersion map and per-channel launch power of these channels and investigate how the choice of pre/post compensation can influence the performance. After obtaining these optimal configurations, we investigate the Bit Error Rate estimation of these systems and we see that estimation based on Gaussian electrical current systems is appropriate for systems of this type, indicating quasi-linear behaviour. The introduction of narrower pulses due to the deployment of quasi-linear transmission decreases the tolerance to chromatic dispersion and intra-channel nonlinearity. We used tools from Mathematical Statistics to study the behaviour of these channels in order to develop new methods to estimate Bit Error Rate. In the final section, we consider the estimation of Eye Closure Penalty, a popular measure of signal distortion. Using a numerical example and assuming the symmetry of eye closure, we see that we can simply estimate Eye Closure Penalty using Gaussian statistics. We also see that the statistics of the logical ones dominates the statistics of the logical ones dominates the statistics of signal distortion in the case of Return-to-Zero On-Off Keying configurations.
Resumo:
IEEE 802.11 standard has achieved huge success in the past decade and is still under development to provide higher physical data rate and better quality of service (QoS). An important problem for the development and optimization of IEEE 802.11 networks is the modeling of the MAC layer channel access protocol. Although there are already many theoretic analysis for the 802.11 MAC protocol in the literature, most of the models focus on the saturated traffic and assume infinite buffer at the MAC layer. In this paper we develop a unified analytical model for IEEE 802.11 MAC protocol in ad hoc networks. The impacts of channel access parameters, traffic rate and buffer size at the MAC layer are modeled with the assistance of a generalized Markov chain and an M/G/1/K queue model. The performance of throughput, packet delivery delay and dropping probability can be achieved. Extensive simulations show the analytical model is highly accurate. From the analytical model it is shown that for practical buffer configuration (e.g. buffer size larger than one), we can maximize the total throughput and reduce the packet blocking probability (due to limited buffer size) and the average queuing delay to zero by effectively controlling the offered load. The average MAC layer service delay as well as its standard deviation, is also much lower than that in saturated conditions and has an upper bound. It is also observed that the optimal load is very close to the maximum achievable throughput regardless of the number of stations or buffer size. Moreover, the model is scalable for performance analysis of 802.11e in unsaturated conditions and 802.11 ad hoc networks with heterogenous traffic flows. © 2012 KSI.
Resumo:
This research aims to examine the effectiveness of Soft Systems Methodology (SSM) to enable systemic change within local goverment and local NHS environments and to examine the role of the facilitator within this process. Checkland's Mode 2 variant of Soft Systems Methodology was applied on an experimental basis in two environments, Herefordshire Health Authority and Sand well Health Authority. The Herefordshire application used SSM in the design of an Integrated Care Pathway for stroke patients. In Sandwell, SSM was deployed to assist in the design of an Infonnation Management and Technology (IM&T) Strategy for the boundary-spanning Sandwell Partnership. Both of these environments were experiencing significant organisational change as the experiments unfurled. The explicit objectives of the research were: To examine the evolution and development of SSM and to contribute to its further development. To apply the Soft Systems Methodology to change processes within the NHS. To evaluate the potential role of SSM in this wider process of change. To assess the role of the researcher as a facilitator within this process. To develop a critical framework through which the impact of SSM on change might be understood and assessed. In developing these objectives, it became apparent that there was a gap in knowledge relating to SSM. This gap concerns the evaluation of the role of the approach in the change process. The case studies highlighted issues in stakeholder selection and management; the communicative assumptions in SSM; the ambiguous role of the facilitator; and the impact of highly politicised problem environments on the effectiveness of the methodology in the process of change. An augmented variant on SSM that integrates an appropriate (social constructivist) evaluation method is outlined, together with a series of hypotheses about the operationalisation of this proposed method.
Resumo:
Transition P Systems are a parallel and distributed computational model based on the notion of the cellular membrane structure. Each membrane determines a region that encloses a multiset of objects and evolution rules. Transition P Systems evolve through transitions between two consecutive configurations that are determined by the membrane structure and multisets present inside membranes. Moreover, transitions between two consecutive configurations are provided by an exhaustive non-deterministic and parallel application of evolution rules. But, to establish the rules to be applied, it is required the previous calculation of useful, applicable and active rules. Hence, computation of useful evolution rules is critical for the whole evolution process efficiency, because it is performed in parallel inside each membrane in every evolution step. This work defines usefulness states through an exhaustive analysis of the P system for every membrane and for every possible configuration of the membrane structure during the computation. Moreover, this analysis can be done in a static way; therefore membranes only have to check their usefulness states to obtain their set of useful rules during execution.
Resumo:
G-protein coupled receptors (GPCRs) constitute the largest class of membrane proteins and are a major drug target. A serious obstacle to studying GPCR structure/function characteristics is the requirement to extract the receptors from their native environment in the plasma membrane, coupled with the inherent instability of GPCRs in the detergents required for their solubilization. In the present study, we report the first solubilization and purification of a functional GPCR [human adenosine A
Resumo:
A new original method and CASE-tool of system analysis and modelling are represented. They are for the first time consistent with the requirements of object-oriented technology of informational systems design. They essentially facilitate the construction of organisational systems models and increase the quality of the organisational designing and basic technological processes of object application developing.
Resumo:
Systems analysis (SA) is widely used in complex and vague problem solving. Initial stages of SA are analysis of problems and purposes to obtain problems/purposes of smaller complexity and vagueness that are combined into hierarchical structures of problems(SP)/purposes(PS). Managers have to be sure the PS and the purpose realizing system (PRS) that can achieve the PS-purposes are adequate to the problem to be solved. However, usually SP/PS are not substantiated well enough, because their development is based on a collective expertise in which logic of natural language and expert estimation methods are used. That is why scientific foundations of SA are not supposed to have been completely formed. The structure-and-purpose approach to SA based on a logic-and-linguistic simulation of problems/purposes analysis is a step towards formalization of the initial stages of SA to improve adequacy of their results, and also towards increasing quality of SA as a whole. Managers of industrial organizing systems using the approach eliminate logical errors in SP/PS at early stages of planning and so they will be able to find better decisions of complex and vague problems.
Resumo:
Membrane computing is a recent area that belongs to natural computing. This field works on computational models based on nature's behavior to process the information. Recently, numerous models have been developed and implemented with this purpose. P-systems are the structures which have been defined, developed and implemented to simulate the behavior and the evolution of membrane systems which we find in nature. What we show in this paper is an application capable to simulate the P-systems based on a multiagent systems (MAS) technology. The main goal we want to achieve is to take advantage of the inner qualities of the multiagent systems. This way we can analyse the proper functioning of any given p-system. When we observe a P-system from a different perspective, we can be assured that it is a particular case of the multiagent systems. This opens a new possibility, in the future, to always evaluate the P-systems in terms of the multiagent systems technology.
Resumo:
Workflows are set of activities that implement and realise business goals. Modern business goals add extra requirements on workflow systems and their management. Workflows may cross many organisations and utilise services on a variety of devices and/or supported by different platforms. Current workflows are therefore inherently context-aware. Each context is governed and constrained by its own policies and rules to prevent unauthorised participants from executing sensitive tasks and also to prevent tasks from accessing unauthorised services and/or data. We present a sound and multi-layered design language for the design and analysis of secure and context aware workflows systems.