853 resultados para Many fermion systems
Resumo:
This research concerns information systems and information systems development. The thesis describes an approach to information systems development called Multiview. This is a methodology which seeks to combine the strengths of a number of different, existing approaches in a coherent manner. Many of these approaches are radically different in terms of concepts, philosophy, assumptions, methods, techniques and tools. Three case studies are described presenting Multiview 'in action'. The first is used mainly to expose the strengths and weaknesses of an early version of the approach discussed in the thesis. Tools and techniques are described in the thesis which aim to strengthen the approach. Two further case studies are presented to illustrate the use of this second version of Multiview. This is not put forward as an 'ideal methodology' and the case studies expose some of the difficulties and practical problems of information systems work and the use of the methodology. A more contingency based approach to information systems development is advocated using Multiview as a framework rather than a prescriptive tool. Each information systems project and the use of the framework is unique, contingent on the particular problem situation. The skills of different analysts, the backgrounds of users and the situations in which they are constrained to work have always to be taken into account in any project. The realities of the situation will cause departure from the 'ideal methodology' in order to allow for the exigencies of the real world. Multiview can therefore be said to be an approach used to explore the application area in order to develop an information system.
Resumo:
Alginate is widely used as a viscosity enhancer in many different pharmaceutical formulations. The aim of this thesis is to quantitatively describe the functions of this polyelectrolyte in pharmaceutical systems. To do this the techniques used were Viscometry, Light Scattering, Continuous and Oscillatory Shear Rheometry, Numerical Analysis and Diffusion. Molecular characterization of the Alginate was carried out using Viscometry and Light Scattering to determine the molecular weight, the radius of gyration, the second virial coefficient and the Kuhn statistical segment length. The results showed good agreement with similar parameters obtained in previous studies. By blending Alginate with other polyelectrolytes, Xanthan Gum and 'Carbopol', in various proportions and with various methods of low and high shear preparation, a very wide range of dynamic rheological properties was found. Using oscillatory testing, the parameters often varied over several decades of magnitude. It was shown that the determination of the viscous and elastic components is particularly useful in describing the rheological 'profiles' of suspending agent blends and provides a step towards the non-empirical formulation of pharmaceutical disperse systems. Using numerical analysis of equations describing planar diffusion, it was shown that the analysis of drug release profiles alone does not provide unambiguous information about the mechanism of rate control. These principles were applied to the diffusion of Ibuprofen in Calcium Alginate gels. For diffusion in such non-Newtonian systems, emphasis was placed on the use of the elastic as well as the viscous component of viscoelasticity. It was found that the diffusion coefficients were relatively unaffected by increases in polymer concentration up to 5 per cent, yet the elasticities measured by oscillatory shear rheometry were increased. This was interpreted in the light of several theories of diffusion in gels.
Resumo:
This study is concerned with several proposals concerning multiprocessor systems and with the various possible methods of evaluating such proposals. After a discussion of the advantages and disadvantages of several performance evaluation tools, the author decides that simulation is the only tool powerful enough to develop a model which would be of practical use, in the design, comparison and extension of systems. The main aims of the simulation package developed as part of this study are cost effectiveness, ease of use and generality. The methodology on which the simulation package is based is described in detail. The fundamental principles are that model design should reflect actual systems design, that measuring procedures should be carried out alongside design that models should be well documented and easily adaptable and that models should be dynamic. The simulation package itself is modular, and in this way reflects current design trends. This approach also aids documentation and ensures that the model is easily adaptable. It contains a skeleton structure and a library of segments which can be added to or directly swapped with segments of the skeleton structure, to form a model which fits a user's requirements. The study also contains the results of some experimental work carried out using the model, the first part of which tests• the model's capabilities by simulating a large operating system, the ICL George 3 system; the second part deals with general questions and some of the many proposals concerning multiprocessor systems.
Resumo:
Manufacturing firms are driven by competitive pressures to continually improve the effectiveness and efficiency of their organisations. For this reason, manufacturing engineers often implement changes to existing processes, or design new production facilities, with the expectation of making further gains in manufacturing system performance. This thesis relates to how the likely outcome of this type of decision should be predicted prior to its implementation. The thesis argues that since manufacturing systems must also interact with many other parts of an organisation, the expected performance improvements can often be significantly hampered by constraints that arise elsewhere in the business. As a result, decision-makers should attempt to predict just how well a proposed design will perform when these other factors, or 'support departments', are taken into consideration. However, the thesis also demonstrates that, in practice, where quantitative analysis is used to evaluate design decisions, the analysis model invariably ignores the potential impact of support functions on a system's overall performance. A more comprehensive modelling approach is therefore required. A study of how various business functions interact establishes that to properly represent the kind of delays that give rise to support department constraints, a model should actually portray the dynamic and stochastic behaviour of entities in both the manufacturing and non-manufacturing aspects of a business. This implies that computer simulation be used to model design decisions but current simulation software does not provide a sufficient range of functionality to enable the behaviour of all of these entities to be represented in this way. The main objective of the research has therefore been the development of a new simulator that will overcome limitations of existing software and so enable decision-makers to conduct a more holistic evaluation of design decisions. It is argued that the application of object-oriented techniques offers a potentially better way of fulfilling both the functional and ease-of-use issues relating to development of the new simulator. An object-oriented analysis and design of the system, called WBS/Office, are therefore presented that extends to modelling a firm's administrative and other support activities in the context of the manufacturing system design process. A particularly novel feature of the design is the ability for decision-makers to model how a firm's specific information and document processing requirements might hamper shop-floor performance. The simulator is primarily intended for modelling make-to-order batch manufacturing systems and the thesis presents example models created using a working version of WBS/Office that demonstrate the feasibility of using the system to analyse manufacturing system designs in this way.
Resumo:
This thesis deals with the problems associated with the planning and control of production, with particular reference to a small aluminium die casting company. The main problem areas were identified as: (a) A need to be able to forecast the customers demands upon the company's facilities. (b) A need to produce a manufacturing programme in which the output of the foundry (or die casting section) was balanced with the available capacity in the machine shop. (c) The need to ensure that the resultant system enabled the company's operating budget to have a reasonable chance of being achieved. At the commencement of the research work the major customers were members of the automobile industry and had their own system of forecasting, from which they issued manufacturing schedules to their component suppliers, The errors in the forecast were analysed and the distributions noted. Using these distributions the customer's forecast was capable of being modified to enable his final demand to be met with a known degree of confidence. Before a manufacturing programme could be developed the actual manufacturing system had to be reviewed and it was found that as with many small companies there was a remarkable lack of formal control and written data. Relevant data with regards to the component and the manufacturing process had therefore to be collected and analysed. The foundry process was fixed but the secondary machining operations were analysed by a technique similar to Component Flow Analysis and as a result the machines were arranged in a series of flow lines. A system of manual production control was proposed and for comparison, a local computer bureau was approached and a system proposed incorporating the production of additional management information. These systems are compared and the relative merits discussed and a proposal made for implementation.
Resumo:
This exploratory study is concerned with the integrated appraisal of multi-storey dwelling blocks which incorporate large concrete panel systems (LPS). The first step was to look at U.K. multi-storey dwelling stock in general, and under the management of Birmingham City Council in particular. The information has been taken from the databases of three departments in the City of Birmingham, and rearranged in a new database using a suite of PC software called `PROXIMA' for clarity and analysis. One hundred of their stock were built large concrete panel system. Thirteen LPS blocks were chosen for the purpose of this study as case-studies depending mainly on the height and age factors of the block. A new integrated appraisal technique has been created for the LPS dwelling blocks, which takes into account the most physical and social factors affecting the condition and acceptability of these blocks. This appraisal technique is built up in a hierarchical form moving from the general approach to particular elements (a tree model). It comprises two main approaches; physical and social. In the physical approach, the building is viewed as a series of manageable elements and sub-elements to cover every single physical or environmental factor of the block, in which the condition of the block is analysed. A quality score system has been developed which depends mainly on the qualitative and quantitative conditions of each category in the appraisal tree model, and leads to physical ranking order of the study blocks. In the social appraisal approach, the residents' satisfaction and attitude toward their multi-storey dwelling block was analysed in relation to: a. biographical and housing related characteristics; and b. social, physical and environmental factors associated with this sort of dwelling, block and estate in general.The random sample consisted of 268 residents living in the 13 case study blocks. Data collected was analysed using frequency counts, percentages, means, standard deviations, Kendall's tue, r-correlation coefficients, t-test, analysis of variance (ANOVA) and multiple regression analysis. The analysis showed a marginally positive satisfaction and attitude towards living in the block. The five most significant factors associated with the residents' satisfaction and attitude in descending order were: the estate, in general; the service categories in the block, including heating system and lift services; vandalism; the neighbours; and the security system of the block. An important attribute of this method, is that it is relatively inexpensive to implement, especially when compared to alternatives adopted by some local authorities and the BRE. It is designed to save time, money and effort, to aid decision making, and to provide ranked priority to the multi-storey dwelling stock, in addition to many other advantages. A series of solution options to the problems of the block was sought for selection and testing before implementation. The traditional solutions have usually resulted in either demolition or costly physical maintenance and social improvement of the blocks. However, a new solution has now emerged, which is particularly suited to structurally sound units. The solution of `re-cycling' might incorporate the reuse of an entire block or part of it, by removing panels, slabs and so forth from the upper floors in order to reconstruct them as low-rise accommodations.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
Manufacturing planning and control systems are fundamental to the successful operations of a manufacturing organisation. 10 order to improve their business performance, significant investment is made by companies into planning and control systems; however, not all companies realise the benefits sought Many companies continue to suffer from high levels of inventory, shortages, obsolete parts, poor resource utilisation and poor delivery performance. This thesis argues that the fit between the planning and control system and the manufacturing organisation is a crucial element of success. The design of appropriate control systems is, therefore, important. The different approaches to the design of manufacturing planning and control systems are investigated. It is concluded that there is no provision within these design methodologies to properly assess the impact of a proposed design on the manufacturing facility. Consequently, an understanding of how a new (or modified) planning and control system will perform in the context of the complete manufacturing system is unlikely to be gained until after the system has been implemented and is running. There are many modelling techniques available, however discrete-event simulation is unique in its ability to model the complex dynamics inherent in manufacturing systems, of which the planning and control system is an integral component. The existing application of simulation to manufacturing control system issues is limited: although operational issues are addressed, application to the more fundamental design of control systems is rarely, if at all, considered. The lack of a suitable simulation-based modelling tool does not help matters. The requirements of a simulation tool capable of modelling a host of different planning and control systems is presented. It is argued that only through the application of object-oriented principles can these extensive requirements be achieved. This thesis reports on the development of an extensible class library called WBS/Control, which is based on object-oriented principles and discrete-event simulation. The functionality, both current and future, offered by WBS/Control means that different planning and control systems can be modelled: not only the more standard implementations but also hybrid systems and new designs. The flexibility implicit in the development of WBS/Control supports its application to design and operational issues. WBS/Control wholly integrates with an existing manufacturing simulator to provide a more complete modelling environment.
Resumo:
Research in safety management has been inhibited by lack of consensus as to the definitions of the terms with which it is concerned and, in general, the lack of an agreed theoretical framework within which to collate and contrast empirical findings. This thesis sets out definitions of key terms (hazard, risk, accident, incident and safety) and provides a theoretical framework. This framework has been informed by many sources but especially the Management Oversight and Risk Tree (MORT), cybernetics and the Viable System Model (VSM). Fieldwork designs are proposed for the empirical development of an analytical framework and its use to assist study of the development of safety management in organisations.
Resumo:
To ensure state synchronization of signalling operations, many signaling protocol designs choose to establish “soft” state that expires if it is not refreshed. The approaches of refreshing state in multi-hop signaling system can be classified as either end-to-end (E2E) or hop-by-hop (HbH). Although both state refresh approaches have been widely used in practical signaling protocols, the design tradeoffs between state synchronization and signaling cost have not yet been fully investigated. In this paper, we investigate this issue from the perspectives of state refresh and state removal. We propose simple but effective Markov chain models for both approaches and obtain closed-form solutions which depict the state refresh performance in terms of state consistency and refresh message rate, as well as the state removal performance in terms of state removal delay. Simulations verify the analytical models. It is observed that the HbH approach yields much better state synchronization at the cost of higher signaling cost than the E2E approach. While the state refresh performance can be improved by increasing the values of state refresh and timeout timers, the state removal delay increases largely for both E2E and HbH approaches. The analysis here shed lights on the design of signaling protocols and the configuration of the timers to adapt to changing network conditions.
Resumo:
Evaluation and benchmarking in content-based image retrieval has always been a somewhat neglected research area, making it difficult to judge the efficacy of many presented approaches. In this paper we investigate the issue of benchmarking for colour-based image retrieval systems, which enable users to retrieve images from a database based on lowlevel colour content alone. We argue that current image retrieval evaluation methods are not suited to benchmarking colour-based image retrieval systems, due in main to not allowing users to reflect upon the suitability of retrieved images within the context of a creative project and their reliance on highly subjective ground-truths. As a solution to these issues, the research presented here introduces the Mosaic Test for evaluating colour-based image retrieval systems, in which test-users are asked to create an image mosaic of a predetermined target image, using the colour-based image retrieval system that is being evaluated. We report on our findings from a user study which suggests that the Mosaic Test overcomes the major drawbacks associated with existing image retrieval evaluation methods, by enabling users to reflect upon image selections and automatically measuring image relevance in a way that correlates with the perception of many human assessors. We therefore propose that the Mosaic Test be adopted as a standardised benchmark for evaluating and comparing colour-based image retrieval systems.
Resumo:
Fibre-optic communications systems have traditionally carried data using binary (on-off) encoding of the light amplitude. However, next-generation systems will use both the amplitude and phase of the optical carrier to achieve higher spectral efficiencies and thus higher overall data capacities(1,2). Although this approach requires highly complex transmitters and receivers, the increased capacity and many further practical benefits that accrue from a full knowledge of the amplitude and phase of the optical field(3) more than outweigh this additional hardware complexity and can greatly simplify optical network design. However, use of the complex optical field gives rise to a new dominant limitation to system performance-nonlinear phase noise(4,5). Developing a device to remove this noise is therefore of great technical importance. Here, we report the development of the first practical ('black-box') all-optical regenerator capable of removing both phase and amplitude noise from binary phase-encoded optical communications signals.
Resumo:
Summary writing is an important part of many English Language Examinations. As grading students' summary writings is a very time-consuming task, computer-assisted assessment will help teachers carry out the grading more effectively. Several techniques such as latent semantic analysis (LSA), n-gram co-occurrence and BLEU have been proposed to support automatic evaluation of summaries. However, their performance is not satisfactory for assessing summary writings. To improve the performance, this paper proposes an ensemble approach that integrates LSA and n-gram co-occurrence. As a result, the proposed ensemble approach is able to achieve high accuracy and improve the performance quite substantially compared with current techniques. A summary assessment system based on the proposed approach has also been developed.
Resumo:
Many natural, technological and social systems are inherently not in equilibrium. We show, by detailed analysis of exemplar models, the emergence of equilibriumlike behavior in localized or nonlocalized domains within nonequilibrium Ising spin systems. Equilibrium domains are shown to emerge either abruptly or gradually depending on the system parameters and disappear, becoming indistinguishable from the remainder of the system for other parameter values. © 2013 American Physical Society.
Resumo:
Bioenergy schemes are multi-faceted and complex by nature, with many available raw material supplies and technical options and a diverse set of stakeholders holding a raft of conflicting opinions. To develop and operate a successful scheme there are many requirements that should be considered and satisfied. This paper provides a review of those academic works attempting to deal with problems arising within the bioenergy sector using multi-criteria decision-making (MCDM) methods. These methods are particularly suitable to bioenergy given its multi-faceted nature but could be equally relevant to other energy conversion technologies. Related articles appearing in the international journals from 2000 to 2010 are gathered and analysed so that the following two questions can be answered. (i) Which methods are the most popular? (ii) Which problems attract the most attention? The review finds that optimisation methods are most popular with methods choosing between few alternatives being used in 44% of reviewed papers and methods choosing between many alternatives being used in 28%. The most popular application area was to technology selection with 27% of reviewed papers followed by policy decisions with 18%. © 2012 Elsevier Ltd.