904 resultados para Number systems. Arithmetic teaching. Number systems ancients


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Requirements for systems to continue to operate satisfactorily in the presence of faults has led to the development of techniques for the construction of fault tolerant software. This thesis addresses the problem of error detection and recovery in distributed systems which consist of a set of communicating sequential processes. A method is presented for the `a priori' design of conversations for this class of distributed system. Petri nets are used to represent the state and to solve state reachability problems for concurrent systems. The dynamic behaviour of the system can be characterised by a state-change table derived from the state reachability tree. Systematic conversation generation is possible by defining a closed boundary on any branch of the state-change table. By relating the state-change table to process attributes it ensures all necessary processes are included in the conversation. The method also ensures properly nested conversations. An implementation of the conversation scheme using the concurrent language occam is proposed. The structure of the conversation is defined using the special features of occam. The proposed implementation gives a structure which is independent of the application and is independent of the number of processes involved. Finally, the integrity of inter-process communications is investigated. The basic communication primitives used in message passing systems are seen to have deficiencies when applied to systems with safety implications. Using a Petri net model a boundary for a time-out mechanism is proposed which will increase the integrity of a system which involves inter-process communications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mathematics is highly structured and also underpins most of science and engineering. For this reason, it has proved a very suitable domain for Intelligent Tutoring System (ITS) research, with the result that probably more tutoring systems have been constructed for the domain than any other. However, the literature reveals that there still exists no consensus on a credible approach or approaches for the design of such systems, despite numerous documented efforts. Current approaches to the construction of ITSs leave much to be desired. Consequently, existing ITSs in the domain suffer from a considerable number of shortcomings which render them 'unintelligent'. The thesis examines some of the reasons why this is the case. Following a critical review of existing ITSs in the domain, and some pilot studies, an alternative approach to their construction is proposed (the 'iterative-style' approach); this supports an iterative style, and also improves on at least some of the shortcomings of existing approaches. The thesis also presents an ITS for fractions which has been developed using this approach, and which has been evaluated in various ways. It has, demonstrably, improved on many of the limitations of existing ITSs; furthermore, it has been shown to be largely 'intelligent', at least more so than current tutors for the domain. Perhaps more significantly, the tutor has also been evaluated against real students with, so far, very encouraging results. The thesis thus concludes that the novel iterative-style approach is a more credible approach to the construction of ITSs in mathematics than existing techniques.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis describes a project which has investigated the evaluation of information systems. The work took place in, and is related to, a specific organisational context, that of the National Health Service (NHS). It aims to increase understanding of the evaluation which takes place in the service and the way in which this is affected by the NHS environment. It also investigates the issues which surround some important types of evaluation and their use in this context. The first stage of the project was a postal survey in which respondents were asked to describe the evaluation which took place in their authorities and to give their opinions about it. This was used to give an overview of the practice of IS evaluation in the NHS and to identify its uses and the problems experienced. Three important types of evaluation were then examined in more detail by means of action research studies. One of these dealt with the selection and purchase of a large hospital information system. The study took the form of an evaluation of the procurement process, and examined the methods used and the influence of organisational factors. The other studies are concerned with post-implementation evaluation, and examine the choice of an evaluation approach as well as its application. One was an evaluation of a community health system which had been operational for some time but was of doubtful value, and suffered from a number of problems. The situation was explored by means of a study of the costs and benefits of the system. The remaining study was the initial review of a system which was used in the administration of a Breast Screening Service. The service itself was also newly operational and the relationship between the service and the system was of interest.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis looks to two traditions in research into language teaching, teacher beliefs and classroom interaction, in order to investigate the question: Do teachers of ESOL have an identifiable and coherent system of beliefs about teaching and learning that may account for different approaches to teaching? A qualitative approach to research is taken, following a case study tradition, in order to carry out an in-depth study into the beliefs of six ESOL teachers. Five teachers participated in an initial pilot study and two subsequently became the main case studies for the research. The beliefs of a sixth teacher were then investigated to verify the findings. Semi-structured interviews and classroom observations were carried out with all the teachers. The teachers in the study were found to have personal belief systems that cohere around two orientations to teaching and learning - a person orientation and a process orientation. Moreover, the findings suggest that underlying the orientations is the perception that teachers have of their teacher identity, in terms of whether this is seen as a separate identity or as part of their personality. It is suggested that the two orientations may offer a powerful tool for teacher education as it is increasingly recognised that, in order to be effective, teacher educators must take into account the beliefs that teachers bring with them to training and development programmes. An initial investigations into the teachers’ classroom behaviour suggests that while their methodologies approach may be very similar there are fundamental differences in their interactions patterns and these differences may be a result of their own orientation. However, while teachers’ personal belief systems undoubtedly underlie their approach to teaching, further research is needed to establish the extent and the nature of the relationship between orientation and classroom interaction.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis addresses the viability of automatic speech recognition for control room systems; with careful system design, automatic speech recognition (ASR) devices can be useful means for human computer interaction in specific types of task. These tasks can be defined as complex verbal activities, such as command and control, and can be paired with spatial tasks, such as monitoring, without detriment. It is suggested that ASR use be confined to routine plant operation, as opposed the critical incidents, due to possible problems of stress on the operators' speech.  It is proposed that using ASR will require operators to adapt a commonly used skill to cater for a novel use of speech. Before using the ASR device, new operators will require some form of training. It is shown that a demonstration by an experienced user of the device can lead to superior performance than instructions. Thus, a relatively cheap and very efficient form of operator training can be supplied by demonstration by experienced ASR operators. From a series of studies into speech based interaction with computers, it is concluded that the interaction be designed to capitalise upon the tendency of operators to use short, succinct, task specific styles of speech. From studies comparing different types of feedback, it is concluded that operators be given screen based feedback, rather than auditory feedback, for control room operation. Feedback will take two forms: the use of the ASR device will require recognition feedback, which will be best supplied using text; the performance of a process control task will require task feedback integrated into the mimic display. This latter feedback can be either textual or symbolic, but it is suggested that symbolic feedback will be more beneficial. Related to both interaction style and feedback is the issue of handling recognition errors. These should be corrected by simple command repetition practices, rather than use error handling dialogues. This method of error correction is held to be non intrusive to primary command and control operations. This thesis also addresses some of the problems of user error in ASR use, and provides a number of recommendations for its reduction.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The recent explosive growth in advanced manufacturing technology (AMT) and continued development of sophisticated information technologies (IT) is expected to have a profound effect on the way we design and operate manufacturing businesses. Furthermore, the escalating capital requirements associated with these developments have significantly increased the level of risk associated with initial design, ongoing development and operation. This dissertation has examined the integration of two key sub-elements of the Computer Integrated Manufacturing (CIM) system, namely the manufacturing facility and the production control system. This research has concentrated on the interactions between production control (MRP) and an AMT based production facility. The disappointing performance of such systems has been discussed in the context of a number of potential technological and performance incompatibilities between these two elements. It was argued that the design and selection of operating policies for both is the key to successful integration. Furthermore, policy decisions are shown to play an important role in matching the performance of the total system to the demands of the marketplace. It is demonstrated that a holistic approach to policy design must be adopted if successful integration is to be achieved. It is shown that the complexity of the issues resulting from such an approach required the formulation of a structured design methodology. Such a methodology was subsequently developed and discussed. This combined a first principles approach to the behaviour of system elements with the specification of a detailed holistic model for use in the policy design environment. The methodology aimed to make full use of the `low inertia' characteristics of AMT, whilst adopting a JIT configuration of MRP and re-coupling the total system to the market demands. This dissertation discussed the application of the methodology to an industrial case study and the subsequent design of operational policies. Consequently a novel approach to production control resulted. A central feature of which was a move toward reduced manual intervention in the MRP processing and scheduling logic with increased human involvement and motivation in the management of work-flow on the shopfloor. Experimental results indicated that significant performance advantages would result from the adoption of the recommended policy set.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This research concerns information systems and information systems development. The thesis describes an approach to information systems development called Multiview. This is a methodology which seeks to combine the strengths of a number of different, existing approaches in a coherent manner. Many of these approaches are radically different in terms of concepts, philosophy, assumptions, methods, techniques and tools. Three case studies are described presenting Multiview 'in action'. The first is used mainly to expose the strengths and weaknesses of an early version of the approach discussed in the thesis. Tools and techniques are described in the thesis which aim to strengthen the approach. Two further case studies are presented to illustrate the use of this second version of Multiview. This is not put forward as an 'ideal methodology' and the case studies expose some of the difficulties and practical problems of information systems work and the use of the methodology. A more contingency based approach to information systems development is advocated using Multiview as a framework rather than a prescriptive tool. Each information systems project and the use of the framework is unique, contingent on the particular problem situation. The skills of different analysts, the backgrounds of users and the situations in which they are constrained to work have always to be taken into account in any project. The realities of the situation will cause departure from the 'ideal methodology' in order to allow for the exigencies of the real world. Multiview can therefore be said to be an approach used to explore the application area in order to develop an information system.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The work described was carried out as part of a collaborative Alvey software engineering project (project number SE057). The project collaborators were the Inter-Disciplinary Higher Degrees Scheme of the University of Aston in Birmingham, BIS Applied Systems Ltd. (BIS) and the British Steel Corporation. The aim of the project was to investigate the potential application of knowledge-based systems (KBSs) to the design of commercial data processing (DP) systems. The work was primarily concerned with BIS's Structured Systems Design (SSD) methodology for DP systems development and how users of this methodology could be supported using KBS tools. The problems encountered by users of SSD are discussed and potential forms of computer-based support for inexpert designers are identified. The architecture for a support environment for SSD is proposed based on the integration of KBS and non-KBS tools for individual design tasks within SSD - The Intellipse system. The Intellipse system has two modes of operation - Advisor and Designer. The design, implementation and user-evaluation of Advisor are discussed. The results of a Designer feasibility study, the aim of which was to analyse major design tasks in SSD to assess their suitability for KBS support, are reported. The potential role of KBS tools in the domain of database design is discussed. The project involved extensive knowledge engineering sessions with expert DP systems designers. Some practical lessons in relation to KBS development are derived from this experience. The nature of the expertise possessed by expert designers is discussed. The need for operational KBSs to be built to the same standards as other commercial and industrial software is identified. A comparison between current KBS and conventional DP systems development is made. On the basis of this analysis, a structured development method for KBSs in proposed - the POLITE model. Some initial results of applying this method to KBS development are discussed. Several areas for further research and development are identified.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The present scarcity of operational knowledge-based systems (KBS) has been attributed, in part, to an inadequate consideration shown to user interface design during development. From a human factors perspective the problem has stemmed from an overall lack of user-centred design principles. Consequently the integration of human factors principles and techniques is seen as a necessary and important precursor to ensuring the implementation of KBS which are useful to, and usable by, the end-users for whom they are intended. Focussing upon KBS work taking place within commercial and industrial environments, this research set out to assess both the extent to which human factors support was presently being utilised within development, and the future path for human factors integration. The assessment consisted of interviews conducted with a number of commercial and industrial organisations involved in KBS development; and a set of three detailed case studies of individual KBS projects. Two of the studies were carried out within a collaborative Alvey project, involving the Interdisciplinary Higher Degrees Scheme (IHD) at the University of Aston in Birmingham, BIS Applied Systems Ltd (BIS), and the British Steel Corporation. This project, which had provided the initial basis and funding for the research, was concerned with the application of KBS to the design of commercial data processing (DP) systems. The third study stemmed from involvement on a KBS project being carried out by the Technology Division of the Trustees Saving Bank Group plc. The preliminary research highlighted poor human factors integration. In particular, there was a lack of early consideration of end-user requirements definition and user-centred evaluation. Instead concentration was given to the construction of the knowledge base and prototype evaluation with the expert(s). In response to this identified problem, a set of methods was developed that was aimed at encouraging developers to consider user interface requirements early on in a project. These methods were then applied in the two further projects, and their uptake within the overall development process was monitored. Experience from the two studies demonstrated that early consideration of user interface requirements was both feasible, and instructive for guiding future development work. In particular, it was shown a user interface prototype could be used as a basis for capturing requirements at the functional (task) level, and at the interface dialogue level. Extrapolating from this experience, a KBS life-cycle model is proposed which incorporates user interface design (and within that, user evaluation) as a largely parallel, rather than subsequent, activity to knowledge base construction. Further to this, there is a discussion of several key elements which can be seen as inhibiting the integration of human factors within KBS development. These elements stem from characteristics of present KBS development practice; from constraints within the commercial and industrial development environments; and from the state of existing human factors support.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Random number generation is a central component of modern information technology, with crucial applications in ensuring communications and information security. The development of new physical mechanisms suitable to directly generate random bit sequences is thus a subject of intense current research, with particular interest in alloptical techniques suitable for the generation of data sequences with high bit rate. One such promising technique that has received much recent attention is the chaotic semiconductor laser systems producing high quality random output as a result of the intrinsic nonlinear dynamics of its architecture [1]. Here we propose a novel complementary concept of all-optical technique that might dramatically increase the generation rate of random bits by using simultaneously multiple spectral channels with uncorrelated signals - somewhat similar to use of wave-division-multiplexing in communications. We propose to exploit the intrinsic nonlinear dynamics of extreme spectral broadening and supercontinuum (SC) generation in optical fibre, a process known to be often associated with non-deterministic fluctuations [2]. In this paper, we report proof-of concept results indicating that the fluctuations in highly nonlinear fibre SC generation can potentially be used for random number generation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The specific objective of the research was to evaluate proprietary audit systems. Proprietary audit systems comprise question sets containing approximately 500 questions dealing with selected aspects of health and safety management. Each question is allotted a number of points and an organisation seeks to judge its health and safety performance by the overall score achieved in the audit. Initially it was considered that the evaluation method might involve comparing the proprietary audit scores with other methods of measuring safety performance. However, what appeared to be missing in the first instance was information that organisations could use to compare the contrast question set content against their own needs. A technique was developed using the computer database FileMaker Pro. This enables questions in an audit to be sorted into categories using a process of searching for key words. Questions that are not categorised by word searching can be identified and sorted manually. The process can be completed in 2-3 hours which is considerably faster than manual categorisation of questions which typically takes about 10 days. The technique was used to compare and contrast three proprietary audits: ISRS, CHASE and QSA. Differences and similarities between these audits were successfully identified. It was concluded that in general proprietary audits need to focus to a greater extent on identifying strengths and weaknesses in occupational health and safety management systems. To do this requires the inclusion of more probing questions which consider whether risk control measures are likely to be successful.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis presents experimental investigation of different effects/techniques that can be used to upgrade legacy WDM communication systems. The main issue in upgrading legacy systems is that the fundamental setup, including components settings such as EDFA gains, does not need to be altered thus the improvement must be carried out at the network terminal. A general introduction to optical fibre communications is given at the beginning, including optical communication components and system impairments. Experimental techniques for performing laboratory optical transmission experiments are presented before the experimental work of this thesis. These techniques include optical transmitter and receiver designs as well as the design and operation of the recirculating loop. The main experimental work includes three different studies. The first study involves a development of line monitoring equipment that can be reliably used to monitor the performance of optically amplified long-haul undersea systems. This equipment can provide instant finding of the fault locations along the legacy communication link which in tum enables rapid repair execution to be performed hence upgrading the legacy system. The second study investigates the effect of changing the number of transmitted 1s and Os on the performance of WDM system. This effect can, in reality, be seen in some coding systems, e.g. forward-error correction (FEC) technique, where the proportion of the 1s and Os are changed at the transmitter by adding extra bits to the original bit sequence. The final study presents transmission results after all-optical format conversion from NRZ to CSRZ and from RZ to CSRZ using semiconductor optical amplifier in nonlinear optical loop mirror (SOA-NOLM). This study is mainly based on the fact that the use of all-optical processing, including format conversion, has become attractive for the future data networks that are proposed to be all-optical. The feasibility of the SOA-NOLM device for converting single and WDM signals is described. The optical conversion bandwidth and its limitations for WDM conversion are also investigated. All studies of this thesis employ 10Gbit/s single or WDM signals being transmitted over dispersion managed fibre span in the recirculating loop. The fibre span is composed of single-mode fibres (SMF) whose losses and dispersion are compensated using erbium-doped fibre amplifiers (EDFAs) and dispersion compensating fibres (DCFs), respectively. Different configurations of the fibre span are presented in different parts.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

WiMAX has been introduced as a competitive alternative for metropolitan broadband wireless access technologies. It is connection oriented and it can provide very high data rates, large service coverage, and flexible quality of services (QoS). Due to the large number of connections and flexible QoS supported by WiMAX, the uplink access in WiMAX networks is very challenging since the medium access control (MAC) protocol must efficiently manage the bandwidth and related channel allocations. In this paper, we propose and investigate a cost-effective WiMAX bandwidth management scheme, named the WiMAX partial sharing scheme (WPSS), in order to provide good QoS while achieving better bandwidth utilization and network throughput. The proposed bandwidth management scheme is compared with a simple but inefficient scheme, named the WiMAX complete sharing scheme (WCPS). A maximum entropy (ME) based analytical model (MEAM) is proposed for the performance evaluation of the two bandwidth management schemes. The reason for using MEAM for the performance evaluation is that MEAM can efficiently model a large-scale system in which the number of stations or connections is generally very high, while the traditional simulation and analytical (e.g., Markov models) approaches cannot perform well due to the high computation complexity. We model the bandwidth management scheme as a queuing network model (QNM) that consists of interacting multiclass queues for different service classes. Closed form expressions for the state and blocking probability distributions are derived for those schemes. Simulation results verify the MEAM numerical results and show that WPSS can significantly improve the network's performance compared to WCPS.