41 resultados para Policy-based management systems
Resumo:
The CancerGrid consortium is developing open-standards cancer informatics to address the challenges posed by modern cancer clinical trials. This paper presents the service-oriented software paradigm implemented in CancerGrid to derive clinical trial information management systems for collaborative cancer research across multiple institutions. Our proposal is founded on a combination of a clinical trial (meta)model and WSRF (Web Services Resource Framework), and is currently being evaluated for use in early phase trials. Although primarily targeted at cancer research, our approach is readily applicable to other areas for which a similar information model is available.
Resumo:
This research investigates technology transfer (TT) to developing countries, with specific reference to South Africa. Particular attention is paid to physical asset management, which includes the maintenance of plant, equipment and facilities. The research is case based, comprising a main case study (the South African electricity utility, Eskom) and four mini-cases. A five level framework adapted from Salami and Reavill (1997) is used as the methodological basis for the formulation of the research questions. This deals with technology selection, and management issues including implementation and maintenance and evaluation and modifications. The findings suggest the Salami and Reavill (1997) framework is a useful guide for TT. The case organisations did not introduce technology for strategic advantage, but to achieve operational efficiencies through cost reduction, higher quality and the ability to meet customer demand. Acquirers favour standardised technologies with which they are familiar. Cost-benefit evaluations have limited use in technology acquisition decisions. Users rely on supplier expertise to compensate for poor education and technical training in South Africa. The impact of political and economic factors is more evident in Eskom than in the mini-cases. Physical asset management follows traditional preventive maintenance practices, with limited use of new maintenance management thinking. Few modifications of the technology or R&D innovations take place. Little use is made of explicit knowledge from computerised maintenance management systems. Low operating and maintenance skills are not conducive to the transfer of high-technology equipment. South African organisations acquire technology as items of plant, equipment and systems, but limited transfer of technology takes place. This suggests that operators and maintainers frequently do not understand the underlying technology, and like workers elsewhere, are not always inclined towards adopting technology in the workplace.
Resumo:
This research is concerned with the application of operational research techniques in the development of a long- term waste management policy by an English waste disposal authority. The main aspects which have been considered are the estimation of future waste production and the assessment of the effects of proposed systems. Only household and commercial wastes have been dealt with in detail, though suggestions are made for the extension of the effect assessment to cover industrial and other wastes. Similarly, the only effects considered in detail have been costs, but possible extensions are discussed. An important feature of the study is that it was conducted in close collaboration with a waste disposal authority, and so pays more attention to the actual needs of the authority than is usual in such research. A critical examination of previous waste forecasting work leads to the use of simple trend extrapolation methods, with some consideration of seasonal effects. The possibility of relating waste production to other social and economic indicators is discussed. It is concluded that, at present, large uncertainties in predictions are inevitable; waste management systems must therefore be designed to cope with this uncertainty. Linear programming is used to assess the overall costs of proposals. Two alternative linear programming formulations of this problem are used and discussed. The first is a straightforward approach, which has been .implemented as an interactive computer program. The second is more sophisticated and represents the behaviour of incineration plants more realistically. Careful attention is paid to the choice of appropriate data and the interpretation of the results. Recommendations are made on methods for immediate use, on the choice of data to be collected for future plans, and on the most useful lines for further research and development.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
While much of a company's knowledge can be found in text repositories, current content management systems have limited capabilities for structuring and interpreting documents. In the emerging Semantic Web, search, interpretation and aggregation can be addressed by ontology-based semantic mark-up. In this paper, we examine semantic annotation, identify a number of requirements, and review the current generation of semantic annotation systems. This analysis shows that, while there is still some way to go before semantic annotation tools will be able to address fully all the knowledge management needs, research in the area is active and making good progress.
Resumo:
The Product Service Systems, servitization, and Service Science literature continues to grow as organisations seek to protect and improve their competitive position. The potential of technology applications to deliver service delivery systems facilitated by the ability to make real time decisions based upon ‘in the field’ performance is also significant. Research identifies four key questions to be addressed. Namely: how far along the servitization continuum should the organisation go in a single strategic step? Does the organisation have the structure and infrastructure to support this transition? What level of condition monitoring should it employ? Is the product positioned correctly in the value chain to adopt condition monitoring technology? Strategy consists of three dimensions, namely content, context, and process. The literature relating to PSS, servitization, and strategy all discuss the concepts relative to content and context but none offer a process to deliver an aligned strategy to deliver a service delivery system enabled by condition based management. This paper presents a tested iterative strategy formulation methodology which is the result of a structured development programme.
Resumo:
The growing use of a variety of information systems in crisis management both by non-governmental organizations (NGOs) and emergency management agencies makes the challenges of information sharing and interoperability increasingly important. The use of semantic web technologies is a growing area and is a technology stack specifically suited to these challenges. This paper presents a review of ontologies, vocabularies and taxonomies that are useful in crisis management systems. We identify the different subject areas relevant to crisis management based on a review of the literature. The different ontologies and vocabularies available are analysed in terms of their coverage, design and usability. We also consider the use cases for which they were designed and the degree to which they follow a variety of standards. While providing comprehensive ontologies for the crisis domain is not feasible or desirable there is considerable scope to develop ontologies for the subject areas not currently covered and for the purposes of interoperability.
Resumo:
Electrocardiography (ECG) has been recently proposed as biometric trait for identification purposes. Intra-individual variations of ECG might affect identification performance. These variations are mainly due to Heart Rate Variability (HRV). In particular, HRV causes changes in the QT intervals along the ECG waveforms. This work is aimed at analysing the influence of seven QT interval correction methods (based on population models) on the performance of ECG-fiducial-based identification systems. In addition, we have also considered the influence of training set size, classifier, classifier ensemble as well as the number of consecutive heartbeats in a majority voting scheme. The ECG signals used in this study were collected from thirty-nine subjects within the Physionet open access database. Public domain software was used for fiducial points detection. Results suggested that QT correction is indeed required to improve the performance. However, there is no clear choice among the seven explored approaches for QT correction (identification rate between 0.97 and 0.99). MultiLayer Perceptron and Support Vector Machine seemed to have better generalization capabilities, in terms of classification performance, with respect to Decision Tree-based classifiers. No such strong influence of the training-set size and the number of consecutive heartbeats has been observed on the majority voting scheme.
Resumo:
The manufacturing industry faces many challenges such as reducing time-to-market and cutting costs. In order to meet these increasing demands, effective methods are need to support the early product development stages by bridging the gap of communicating early design ideas and the evaluation of manufacturing performance. This paper introduces methods of linking design and manufacturing domains using disparate technologies. The combined technologies include knowledge management supporting for product lifecycle management systems, Enterprise Resource Planning (ERP) systems, aggregate process planning systems, workflow management and data exchange formats. A case study has been used to demonstrate the use of these technologies, illustrated by adding manufacturing knowledge to generate alternative early process plan which are in turn used by an ERP system to obtain and optimise a rough-cut capacity plan. Copyright © 2010 Inderscience Enterprises Ltd.
Resumo:
The shifting of global economic power from mature, established markets to emerging markets (EMs) is a fundamental feature of the new realities in the global political economy. Due to a combination of reasons (such as scarcity of reliable information on management systems of EMs, the growing contribution of human resource management (HRM) towards organisational performance, amongst others), the understanding about the dynamics of management of HRM in the EMs context and the need for proactive efforts by key stakeholders (e.g., multinational and local firms, policy makers and institutions such as trade unions) to develop appropriate HRM practice and policy for EMs has now become more critical than ever. It is more so given the phenomenal significance of the EMs predicted for the future of the global economy. For example, Antoine van Agtmael predicts that: in about 25 years the combined gross national product (GNP) of emergent markets will overtake that of currently mature economies causing a major shift in the centre of gravity of the global economy away from the developed to emerging economies. (van Agtmael 2007: 10–11) Despite the present (late 2013 and early 2014) slowdown in the contribution of EMs towards the global industrial growth (e.g., Das, 2013; Reuters, 2014), EMs are predicted to produce 70 per cent of world GDP growth and a further ten years later, their equity market capitalisation is expected to reach US$ 80 trillion, 1.2 times more than the developed world (see Goldman Sachs, 2010).
Resumo:
This paper uses empirical evidence to examine the operational dynamics and paradoxical nature of risk management systems in the banking sector. It demonstrates how a core paradox of market versus regulatory demands and an accompanying variety of performance, learning and belonging paradoxes underlie evident tensions in the interaction between front and back office staff in banks. Organisational responses to such paradoxes are found to range from passive to proactive, reflecting differing organisational, departmental and individual risk culture(s), and performance management systems. Nonetheless, a common feature of regulatory initiatives designed to secure a more structurally independent risk management function is that they have failed to rectify a critical imbalance of power - with the back office control functions continuing to be dominated by front office trading and investment functions. Ultimately, viewing the 'core' of risk management systems as a series of connected paradoxes rather than a set of assured, robust practices, requires a fundamental switch in emphasis away from a normative, standards-based approach to risk management to one which gives greater recognition to its behavioural dimensions.