16 resultados para Project reporting tools
em Aston University Research Archive
Resumo:
This paper describes the work undertaken in the Scholarly Ontologies Project. The aim of the project has been to develop a computational approach to support scholarly sensemaking, through interpretation and argumentation, enabling researchers to make claims: to describe and debate their view of a document's key contributions and relationships to the literature. The project has investigated the technicalities and practicalities of capturing conceptual relations, within and between conventional documents in terms of abstract ontological structures. In this way, we have developed a new kind of index to distributed digital library systems. This paper reports a case study undertaken to test the sensemaking tools developed by the Scholarly Ontologies project. The tools used were ClaiMapper, which allows the user to sketch argument maps of individual papers and their connections, ClaiMaker, a server on which such models can be stored and saved, which provides interpretative services to assist the querying of argument maps across multiple papers and ClaimFinder, a novice interface to the search services in ClaiMaker.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
The methods used by the UK Police to investigate complaints of rape have unsurprisingly come under much scrutiny in recent times, with a 2007 joint report on behalf of HM Crown Prosecution Service Inspectorate and HM Inspectorate of Constabulary concluding that there were many areas where improvements should be made. The research reported here forms part of a larger project which draws on various discourse analytical tools to identify the processes at work during police interviews with women reporting rape. Drawing on a corpus of video recorded police interviews with women reporting rape, this study applies a two pronged analysis to reveal the presence of these ideologies. Firstly, an analysis of the discourse markers ‘well’ and ‘so’ demonstrates the control exerted on the interaction by interviewing officers, as they attach importance to certain facts while omitting much of the information provided by the victim. Secondly, the interpretative repertoires relied upon by officers to ‘make sense’ of victim’s accounts are subject to scrutiny. As well as providing micro-level analyses which demonstrate processes of interactional control at the local level, the findings of these analyses can be shown to relate to a wider context – specifically prevailing ideologies about sexual violence in society as a whole.
Resumo:
Effective management of projects is becoming increasingly important for any type of organization to remain competitive in today’s dynamic business environment due to pressure of globalization. The use of benchmarking is widening as a technique for supporting project management. Benchmarking can be described as the search for the best practices, leading to the superior performance of an organization. However, effectiveness of benchmarking depends on the use of tools for collecting and analyzing information and deriving subsequent improvement projects. This study demonstrates how analytic hierarchy process (AHP), a multiple attribute decision-making technique, can be used for benchmarking project management practices. The entire methodology has been applied to benchmark project management practice of Caribbean public sector organizations with organizations in the Indian petroleum sector, organizations in the infrastructure sector of Thailand and the UK. This study demonstrates the effectiveness of a proposed benchmarking model using AHP, determines problems and issues of Caribbean project management in the public sector and suggests improvement measures for effective project management.
Resumo:
Construction projects are risky. However, the characteristics of the risk highly depend on the type of procurement being adopted for managing the project. A build-operate-transfer (BOT) project is recognized as one of the most risky project schemes. There are instances of project failure where a BOT scheme was employed. Ineffective rts are increasingly being managed using various risk management tools and techniques. However, application of those tools depends on the nature of the project, organization's policy, project management strategy, risk attitude of the project team members, and availability of the resources. Understanding of the contents and contexts of BOT projects, together with a thorough understanding of risk management tools and techniques, helps select processes of risk management for effective project implementation in a BOT scheme. This paper studies application of risk management tools and techniques in BOT projects through reviews of relevant literatures and develops a model for selecting risk management process for BOT projects. The application to BOT projects is considered from the viewpoints of the major project participants. Discussion is also made with regard to political risks. This study would contribute to the establishment of a framework for systematic risk management in BOT projects.
Resumo:
Healthcare professionals routinely deploy various quality management tools and techniques in order to improve performance of healthcare delivery. However, they are characterised by fragmented approach i.e., they are not linked with the strategic intent of the organisation. This study introduces a holistic quality improvement method, which integrates all quality improvement projects with the strategic intent of the healthcare organisations. It first identifies a healthcare system and its environment. The Strengths, Weaknesses, Opportunities and Threats (SWOT) of the system are then derived with the involvement of the concerned stakeholders. This leads to developing the strategies in order to satisfy customers in line with the organisation's competitive position. These strategies help identify a few projects, the implementation of which ensures achievement of desired quality. The projects are then prioritised with the involvement of the concerned stakeholders and implemented in order to improve the system performance. The effectiveness of the method has been demonstrated using a case study of an intensive care unit at the Eric Williams Medical Sciences Complex Hospital in Trinidad. Copyright © 2007 Inderscience Enterprises Ltd.
Resumo:
This study investigates the discursive patterns of interactions between police interviewers and women reporting rape in significant witness interviews. Data in the form of video recorded interviews were obtained from a UK police force for the purposes of this study. The data are analysed using a multi-method approach, incorporating tools from micro-sociology, Conversation Analysis and Discursive Psychology, to reveal patterns of interactional control, negotiation, and interpretation. The study adopts a critical approach, which is to say that as well as describing discursive patterns, it explains them in light of the discourse processes involved in the production and consumption of police interview talk, and comments on the relationship between these discourse processes and the social context in which they occur. A central focus of the study is how interviewers draw on particular interactional resources to shape interviewees? accounts in particular ways, and this is discussed in relation to the institutional role of the significant witness interview. The discussion is also extended to the ways in which mainstream rape ideology is both reflected in, and maintained by, the discursive choices of participants. The findings of this study indicate that there are a number of issues to be addressed in terms of the training currently offered to officers at Level 2 of the Professionalising Investigation Programme (PIP) (NPIA, 2009) who intend to conduct significant witness interviews. Furthermore, a need is identified to bring the linguistic and discursive processes of negotiation and transformation identified by the study to the attention of the justice system as a whole. This is a particularly pressing need in light of judicial reluctance to replace written witness statements, the current „end product? of significant witness interviews, with the video recorded interview in place of direct examination in cases of rape.
Resumo:
Protein crystallization has gained a new strategic and commercial relevance in the postgenomic era due to its pivotal role in structural genomics. Producing high quality crystals has always been a bottleneck to efficient structure determination, and this problem is becoming increasingly acute. This is especially true for challenging, therapeutically important proteins that typically do not form suitable crystals. The OptiCryst consortium has focused on relieving this bottleneck by making a concerted effort to improve the crystallization techniques usually employed, designing new crystallization tools, and applying such developments to the optimization of target protein crystals. In particular, the focus has been on the novel application of dual polarization interferometry (DPI) to detect suitable nucleation; the application of in situ dynamic light scattering (DLS) to monitor and analyze the process of crystallization; the use of UV-fluorescence to differentiate protein crystals from salt; the design of novel nucleants and seeding technologies; and the development of kits for capillary counterdiffusion and crystal growth in gels. The consortium collectively handled 60 new target proteins that had not been crystallized previously. From these, we generated 39 crystals with improved diffraction properties. Fourteen of these 39 were only obtainable using OptiCryst methods. For the remaining 25, OptiCryst methods were used in combination with standard crystallization techniques. Eighteen structures have already been solved (30% success rate), with several more in the pipeline.
Resumo:
We argue that, for certain constrained domains, elaborate model transformation technologies-implemented from scratch in general-purpose programming languages-are unnecessary for model-driven engineering; instead, lightweight configuration of commercial off-the-shelf productivity tools suffices. In particular, in the CancerGrid project, we have been developing model-driven techniques for the generation of software tools to support clinical trials. A domain metamodel captures the community's best practice in trial design. A scientist authors a trial protocol, modelling their trial by instantiating the metamodel; customized software artifacts to support trial execution are generated automatically from the scientist's model. The metamodel is expressed as an XML Schema, in such a way that it can be instantiated by completing a form to generate a conformant XML document. The same process works at a second level for trial execution: among the artifacts generated from the protocol are models of the data to be collected, and the clinician conducting the trial instantiates such models in reporting observations-again by completing a form to create a conformant XML document, representing the data gathered during that observation. Simple standard form management tools are all that is needed. Our approach is applicable to a wide variety of information-modelling domains: not just clinical trials, but also electronic public sector computing, customer relationship management, document workflow, and so on. © 2012 Springer-Verlag.
Resumo:
The Securities and Exchange Commission (SEC) in the United States and in particular its immediately past chairman, Christopher Cox, has been actively promoting an upgrade of the EDGAR system of disseminating filings. The new generation of information provision has been dubbed by Chairman Cox, "Interactive Data" (SEC, 2006). In October this year the Office of Interactive Disclosure was created(http://www.sec.gov/news/press/2007/2007-213.htm). The focus of this paper is to examine the way in which the non-professional investor has been constructed by various actors. We examine the manner in which Interactive Data has been sold as the panacea for financial market 'irregularities' by the SEC and others. The academic literature shows almost no evidence of researching non-professional investors in any real sense (Young, 2006). Both this literature and the behaviour of representatives of institutions such as the SEC and FSA appears to find it convenient to construct this class of investor in a particular form and to speak for them. We theorise the activities of the SEC and its chairman in particular over a period of about three years, both following and prior to the 'credit crunch'. Our approach is to examine a selection of the policy documents released by the SEC and other interested parties and the statements made by some of the policy makers and regulators central to the programme to advance the socio-technical project that is constituted by Interactive Data. We adopt insights from ANT and more particularly the sociology of translation (Callon, 1986; Latour, 1987, 2005; Law, 1996, 2002; Law & Singleton, 2005) to show how individuals and regulators have acted as spokespersons for this malleable class of investor. We theorise the processes of accountability to investors and others and in so doing reveal the regulatory bodies taking the regulated for granted. The possible implications of technological developments in digital reporting have been identified also by the CEO's of the six biggest audit firms in a discussion document on the role of accounting information and audit in the future of global capital markets (DiPiazza et al., 2006). The potential for digital reporting enabled through XBRL to "revolutionize the entire company reporting model" (p.16) is discussed and they conclude that the new model "should be driven by the wants of investors and other users of company information,..." (p.17; emphasis in the original). Here rather than examine the somewhat illusive and vexing question of whether adding interactive functionality to 'traditional' reports can achieve the benefits claimed for nonprofessional investors we wish to consider the rhetorical and discursive moves in which the SEC and others have engaged to present such developments as providing clearer reporting and accountability standards and serving the interests of this constructed and largely unknown group - the non-professional investor.
Resumo:
The aim of this study is to address the main deficiencies with the prevailing project cost and time control practices for construction projects in the UK. A questionnaire survey was carried out with 250 top companies followed by in-depth interviews with 15 experienced practitioners from these companies in order to gain further insights of the identified problems, and their experience of good practice on how these problems can be tackled. On the basis of these interviews and syntheses with literature, a list of 65 good practice recommendations have been developed for the key project control tasks: planning, monitoring, reporting and analysing. The Delphi method was then used, with the participation of a panel of 8 practitioner experts, to evaluate these improvement recommendations and to establish their degree of relevance. After two rounds of Delphi, these recommendations are put forward as "critical", "important", or "helpful" measures for improving project control practice.
Resumo:
This book provides a practical guide for accountants working in practice or in business faced with the complexity of moving to adopt IFRS-based financial reporting. The book offers not only an overview of the regulatory framework and the requirements to produce IFRS-compliant financial statements but also guidance on developing an implementation strategy including project management, identifying and responding to challenges, dealing with change management and communication with external stakeholders.
Resumo:
The goal of FOCUS, which stands for Frailty Management Optimization through EIPAHA Commitments and Utilization of Stakeholders’ Input, is to reduce the burden of frailty in Europe. The partners are working on advancing knowledge of frailty detection, assessment, and management, including biological, clinical, cognitive and psychosocial markers, in order to change the paradigm of frailty care from acute intervention to prevention. FOCUS partners are working on ways to integrate the best available evidence from frailty-related screening tools, epidemiological and interventional studies into the care of frail people and their quality of life. Frail citizens in Italy, Poland and the UK and their caregivers are being called to express their views and their experiences with treatments and interventions aimed at improving quality of life. The FOCUS Consortium is developing pathways to leverage the knowledge available and to put it in the service of frail citizens. In order to reach out to the broadest audience possible, the FOCUS Platform for Knowledge Exchange and the platform for Scaling Up are being developed with the collaboration of stakeholders. The FOCUS project is a development of the work being done by the European Innovation Partnership on Active and Healthy Ageing (EIPAHA), which aims to increase the average healthy lifespan in Europe by 2020 while fostering sustainability of health/social care systems and innovation in Europe. The knowledge and tools developed by the FOCUS project, with input from stakeholders, will be deployed to all EIPAHA participants dealing with frail older citizens to support activities and optimize performance.