31 resultados para Communication of technical information


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Existing theories of semantic cognition propose models of cognitive processing occurring in a conceptual space, where ‘meaning’ is derived from the spatial relationships between concepts’ mapped locations within the space. Information visualisation is a growing area of research within the field of information retrieval, and methods for presenting database contents visually in the form of spatial data management systems (SDMSs) are being developed. This thesis combined these two areas of research to investigate the benefits associated with employing spatial-semantic mapping (documents represented as objects in two- and three-dimensional virtual environments are proximally mapped dependent on the semantic similarity of their content) as a tool for improving retrieval performance and navigational efficiency when browsing for information within such systems. Positive effects associated with the quality of document mapping were observed; improved retrieval performance and browsing behaviour were witnessed when mapping was optimal. It was also shown using a third dimension for virtual environment (VE) presentation provides sufficient additional information regarding the semantic structure of the environment that performance is increased in comparison to using two-dimensions for mapping. A model that describes the relationship between retrieval performance and browsing behaviour was proposed on the basis of findings. Individual differences were not found to have any observable influence on retrieval performance or browsing behaviour when mapping quality was good. The findings from this work have implications for both cognitive modelling of semantic information, and for designing and testing information visualisation systems. These implications are discussed in the conclusions of this work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using prescription analyses and questionnaires, the way drug information was used by general medical practitioners during the drug adoption process was studied. Three new drugs were considered; an innovation and two 'me-too' products. The innovation was accepted by general practitioners via a contagion process, information passing among doctors. The 'me-too' preparations were accepted more slowly and by a process which did not include the contagion effect. 'Industrial' information such as direct mail was used more at the 'awareness' stage of the adoption process while 'professional' sources of information such as articles in medical journals were used more to evaluate a new product. It was shown that 'industrial' information was preferred by older single practice doctors who did not specialise, had a first degree only and who did not dispense their own prescriptions. Doctors were divided into early and late-prescribers by using the date they first prescribed the innovatory drug. Their approach to drug information sources was further studied and it was shown that the early-prescriber issued slightly more prescriptions per month, had a larger list size, read fewer journals and generally rated industrial sources of information more highly than late-prescribers. The prescribing habits of three consultant rheumatologists were analysed and compared with those of the general practitioners in the community which they served. Very little association was noted and the influence of the consultant on the prescribing habits of general practitioners was concluded to be low. The consultants influence was suggested to be of two components, active and passive; the active component being the most influential. Journal advertising and advertisement placement were studied for one of the 'me-too' drugs. It was concluded that advertisement placement should be based on the reading patterns of general practitioners and not on ad-hoc data gathered by representatives as was the present practice. A model was proposed relating the 'time to prescribe' a new drug to the variables suggested throughout this work. Four of these variables were shown to be significant. These were, the list size, the medical age of the prescriber, the number of new preparations prescribed in a given time and the number of partners in the practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research investigates the contribution that Geographic Information Systems (GIS) can make to the land suitability process used to determine the effects of a climate change scenario. The research is intended to redress the severe under representation of Developing countries within the literature examining the impacts of climatic change upon crop productivity. The methodology adopts some of the Intergovernmental Panel on Climate Change (IPCC) estimates for regional climate variations, based upon General Circulation Model predictions (GCMs) and applies them to a baseline climate for Bangladesh. Utilising the United Nations Food & Agricultural Organisation's Agro-ecological Zones land suitability methodology and crop yield model, the effects of the scenario upon agricultural productivity on 14 crops are determined. A Geographic Information System (IDRISI) is adopted in order to facilitate the methodology, in conjunction with a specially designed spreadsheet, used to determine the yield and suitability rating for each crop. A simple optimisation routine using the GIS is incorporated to provide an indication of the 'maximum theoretical' yield available to the country, should the most calorifically significant crops be cultivated on each land unit both before and after the climate change scenario. This routine will provide an estimate of the theoretical population supporting capacity of the country, both now and in the future, to assist with planning strategies and research. The research evaluates the utility of this alternative GIS based methodology for the land evaluation process and determines the relative changes in crop yields that may result from changes in temperature, photosynthesis and flooding hazard frequency. In summary, the combination of a GIS and a spreadsheet was successful, the yield prediction model indicates that the application of the climate change scenario will have a deleterious effect upon the yields of the study crops. Any yield reductions will have severe implications for agricultural practices. The optimisation routine suggests that the 'theoretical maximum' population supporting capacity is well in excess of current and future population figures. If this agricultural potential could be realised however, it may provide some amelioration from the effects of climate change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop and study the concept of dataflow process networks as used for exampleby Kahn to suit exact computation over data types related to real numbers, such as continuous functions and geometrical solids. Furthermore, we consider communicating these exact objectsamong processes using protocols of a query-answer nature as introduced in our earlier work. This enables processes to provide valid approximations with certain accuracy and focusing on certainlocality as demanded by the receiving processes through queries. We define domain-theoretical denotational semantics of our networks in two ways: (1) directly, i. e. by viewing the whole network as a composite process and applying the process semantics introduced in our earlier work; and (2) compositionally, i. e. by a fixed-point construction similarto that used by Kahn from the denotational semantics of individual processes in the network. The direct semantics closely corresponds to the operational semantics of the network (i. e. it iscorrect) but very difficult to study for concrete networks. The compositional semantics enablescompositional analysis of concrete networks, assuming it is correct. We prove that the compositional semantics is a safe approximation of the direct semantics. Wealso provide a method that can be used in many cases to establish that the two semantics fully coincide, i. e. safety is not achieved through inactivity or meaningless answers. The results are extended to cover recursively-defined infinite networks as well as nested finitenetworks. A robust prototype implementation of our model is available.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of separating structured information representing phenomena of differing natures is considered. A structure is assumed to be independent of the others if can be represented in a complementary subspace. When the concomitant subspaces are well separated the problem is readily solvable by a linear technique. Otherwise, the linear approach fails to correctly discriminate the required information. Hence, a non-extensive approach is proposed. The resulting nonlinear technique is shown to be suitable for dealing with cases that cannot be tackled by the linear one.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Currently, no review has been completed regarding the information-gathering process for the provision of medicines for self-medication in community pharmacies in developing countries. Objective: To review the rate of information gathering and the types of information gathered when patients present for self-medication requests. Methods: Six databases were searched for studies that described the rate of information gathering and/or the types of information gathered in the provision of medicines for self-medication in community pharmacies in developing countries. The types of information reported were classified as: signs and symptoms, patient identity, action taken, medications, medical history, and others. Results: Twenty-two studies met the inclusion criteria. Variations in the study populations, types of scenarios, research methods, and data reporting were observed. The reported rate of information gathering varied from 18% to 97%, depending on the research methods used. Information on signs and symptoms and patient identity was more frequently reported to be gathered compared with information on action taken, medications, and medical history. Conclusion: Evidence showed that the information-gathering process for the provision of medicines for self-medication via community pharmacies in developing countries is inconsistent. There is a need to determine the barriers to appropriate information-gathering practice as well as to develop strategies to implement effective information-gathering processes. It is also recommended that international and national pharmacy organizations, including pharmacy academics and pharmacy researchers, develop a consensus on the types of information that should be reported in the original studies. This will facilitate comparison across studies so that areas that need improvement can be identified. © 2013 Elsevier Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasingly users are seen as the weak link in the chain, when it comes to the security of corporate information. Should the users of computer systems act in any inappropriate or insecure manner, then they may put their employers in danger of financial losses, information degradation or litigation, and themselves in danger of dismissal or prosecution. This is a particularly important concern for knowledge-intensive organisations, such as universities, as the effective conduct of their core teaching and research activities is becoming ever more reliant on the availability, integrity and accuracy of computer-based information resources. One increasingly important mechanism for reducing the occurrence of inappropriate behaviours, and in so doing, protecting corporate information, is through the formulation and application of a formal ‘acceptable use policy (AUP). Whilst the AUP has attracted some academic interest, it has tended to be prescriptive and overly focussed on the role of the Internet, and there is relatively little empirical material that explicitly addresses the purpose, positioning or content of real acceptable use policies. The broad aim of the study, reported in this paper, is to fill this gap in the literature by critically examining the structure and composition of a sample of authentic policies – taken from the higher education sector – rather than simply making general prescriptions about what they ought to contain. There are two important conclusions to be drawn from this study: (1) the primary role of the AUP appears to be as a mechanism for dealing with unacceptable behaviour, rather than proactively promoting desirable and effective security behaviours, and (2) the wide variation found in the coverage and positioning of the reviewed policies is unlikely to be fostering a coherent approach to security management, across the higher education sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In efficiency studies using the stochastic frontier approach, the main focus is to explain inefficiency in terms of some exogenous variables and computation of marginal effects of each of these determinants. Although inefficiency is estimated by its mean conditional on the composed error term (the Jondrow et al., 1982 estimator), the marginal effects are computed from the unconditional mean of inefficiency (Wang, 2002). In this paper we derive the marginal effects based on the Jondrow et al. estimator and use the bootstrap method to compute confidence intervals of the marginal effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The evaluation of geospatial data quality and trustworthiness presents a major challenge to geospatial data users when making a dataset selection decision. The research presented here therefore focused on defining and developing a GEO label – a decision support mechanism to assist data users in efficient and effective geospatial dataset selection on the basis of quality, trustworthiness and fitness for use. This thesis thus presents six phases of research and development conducted to: (a) identify the informational aspects upon which users rely when assessing geospatial dataset quality and trustworthiness; (2) elicit initial user views on the GEO label role in supporting dataset comparison and selection; (3) evaluate prototype label visualisations; (4) develop a Web service to support GEO label generation; (5) develop a prototype GEO label-based dataset discovery and intercomparison decision support tool; and (6) evaluate the prototype tool in a controlled human-subject study. The results of the studies revealed, and subsequently confirmed, eight geospatial data informational aspects that were considered important by users when evaluating geospatial dataset quality and trustworthiness, namely: producer information, producer comments, lineage information, compliance with standards, quantitative quality information, user feedback, expert reviews, and citations information. Following an iterative user-centred design (UCD) approach, it was established that the GEO label should visually summarise availability and allow interrogation of these key informational aspects. A Web service was developed to support generation of dynamic GEO label representations and integrated into a number of real-world GIS applications. The service was also utilised in the development of the GEO LINC tool – a GEO label-based dataset discovery and intercomparison decision support tool. The results of the final evaluation study indicated that (a) the GEO label effectively communicates the availability of dataset quality and trustworthiness information and (b) GEO LINC successfully facilitates ‘at a glance’ dataset intercomparison and fitness for purpose-based dataset selection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The involvement of parents in their child’s hospital care has been strongly advocated in paediatric healthcare policy and practice. However, incorporating parental worries about their child’s condition into clinical care can be difficult for both parents and healthcare professionals. Through our “Listening To You” quality improvement project we developed and piloted an innovative approach to listening, incorporating and responding to parental concerns regarding their child’s condition when in hospital. Here we describe the phases of work undertaken to develop our “Listening To You” communications bundle, including a survey, literature review and consultation with parents and staff, before findings from the project evaluation are presented and discussed.