39 resultados para Democratization of information


Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information and Communications Technology (ICT) is widely regarded as a key integration enabler in contemporary supply chain configurations. Furthermore, recent years have seen the vertical disintegration of supply chains as increasing numbers of manufacturers and retailers outsource significant parts of their supply chain functionality. In this environment, Third Party Logistics (3PL) providers - the majority of which are small companies - play a pivotal role. This raises important questions about the usage of ICT in this sector. However, there is a paucity of research in the field of small 3PLs with little empirical investigation into the usage of ICT by such firms. This paper presents the results of a survey on ICT systems usage in a sample of small Italian 3PLs. The results provide a technological profile of the surveyed companies, as well as an analysis of the role of ICT in customising services and of the factors influencing technology adoption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports on an action research project based in the UK rail industry; it used a novel type of Soft Systems Methodology (known as PrOH Modelling) to facilitate change in a major Train Operating Company (TOC). The project looked at a number of different disruptive incidents to compare and contrast practice via the Mitigate, Prevent, React and Recover (MPRR) Framework. One incident is detailed in depth. The paper also looks at the general process of conducting action research. This work will be of interest for researchers in the rail sector and for those conducting operations action research projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. Generalisation is measured by the performance on independent test data drawn from the same distribution as the training data. Such performance can be quantified by the posterior average of the information divergence between the true and the model distributions. Averaging over the Bayesian posterior guarantees internal coherence; Using information divergence guarantees invariance with respect to representation. The theory generalises the least mean squares theory for linear Gaussian models to general problems of statistical estimation. The main results are: (1)~the ideal optimal estimate is always given by average over the posterior; (2)~the optimal estimate within a computational model is given by the projection of the ideal estimate to the model. This incidentally shows some currently popular methods dealing with hyperpriors are in general unnecessary and misleading. The extension of information divergence to positive normalisable measures reveals a remarkable relation between the dlt dual affine geometry of statistical manifolds and the geometry of the dual pair of Banach spaces Ld and Ldd. It therefore offers conceptual simplification to information geometry. The general conclusion on the issue of evaluating neural network learning rules and other statistical inference methods is that such evaluations are only meaningful under three assumptions: The prior P(p), describing the environment of all the problems; the divergence Dd, specifying the requirement of the task; and the model Q, specifying available computing resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To meet changing needs of customers and to survive in the increasingly globalised and competitive environment, it is necessary for companies to equip themselves with intelligent tools, thereby enabling managerial levels to use the tactical decision in a better way. However, the implementation of an intelligent system is always a challenge in Small- and Medium-sized Enterprises (SMEs). Therefore, a new and simple approach with 'process rethinking' ability is proposed to generate ongoing process improvements over time. In this paper, a roadmap of the development of an agent-based information system is described. A case example has also been provided to show how the system can assist non-specialists, for example, managers and engineers to make right decisions for a continual process improvement. Copyright © 2006 Inderscience Enterprises Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper considers the empirical determinants of the quality of information disclosed about directors’ share options in a sample of large companies in 1994 and 1995. Policy recommendations, consolidated in the recommendations of the Greenbury report, argue for full and complete disclosure of director option information. In this paper two modest contributions to the UK empirical literature are made. First, the current degree of option information disclosure in the FTSE 350 companies is documented. Second, option information disclosure as a function of variables that are thought to in¯uence corporate costs of disclosure is modelled. The results have implications for corporate governance. Speci®cally, support is oVered for the monitoring function of nonexecutive directors. In addition, nondisclosure is found to be related to variables which proxy proprietary costs of revealing information (such as company size).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sensory cells usually transmit information to afferent neurons via chemical synapses, in which the level of noise is dependent on an applied stimulus. Taking into account such dependence, we model a sensory system as an array of LIF neurons with a common signal. We show that information transmission is enhanced by a nonzero level of noise. Moreover, we demonstrate a phenomenon similar to suprathreshold stochastic resonance with additive noise. We remark that many properties of information transmission found for the LIF neurons was predicted by us before with simple binary units [Phys. Rev. E 75, 021121 (2007)]. This confirmation of our predictions allows us to point out identical roots of the phenomena found in the simple threshold systems and more complex LIF neurons.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis deals with the problem of Information Systems design for Corporate Management. It shows that the results of applying current approaches to Management Information Systems and Corporate Modelling fully justify a fresh look to the problem. The thesis develops an approach to design based on Cybernetic principles and theories. It looks at Management as an informational process and discusses the relevance of regulation theory to its practice. The work proceeds around the concept of change and its effects on the organization's stability and survival. The idea of looking at organizations as viable systems is discussed and a design to enhance survival capacity is developed. It takes Ashby's theory of adaptation and developments on ultra-stability as a theoretical framework and considering conditions for learning and foresight deduces that a design should include three basic components: A dynamic model of the organization- environment relationships; a method to spot significant changes in the value of the essential variables and in a certain set of parameters; and a Controller able to conceive and change the other two elements and to make choices among alternative policies. Further considerations of the conditions for rapid adaptation in organisms composed of many parts, and the law of Requisite Variety determine that successful adaptive behaviour requires certain functional organization. Beer's model of viable organizations is put in relation to Ashby's theory of adaptation and regulation. The use of the Ultra-stable system as abstract unit of analysis permits developing a rigorous taxonomy of change; it starts distinguishing between change with in behaviour and change of behaviour to complete the classification with organizational change. It relates these changes to the logical categories of learning connecting the topic of Information System design with that of organizational learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Existing theories of semantic cognition propose models of cognitive processing occurring in a conceptual space, where ‘meaning’ is derived from the spatial relationships between concepts’ mapped locations within the space. Information visualisation is a growing area of research within the field of information retrieval, and methods for presenting database contents visually in the form of spatial data management systems (SDMSs) are being developed. This thesis combined these two areas of research to investigate the benefits associated with employing spatial-semantic mapping (documents represented as objects in two- and three-dimensional virtual environments are proximally mapped dependent on the semantic similarity of their content) as a tool for improving retrieval performance and navigational efficiency when browsing for information within such systems. Positive effects associated with the quality of document mapping were observed; improved retrieval performance and browsing behaviour were witnessed when mapping was optimal. It was also shown using a third dimension for virtual environment (VE) presentation provides sufficient additional information regarding the semantic structure of the environment that performance is increased in comparison to using two-dimensions for mapping. A model that describes the relationship between retrieval performance and browsing behaviour was proposed on the basis of findings. Individual differences were not found to have any observable influence on retrieval performance or browsing behaviour when mapping quality was good. The findings from this work have implications for both cognitive modelling of semantic information, and for designing and testing information visualisation systems. These implications are discussed in the conclusions of this work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using prescription analyses and questionnaires, the way drug information was used by general medical practitioners during the drug adoption process was studied. Three new drugs were considered; an innovation and two 'me-too' products. The innovation was accepted by general practitioners via a contagion process, information passing among doctors. The 'me-too' preparations were accepted more slowly and by a process which did not include the contagion effect. 'Industrial' information such as direct mail was used more at the 'awareness' stage of the adoption process while 'professional' sources of information such as articles in medical journals were used more to evaluate a new product. It was shown that 'industrial' information was preferred by older single practice doctors who did not specialise, had a first degree only and who did not dispense their own prescriptions. Doctors were divided into early and late-prescribers by using the date they first prescribed the innovatory drug. Their approach to drug information sources was further studied and it was shown that the early-prescriber issued slightly more prescriptions per month, had a larger list size, read fewer journals and generally rated industrial sources of information more highly than late-prescribers. The prescribing habits of three consultant rheumatologists were analysed and compared with those of the general practitioners in the community which they served. Very little association was noted and the influence of the consultant on the prescribing habits of general practitioners was concluded to be low. The consultants influence was suggested to be of two components, active and passive; the active component being the most influential. Journal advertising and advertisement placement were studied for one of the 'me-too' drugs. It was concluded that advertisement placement should be based on the reading patterns of general practitioners and not on ad-hoc data gathered by representatives as was the present practice. A model was proposed relating the 'time to prescribe' a new drug to the variables suggested throughout this work. Four of these variables were shown to be significant. These were, the list size, the medical age of the prescriber, the number of new preparations prescribed in a given time and the number of partners in the practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Currently, no review has been completed regarding the information-gathering process for the provision of medicines for self-medication in community pharmacies in developing countries. Objective: To review the rate of information gathering and the types of information gathered when patients present for self-medication requests. Methods: Six databases were searched for studies that described the rate of information gathering and/or the types of information gathered in the provision of medicines for self-medication in community pharmacies in developing countries. The types of information reported were classified as: signs and symptoms, patient identity, action taken, medications, medical history, and others. Results: Twenty-two studies met the inclusion criteria. Variations in the study populations, types of scenarios, research methods, and data reporting were observed. The reported rate of information gathering varied from 18% to 97%, depending on the research methods used. Information on signs and symptoms and patient identity was more frequently reported to be gathered compared with information on action taken, medications, and medical history. Conclusion: Evidence showed that the information-gathering process for the provision of medicines for self-medication via community pharmacies in developing countries is inconsistent. There is a need to determine the barriers to appropriate information-gathering practice as well as to develop strategies to implement effective information-gathering processes. It is also recommended that international and national pharmacy organizations, including pharmacy academics and pharmacy researchers, develop a consensus on the types of information that should be reported in the original studies. This will facilitate comparison across studies so that areas that need improvement can be identified. © 2013 Elsevier Inc.