827 resultados para Ethics of information
Levinasian ethics and the representation of the other in international and cross-cultural management
Resumo:
In this paper, we seek to further the discussion, problematization and critique of west/east identity relations in ICM studies by considering the ethics of the relationship – an issue never far beneath the surface in discussions of Orientalism. In particular we seek to both examine and question the ethics of representation in relation to a critique of what has come to be known as international and cross-cultural management (ICM). To pursue such a discussion, we draw specifically on the ethical elaborations of Emmanuel Levinas as well as his chief interlocutors Jacques Derrida and Zygmunt Bauman. The value of this discussion, we propose, is that Levinas offers a philosophy that holds as its central concept the relationship between the self and Other as the primary ethical and pre-ontological relation. Levinas’ philosophy provides a means of extending the post-colonial critique of ICM, and ICM provides a context in which the Levinasian ethics can be brought to bear on a significant issue on contemporary business and management.
Resumo:
Social networks constitute a major channel for the diffusion of information and the formation of attitudes in a society. Introducing a dynamic model of social learning, the first part of this thesis studies the emergence of socially influential individuals and groups, and identifies the characteristics that make them influential. The second part uses a Bayesian network game to analyse the role of social interaction and conformism in the making of decisions whose returns or costs are ex ante uncertain.
Resumo:
Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. Generalisation is measured by the performance on independent test data drawn from the same distribution as the training data. Such performance can be quantified by the posterior average of the information divergence between the true and the model distributions. Averaging over the Bayesian posterior guarantees internal coherence; Using information divergence guarantees invariance with respect to representation. The theory generalises the least mean squares theory for linear Gaussian models to general problems of statistical estimation. The main results are: (1)~the ideal optimal estimate is always given by average over the posterior; (2)~the optimal estimate within a computational model is given by the projection of the ideal estimate to the model. This incidentally shows some currently popular methods dealing with hyperpriors are in general unnecessary and misleading. The extension of information divergence to positive normalisable measures reveals a remarkable relation between the dlt dual affine geometry of statistical manifolds and the geometry of the dual pair of Banach spaces Ld and Ldd. It therefore offers conceptual simplification to information geometry. The general conclusion on the issue of evaluating neural network learning rules and other statistical inference methods is that such evaluations are only meaningful under three assumptions: The prior P(p), describing the environment of all the problems; the divergence Dd, specifying the requirement of the task; and the model Q, specifying available computing resources.
Resumo:
To meet changing needs of customers and to survive in the increasingly globalised and competitive environment, it is necessary for companies to equip themselves with intelligent tools, thereby enabling managerial levels to use the tactical decision in a better way. However, the implementation of an intelligent system is always a challenge in Small- and Medium-sized Enterprises (SMEs). Therefore, a new and simple approach with 'process rethinking' ability is proposed to generate ongoing process improvements over time. In this paper, a roadmap of the development of an agent-based information system is described. A case example has also been provided to show how the system can assist non-specialists, for example, managers and engineers to make right decisions for a continual process improvement. Copyright © 2006 Inderscience Enterprises Ltd.
Resumo:
This paper considers the empirical determinants of the quality of information disclosed about directors’ share options in a sample of large companies in 1994 and 1995. Policy recommendations, consolidated in the recommendations of the Greenbury report, argue for full and complete disclosure of director option information. In this paper two modest contributions to the UK empirical literature are made. First, the current degree of option information disclosure in the FTSE 350 companies is documented. Second, option information disclosure as a function of variables that are thought to in¯uence corporate costs of disclosure is modelled. The results have implications for corporate governance. Speci®cally, support is oVered for the monitoring function of nonexecutive directors. In addition, nondisclosure is found to be related to variables which proxy proprietary costs of revealing information (such as company size).
Resumo:
Sensory cells usually transmit information to afferent neurons via chemical synapses, in which the level of noise is dependent on an applied stimulus. Taking into account such dependence, we model a sensory system as an array of LIF neurons with a common signal. We show that information transmission is enhanced by a nonzero level of noise. Moreover, we demonstrate a phenomenon similar to suprathreshold stochastic resonance with additive noise. We remark that many properties of information transmission found for the LIF neurons was predicted by us before with simple binary units [Phys. Rev. E 75, 021121 (2007)]. This confirmation of our predictions allows us to point out identical roots of the phenomena found in the simple threshold systems and more complex LIF neurons.
Resumo:
This thesis deals with the problem of Information Systems design for Corporate Management. It shows that the results of applying current approaches to Management Information Systems and Corporate Modelling fully justify a fresh look to the problem. The thesis develops an approach to design based on Cybernetic principles and theories. It looks at Management as an informational process and discusses the relevance of regulation theory to its practice. The work proceeds around the concept of change and its effects on the organization's stability and survival. The idea of looking at organizations as viable systems is discussed and a design to enhance survival capacity is developed. It takes Ashby's theory of adaptation and developments on ultra-stability as a theoretical framework and considering conditions for learning and foresight deduces that a design should include three basic components: A dynamic model of the organization- environment relationships; a method to spot significant changes in the value of the essential variables and in a certain set of parameters; and a Controller able to conceive and change the other two elements and to make choices among alternative policies. Further considerations of the conditions for rapid adaptation in organisms composed of many parts, and the law of Requisite Variety determine that successful adaptive behaviour requires certain functional organization. Beer's model of viable organizations is put in relation to Ashby's theory of adaptation and regulation. The use of the Ultra-stable system as abstract unit of analysis permits developing a rigorous taxonomy of change; it starts distinguishing between change with in behaviour and change of behaviour to complete the classification with organizational change. It relates these changes to the logical categories of learning connecting the topic of Information System design with that of organizational learning.
Resumo:
Existing theories of semantic cognition propose models of cognitive processing occurring in a conceptual space, where ‘meaning’ is derived from the spatial relationships between concepts’ mapped locations within the space. Information visualisation is a growing area of research within the field of information retrieval, and methods for presenting database contents visually in the form of spatial data management systems (SDMSs) are being developed. This thesis combined these two areas of research to investigate the benefits associated with employing spatial-semantic mapping (documents represented as objects in two- and three-dimensional virtual environments are proximally mapped dependent on the semantic similarity of their content) as a tool for improving retrieval performance and navigational efficiency when browsing for information within such systems. Positive effects associated with the quality of document mapping were observed; improved retrieval performance and browsing behaviour were witnessed when mapping was optimal. It was also shown using a third dimension for virtual environment (VE) presentation provides sufficient additional information regarding the semantic structure of the environment that performance is increased in comparison to using two-dimensions for mapping. A model that describes the relationship between retrieval performance and browsing behaviour was proposed on the basis of findings. Individual differences were not found to have any observable influence on retrieval performance or browsing behaviour when mapping quality was good. The findings from this work have implications for both cognitive modelling of semantic information, and for designing and testing information visualisation systems. These implications are discussed in the conclusions of this work.
Resumo:
Using prescription analyses and questionnaires, the way drug information was used by general medical practitioners during the drug adoption process was studied. Three new drugs were considered; an innovation and two 'me-too' products. The innovation was accepted by general practitioners via a contagion process, information passing among doctors. The 'me-too' preparations were accepted more slowly and by a process which did not include the contagion effect. 'Industrial' information such as direct mail was used more at the 'awareness' stage of the adoption process while 'professional' sources of information such as articles in medical journals were used more to evaluate a new product. It was shown that 'industrial' information was preferred by older single practice doctors who did not specialise, had a first degree only and who did not dispense their own prescriptions. Doctors were divided into early and late-prescribers by using the date they first prescribed the innovatory drug. Their approach to drug information sources was further studied and it was shown that the early-prescriber issued slightly more prescriptions per month, had a larger list size, read fewer journals and generally rated industrial sources of information more highly than late-prescribers. The prescribing habits of three consultant rheumatologists were analysed and compared with those of the general practitioners in the community which they served. Very little association was noted and the influence of the consultant on the prescribing habits of general practitioners was concluded to be low. The consultants influence was suggested to be of two components, active and passive; the active component being the most influential. Journal advertising and advertisement placement were studied for one of the 'me-too' drugs. It was concluded that advertisement placement should be based on the reading patterns of general practitioners and not on ad-hoc data gathered by representatives as was the present practice. A model was proposed relating the 'time to prescribe' a new drug to the variables suggested throughout this work. Four of these variables were shown to be significant. These were, the list size, the medical age of the prescriber, the number of new preparations prescribed in a given time and the number of partners in the practice.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
Background: Currently, no review has been completed regarding the information-gathering process for the provision of medicines for self-medication in community pharmacies in developing countries. Objective: To review the rate of information gathering and the types of information gathered when patients present for self-medication requests. Methods: Six databases were searched for studies that described the rate of information gathering and/or the types of information gathered in the provision of medicines for self-medication in community pharmacies in developing countries. The types of information reported were classified as: signs and symptoms, patient identity, action taken, medications, medical history, and others. Results: Twenty-two studies met the inclusion criteria. Variations in the study populations, types of scenarios, research methods, and data reporting were observed. The reported rate of information gathering varied from 18% to 97%, depending on the research methods used. Information on signs and symptoms and patient identity was more frequently reported to be gathered compared with information on action taken, medications, and medical history. Conclusion: Evidence showed that the information-gathering process for the provision of medicines for self-medication via community pharmacies in developing countries is inconsistent. There is a need to determine the barriers to appropriate information-gathering practice as well as to develop strategies to implement effective information-gathering processes. It is also recommended that international and national pharmacy organizations, including pharmacy academics and pharmacy researchers, develop a consensus on the types of information that should be reported in the original studies. This will facilitate comparison across studies so that areas that need improvement can be identified. © 2013 Elsevier Inc.
Resumo:
Provision of information and behavioural instruction has been demonstrated to improve recovery after surgery. However, patients draw on a range of information sources and it is important to establish which sources patients use and how this influences perceptions and behaviour as they progress along the surgical pathway. In this qualitative, exploratory and longitudinal study, the use of information and instruction were explored from the perspective of people undergoing inguinal hernia repair surgery. Seven participants undergoing inguinal hernia repair surgery were interviewed using semi-structured interviews 2 weeks before surgery and 2 weeks and 4 months post-surgery. Nineteen interviews were conducted in total. Topic guides included sources of knowledge, reasons for help-seeking and opting for surgery and factors influencing return to activity. Data were analysed thematically according to Interpretative Phenomenological Analysis. Participants sought information from a range of sources, focusing on informal information sources before surgery and using information and instruction from health-care professionals post-surgery. This information influenced behaviours including deciding to undergo surgery, use of pain medication and returning to usual activity. Anxiety and help-seeking resulted when unexpected post-surgical events occurred such as extensive bruising. Findings were consistent with psychological and sociological theories. Overall, participants were positive about the information and instruction they received but expressed a desire for more timely information on post-operative adverse events.
Resumo:
Increasingly users are seen as the weak link in the chain, when it comes to the security of corporate information. Should the users of computer systems act in any inappropriate or insecure manner, then they may put their employers in danger of financial losses, information degradation or litigation, and themselves in danger of dismissal or prosecution. This is a particularly important concern for knowledge-intensive organisations, such as universities, as the effective conduct of their core teaching and research activities is becoming ever more reliant on the availability, integrity and accuracy of computer-based information resources. One increasingly important mechanism for reducing the occurrence of inappropriate behaviours, and in so doing, protecting corporate information, is through the formulation and application of a formal ‘acceptable use policy (AUP). Whilst the AUP has attracted some academic interest, it has tended to be prescriptive and overly focussed on the role of the Internet, and there is relatively little empirical material that explicitly addresses the purpose, positioning or content of real acceptable use policies. The broad aim of the study, reported in this paper, is to fill this gap in the literature by critically examining the structure and composition of a sample of authentic policies – taken from the higher education sector – rather than simply making general prescriptions about what they ought to contain. There are two important conclusions to be drawn from this study: (1) the primary role of the AUP appears to be as a mechanism for dealing with unacceptable behaviour, rather than proactively promoting desirable and effective security behaviours, and (2) the wide variation found in the coverage and positioning of the reviewed policies is unlikely to be fostering a coherent approach to security management, across the higher education sector.
Resumo:
Ensuring the security of corporate information, that is increasingly stored, processed and disseminated using information and communications technologies [ICTs], has become an extremely complex and challenging activity. This is a particularly important concern for knowledge-intensive organisations, such as universities, as the effective conduct of their core teaching and research activities is becoming ever more reliant on the availability, integrity and accuracy of computer-based information resources. One increasingly important mechanism for reducing the occurrence of security breaches, and in so doing, protecting corporate information, is through the formulation and application of a formal information security policy (InSPy). Whilst a great deal has now been written about the importance and role of the information security policy, and approaches to its formulation and dissemination, there is relatively little empirical material that explicitly addresses the structure or content of security policies. The broad aim of the study, reported in this paper, is to fill this gap in the literature by critically examining the structure and content of authentic information security policies, rather than simply making general prescriptions about what they ought to contain. Having established the structure and key features of the reviewed policies, the paper critically explores the underlying conceptualisation of information security embedded in the policies. There are two important conclusions to be drawn from this study: (1) the wide diversity of disparate policies and standards in use is unlikely to foster a coherent approach to security management; and (2) the range of specific issues explicitly covered in university policies is surprisingly low, and reflects a highly techno-centric view of information security management.
Resumo:
This paper presents the design and results of a task-based user study, based on Information Foraging Theory, on a novel user interaction framework - uInteract - for content-based image retrieval (CBIR). The framework includes a four-factor user interaction model and an interactive interface. The user study involves three focused evaluations, 12 simulated real life search tasks with different complexity levels, 12 comparative systems and 50 subjects. Information Foraging Theory is applied to the user study design and the quantitative data analysis. The systematic findings have not only shown how effective and easy to use the uInteract framework is, but also illustrate the value of Information Foraging Theory for interpreting user interaction with CBIR. © 2011 Springer-Verlag Berlin Heidelberg.