93 resultados para Human Information Processing
em Aston University Research Archive
Resumo:
This paper complements earlier work by the author that shows that the pattern of information arrivals into the UK stock market may explain the behaviour of returns. It is argued that delays or other systematic behaviour in the processing of this information could compound the impact of information arrival patterns. It is found, however, that this does not happen, and so it is the arrival and not the processing of news that is most important. © 2004 Taylor & Francis Ltd.
Resumo:
With the extensive use of pulse modulation methods in telecommunications, much work has been done in the search for a better utilisation of the transmission channel.The present research is an extension of these investigations. A new modulation method, 'Variable Time-Scale Information Processing', (VTSIP), is proposed.The basic principles of this system have been established, and the main advantages and disadvantages investigated. With the proposed system, comparison circuits detect the instants at which the input signal voltage crosses predetermined amplitude levels.The time intervals between these occurrences are measured digitally and the results are temporarily stored, before being transmitted.After reception, an inverse process enables the original signal to be reconstituted.The advantage of this system is that the irregularities in the rate of information contained in the input signal are smoothed out before transmission, allowing the use of a smaller transmission bandwidth. A disadvantage of the system is the time delay necessarily introduced by the storage process.Another disadvantage is a type of distortion caused by the finite store capacity.A simulation of the system has been made using a standard speech signal, to make some assessment of this distortion. It is concluded that the new system should be an improvement on existing pulse transmission systems, allowing the use of a smaller transmission bandwidth, but introducing a time delay.
Resumo:
This thesis examines children's consumer choice behaviour using an information processing perspective, with the fundamental goal of applying academic research to practical marketing and commercial problems. Proceeding a preface, which describes the academic and commercial terms of reference within which this interdisciplinary study is couched, the thesis comprises four discernible parts. Initially, the rationale inherent in adopting an information processing perspective is justified and the diverse array of topics which have bearing on children's consumer processing and behaviour are aggregated. The second part uses this perspective as a springboard to appraise the little explored role of memory, and especially memory structure, as a central cognitive component in children's consumer choice processing. The main research theme explores the ease with which 10 and 11 year olds retrieve contemporary consumer information from subjectively defined memory organisations. Adopting a sort-recall paradigm, hierarchical retrieval processing is stimulated and it is contended that when two items, known to be stored proximally in the memory organisation are not recalled adjacently, this discrepancy is indicative of retrieval processing ease. Results illustrate the marked influence of task conditions and orientation of memory structure on retrieval; these conclusions are accounted for in terms of input and integration failure. The third section develops the foregoing interpellations in the marketing context. A straightforward methodology for structuring marketing situations is postulated, a basis for segmenting children's markets using processing characteristics is adopted, and criteria for communicating brand support information to children are discussed. A taxonomy of market-induced processing conditions is developed. Finally, a case study with topical commercial significance is described. The development, launch and marketing of a new product in the confectionery market is outlined, the aetiology of its subsequent demise identified and expounded, and prescriptive guidelines are put forward to help avert future repetition of marketing misjudgements.
Resumo:
For over 30. years information-processing approaches to leadership and more specifically Implicit Leadership Theories (ILTs) research has contributed a significant body of knowledge on leadership processes in applied settings. A new line of research on Implicit Followership Theories (IFTs) has re-ignited interest in information-processing and socio-cognitive approaches to leadership and followership. In this review, we focus on organizational research on ILTs and IFTs and highlight their practical utility for the exercise of leadership and followership in applied settings. We clarify common misperceptions regarding the implicit nature of ILTs and IFTs, review both direct and indirect measures, synthesize current and ongoing research on ILTs and IFTs in organizational settings, address issues related to different levels of analysis in the context of leadership and follower schemas and, finally, propose future avenues for organizational research. © 2013 Elsevier Inc.
Resumo:
Three studies tested the impact of properties of behavioral intention on intention-behavior consistency, information processing, and resistance. Principal components analysis showed that properties of intention formed distinct factors. Study 1 demonstrated that temporal stability, but not the other intention attributes, moderated intention-behavior consistency. Study 2 found that greater stability of intention was associated with improved memory performance. In Study 3, participants were confronted with a rating scale manipulation designed to alter their intention scores. Findings showed that stable intentions were able to withstand attack. Overall, the present research findings suggest that different properties of intention are not simply manifestations of a single underlying construct ("intention strength"), and that temporal stability exhibits superior resistance and impact compared to other intention attributes. © 2013 Wiley Periodicals, Inc.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
Modern managers are under tremendous pressure in attempting to fulfil a profoundly complex managerial task, that of handling information resources. Information management, an intricate process requiring a high measure of human cognition and discernment, involves matching a manager's lack of information processing capacity against his information needs, with voluminous information at his disposal. The nature of the task will undoubtedly become more complex in the case of a large organisation. Management of large-scale organisations is therefore an exceedingly challenging prospect for any manager to be faced with. A system that supports executive information needs will help reduce managerial and informational mismatches. In the context of the Malaysian public sector, the task of overall management lies with the Prime Minister and the Cabinet. The Prime Minister's Office is presently supporting the Prime Minister's information and managerial needs, although not without various shortcomings. The rigid formalised structure predominant of the Malaysian public sector, so opposed to dynamic treatment of problematic issues as faced by that sector, further escalates the managerial and organisational problem of coping with a state of complexity. The principal features of the research are twofold: the development of a methodology for diagnosing the problem organisation' and the design of an office system. The methodological development is done in the context of the Malaysian public sector, and aims at understanding the complexity of its communication and control situation. The outcome is a viable model of the public sector. `Design', on the other hand, is developing a syntax or language for office systems which provides an alternative to current views on office systems. The design is done with reference to, rather than for, the Prime Minister's Office. The desirable outcome will be an office model called Office Communication and Information System (OCIS).
Resumo:
Research on diversity in teams and organizations has revealed ambiguous results regarding the effects of group composition on workgroup performance. The categorization—elaboration model (van Knippenberg et al., 2004) accounts for this variety and proposes two different underlying processes. On the one hand diversity may bring about intergroup bias which leads to less group identification, which in turn is followed by more conflict and decreased workgroup performance. On the other hand, the information processing approach proposes positive effects of diversity because of a more elaborate processing of information brought about by a wider pool and variety of perspectives in more diverse groups. We propose that the former process is contingent on individual team members' beliefs that diversity is good or bad for achieving the team's aims. We predict that the relationship between subjective diversity and identification is more positive in ethnically diverse project teams when group members hold beliefs that are pro-diversity. Results of two longitudinal studies involving postgraduate students working in project teams confirm this hypothesis. Analyses further reveal that group identification is positively related to students' desire to stay in their groups and to their information elaboration. Finally, we found evidence for the expected moderated mediation model with indirect effects of subjective diversity on elaboration and the desire to stay, mediated through group identification, moderated by diversity beliefs.
Resumo:
We propose a novel all-optical signal processor for use at a return-to-zero receiver utilising loop mirror intensity filtering and nonlinear pulse broadening in normal dispersion fibre. The device offers reamplification and cleaning up of the optical signals, and phase margin improvement. The efficiency of the technique is demonstrated by application to 40 Gbit/s data transmission.
Resumo:
Safety enforcement practitioners within Europe and marketers, designers or manufacturers of consumer products need to determine compliance with the legal test of "reasonable safety" for consumer goods, to reduce the "risks" of injury to the minimum. To enable freedom of movement of products, a method for safety appraisal is required for use as an "expert" system of hazard analysis by non-experts in safety testing of consumer goods for implementation consistently throughout Europe. Safety testing approaches and the concept of risk assessment and hazard analysis are reviewed in developing a model for appraising consumer product safety which seeks to integrate the human factors contribution of risk assessment, hazard perception, and information processing. The model develops a system of hazard identification, hazard analysis and risk assessment which can be applied to a wide range of consumer products through use of a series of systematic checklists and matrices and applies alternative numerical and graphical methods for calculating a final product safety risk assessment score. It is then applied in its pilot form by selected "volunteer" Trading Standards Departments to a sample of consumer products. A series of questionnaires is used to select participating Trading Standards Departments, to explore the contribution of potential subjective influences, to establish views regarding the usability and reliability of the model and any preferences for the risk assessment scoring system used. The outcome of the two stage hazard analysis and risk assessment process is considered to determine consistency in results of hazard analysis, final decisions regarding the safety of the sample product and to determine any correlation in the decisions made using the model and alternative scoring methods of risk assessment. The research also identifies a number of opportunities for future work, and indicates a number of areas where further work has already begun.
Resumo:
Improving bit error rates in optical communication systems is a difficult and important problem. The error correction must take place at high speed and be extremely accurate. We show the feasibility of using hardware implementable machine learning techniques. This may enable some error correction at the speed required.