34 resultados para Social Information Processing Theory


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Huge advertising budgets are invested by firms to reach and convince potential consumers to buy their products. To optimize these investments, it is fundamental not only to ensure that appropriate consumers will be reached, but also that they will be in appropriate reception conditions. Marketing research has focused on the way consumers react to advertising, as well as on some individual and contextual factors that could mediate or moderate the ad impact on consumers (e.g. motivation and ability to process information or attitudes toward advertising). Nevertheless, a factor that potentially influences consumers’ advertising reactions has not yet been studied in marketing research: fatigue. Fatigue can yet impact key variables of advertising processing, such as cognitive resources availability (Lieury 2004). Fatigue is felt when the body warns to stop an activity (or inactivity) to have some rest, allowing the individual to compensate for fatigue effects. Dittner et al. (2004) defines it as “the state of weariness following a period of exertion, mental or physical, characterized by a decreased capacity for work and reduced efficiency to respond to stimuli.’’ It signals that resources will lack if we continue with the ongoing activity. According to Schmidtke (1969), fatigue leads to troubles in information reception, in perception, in coordination, in attention getting, in concentration and in thinking. In addition, for Markle (1984) fatigue generates a decrease in memory, and in communication ability, whereas it increases time reaction, and number of errors. Thus, fatigue may have large effects on advertising processing. We suggest that fatigue determines the level of available resources. Some research about consumer responses to advertising claim that complexity is a fundamental element to take into consideration. Complexity determines the cognitive efforts the consumer must provide to understand the message (Putrevu et al. 2004). Thus, we suggest that complexity determines the level of required resources. To study this complex question about need and provision of cognitive resources, we draw upon Resource Matching Theory. Anand and Sternthal (1989, 1990) are the first to state the Resource Matching principle, saying that an ad is most persuasive when the resources required to process it match the resources the viewer is willing and able to provide. They show that when the required resources exceed those available, the message is not entirely processed by the consumer. And when there are too many available resources comparing to those required, the viewer elaborates critical or unrelated thoughts. According to the Resource Matching theory, the level of resource demanded by an ad can be high or low, and is mostly determined by the ad’s layout (Peracchio and Myers-Levy, 1997). We manipulate the level of required resources using three levels of ad complexity (low – high – extremely high). On the other side, the resource availability of an ad viewer is determined by lots of contextual and individual variables. We manipulate the level of available resources using two levels of fatigue (low – high). Tired viewers want to limit the processing effort to minimal resource requirements by making heuristics, forming overall impression at first glance. It will be easier for them to decode the message when ads are very simple. On the contrary, the most effective ads for viewers who are not tired are complex enough to draw their attention and fully use their resources. They will use more analytical strategies, looking at the details of the ad. However, if ads are too complex, they will be too difficult to understand. The viewer will be discouraged to process information and will overlook the ad. The objective of our research is to study fatigue as a moderating variable of advertising information processing. We run two experimental studies to assess the effect of fatigue on visual strategies, comprehension, persuasion and memorization. In study 1, thirty-five undergraduate students enrolled in a marketing research course participated in the experiment. The experimental design is 2 (tiredness level: between subjects) x 3 (ad complexity level: within subjects). Participants were randomly assigned a schedule time (morning: 8-10 am or evening: 10-12 pm) to perform the experiment. We chose to test subjects at various moments of the day to obtain maximum variance in their fatigue level. We use Morningness / Eveningness tendency of participants (Horne & Ostberg, 1976) as a control variable. We assess fatigue level using subjective measures - questionnaire with fatigue scales - and objective measures - reaction time and number of errors. Regarding complexity levels, we have designed our own ads in order to keep aspects other than complexity equal. We ran a pretest using the Resource Demands scale (Keller and Bloch 1997) and by rating them on complexity like Morrison and Dainoff (1972) to check for our complexity manipulation. We found three significantly different levels. After having completed the fatigue scales, participants are asked to view the ads on a screen, while their eye movements are recorded by the eye-tracker. Eye-tracking allows us to find out patterns of visual attention (Pieters and Warlop 1999). We are then able to infer specific respondents’ visual strategies according to their level of fatigue. Comprehension is assessed with a comprehension test. We collect measures of attitude change for persuasion and measures of recall and recognition at various points of time for memorization. Once the effect of fatigue will be determined across the student population, it is interesting to account for individual differences in fatigue severity and perception. Therefore, we run study 2, which is similar to the previous one except for the design: time of day is now within-subjects and complexity becomes between-subjects

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We employ the methods presented in the previous chapter for decoding corrupted codewords, encoded using sparse parity check error correcting codes. We show the similarity between the equations derived from the TAP approach and those obtained from belief propagation, and examine their performance as practical decoding methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Thouless-Anderson-Palmer (TAP) approach was originally developed for analysing the Sherrington-Kirkpatrick model in the study of spin glass models and has been employed since then mainly in the context of extensively connected systems whereby each dynamical variable interacts weakly with the others. Recently, we extended this method for handling general intensively connected systems where each variable has only O(1) connections characterised by strong couplings. However, the new formulation looks quite different with respect to existing analyses and it is only natural to question whether it actually reproduces known results for systems of extensive connectivity. In this chapter, we apply our formulation of the TAP approach to an extensively connected system, the Hopfield associative memory model, showing that it produces identical results to those obtained by the conventional formulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The nature of Discrete-Event Simulation (DES) and the use of DES in organisations is changing. Two important developments are the use of Visual Interactive Modelling systems and the use of DES in Business Process Management (BPM) projects. Survey research is presented that shows that despite these developments usage of DES remains relatively low due to a lack of knowledge of the benefits of the technique. This paper considers two factors that could lead to a greater achievement and appreciation of the full benefit of DES and thus lead to greater usage. Firstly in relation to using DES to investigate social systems, both in the process of undertaking a simulation project and in the interpretation of the findings a 'soft' approach may generate more knowledge from the DES intervention and thus increase its benefit to businesses. Secondly in order to assess the full range of outcomes of DES the technique could be considered from the perspective of an information processing tool within the organisation. This will allow outcomes to be considered under the three modes of organisational information use of sense making, knowledge creating and decision making which relate to the theoretical areas of knowledge management, organisational learning and decision making respectively. The association of DES with these popular techniques could further increase its usage in business.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Different forms of strategic flexibility allow for reactive adaptation to different changing environments and the proactive driving of change. It is therefore becoming increasingly important for decision makers to not only possess marketing capabilities, but also the capabilities for strategic flexibility in its various forms. However, our knowledge of the relationships between decision makers’ different ways of thinking and their capabilities for strategic flexibility is limited. This limitation is constraining research and understanding. In this article we develop a theoretical cognitive content framework that postulates relationships between different ways of thinking about strategy and different information-processing demands. We then outline how the contrasting beliefs of decision makers may influence their capabilities to generate different hybrid forms of strategic flexibility at the cognitive level. Theoretically, the framework is embedded in resource-based theory, personal construct theory and schema theory. The implications for research and theory are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a novel all-optical signal processor for use at a return-to-zero receiver utilising loop mirror intensity filtering and nonlinear pulse broadening in normal dispersion fibre. The device offers reamplification and cleaning up of the optical signals, and phase margin improvement. The efficiency of the technique is demonstrated by application to 40 Gbit/s data transmission.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern managers are under tremendous pressure in attempting to fulfil a profoundly complex managerial task, that of handling information resources. Information management, an intricate process requiring a high measure of human cognition and discernment, involves matching a manager's lack of information processing capacity against his information needs, with voluminous information at his disposal. The nature of the task will undoubtedly become more complex in the case of a large organisation. Management of large-scale organisations is therefore an exceedingly challenging prospect for any manager to be faced with. A system that supports executive information needs will help reduce managerial and informational mismatches. In the context of the Malaysian public sector, the task of overall management lies with the Prime Minister and the Cabinet. The Prime Minister's Office is presently supporting the Prime Minister's information and managerial needs, although not without various shortcomings. The rigid formalised structure predominant of the Malaysian public sector, so opposed to dynamic treatment of problematic issues as faced by that sector, further escalates the managerial and organisational problem of coping with a state of complexity. The principal features of the research are twofold: the development of a methodology for diagnosing the problem organisation' and the design of an office system. The methodological development is done in the context of the Malaysian public sector, and aims at understanding the complexity of its communication and control situation. The outcome is a viable model of the public sector. `Design', on the other hand, is developing a syntax or language for office systems which provides an alternative to current views on office systems. The design is done with reference to, rather than for, the Prime Minister's Office. The desirable outcome will be an office model called Office Communication and Information System (OCIS).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Task classification is introduced as a method for the evaluation of monitoring behaviour in different task situations. On the basis of an analysis of different monitoring tasks, a task classification system comprising four task 'dimensions' is proposed. The perceptual speed and flexibility of closure categories, which are identified with signal discrimination type, comprise the principal dimension in this taxonomy, the others being sense modality, the time course of events, and source complexity. It is also proposed that decision theory provides the most complete method for the analysis of performance in monitoring tasks. Several different aspects of decision theory in relation to monitoring behaviour are described. A method is also outlined whereby both accuracy and latency measures of performance may be analysed within the same decision theory framework. Eight experiments and an organizational study are reported. The results show that a distinction can be made between the perceptual efficiency (sensitivity) of a monitor and his criterial level of response, and that in most monitoring situations, there is no decrement in efficiency over the work period, but an increase in the strictness of the response criterion. The range of tasks exhibiting either or both of these performance trends can be specified within the task classification system. In particular, it is shown that a sensitivity decrement is only obtained for 'speed' tasks with a high stimulation rate. A distinctive feature of 'speed' tasks is that target detection requires the discrimination of a change in a stimulus relative to preceding stimuli, whereas in 'closure' tasks, the information required for the discrimination of targets is presented at the same point In time. In the final study, the specification of tasks yielding sensitivity decrements is shown to be consistent with a task classification analysis of the monitoring literature. It is also demonstrated that the signal type dimension has a major influence on the consistency of individual differences in performance in different tasks. The results provide an empirical validation for the 'speed' and 'closure' categories, and suggest that individual differences are not completely task specific but are dependent on the demands common to different tasks. Task classification is therefore shovn to enable improved generalizations to be made of the factors affecting 1) performance trends over time, and 2) the consistencv of performance in different tasks. A decision theory analysis of response latencies is shown to support the view that criterion shifts are obtained in some tasks, while sensitivity shifts are obtained in others. The results of a psychophysiological study also suggest that evoked potential latency measures may provide temporal correlates of criterion shifts in monitoring tasks. Among other results, the finding that the latencies of negative responses do not increase over time is taken to invalidate arousal-based theories of performance trends over a work period. An interpretation in terms of expectancy, however, provides a more reliable explanation of criterion shifts. Although the mechanisms underlying the sensitivity decrement are not completely clear, the results rule out 'unitary' theories such as observing response and coupling theory. It is suggested that an interpretation in terms of the memory data limitations on information processing provides the most parsimonious explanation of all the results in the literature relating to sensitivity decrement. Task classification therefore enables the refinement and selection of theories of monitoring behaviour in terms of their reliability in generalizing predictions to a wide range of tasks. It is thus concluded that task classification and decision theory provide a reliable basis for the assessment and analysis of monitoring behaviour in different task situations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study was concerned with the computer automation of land evaluation. This is a broad subject with many issues to be resolved, so the study concentrated on three key problems: knowledge based programming; the integration of spatial information from remote sensing and other sources; and the inclusion of socio-economic information into the land evaluation analysis. Land evaluation and land use planning were considered in the context of overseas projects in the developing world. Knowledge based systems were found to provide significant advantages over conventional programming techniques for some aspects of the land evaluation process. Declarative languages, in particular Prolog, were ideally suited to integration of social information which changes with every situation. Rule-based expert system shells were also found to be suitable for this role, including knowledge acquisition at the interview stage. All the expert system shells examined suffered from very limited constraints to problem size, but new products now overcome this. Inductive expert system shells were useful as a guide to knowledge gaps and possible relationships, but the number of examples required was unrealistic for typical land use planning situations. The accuracy of classified satellite imagery was significantly enhanced by integrating spatial information on soil distribution for Thailand data. Estimates of the rice producing area were substantially improved (30% change in area) by the addition of soil information. Image processing work on Mozambique showed that satellite remote sensing was a useful tool in stratifying vegetation cover at provincial level to identify key development areas, but its full utility could not be realised on typical planning projects, without treatment as part of a complete spatial information system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Improving bit error rates in optical communication systems is a difficult and important problem. The error correction must take place at high speed and be extremely accurate. We show the feasibility of using hardware implementable machine learning techniques. This may enable some error correction at the speed required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The leadership categorisation theory suggests that followers rely on a hierarchical cognitive structure in perceiving leaders and the leadership process, which consists of three levels; superordinate, basic and subordinate. The predominant view is that followers rely on Implicit Leadership Theories (ILTs) at the basic level in making judgments about managers. The thesis examines whether this presumption is true by proposing and testing two competing conceptualisations; namely the congruence between the basic level ILTs (general leader) and actual manager perceptions, and subordinate level ILTs (job-specific leader) and actual manager. The conceptualisation at the job-specific level builds on context-related assertions of the ILT explanatory models: leadership categorisation, information processing and connectionist network theories. Further, the thesis addresses the effects of ILT congruence at the group level. The hypothesised model suggests that Leader-Member Exchange (LMX) will act as a mediator between ILT congruence and outcomes. Three studies examined the proposed model. The first was cross-sectional with 175 students reporting on work experience during a 1-year industrial placement. The second was longitudinal and had a sample of 343 students engaging in a business simulation in groups with formal leadership. The final study was a cross-sectional survey in several organisations with a sample of 178. A novel approach was taken to congruence analysis; the hypothesised models were tested using Latent Congruence Modelling (LCM), which accounts for measurement error and overcomes the majority of limitations of traditional approaches. The first two studies confirm the traditional theorised view that employees rely on basic-level ILTs in making judgments about their managers with important implications, and show that LMX mediates the relationship between ILT congruence and work-related outcomes (performance, job satisfaction, well-being, task satisfaction, intragroup conflict, group satisfaction, team realness, team-member exchange, group performance). The third study confirms this with conflict, well-being, self-rated performance and commitment as outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the key challenges that organizations face when trying to integrate knowledge across different functions is the need to overcome knowledge boundaries between team members. In cross-functional teams, these boundaries, associated with different knowledge backgrounds of people from various disciplines, create communication problems, necessitating team members to engage in complex cognitive processes when integrating knowledge toward a joint outcome. This research investigates the impact of syntactic, semantic, and pragmatic knowledge boundaries on a team’s ability to develop a transactive memory system (TMS)—a collective memory system for knowledge coordination in groups. Results from our survey show that syntactic and pragmatic knowledge boundaries negatively affect TMS development. These findings extend TMS theory beyond the information-processing view, which treats knowledge as an object that can be stored and retrieved, to the interpretive and practice-based views of knowledge, which recognize that knowledge (in particular specialized knowledge) is localized, situated, and embedded in practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Addressing inconsistencies in relational demography research, we examine the relationship between cultural dissimilarity and individual performance through the lens of social self-regulation theory, which extends the social identity perspective in relational demography with the analysis of social self-regulation. We propose that social self-regulation in culturally diverse teams manifests itself as performance monitoring (i.e., individuals' actions to meet team performance standards and peer expectations). Contingent on the status associated with individuals' cultural background, performance monitoring is proposed to have a curvilinear relationship with individual performance and to mediate between cultural dissimilarity and performance. Multilevel moderated mediation analyses of time-lagged data from 316 members of 69 teams confirmed these hypotheses. Cultural dissimilarity had a negative relationship with performance monitoring for high cultural-status members, and a positive relationship for low cultural-status members. Performance monitoring had a curvilinear relationship with individual performance that became decreasingly positive. Cultural dissimilarity thus was increasingly negatively associated with performance for high culturalstatus members, and decreasingly positively for low cultural-status members. These findings suggest that cultural dissimilarity to the team is not unconditionally negative for the individual but, in moderation, may in fact have positive motivational effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This edited book is intended for use by students, academics and practitioners who take interest in the outsourcing and offshoring of information technology and business services and processes. The book offers a review of the key topics in outsourcing and offshoring, populated with practical frameworks that serve as a tool kit for practitioners, academics and students. The range of topics covered in this book is wide and diverse, and represents both client and supplier perspectives on sourcing of global services. Various aspects related to the decision making process (e.g., asset transfer), learning mechanisms and organizational practices for managing outsourcing relationships are discussed in great depth. Contemporary sourcing models, including cloud services, are examined. Client dependency on the outsourcing provider, and social aspects, such as identity, are discussed in detail. Furthermore, resistance in outsourcing and failures are investigated to derive lessons as to how to avoid them and improve efficiency in outsourcing. Topics discussed in this book combine theoretical and practical insights regarding challenges that both clients and vendors face. Case studies from client and vendor organizations are used extensively throughout the book. Last but not least, the book examines current and future trends in outsourcing and offshoring, placing particular attention on the centrality of innovation in sourcing arrangements, and how innovation can be realized in outsourcing. The book is based on a vast empirical base brought together through years of extensive research by leading researchers in information systems, strategic management and operations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This edited book is intended for use by students, academics and practitioners who take interest in the outsourcing and offshoring of information technology and business services and processes. The book offers a review of the key topics in outsourcing and offshoring, populated with practical frameworks that serve as a tool kit for practitioners, academics and students. The range of topics covered in this book is wide and diverse, and represents both client and supplier perspectives on sourcing of global services. Various aspects related to the decision making process (e.g., asset transfer), learning mechanisms and organizational practices for managing outsourcing relationships are discussed in great depth. Contemporary sourcing models, including cloud services, are examined. Client dependency on the outsourcing provider, and social aspects, such as identity, are discussed in detail. Furthermore, resistance in outsourcing and failures are investigated to derive lessons as to how to avoid them and improve efficiency in outsourcing. Topics discussed in this book combine theoretical and practical insights regarding challenges that both clients and vendors face. Case studies from client and vendor organizations are used extensively throughout the book. Last but not least, the book examines current and future trends in outsourcing and offshoring, placing particular attention on the centrality of innovation in sourcing arrangements, and how innovation can be realized in outsourcing. The book is based on a vast empirical base brought together through years of extensive research by leading researchers in information systems, strategic management and operations.