284 resultados para Event marketing
Resumo:
In this study the impact of message strategy on advertising performance will be in examined in a business-to-business (B2B) context. From a theoretical standpoint, the study will explore differences in message type between symbolic and literal approaches in B2B advertisements. While there has been much discussion on the effect of symbolism, (eg. metaphors, abstract images and figurative language), an empirically-tested scale that measures the degree of symbolism has not been developed. This research project focuses on development of a methodological scale to accurately test the difference in the direction of message appeals. Thus, insights in the role of message strategy in the B2B adoption process are anticipated with contributions in future consumer and business advertising research.
Resumo:
If the current discourses of progress are to be believed, the new or social media promise a kaleidoscope of opportunity for connecting and informing citizens. This is by allegedly revitalizing the fading legitimacy and practice of institutions and providing an agent for social interaction. However, as social media adoption has increased, it has revealed a wealth of contradictions both of its own making and reproduction of past action. This has created a crisis for traditional media as well as for public relations. For example, social media such as WikiLeaks have bypassed official channels about government information. In other cases, social media such as Facebook and Twitter informed BBC coverage of the Rio Olympics. Although old media are unlikely to go away, social media have had an impact with several large familybased media companies collapsing or being reintegrated into the new paradigm. To use Walter Lippman’s analogy of the phantom public, the social media contradictorily serve to both disparate the phantom in part and reinforce it...
Resumo:
Effective risk management is crucial for any organisation. One of its key steps is risk identification, but few tools exist to support this process. Here we present a method for the automatic discovery of a particular type of process-related risk, the danger of deadline transgressions or overruns, based on the analysis of event logs. We define a set of time-related process risk indicators, i.e., patterns observable in event logs that highlight the likelihood of an overrun, and then show how instances of these patterns can be identified automatically using statistical principles. To demonstrate its feasibility, the approach has been implemented as a plug-in module to the process mining framework ProM and tested using an event log from a Dutch financial institution.
Resumo:
A contentious issue in the field of destination marketing has been the recent tendency by some authors to refer to destination marketing organisations (DMOs) as destination management organisations. This nomenclature infers control over destination resources, a level of influence that is in reality held by few DMOs. This issue of a lack of control over the destination ‘amalgam’ is acknowledged by a number of the contributors, including the editors and the discussion on destination competitiveness by J.R. Brent Ritchie and Geoffrey Crouch, and is perhaps best summed up by Alan Fyall in the concluding chapter: “...unless all elements are owned by the same body, then the ability to control and influence the direction, quality and development of the destination pose very real challenges’ (p. 343). The title of the text acknowledges both marketing and management, in relation to theories and applications. While there are insightful propositions about ideals of destination management, readers will find there is a lack of coverage of destination management in practise by DMOs. This represents fertile ground for future research.
Resumo:
During a major flood event, the inundation of urban environments leads to some complicated flow motion most often associated with significant sediment fluxes. In the present study, a series of field measurements were conducted in an inundated section of the City of Brisbane (Australia) about the peak of a major flood in January 2011. Some experiments were performed to use ADV backscatter amplitude as a surrogate estimate of the suspended sediment concentration (SSC) during the flood event. The flood water deposit samples were predominantly silty material with a median particle size about 25 μm and they exhibited a non-Newtonian behavior under rheological testing. In the inundated urban environment during the flood, estimates of suspended sediment concentration presented a general trend with increasing SSC for decreasing water depth. The suspended sediment flux data showed some substantial sediment flux amplitudes consistent with the murky appearance of floodwaters. Altogether the results highlighted the large suspended sediment loads and fluctuations in the inundated urban setting associated possibly with a non-Newtonian behavior. During the receding flood, some unusual long-period oscillations were observed (periods about 18 min), although the cause of these oscillations remains unknown. The field deployment was conducted in challenging conditions highlighting a number of practical issues during a natural disaster.
Resumo:
This article investigates the role of information communication technologies (ICTs) in establishing a well-aligned, authentic learning environment for a diverse cohort of non-cognate and cognate students studying event management in a higher education context. Based on a case study which examined the way ICTs assisted in accommodating diverse learning needs, styles and stages in an event management subject offered in the Creative Industries Faculty at Queensland University of Technology in Brisbane, Australia, the article uses an action research approach to generate grounded, empirical data on the effectiveness of the dynamic, individualised curriculum frameworks that the use of ICTs makes possible. The study provides insights into the way non-cognate and cognate students respond to different learning tools. It finds that whilst non-cognate and cognate students do respond to learning tools differently, due to a differing degree of emphasis on technical, task or theoretical competencies, the use of ICTs allows all students to improve their performance by providing multiple points of entry into the content. In this respect, whilst the article focuses on the way ICTs can be used to develop an authentic, well-aligned curriculum model that meets the needs of event management students in a higher education context, with findings relevant for event educators in Business, Hospitality, Tourism and Creative Industries, the strategies outlined may also be useful for educators in other fields who are faced with similar challenges when designing and developing curriculum for diverse cohorts.
Resumo:
This study examines disillusioned consumers. The theory proposes that this is a group learning to lower their expectations of firm integrity and who, to avoid being let down, ignore marketing activity directly from the firm. This kind of exchange orientation develops as a response to consistent failure in perceptions of firm integrity. The research includes six studies, including over 600 adult consumers, to outline the development and validation of a measure of consumer disillusionment toward marketing activity. Completing the process provides a valid and reliable four-item measure. In addition, the study includes the assessment of the nomological validity of the construct. The nomological validation includes using cue utilization theory to predict that disillusioned consumers favor advertising that provides evidence of verifiable integrity. The validation experiment uses print advertising containing high and low verifiable integrity stimuli. Results confirm the theory with disillusioned consumers focusing less on the firm as source of information. Further, these consumers respond more favorably than non-disillusioned consumers to third party endorsers who serve to verify the firm's attempts to show integrity.
Resumo:
The rapid increase in the deployment of CCTV systems has led to a greater demand for algorithms that are able to process incoming video feeds. These algorithms are designed to extract information of interest for human operators. During the past several years, there has been a large effort to detect abnormal activities through computer vision techniques. Typically, the problem is formulated as a novelty detection task where the system is trained on normal data and is required to detect events which do not fit the learned `normal' model. Many researchers have tried various sets of features to train different learning models to detect abnormal behaviour in video footage. In this work we propose using a Semi-2D Hidden Markov Model (HMM) to model the normal activities of people. The outliers of the model with insufficient likelihood are identified as abnormal activities. Our Semi-2D HMM is designed to model both the temporal and spatial causalities of the crowd behaviour by assuming the current state of the Hidden Markov Model depends not only on the previous state in the temporal direction, but also on the previous states of the adjacent spatial locations. Two different HMMs are trained to model both the vertical and horizontal spatial causal information. Location features, flow features and optical flow textures are used as the features for the model. The proposed approach is evaluated using the publicly available UCSD datasets and we demonstrate improved performance compared to other state of the art methods.
Resumo:
Process mining encompasses the research area which is concerned with knowledge discovery from information system event logs. Within the process mining research area, two prominent tasks can be discerned. First of all, process discovery deals with the automatic construction of a process model out of an event log. Secondly, conformance checking focuses on the assessment of the quality of a discovered or designed process model in respect to the actual behavior as captured in event logs. Hereto, multiple techniques and metrics have been developed and described in the literature. However, the process mining domain still lacks a comprehensive framework for assessing the goodness of a process model from a quantitative perspective. In this study, we describe the architecture of an extensible framework within ProM, allowing for the consistent, comparative and repeatable calculation of conformance metrics. For the development and assessment of both process discovery as well as conformance techniques, such a framework is considered greatly valuable.
Resumo:
Free association norms indicate that words are organized into semantic/associative neighborhoods within a larger network of words and links that bind the net together. We present evidence indicating that memory for a recent word event can depend on implicitly and simultaneously activating related words in its neighborhood. Processing a word during encoding primes its network representation as a function of the density of the links in its neighborhood. Such priming increases recall and recognition and can have long lasting effects when the word is processed in working memory. Evidence for this phenomenon is reviewed in extralist cuing, primed free association, intralist cuing, and single-item recognition tasks. The findings also show that when a related word is presented to cue the recall of a studied word, the cue activates it in an array of related words that distract and reduce the probability of its selection. The activation of the semantic network produces priming benefits during encoding and search costs during retrieval. In extralist cuing recall is a negative function of cue-to-distracter strength and a positive function of neighborhood density, cue-to-target strength, and target-to cue strength. We show how four measures derived from the network can be combined and used to predict memory performance. These measures play different roles in different tasks indicating that the contribution of the semantic network varies with the context provided by the task. We evaluate spreading activation and quantum-like entanglement explanations for the priming effect produced by neighborhood density.
Resumo:
In this paper, we propose an approach which attempts to solve the problem of surveillance event detection, assuming that we know the definition of the events. To facilitate the discussion, we first define two concepts. The event of interest refers to the event that the user requests the system to detect; and the background activities are any other events in the video corpus. This is an unsolved problem due to many factors as listed below: 1) Occlusions and clustering: The surveillance scenes which are of significant interest at locations such as airports, railway stations, shopping centers are often crowded, where occlusions and clustering of people are frequently encountered. This significantly affects the feature extraction step, and for instance, trajectories generated by object tracking algorithms are usually not robust under such a situation. 2) The requirement for real time detection: The system should process the video fast enough in both of the feature extraction and the detection step to facilitate real time operation. 3) Massive size of the training data set: Suppose there is an event that lasts for 1 minute in a video with a frame rate of 25fps, the number of frames for this events is 60X25 = 1500. If we want to have a training data set with many positive instances of the event, the video is likely to be very large in size (i.e. hundreds of thousands of frames or more). How to handle such a large data set is a problem frequently encountered in this application. 4) Difficulty in separating the event of interest from background activities: The events of interest often co-exist with a set of background activities. Temporal groundtruth typically very ambiguous, as it does not distinguish the event of interest from a wide range of co-existing background activities. However, it is not practical to annotate the locations of the events in large amounts of video data. This problem becomes more serious in the detection of multi-agent interactions, since the location of these events can often not be constrained to within a bounding box. 5) Challenges in determining the temporal boundaries of the events: An event can occur at any arbitrary time with an arbitrary duration. The temporal segmentation of events is difficult and ambiguous, and also affected by other factors such as occlusions.
Resumo:
Customer relationship marketing (CRM) initiatives are increasingly being adopted by businesses in the attempt to enhance brand loyalty and stimulate repeat purchases. The purpose of this study was to examine the extent to which destination marketing organisations (DMOs) around the world have developed a visitor relationship marketing (VRM) orientation. The proposition underpinning the study is that maintaining meaningful dialogue with previous visitors in some markets would represent a more efficient use of resources than above the line advertising to attract new visitors. Importance-performance analysis was utilised to measure destination marketers’ perceptions of the efficacy of CRM initiatives, and then rate their own organisation’s performance across the same range of initiatives. A key finding was that mean importance was higher than perceived performance for every item. While the small sample limits generalisability, in general there are appears to be a lack of strategic intent by DMOs to invest in VRM.
Resumo:
The proposition underpinning this study is engaging in meaningful dialogue with previous visitors represents an efficient and effective use of resources for a destination marketing organization (DMO), compared to above the line advertising in broadcast media. However there has been a lack of attention in the tourism literature relating to destination switching, loyalty and customer relationship management (CRM) to test such a proposition. This paper reports an investigation of visitor relationship marketing (VRM) orientation among DMOs. A model of CRM orientation, which was developed from the wider marketing literature and a prior qualitative study, was used to develop a scale to operationalise DMO visitor relationship orientation. Due to a small sample, the Partial Least Squares (PLS) method of structural equation modelling was used to analyse the data. Although the sample limits the ability to generalise, the results indicated the DMOs’ visitor orientation is generally responsive and reactive rather than proactive.
Resumo:
News blog hot topics are important for the information recommendation service and marketing. However, information overload and personalized management make the information arrangement more difficult. Moreover, what influences the formation and development of blog hot topics is seldom paid attention to. In order to correctly detect news blog hot topics, the paper first analyzes the development of topics in a new perspective based on W2T (Wisdom Web of Things) methodology. Namely, the characteristics of blog users, context of topic propagation and information granularity are unified to analyze the related problems. Some factors such as the user behavior pattern, network opinion and opinion leader are subsequently identified to be important for the development of topics. Then the topic model based on the view of event reports is constructed. At last, hot topics are identified by the duration, topic novelty, degree of topic growth and degree of user attention. The experimental results show that the proposed method is feasible and effective.