932 resultados para methods and measurement


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Much debate has taken place recently over the potential for entertainment genres and unorthodox forms of news to provide legitimate – indeed democratized – in-roads into the public sphere. Amidst these discussions, however, little thought has been paid to the audiences for programs of this sort, and (even when viewers are considered) the research can too easily treat audiences in homogenous terms and therefore replicate the very dichotomies these television shows directly challenge. This paper is a critical reflection on an audience study into the Australian morning “newstainment” program Sunrise. After examining the show and exploring how it is ‘used’ as a news source, this paper will promote the use of ethnographic study to better conceptualize how citizens integrate and connect the increasingly fragmented and multifarious forms of postmodern political communication available in their everyday lives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the 1980s, industries and researchers have sought to better understand the quality of services due to the rise in their importance (Brogowicz, Delene and Lyth 1990). More recent developments with online services, coupled with growing recognition of service quality (SQ) as a key contributor to national economies and as an increasingly important competitive differentiator, amplify the need to revisit our understanding of SQ and its measurement. Although ‘SQ’ can be broadly defined as “a global overarching judgment or attitude relating to the overall excellence or superiority of a service” (Parasuraman, Berry and Zeithaml 1988), the term has many interpretations. There has been considerable progress on how to measure SQ perceptions, but little consensus has been achieved on what should be measured. There is agreement that SQ is multi-dimensional, but little agreement as to the nature or content of these dimensions (Brady and Cronin 2001). For example, within the banking sector, there exist multiple SQ models, each consisting of varying dimensions. The existence of multiple conceptions and the lack of a unifying theory bring the credibility of existing conceptions into question, and beg the question of whether it is possible at some higher level to define SQ broadly such that it spans all service types and industries. This research aims to explore the viability of a universal conception of SQ, primarily through a careful re-visitation of the services and SQ literature. The study analyses the strengths and weaknesses of the highly regarded and widely used global SQ model (SERVQUAL) which reflects a single-level approach to SQ measurement. The SERVQUAL model states that customers evaluate SQ (of each service encounter) based on five dimensions namely reliability, assurance, tangibles, empathy and responsibility. SERVQUAL, however, failed to address what needs to be reliable, assured, tangible, empathetic and responsible. This research also addresses a more recent global SQ model from Brady and Cronin (2001); the B&C (2001) model, that has potential to be the successor of SERVQUAL in that it encompasses other global SQ models and addresses the ‘what’ questions that SERVQUAL didn’t. The B&C (2001) model conceives SQ as being multidimensional and multi-level; this hierarchical approach to SQ measurement better reflecting human perceptions. In-line with the initial intention of SERVQUAL, which was developed to be generalizable across industries and service types, this research aims to develop a conceptual understanding of SQ, via literature and reflection, that encompasses the content/nature of factors related to SQ; and addresses the benefits and weaknesses of various SQ measurement approaches (i.e. disconfirmation versus perceptions-only). Such understanding of SQ seeks to transcend industries and service types with the intention of extending our knowledge of SQ and assisting practitioners in understanding and evaluating SQ. The candidate’s research has been conducted within, and seeks to contribute to, the ‘IS-Impact’ research track of the IT Professional Services (ITPS) Research Program at QUT. The vision of the track is “to develop the most widely employed model for benchmarking Information Systems in organizations for the joint benefit of research and practice.” The ‘IS-Impact’ research track has developed an Information Systems (IS) success measurement model, the IS-Impact Model (Gable, Sedera and Chan 2008), which seeks to fulfill the track’s vision. Results of this study will help future researchers in the ‘IS-Impact’ research track address questions such as: • Is SQ an antecedent or consequence of the IS-Impact model or both? • Has SQ already been addressed by existing measures of the IS-Impact model? • Is SQ a separate, new dimension of the IS-Impact model? • Is SQ an alternative conception of the IS? Results from the candidate’s research suggest that SQ dimensions can be classified at a higher level which is encompassed by the B&C (2001) model’s 3 primary dimensions (interaction, physical environment and outcome). The candidate also notes that it might be viable to re-word the ‘physical environment quality’ primary dimension to ‘environment quality’ so as to better encompass both physical and virtual scenarios (E.g: web sites). The candidate does not rule out the global feasibility of the B&C (2001) model’s nine sub-dimensions, however, acknowledges that more work has to be done to better define the sub-dimensions. The candidate observes that the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions are supportive representations of the ‘interaction’, physical environment’ and ‘outcome’ primary dimensions respectively. The latter statement suggests that customers evaluate each primary dimension (or each higher level of SQ classification) namely ‘interaction’, physical environment’ and ‘outcome’ based on the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions respectively. The ability to classify SQ dimensions at a higher level coupled with support for the measures that make up this higher level, leads the candidate to propose the B&C (2001) model as a unifying theory that acts as a starting point to measuring SQ and the SQ of IS. The candidate also notes, in parallel with the continuing validation and generalization of the IS-Impact model, that there is value in alternatively conceptualizing the IS as a ‘service’ and ultimately triangulating measures of IS SQ with the IS-Impact model. These further efforts are beyond the scope of the candidate’s study. Results from the candidate’s research also suggest that both the disconfirmation and perceptions-only approaches have their merits and the choice of approach would depend on the objective(s) of the study. Should the objective(s) be an overall evaluation of SQ, the perceptions-only approached is more appropriate as this approach is more straightforward and reduces administrative overheads in the process. However, should the objective(s) be to identify SQ gaps (shortfalls), the (measured) disconfirmation approach is more appropriate as this approach has the ability to identify areas that need improvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims: To describe a local data linkage project to match hospital data with the Australian Institute of Health and Welfare (AIHW) National Death Index (NDI) to assess longterm outcomes of intensive care unit patients. Methods: Data were obtained from hospital intensive care and cardiac surgery databases on all patients aged 18 years and over admitted to either of two intensive care units at a tertiary-referral hospital between 1 January 1994 and 31 December 2005. Date of death was obtained from the AIHW NDI by probabilistic software matching, in addition to manual checking through hospital databases and other sources. Survival was calculated from time of ICU admission, with a censoring date of 14 February 2007. Data for patients with multiple hospital admissions requiring intensive care were analysed only from the first admission. Summary and descriptive statistics were used for preliminary data analysis. Kaplan-Meier survival analysis was used to analyse factors determining long-term survival. Results: During the study period, 21 415 unique patients had 22 552 hospital admissions that included an ICU admission; 19 058 surgical procedures were performed with a total of 20 092 ICU admissions. There were 4936 deaths. Median follow-up was 6.2 years, totalling 134 203 patient years. The casemix was predominantly cardiac surgery (80%), followed by cardiac medical (6%), and other medical (4%). The unadjusted survival at 1, 5 and 10 years was 97%, 84% and 70%, respectively. The 1-year survival ranged from 97% for cardiac surgery to 36% for cardiac arrest. An APACHE II score was available for 16 877 patients. In those discharged alive from hospital, the 1, 5 and 10-year survival varied with discharge location. Conclusions: ICU-based linkage projects are feasible to determine long-term outcomes of ICU patients

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is widely held that strong relationships exist between housing, economic status, and well being. This is exemplified by widespread housing stock surpluses in many countries which threaten to destabilise numerous aspects related to individuals and community. However, the position of housing demand and supply is not consistent. The Australian position provides a distinct contrast whereby seemingly inexorable housing demand generally remains a critical issue affecting the socio-economic landscape. Underpinned by high levels of immigration, and further buoyed by sustained historically low interest rates, increasing income levels, and increased government assistance for first home buyers, this strong housing demand ensures elements related to housing affordability continue to gain prominence. A significant, but less visible factor impacting housing affordability – particularly new housing development – relates to holding costs. These costs are in many ways “hidden” and cannot always be easily identified. Although it is only one contributor, the nature and extent of its impact requires elucidation. In its simplest form, it commences with a calculation of the interest or opportunity cost of land holding. However, there is significantly more complexity for major new developments - particularly greenfield property development. Preliminary analysis conducted by the author suggests that even small shifts in primary factors impacting holding costs can appreciably affect housing affordability – and notably, to a greater extent than commonly held. Even so, their importance and perceived high level impact can be gauged from the unprecedented level of attention policy makers have given them over recent years. This may be evidenced by the embedding of specific strategies to address burgeoning holding costs (and particularly those cost savings associated with streamlining regulatory assessment) within statutory instruments such as the Queensland Housing Affordability Strategy, and the South East Queensland Regional Plan. However, several key issues require investigation. Firstly, the computation and methodology behind the calculation of holding costs varies widely. In fact, it is not only variable, but in some instances completely ignored. Secondly, some ambiguity exists in terms of the inclusion of various elements of holding costs, thereby affecting the assessment of their relative contribution. Perhaps this may in part be explained by their nature: such costs are not always immediately apparent. Some forms of holding costs are not as visible as the more tangible cost items associated with greenfield development such as regulatory fees, government taxes, acquisition costs, selling fees, commissions and others. Holding costs are also more difficult to evaluate since for the most part they must be ultimately assessed over time in an ever-changing environment, based on their strong relationship with opportunity cost which is in turn dependant, inter alia, upon prevailing inflation and / or interest rates. By extending research in the general area of housing affordability, this thesis seeks to provide a more detailed investigation of those elements related to holding costs, and in so doing determine the size of their impact specifically on the end user. This will involve the development of soundly based economic and econometric models which seek to clarify the componentry impacts of holding costs. Ultimately, there are significant policy implications in relation to the framework used in Australian jurisdictions that promote, retain, or otherwise maximise, the opportunities for affordable housing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses the choice to use two less conventional or “interesting” research methods, Q Methodology and Experience Sampling Method, rather than “status quo” research methods so common in the marketing discipline. It is argued that such methods have value for marketing academics because they widen the potential for discovery. The paper outlines these two research methods, providing examples of how they have been used in an experiential consumption perspective. Additionally the paper identifies some of the challenges to be faced when trying to publish research that use such less conventional methods, as well as offering suggestions to address them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Digital production and distribution technologies may create new opportunities for filmmaking in Australia. A culture of new approaches to filmmaking is emerging driven by ‘next generation filmmakers’ who are willing to consider new business models: from online web series to short films produced for mobile phones. At the same time cultural representation itself is transforming within an interactive, social media driven environment. Yet there is very little research into next generation filmmaking. The aim of this paper is to scope and discuss three key aspects of next generation filmmaking, namely: digital trends in film distribution and marketing; processes and strategies of ‘next generation’ filmmakers; and case studies of viable next generation business models and filmmaking practices. We conclude with a brief examination of the implications for media and cultural policy which suggests the future possibility of a rapprochement between creative industries discourse and cultural policy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electrostatic discharge is the sudden and brief electric current that flashes between two objects at different voltages. This is a serious issue ranging in application from solid-state electronics to spectacular and dangerous lightning strikes (arc flashes). The research herein presents work on the experimental simulation and measurement of the energy in an electrostatic discharge. The energy released in these discharges has been linked to ignitions and burning in a number of documented disasters and can be enormously hazardous in many other industrial scenarios. Simulations of electrostatic discharges were designed to specifications by IEC standards. This is typically based on the residual voltage/charge on the discharge capacitor, whereas this research examines the voltage and current in the actual spark in order to obtain a more precise comparative measurement of the energy dissipated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seven endemic governance problems are shown to be currently present in governments around the globe and at any level of government as well (for example municipal, federal). These problems have their roots traced back through more than two thousand years of political, specifically ‘democratic’, history. The evidence shows that accountability, transparency, corruption, representation, campaigning methods, constitutionalism and long-term goals were problematic for the ancient Athenians as well as modern international democratisation efforts encompassing every major global region. Why then, given the extended time period humans have had to deal with these problems, are they still present? At least part of the answer to this question is that philosophers, academics and NGOs as well as MNOs have only approached these endemic problems in a piecemeal manner with a skewed perspective on democracy. Their works have also been subject to the ebbs and flows of human history which essentially started and stopped periods of thinking. In order to approach the investigation of endemic problems in relation to democracy (as the overall quest of this thesis was to generate prescriptive results for the improvement of democratic government), it was necessary to delineate what exactly is being written about when using the term ‘democracy’. It is common knowledge that democracy has no one specific definition or practice, even though scholars and philosophers have been attempting to create a definition for generations. What is currently evident, is that scholars are not approaching democracy in an overly simplified manner (that is, it is government for the people, by the people) but, rather, are seeking the commonalities that democracies share, in other words, those items which are common to all things democratic. Following that specific line of investigation, the major practiced and theoretical versions of democracy were thematically analysed. After that, their themes were collapsed into larger categories, at which point the larger categories were comparatively analysed with the practiced and theoretical versions of democracy. Four democratic ‘particles’ (selecting officials, law, equality and communication) were seen to be present in all practiced and theoretical democratic styles. The democratic particles fused with a unique investigative perspective and in-depth political study created a solid conceptualisation of democracy. As such, it is argued that democracy is an ever-present element of any state government, ‘democratic’ or not, and the particles are the bodies which comprise the democratic element. Frequency- and proximity-based analyses showed that democratic particles are related to endemic problems in international democratisation discourse. The linkages between democratic particles and endemic problems were also evident during the thematic analysis as well historical review. This ultimately led to the viewpoint that if endemic problems are mitigated the act may improve democratic particles which might strengthen the element of democracy in the governing apparatus of any state. Such may actively minimise or wholly displace inefficient forms of government, leading to a government specifically tailored to the population it orders. Once the theoretical and empirical goals were attained, this thesis provided some prescriptive measures which government, civil society, academics, professionals and/or active citizens can use to mitigate endemic problems (in any country and at any level of government) so as to improve the human condition via better democratic government.