169 resultados para Masters theses


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is a documented energy audit and long term study of energy and water reduction in a ghee factory. Global production of ghee exceeds 4 million tonnes annually. The factory in this study refines dairy products by non-traditional centrifugal separation and produces 99.9% pure, canned, crystallised Anhydrous Milk Fat (Ghee). Ghee is traditionally made by batch processing methods. The traditional method is less efficient, than centrifugal separation. An in depth systematic investigation was conducted of each item of major equipment including; ammonia refrigeration, a steam boiler, canning equipment, pumps, heat exchangers and compressed air were all fine-tuned. Continuous monitoring of electrical usage showed that not every initiative worked, others had pay back periods of less than a year. In 1994-95 energy consumption was 6,582GJ and in 2003-04 it was 5,552GJ down 16% for a similar output. A significant reduction in water usage was achieved by reducing the airflow in the refrigeration evaporative condensers to match the refrigeration load. Water usage has fallen 68% from18ML in 1994-95 to 5.78ML in 2003-04. The methods reported in this thesis could be applied to other industries, which have similar equipment, and other ghee manufacturers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although the sciences were being taught in Australian schools well before the Second World War, the only evidence of research studies of this teaching is to be found in the report, published by ACER in 1932 of Roy Stanhope’s survey of the teaching of chemistry in New South Wales and a standardized test he had developed. Roy Stanhope was a science teacher with a research masters degree in chemistry. He had won a scholarship to go to Stanford University for doctoral studies, but returned after one year when his scholarship was not extended. He went on to be a founder in 1943 of the Australian Science Teachers Association (ASTA), which honours this remarkable pioneer through its annual Stanhope Oration. In his retirement Stanhope undertook a comparative study of science

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current regulatory requirements on data privacy make it increasingly important for enterprises to be able to verify and audit their compliance with their privacy policies. Traditionally, a privacy policy is written in a natural language. Such policies inherit the potential ambiguity, inconsistency and mis-interpretation of natural text. Hence, formal languages are emerging to allow a precise specification of enforceable privacy policies that can be verified. The EP3P language is one such formal language. An EP3P privacy policy of an enterprise consists of many rules. Given the semantics of the language, there may exist some rules in the ruleset which can never be used, these rules are referred to as redundant rules. Redundancies adversely affect privacy policies in several ways. Firstly, redundant rules reduce the efficiency of operations on privacy policies. Secondly, they may misdirect the policy auditor when determining the outcome of a policy. Therefore, in order to address these deficiencies it is important to identify and resolve redundancies. This thesis introduces the concept of minimal privacy policy - a policy that is free of redundancy. The essential component for maintaining the minimality of privacy policies is to determine the effects of the rules on each other. Hence, redundancy detection and resolution frameworks are proposed. Pair-wise redundancy detection is the central concept in these frameworks and it suggests a pair-wise comparison of the rules in order to detect redundancies. In addition, the thesis introduces a policy management tool that assists policy auditors in performing several operations on an EP3P privacy policy while maintaining its minimality. Formal results comparing alternative notions of redundancy, and how this would affect the tool, are also presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Extended spectrum β-lactamases or ESBLs, which are derived from non-ESBL precursors by point mutation of β-lactamase genes (bla), are spreading rapidly all over the world and have caused considerable problems in the treatment of infections caused by bacteria which harbour them. The mechanism of this resistance is not fully understood and a better understanding of these mechanisms might significantly impact on choosing proper diagnostic and treatment strategies. Previous work on SHV β-lactamase gene, blaSHV, has shown that only Klebsiella pneumoniae strains which contain plasmid-borne blaSHV are able to mutate to phenotypically ESBL-positive strains and there was also evidence of an increase in blaSHV copy number. Therefore, it was hypothesised that although specific point mutation is essential for acquisition of ESBL activity, it is not yet enough, and blaSHV copy number amplification is also essential for an ESBL-positive phenotype, with homologous recombination being the likely mechanism of blaSHV copy number expansion. In this study, we investigated the mutation rate of non-ESBL expressing K. pneumoniae isolates to an ESBL-positive status by using the MSS-maximum likelihood method. Our data showed that blaSHV mutation rate of a non-ESBL expressing isolate is lower than the mutation rate of the other single base changes on the chromosome, even with a plasmid-borne blaSHV gene. On the other hand, mutation rate from a low MIC ESBL-positive (≤ 8 µg/mL for cefotaxime) to high MIC ESBL-positive (≥16 µg/mL for cefotaxime) is very high. This is because only gene copy number increase is needed which is probably mediated by homologous recombination that typically takes place at a much higher frequencies than point mutations. Using a subinhibitory concentration of novobiocin, as a homologous recombination inhibitor, revealed that this is the case.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study focuses on trends in contemporary Australian playwrighting, discussing recent investigations into the playwrighting process. The study analyses the current state of this country’s playwrighting industry, with a particular focus on programming trends since 1998. It seeks to explore the implications of this current theatrical climate, in particular the types of work most commonly being favoured for production. It argues that Australian plays are under-represented (compared to non-Australian plays) on ‘mainstream’ stages and that audiences might benefit from more challenging modes of writing than the popular three-act realist play models. The thesis argues that ‘New Lyricism’ might fill this position of offering an innovative Australian playwrighting mode. New Lyricism is characterised by a set of common aesthetics, including a non-linear narrative structure, a poetic use of language and magic realism. Several Australian playwrights who have adopted this mode of writing are identified and their works examined. The author’s play Floodlands is presented as a case study and the author’s creative process is examined in light of the published critical discussions about experimental playwriting work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Machine downtime, whether planned or unplanned, is intuitively costly to manufacturing organisations, but is often very difficult to quantify. The available literature showed that costing processes are rarely undertaken within manufacturing organisations. Where cost analyses have been undertaken, they generally have only valued a small proportion of the affected costs, leading to an overly conservative estimate. This thesis aimed to develop a cost of downtime model, with particular emphasis on the application of the model to Australia Post’s Flat Mail Optical Character Reader (FMOCR). The costing analysis determined a cost of downtime of $5,700,000 per annum, or an average cost of $138 per operational hour. The second section of this work focused on the use of the cost of downtime to objectively determine areas of opportunity for cost reduction on the FMOCR. This was the first time within Post that maintenance costs were considered along side of downtime for determining machine performance. Because of this, the results of the analysis revealed areas which have historically not been targeted for cost reduction. Further exploratory work was undertaken on the Flats Lift Module (FLM) and Auto Induction Station (AIS) Deceleration Belts through the comparison of the results against two additional FMOCR analysis programs. This research has demonstrated the development of a methodical and quantifiable cost of downtime for the FMOCR. This has been the first time that Post has endeavoured to examine the cost of downtime. It is also one of the very few methodologies for valuing downtime costs that has been proposed in literature. The work undertaken has also demonstrated how the cost of downtime can be incorporated into machine performance analysis with specific application to identifying high costs modules. The outcome of this report has both been the methodology for costing downtime, as well as a list of areas for cost reduction. In doing so, this thesis has outlined the two key deliverables presented at the outset of the research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of the dissertation is to discover the extent to which methodologies and conceptual frameworks used to understand popular culture may also be useful in the attempt to understand contemporary high culture. The dissertation addresses this question through the application of subculture theory to Brisbane’s contemporary chamber music scene, drawing on a detailed case study of the contemporary chamber ensemble Topology and its audiences. The dissertation begins by establishing the logic and necessity of applying cultural studies methodologies to contemporary high culture. This argument is supported by a discussion of the conceptual relationships between cultural studies, high culture, and popular culture, and the methodological consequences of these relationships. In Chapter 2, a brief overview of interdisciplinary approaches to music reveals the central importance of subculture theory, and a detailed survey of the history of cultural studies research into music subcultures follows. Five investigative themes are identified as being crucial to all forms of contemporary subculture theory: the symbolic; the spatial; the social; the temporal; the ideological and political. Chapters 3 and 4 present the findings of the case study as they relate to these five investigative themes of contemporary subculture theory. Chapter 5 synthesises the findings of the previous two chapters, and argues that while participation in contemporary chamber music is not as intense or pervasive as is the case with the most researched street-based youth subcultures, it is nevertheless possible to describe Brisbane’s contemporary chamber music scene as a subculture. The dissertation closes by reflecting on the ways in which the subcultural analysis of contemporary chamber music has yielded some insight into the lived practices of high culture in contemporary urban contexts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis examines the new theatrical form of cyberformance (live performance by remote players using internet technologies) and contextualises it within the broader fields of networked performance, digital performance and theatre. Poststructuralist theories that contest the binary distinction between reality and representation provide the analytical foundation for the thesis. A critical reflexive methodological approach is undertaken in order to highlight three themes. First, the essential qualities and criteria of cyberformance are identified, and illustrated with examples from the early 1990s to the present day. Second, two cyberformance groups – the Plaintext Players and Avatar Body Collision – and UpStage, a purpose-built application for cyberformance, are examined in more detailed case studies. Third, the specifics of the cyberformance audience are explored and commonalities are identified between theatre and online culture. In conclusion, this thesis suggests that theatre and the internet have much to offer each other in this current global state of transition, and that cyberformance offers one means by which to facilitate the incorporation of new technologies into our lives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Retinal image properties such as contrast and spatial frequency play important roles in the development of normal vision. For example, visual environments comprised solely of low contrast and/or low spatial frequencies induce myopia. The visual image is processed by the retina and it then locally controls eye growth. In terms of the retinal neurotransmitters that link visual stimuli to eye growth, there is strong evidence to suggest involvement of the retinal dopamine (DA) system. For example, effectively increasing retinal DA levels by using DA agonists can suppress the development of form-deprivation myopia (FDM). However, whether visual feedback controls eye growth by modulating retinal DA release, and/or some other factors, is still being elucidated. This thesis is chiefly concerned with the relationship between the dopaminergic system and retinal image properties in eye growth control. More specifically, whether the amount of retinal DA release reduces as the complexity of the image degrades was determined. For example, we investigated whether the level of retinal DA release decreased as image contrast decreased. In addition, the effects of spatial frequency, spatial energy distribution slope, and spatial phase on retinal DA release and eye growth were examined. When chicks were 8-days-old, a cone-lens imaging system was applied monocularly (+30 D, 3.3 cm cone). A short-term treatment period (6 hr) and a longer-term treatment period (4.5 days) were used. The short-term treatment tests for the acute reduction in DA release by the visual stimulus, as is seen with diffusers and lenses, whereas the 4.5 day point tests for reduction in DA release after more prolonged exposure to the visual stimulus. In the contrast study, 1.35 cyc/deg square wave grating targets of 95%, 67%, 45%, 12% or 4.2% contrast were used. Blank (0% contrast) targets were included for comparison. In the spatial frequency study, both sine and square wave grating targets with either 0.017 cyc/deg and 0.13 cyc/deg fundamental spatial frequencies and 95% contrast were used. In the spectral slope study, 30% root-mean-squared (RMS) contrast fractal noise targets with spectral fall-off of 1/f0.5, 1/f and 1/f2 were used. In the spatial alignment study, a structured Maltese cross (MX) target, a structured circular patterned (C) target and the scrambled versions of these two targets (SMX and SC) were used. Each treatment group comprised 6 chicks for ocular biometry (refraction and ocular dimension measurement) and 4 for analysis of retinal DA release. Vitreal dihydroxyphenylacetic acid (DOPAC) was analysed through ion-paired reversed phase high performance liquid chromatography with electrochemical detection (HPLC-ED), as a measure of retinal DA release. For the comparison between retinal DA release and eye growth, large reductions in retinal DA release possibly due to the decreased light level inside the cone-lens imaging system were observed across all treated eyes while only those exposed to low contrast, low spatial frequency sine wave grating, 1/f2, C and SC targets had myopic shifts in refraction. Amongst these treatment groups, no acute effect was observed and longer-term effects were only found in the low contrast and 1/f2 groups. These findings suggest that retinal DA release does not causally link visual stimuli properties to eye growth, and these target induced changes in refractive development are not dependent on the level of retinal DA release. Retinal dopaminergic cells might be affected indirectly via other retinal cells that immediately respond to changes in the image contrast of the retinal image.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Overloading truck traffic is an untenable problem around the world. The occurrence of overloaded truck traffic can be evidence of rapid development of an economy. Most of the developing countries emphasize the development of economy, thus supporting reform of infrastructure is limited. This research investigates the relationship between truck overloading and the condition of road damage. The objective of this research is to determine the amount of economic loss due to overloaded truck traffic is. Axle load will be used to calculate the total ESAL to pavement. This study intends to provide perspective on the relationship between change in axle load due to overloading and the resultant service life of pavement. It can then be used in the estimation of pavement damage in other developing countries facing the problem of truck overloading. In conclusion, economical loss was found, which include reduction of pavement life and increase in maintenance and rehabilitation (M&R) cost. As a result, net present value (NPV) of pavement investment with overloading truck traffic is higher than normal truck traffic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Franchising has been widely accepted as an effective way to conduct and expand businesses. However, a franchise system is not a guarantee of success in the market. A successful franchise system should rely on a close and strong franchising relationship. Franchising is an important relationship management business. Franchising arrangements normally last for a number of years, so the franchisor and franchisee in the arrangement relationship are usually motivated to cooperate with each other. In addition, highly loyal franchisees may be obtained through a successful long-term franchising relationship. Over the last few decades, there has been a tremendous wave of interest in franchising relationships. However, little research has been conducted to determine the reasons for long-term franchising relationships. As a result, this study focuses on the important elements that might lead to a successful long-term franchising relationship. This study attempts to examine empirically three essential constructs (relationship quality, cooperation and customer loyalty), which might lead to successful long-term franchising relationships between franchisees and franchisors among the convenience stores in Taiwan. Mailed questionnaires were utilised to collect the research data. A total of 500 surveys were mailed randomly to the manager/supervisor of convenience stores’ franchisees among the four main franchisors (7-ELEVEN, Family, Hi-Life and OK) in Taiwan. The final sample size is 120, yielding a response rate of 24 per cent. The results show that relationship quality positively influences the cooperative relationships between franchisors and franchisees. Relationship quality is also positively correlated with franchisees’ loyalty. Additionally, the results indicate that the cooperative relationships between franchisors and franchisees are significantly associated with franchisees’ loyalty.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, practitioners and researchers alike have turned their attention to knowledge management (KM) in order to increase organisational performance (OP). As a result, many different approaches and strategies have been investigated and suggested for how knowledge should be managed to make organisations more effective and efficient. However, most research has been undertaken in the for-profit sector, with only a few studies focusing on the benefits nonprofit organisations might gain by managing knowledge. This study broadly investigates the impact of knowledge management on the organisational performance of nonprofit organisations. Organisational performance can be evaluated through either financial or non-financial measurements. In order to evaluate knowledge management and organisational performance, non-financial measurements are argued to be more suitable given that knowledge is an intangible asset which often cannot be expressed through financial indicators. Non-financial measurement concepts of performance such as the balanced scorecard or the concept of Intellectual Capital (IC) are well accepted and used within the for-profit and nonprofit sectors to evaluate organisational performance. This study utilised the concept of IC as the method to evaluate KM and OP in the context of nonprofit organisations due to the close link between KM and IC: Indeed, KM is concerned with managing the KM processes of creating, storing, sharing and applying knowledge and the organisational KM infrastructure such as organisational culture or organisational structure to support these processes. On the other hand, IC measures the knowledge stocks in different ontological levels: at the individual level (human capital), at the group level (relational capital) and at the organisational level (structural capital). In other words, IC measures the value of the knowledge which has been managed through KM. As KM encompasses the different KM processes and the KM infrastructure facilitating these processes, previous research has investigated the relationship between KM infrastructure and KM processes. Organisational culture, organisational structure and the level of IT support have been identified as the main factors of the KM infrastructure influencing the KM processes of creating, storing, sharing and applying knowledge. Other research has focused on the link between KM and OP or organisational effectiveness. Based on existing literature, a theoretical model was developed to enable the investigation of the relation between KM (encompassing KM infrastructure and KM processes) and IC. The model assumes an association between KM infrastructure and KM processes, as well as an association between KM processes and the various levels of IC (human capital, structural capital and relational capital). As a result, five research questions (RQ) with respect to the various factors of the KM infrastructure as well as with respect to the relationship between KM infrastructure and IC were raised and included into the research model: RQ 1 Do nonprofit organisations which have a Hierarchy culture have a stronger IT support than nonprofit organisations which have an Adhocracy culture? RQ 2 Do nonprofit organisations which have a centralised organisational structure have a stronger IT support than nonprofit organisations which have decentralised organisational structure? RQ 3 Do nonprofit organisations which have a stronger IT support have a higher value of Human Capital than nonprofit organisations which have a less strong IT support? RQ 4 Do nonprofit organisations which have a stronger IT support have a higher value of Structural Capital than nonprofit organisations which have a less strong IT support? RQ 5 Do nonprofit organisations which have a stronger IT support have a higher value of Relational Capital than nonprofit organisations which have a less strong IT support? In order to investigate the research questions, measurements for IC were developed which were linked to the main KM processes. The final KM/IC model contained four items for evaluating human capital, five items for evaluating structural capital and four items for evaluating relational capital. The research questions were investigated through empirical research using a case study approach with the focus on two nonprofit organisations providing trade promotions services through local offices worldwide. Data for the investigation of the assumptions were collected via qualitative as well as quantitative research methods. The qualitative study included interviews with representatives of the two participating organisations as well as in-depth document research. The purpose of the qualitative study was to investigate the factors of the KM infrastructure (organisational culture, organisational structure, IT support) of the organisations and how these factors were related to each other. On the other hand, the quantitative study was carried out through an online-survey amongst staff of the various local offices. The purpose of the quantitative study was to investigate which impact the level of IT support, as the main instrument of the KM infrastructure, had on IC. Overall several key themes were found as a result of the study: • Knowledge Management and Intellectual Capital were complementary with each other, which should be expressed through measurements of IC based on KM processes. • The various factors of the KM infrastructure (organisational culture, organisational structure and level of IT support) are interdependent. • IT was a primary instrument through which the different KM processes (creating, storing, sharing and applying knowledge) were performed. • A high level of IT support was evident when participants reported higher level of IC (human capital, structural capital and relational capital). The study supported previous research in the field of KM and replicated the findings from other case studies in this area. The study also contributed to theory by placing the KM research within the nonprofit context and analysing the linkage between KM and IC. From the managerial perspective, the findings gave clear indications that would allow interested parties, such as nonprofit managers or consultants to understand more about the implications of KM on OP and to use this knowledge for implementing efficient and effective KM strategies within their organisations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Transmissible diseases are re-emerging as a global problem, with Sexually Transmitted Diseases (STDs) becoming endemic. Chlamydia trachomatis is the leading cause of bacterially-acquired STD worldwide, with the Australian cost of infection estimated at $90 - $160 million annually. Studies using animal models of genital tract Chlamydia infection suggested that the hormonal status of the genital tract epithelium at the time of exposure may influence the outcome of infection. Oral contraceptive use also increased the risk of contracting chlamydial infections compared to women not using contraception. Generally it was suggested that the outcome of chlamydial infection is determined in part by the hormonal status of the epithelium at the time of exposure. Using the human endolmetrial cell line ECC-1 this study investigated the effects of C. trachomatis serovar D infection, in conjunction with the female sex hormones, 17β-estradiol and progesterone, on chlamydial gene expression. While previous studies have examined the host response, this is the first study to examine C.trachomatis gene expression under different hormonal conditions. We have highlighted a basic model of C. trachomatis gene regulation in the presence of steroid hormones by identifying 60 genes that were regulated by addition of estradiol and/or progesterone. In addition, the third chapter of this thesis discussed and compared the significance of the current findings in the context of data from other research groups to improve our understanding of the molecular basis of chlamydial persistence under hormonal different conditions. In addition, this study analysed the effects of these female sex hormones and C. trachomatis Serovar D infection, on host susceptibility and bacterial growth. Our results clearly demonstrated that addition of steroid hormones not only had a great impact on the level of infectivity of epithelial cells with C.trachomatis serovar D, but also the morphology of chlamydial inclusions was affected by hormone supplementation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the 1980s, industries and researchers have sought to better understand the quality of services due to the rise in their importance (Brogowicz, Delene and Lyth 1990). More recent developments with online services, coupled with growing recognition of service quality (SQ) as a key contributor to national economies and as an increasingly important competitive differentiator, amplify the need to revisit our understanding of SQ and its measurement. Although ‘SQ’ can be broadly defined as “a global overarching judgment or attitude relating to the overall excellence or superiority of a service” (Parasuraman, Berry and Zeithaml 1988), the term has many interpretations. There has been considerable progress on how to measure SQ perceptions, but little consensus has been achieved on what should be measured. There is agreement that SQ is multi-dimensional, but little agreement as to the nature or content of these dimensions (Brady and Cronin 2001). For example, within the banking sector, there exist multiple SQ models, each consisting of varying dimensions. The existence of multiple conceptions and the lack of a unifying theory bring the credibility of existing conceptions into question, and beg the question of whether it is possible at some higher level to define SQ broadly such that it spans all service types and industries. This research aims to explore the viability of a universal conception of SQ, primarily through a careful re-visitation of the services and SQ literature. The study analyses the strengths and weaknesses of the highly regarded and widely used global SQ model (SERVQUAL) which reflects a single-level approach to SQ measurement. The SERVQUAL model states that customers evaluate SQ (of each service encounter) based on five dimensions namely reliability, assurance, tangibles, empathy and responsibility. SERVQUAL, however, failed to address what needs to be reliable, assured, tangible, empathetic and responsible. This research also addresses a more recent global SQ model from Brady and Cronin (2001); the B&C (2001) model, that has potential to be the successor of SERVQUAL in that it encompasses other global SQ models and addresses the ‘what’ questions that SERVQUAL didn’t. The B&C (2001) model conceives SQ as being multidimensional and multi-level; this hierarchical approach to SQ measurement better reflecting human perceptions. In-line with the initial intention of SERVQUAL, which was developed to be generalizable across industries and service types, this research aims to develop a conceptual understanding of SQ, via literature and reflection, that encompasses the content/nature of factors related to SQ; and addresses the benefits and weaknesses of various SQ measurement approaches (i.e. disconfirmation versus perceptions-only). Such understanding of SQ seeks to transcend industries and service types with the intention of extending our knowledge of SQ and assisting practitioners in understanding and evaluating SQ. The candidate’s research has been conducted within, and seeks to contribute to, the ‘IS-Impact’ research track of the IT Professional Services (ITPS) Research Program at QUT. The vision of the track is “to develop the most widely employed model for benchmarking Information Systems in organizations for the joint benefit of research and practice.” The ‘IS-Impact’ research track has developed an Information Systems (IS) success measurement model, the IS-Impact Model (Gable, Sedera and Chan 2008), which seeks to fulfill the track’s vision. Results of this study will help future researchers in the ‘IS-Impact’ research track address questions such as: • Is SQ an antecedent or consequence of the IS-Impact model or both? • Has SQ already been addressed by existing measures of the IS-Impact model? • Is SQ a separate, new dimension of the IS-Impact model? • Is SQ an alternative conception of the IS? Results from the candidate’s research suggest that SQ dimensions can be classified at a higher level which is encompassed by the B&C (2001) model’s 3 primary dimensions (interaction, physical environment and outcome). The candidate also notes that it might be viable to re-word the ‘physical environment quality’ primary dimension to ‘environment quality’ so as to better encompass both physical and virtual scenarios (E.g: web sites). The candidate does not rule out the global feasibility of the B&C (2001) model’s nine sub-dimensions, however, acknowledges that more work has to be done to better define the sub-dimensions. The candidate observes that the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions are supportive representations of the ‘interaction’, physical environment’ and ‘outcome’ primary dimensions respectively. The latter statement suggests that customers evaluate each primary dimension (or each higher level of SQ classification) namely ‘interaction’, physical environment’ and ‘outcome’ based on the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions respectively. The ability to classify SQ dimensions at a higher level coupled with support for the measures that make up this higher level, leads the candidate to propose the B&C (2001) model as a unifying theory that acts as a starting point to measuring SQ and the SQ of IS. The candidate also notes, in parallel with the continuing validation and generalization of the IS-Impact model, that there is value in alternatively conceptualizing the IS as a ‘service’ and ultimately triangulating measures of IS SQ with the IS-Impact model. These further efforts are beyond the scope of the candidate’s study. Results from the candidate’s research also suggest that both the disconfirmation and perceptions-only approaches have their merits and the choice of approach would depend on the objective(s) of the study. Should the objective(s) be an overall evaluation of SQ, the perceptions-only approached is more appropriate as this approach is more straightforward and reduces administrative overheads in the process. However, should the objective(s) be to identify SQ gaps (shortfalls), the (measured) disconfirmation approach is more appropriate as this approach has the ability to identify areas that need improvement.