93 resultados para global nonhydrostatic model

em Queensland University of Technology - ePrints Archive


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Since the 1980s, industries and researchers have sought to better understand the quality of services due to the rise in their importance (Brogowicz, Delene and Lyth 1990). More recent developments with online services, coupled with growing recognition of service quality (SQ) as a key contributor to national economies and as an increasingly important competitive differentiator, amplify the need to revisit our understanding of SQ and its measurement. Although ‘SQ’ can be broadly defined as “a global overarching judgment or attitude relating to the overall excellence or superiority of a service” (Parasuraman, Berry and Zeithaml 1988), the term has many interpretations. There has been considerable progress on how to measure SQ perceptions, but little consensus has been achieved on what should be measured. There is agreement that SQ is multi-dimensional, but little agreement as to the nature or content of these dimensions (Brady and Cronin 2001). For example, within the banking sector, there exist multiple SQ models, each consisting of varying dimensions. The existence of multiple conceptions and the lack of a unifying theory bring the credibility of existing conceptions into question, and beg the question of whether it is possible at some higher level to define SQ broadly such that it spans all service types and industries. This research aims to explore the viability of a universal conception of SQ, primarily through a careful re-visitation of the services and SQ literature. The study analyses the strengths and weaknesses of the highly regarded and widely used global SQ model (SERVQUAL) which reflects a single-level approach to SQ measurement. The SERVQUAL model states that customers evaluate SQ (of each service encounter) based on five dimensions namely reliability, assurance, tangibles, empathy and responsibility. SERVQUAL, however, failed to address what needs to be reliable, assured, tangible, empathetic and responsible. This research also addresses a more recent global SQ model from Brady and Cronin (2001); the B&C (2001) model, that has potential to be the successor of SERVQUAL in that it encompasses other global SQ models and addresses the ‘what’ questions that SERVQUAL didn’t. The B&C (2001) model conceives SQ as being multidimensional and multi-level; this hierarchical approach to SQ measurement better reflecting human perceptions. In-line with the initial intention of SERVQUAL, which was developed to be generalizable across industries and service types, this research aims to develop a conceptual understanding of SQ, via literature and reflection, that encompasses the content/nature of factors related to SQ; and addresses the benefits and weaknesses of various SQ measurement approaches (i.e. disconfirmation versus perceptions-only). Such understanding of SQ seeks to transcend industries and service types with the intention of extending our knowledge of SQ and assisting practitioners in understanding and evaluating SQ. The candidate’s research has been conducted within, and seeks to contribute to, the ‘IS-Impact’ research track of the IT Professional Services (ITPS) Research Program at QUT. The vision of the track is “to develop the most widely employed model for benchmarking Information Systems in organizations for the joint benefit of research and practice.” The ‘IS-Impact’ research track has developed an Information Systems (IS) success measurement model, the IS-Impact Model (Gable, Sedera and Chan 2008), which seeks to fulfill the track’s vision. Results of this study will help future researchers in the ‘IS-Impact’ research track address questions such as: • Is SQ an antecedent or consequence of the IS-Impact model or both? • Has SQ already been addressed by existing measures of the IS-Impact model? • Is SQ a separate, new dimension of the IS-Impact model? • Is SQ an alternative conception of the IS? Results from the candidate’s research suggest that SQ dimensions can be classified at a higher level which is encompassed by the B&C (2001) model’s 3 primary dimensions (interaction, physical environment and outcome). The candidate also notes that it might be viable to re-word the ‘physical environment quality’ primary dimension to ‘environment quality’ so as to better encompass both physical and virtual scenarios (E.g: web sites). The candidate does not rule out the global feasibility of the B&C (2001) model’s nine sub-dimensions, however, acknowledges that more work has to be done to better define the sub-dimensions. The candidate observes that the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions are supportive representations of the ‘interaction’, physical environment’ and ‘outcome’ primary dimensions respectively. The latter statement suggests that customers evaluate each primary dimension (or each higher level of SQ classification) namely ‘interaction’, physical environment’ and ‘outcome’ based on the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions respectively. The ability to classify SQ dimensions at a higher level coupled with support for the measures that make up this higher level, leads the candidate to propose the B&C (2001) model as a unifying theory that acts as a starting point to measuring SQ and the SQ of IS. The candidate also notes, in parallel with the continuing validation and generalization of the IS-Impact model, that there is value in alternatively conceptualizing the IS as a ‘service’ and ultimately triangulating measures of IS SQ with the IS-Impact model. These further efforts are beyond the scope of the candidate’s study. Results from the candidate’s research also suggest that both the disconfirmation and perceptions-only approaches have their merits and the choice of approach would depend on the objective(s) of the study. Should the objective(s) be an overall evaluation of SQ, the perceptions-only approached is more appropriate as this approach is more straightforward and reduces administrative overheads in the process. However, should the objective(s) be to identify SQ gaps (shortfalls), the (measured) disconfirmation approach is more appropriate as this approach has the ability to identify areas that need improvement.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes an application of decoupled probabilistic world modeling to achieve team planning. The research is based on the principle that the action selection mechanism of a member in a robot team can select an effective action if a global world model is available to all team members. In the real world, the sensors are imprecise, and are individual to each robot, hence providing each robot a partial and unique view about the environment. We address this problem by creating a probabilistic global view on each agent by combining the perceptual information from each robot. This probabilistic view forms the basis for selecting actions to achieve the team goal in a dynamic environment. Experiments have been carried out to investigate the effectiveness of this principle using custom-built robots for real world performance, in addition, to extensive simulation results. The results show an improvement in team effectiveness when using probabilistic world modeling based on perception sharing for team planning.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The occurrence of extreme water levels along low-lying, highly populated and/or developed coastlines can lead to considerable loss of life and billions of dollars of damage to coastal infrastructure. Therefore it is vitally important that the exceedance probabilities of extreme water levels are accurately evaluated to inform risk-based flood management, engineering and future land-use planning. This ensures the risk of catastrophic structural failures due to under-design or expensive wastes due to over-design are minimised. This paper estimates for the first time present day extreme water level exceedence probabilities around the whole coastline of Australia. A high-resolution depth averaged hydrodynamic model has been configured for the Australian continental shelf region and has been forced with tidal levels from a global tidal model and meteorological fields from a global reanalysis to generate a 61-year hindcast of water levels. Output from this model has been successfully validated against measurements from 30 tide gauge sites. At each numeric coastal grid point, extreme value distributions have been fitted to the derived time series of annual maxima and the several largest water levels each year to estimate exceedence probabilities. This provides a reliable estimate of water level probabilities around southern Australia; a region mainly impacted by extra-tropical cyclones. However, as the meteorological forcing used only weakly includes the effects of tropical cyclones, extreme water level probabilities are underestimated around the western, northern and north-eastern Australian coastline. In a companion paper we build on the work presented here and more accurately include tropical cyclone-induced surges in the estimation of extreme water level. The multi-decadal hindcast generated here has been used primarily to estimate extreme water level exceedance probabilities but could be used more widely in the future for a variety of other research and practical applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The occurrence of extreme water level events along low-lying, highly populated and/or developed coastlines can lead to devastating impacts on coastal infrastructure. Therefore it is very important that the probabilities of extreme water levels are accurately evaluated to inform flood and coastal management and for future planning. The aim of this study was to provide estimates of present day extreme total water level exceedance probabilities around the whole coastline of Australia, arising from combinations of mean sea level, astronomical tide and storm surges generated by both extra-tropical and tropical storms, but exclusive of surface gravity waves. The study has been undertaken in two main stages. In the first stage, a high-resolution (~10 km along the coast) hydrodynamic depth averaged model has been configured for the whole coastline of Australia using the Danish Hydraulics Institute’s Mike21 modelling suite of tools. The model has been forced with astronomical tidal levels, derived from the TPX07.2 global tidal model, and meteorological fields, from the US National Center for Environmental Prediction’s global reanalysis, to generate a 61-year (1949 to 2009) hindcast of water levels. This model output has been validated against measurements from 30 tide gauge sites around Australia with long records. At each of the model grid points located around the coast, time series of annual maxima and the several highest water levels for each year were derived from the multi-decadal water level hindcast and have been fitted to extreme value distributions to estimate exceedance probabilities. Stage 1 provided a reliable estimate of the present day total water level exceedance probabilities around southern Australia, which is mainly impacted by extra-tropical storms. However, as the meteorological fields used to force the hydrodynamic model only weakly include the effects of tropical cyclones the resultant water levels exceedance probabilities were underestimated around western, northern and north-eastern Australia at higher return periods. Even if the resolution of the meteorological forcing was adequate to represent tropical cyclone-induced surges, multi-decadal periods yielded insufficient instances of tropical cyclones to enable the use of traditional extreme value extrapolation techniques. Therefore, in the second stage of the study, a statistical model of tropical cyclone tracks and central pressures was developed using histroic observations. This model was then used to generate synthetic events that represented 10,000 years of cyclone activity for the Australia region, with characteristics based on the observed tropical cyclones over the last ~40 years. Wind and pressure fields, derived from these synthetic events using analytical profile models, were used to drive the hydrodynamic model to predict the associated storm surge response. A random time period was chosen, during the tropical cyclone season, and astronomical tidal forcing for this period was included to account for non-linear interactions between the tidal and surge components. For each model grid point around the coast, annual maximum total levels for these synthetic events were calculated and these were used to estimate exceedance probabilities. The exceedance probabilities from stages 1 and 2 were then combined to provide a single estimate of present day extreme water level probabilities around the whole coastline of Australia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The study of inductively coupled Ar/CH 4/H 2 plasmas in the plasma enhanced chemical vapor deposition (PECVD) of self-assembled carbon nanostructures (CN) was presented. A spatially averaged (global) discharge model was developed to study the densities and fluxes of the radical neutrals and charged species, the effective electron temperature, and methane conversion factors under various conditions. It was found that the deposited cation fluxes in the PECVD of CNs generally exceed those of the radical neutrals. The agreement with the optical emission spectroscopy (OES) and quadrupole mass spectrometry (QMS) was also derived through numerical results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A global electromagnetic model of an inductively coupled plasma sustained by an internal oscillating current sheet in a cylindrical metal vessel is developed. The electromagnetic field structure, profiles of the rf power transferred to the plasma electrons, electron/ion number density, and working points of the discharge are studied, by invoking particle and power balance. It is revealed that the internal rf current with spatially invariable phase significantly improves the radial uniformity of the electromagnetic fields and the power density in the chamber as compared with conventional plasma sources with external flat spiral inductive coils. This configuration offers the possibility of controlling the rf power deposition in the azimuthal direction.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the increasing growth of cultural events both in Australia and internationally, there has also been an increase in event management studies; in theory and in practice. Although a series of related knowledge and skills required specifically by event managers has already been identified by many researchers (Perry et al., 1996; Getz, 2002 & Silvers et al., 2006) and generic event management models proposed, including ‘project management’ strategies in an event context (Getz, 2007), knowledge gaps still exist in relation to identifying specific types of events, especially for not-for-profit arts events. For events of a largely voluntary nature, insufficient resources are recognised as the most challenging; including finance, human resources and infrastructure. Therefore, the concepts and principles which are adopted by large scale commercial events may not be suitable for not-for-profit arts events aiming at providing professional network opportunities for artists. Building partnerships are identified as a key strategy in developing an effective event management model for this type of event. Using the 2008 World Dance Alliance Global Summit (WDAGS) in Brisbane 13-18 July, as a case study, the level, nature and relationship of key partners are investigated. Data is triangulated from interviews with organisers of the 2008 WDAGS, on-line and email surveys of delegates, participant observation and analysis of formal and informal documents, to produce a management model suited to this kind of event.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The basic reproduction number of a pathogen, R 0, determines whether a pathogen will spread (R0>1R 0>1), when introduced into a fully susceptible population or fade out (R0<1R 0<1), because infected hosts do not, on average, replace themselves. In this paper we develop a simple mechanistic model for the basic reproduction number for a group of tick-borne pathogens that wholly, or almost wholly, depend on horizontal transmission to and from vertebrate hosts. This group includes the causative agent of Lyme disease, Borrelia burgdorferi, and the causative agent of human babesiosis, Babesia microti, for which transmission between co-feeding ticks and vertical transmission from adult female ticks are both negligible. The model has only 19 parameters, all of which have a clear biological interpretation and can be estimated from laboratory or field data. The model takes into account the transmission efficiency from the vertebrate host as a function of the days since infection, in part because of the potential for this dynamic to interact with tick phenology, which is also included in the model. This sets the model apart from previous, similar models for R0 for tick-borne pathogens. We then define parameter ranges for the 19 parameters using estimates from the literature, as well as laboratory and field data, and perform a global sensitivity analysis of the model. This enables us to rank the importance of the parameters in terms of their contribution to the observed variation in R0. We conclude that the transmission efficiency from the vertebrate host to Ixodes scapularis ticks, the survival rate of Ixodes scapularis from fed larva to feeding nymph, and the fraction of nymphs finding a competent host, are the most influential factors for R0. This contrasts with other vector borne pathogens where it is usually the abundance of the vector or host, or the vector-to-host ratio, that determine conditions for emergence. These results are a step towards a better understanding of the geographical expansion of currently emerging horizontally transmitted tick-borne pathogens such as Babesia microti, as well as providing a firmer scientific basis for targeted use of acaricide or the application of wildlife vaccines that are currently in development.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This special issue of Cultural Science Journal is devoted to the report of a groundbreaking experiment in re-coordinating global markets for specialist scholarly books and enabling the knowledge commons: the Knowledge Unlatched proof-of-concept pilot. The pilot took place between January 2012 and September 2014. It involved libraries, publishers, authors, readers and research funders in the process of developing and testing a global library consortium model for supporting Open Access books. The experiment established that authors, librarians, publishers and research funding agencies can work together in powerful new ways to enable open access; that doing so is cost effective; and that a global library consortium model has the potential dramatically to widen access to the knowledge and ideas contained in book-length scholarly works.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Specialist scholarly books, including monographs, allow researchers to present their work, pose questions and to test and extend areas of theory through long-form writing. In spite of the fact that research communities all over the world value monographs and depend heavily on them as a requirement of tenure and promotion in many disciplines, sales of this kind of book are in free fall, with some estimates suggesting declines of as much as 90% over twenty years (Willinsky 2006). Cashstrapped monograph publishers have found themselves caught in a negative cycle of increasing prices and falling sales, with few resources left to support experimentation, business model innovation or engagement with digital technology and Open Access (OA). This chapter considers an important attempt to tackle failing markets for scholarly monographs, and to enable the wider adoption of OA licenses for book-length works: the 2012 – 2014 Knowledge Unlatched pilot. Knowledge Unlatched is a bold attempt to reconfigure the market for specialist scholarly books: moving it beyond the sale of ‘content’ towards a model that supports the services valued by scholarly and wider communities in the context of digital possibility. Its success has powerful implications for the way we understand copyright’s role in the creative industries, and the potential for established institutions and infrastructure to support the open and networked dynamics of a digital age.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Improving urban ecosystems and the quality of life of citizens have become a central issue in the global effort of creating sustainable built environments. As human beings our lives completely depend on the sustainability of the nature and we need to protect and manage natural resources in a more sustainable way in order to sustain our existence. As a result of population growth and rapid urbanisation, increasing demand of productivity depletes and degrades natural resources. However, the increasing activities and rapid development require more resources, and therefore, ecological planning becomes an essential vehicle in preserving scarce natural resources. This paper aims to indentify the interation between urban ecosystems and human activities in the context of urban sustainability and explores the degrading environmental impacts of this interaction and the necessity and benefits of using sustainability indicators as a tool in sustainable urban evnironmental management. Additionally, the paper also introduces an environmental sustainability indexing model (ASSURE) as an innovative approach to evaluate the environmental conditions of built environment.