514 resultados para uses
Resumo:
Network Jamming systems provide real-time collaborative performance experiences for novice or inexperienced users. In this paper we will outline the interaction design considerations that have emerged during through evolutionary development cycles of the jam2jam Network Jamming software that employs generative techniques that require particular attention to the human computer relationship. In particular we describe the co-evolution of features and uses, explore the role of agile development methods in supporting this evolution, and show how the provision of a clear core capability can be matched with options for enhanced features support multi-levelled user experience and skill develop.
Resumo:
In Australia, airports have emerged as important sub-regional activity centres and now pose challenges for both airport operation and planning in the surrounding urban and regional environment. The changing nature of airports in their metropolitan context and the emergence of new pressures and problems require the introduction of a fresh conceptual framework to assist the better understanding of these complex roles and spatial interactions. The approach draws upon the meta-concept of interfaces of an ‘airport metropolis’ as an organising device consisting of four main domains: economic development, land use,infrastructure, and governance. The paper uses the framework to further discuss airport and regional interactions and highlights the use of sustainability criteria to operationalise the model. The approach aims to move research and practice beyond the traditionally compartmentalised analysis of airport issues and policy-making by highlighting interdependencies between airports and regions.
Resumo:
Is 'disappointment' and 'the teaching of disgust' the core of TV Studies? Or might teaching better be accomplished by inspiring positive civic action. Either way, doesn't reality TV do it better? John Hartley uses examples from reality TV to discuss this question.
Resumo:
John Hartley uses the TV show "Dead Like Me" to show how far TV has evolved from the broadcast era.
Resumo:
John Hartley uses the 1956 Olympic Games in Melbourne to discuss the notions of a history of TV and TV History and concludes that the internet offers entirely new possibilities for TV as History.
Resumo:
This paper focuses on the assessment of reflective practice, an issue that has not been fully explored within legal education literature. While the issue of how reflective practice should be taught is one that requires careful consideration, it is beyond the scope of this paper to consider both the teaching and the assessment of reflective practice. Part II of this paper conceptualises reflective practice, and Part III explores the benefits of reflective practice in legal education and the use of reflective writing to assess experiential learning in a legal context. Part IV considers the diverse issues that arise in assessing reflective practice and whether there is an objective method for assessing reflection. Part V of the paper examines the assessment of reflective practice in the context of an exemplar undergraduate law subject that uses a reflective report to assess students’ experiential learning during a court visit.14 Finally, Part VI offers a rubric to facilitate criterion-referenced assessment of reflective practice and thereby provides a framework for assessing reflection skills. It is suggested that the rubric is transferable not only to other law subjects but also to subjects in other disciplines.
Resumo:
In recent years culture has become one of the most studied topics in project management research. Some studies have investigated the influence of culture at different levels – such as national culture, industry culture, organisational culture and professional culture. As a project-based industry, the construction industry needs to have more insight concerning cultural issues at the project level and their influence on the performance of construction projects. Few studies, however, have focused on culture at the project level. This paper uses a questionnaire survey to determine the perceptions of Chinese contractors about the impact of project culture on the performance of local construction projects. This is augmented by a series of in-depth interviews with senior executive managers in the industry. The findings indicate that specific project culture does contribute significantly towards project outcomes. In particular, goal orientation and flexibility, as two dimensions of project culture, have a negative statistical correlation with perceived satisfaction of the process, commercial success, future business opportunities, lessons learnt from the project, satisfaction with the relationships, and overall performance. This paper also indicates that the affordability of developing an appropriate project culture is a major concern for industry practitioners.
Resumo:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.
Resumo:
Changes in load characteristics, deterioration with age, environmental influences and random actions may cause local or global damage in structures, especially in bridges, which are designed for long life spans. Continuous health monitoring of structures will enable the early identification of distress and allow appropriate retrofitting in order to avoid failure or collapse of the structures. In recent times, structural health monitoring (SHM) has attracted much attention in both research and development. Local and global methods of damage assessment using the monitored information are an integral part of SHM techniques. In the local case, the assessment of the state of a structure is done either by direct visual inspection or using experimental techniques such as acoustic emission, ultrasonic, magnetic particle inspection, radiography and eddy current. A characteristic of all these techniques is that their application requires a prior localization of the damaged zones. The limitations of the local methodologies can be overcome by using vibration-based methods, which give a global damage assessment. The vibration-based damage detection methods use measured changes in dynamic characteristics to evaluate changes in physical properties that may indicate structural damage or degradation. The basic idea is that modal parameters (notably frequencies, mode shapes, and modal damping) are functions of the physical properties of the structure (mass, damping, and stiffness). Changes in the physical properties will therefore cause changes in the modal properties. Any reduction in structural stiffness and increase in damping in the structure may indicate structural damage. This research uses the variations in vibration parameters to develop a multi-criteria method for damage assessment. It incorporates the changes in natural frequencies, modal flexibility and modal strain energy to locate damage in the main load bearing elements in bridge structures such as beams, slabs and trusses and simple bridges involving these elements. Dynamic computer simulation techniques are used to develop and apply the multi-criteria procedure under different damage scenarios. The effectiveness of the procedure is demonstrated through numerical examples. Results show that the proposed method incorporating modal flexibility and modal strain energy changes is competent in damage assessment in the structures treated herein.
Resumo:
Objectives. To evaluate the performance of the dynamic-area high-speed videokeratoscopy technique in the assessment of tear film surface quality with and without the presence of soft contact lenses on eye. Methods. Retrospective data from a tear film study using basic high-speed videokeratoscopy, captured at 25 frames per second, (Kopf et al., 2008, J Optom) were used. Eleven subjects had tear film analysis conducted in the morning, midday and evening on the first and seventh day of one week of no lens wear. Five of the eleven subjects then completed an extra week of hydrogel lens wear followed by a week of silicone hydrogel lens wear. Analysis was performed on a 6 second period of the inter-blink recording. The dynamic-area high-speed videokeratoscopy technique uses the maximum available area of Placido ring pattern reflected from the tear interface and eliminates regions of disturbance due to shadows from the eyelashes. A value of tear film surface quality was derived using image rocessing techniques, based on the quality of the reflected ring pattern orientation. Results. The group mean tear film surface quality and the standard deviations for each of the conditions (bare eye, hydrogel lens, and silicone hydrogel lens) showed a much lower coefficient of variation than previous methods (average reduction of about 92%). Bare eye measurements from the right and left eyes of eleven individuals showed high correlation values (Pearson’s correlation r = 0.73, p < 0.05). Repeated measures ANOVA across the 6 second period of measurement in the normal inter-blink period for the bare eye condition showed no statistically significant changes. However, across the 6 second inter-blink period with both contact lenses, statistically significant changes were observed (p < 0.001) for both types of contact lens material. Overall, wearing hydrogel and silicone hydrogel lenses caused the tear film surface quality to worsen compared with the bare eye condition (repeated measures ANOVA, p < 0.0001 for both hydrogel and silicone hydrogel). Conclusions. The results suggest that the dynamic-area method of high-speed videokeratoscopy was able to distinguish and quantify the subtle, but systematic worsening of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions.
Resumo:
The volume is a collection of papers that address issues associated with change in the delivery of VET programs in Queensland, foreshadowed by the release of The Queensland Skill Plan in 2006. Issues that relate to the implementation of the Actions identified in the Queensland Skills Plan are the focus of the collection. In particular, the incorporation of Information Communication Technologies (ICTs) and e-learning approaches in the delivery of training packages is a key topic, how such change can be managed in the delivery of training programs, as well as broader professional development issues for VET practitioners. Change at an organisational level is the focus of two papers. Lyn Ambrose uses ideas from Diffusion of Innovations Theory to consider how the adoption eLearning in a TAFE community can be addressed. The paper by Susan Todhunter also discusses the organisational challenges in change initiatives in TAFE Institutes. Specific issues related to in the professional development of VET teachers are the focus of the papers by Mary Campbell, Sharon Altena, and Judy Gronold. Mary Campbell discusses the importance of building staff capabilities within the TAFE system and how this might be managed. Sharon Altena considers how professional development programs are currently delivered and how new approaches to professional development for TAFE teachers are needed to ensure changes can be sustained in teaching practice. The paper by Judy Gronold takes up a specific challenge for VET practitioners in the Queensland Skills Plan. She addresses issues related to embedding employability skills into training delivery in order to address industries’ need for flexible, multi-skilled productive workers. Mark Driver discusses the issues resulting from increased number of mature-aged learners in VET programs and how this change in the demographic profile of students presents challenges to the VET system. In the paper by David McKee, implications in the incorporation of ICTs into trade training are discussed and the need for effective change management strategies to ensure a smooth transition to new ways of delivering trade training. Finally, in the paper by David Roberts, the potential of Problem-Based Learning (PBL) approaches in VET training and the role of ICTs within such approaches are discussed. David uses horticulture training as an example to discuss the issues in implementing PBL effectively in VET programs. These papers were completed by the authors as a part of their postgraduate studies at QUT. The views reported are those of the authors and should not be attributed to the Queensland Department of Education, Training and the Arts.
Resumo:
This is an experimental study into the permeability and compressibility properties of bagasse pulp pads. Three experimental rigs were custom-built for this project. The experimental work is complemented by modelling work. Both the steady-state and dynamic behaviour of pulp pads are evaluated in the experimental and modelling components of this project. Bagasse, the fibrous residue that remains after sugar is extracted from sugarcane, is normally burnt in Australia to generate steam and electricity for the sugar factory. A study into bagasse pulp was motivated by the possibility of making highly value-added pulp products from bagasse for the financial benefit of sugarcane millers and growers. The bagasse pulp and paper industry is a multibillion dollar industry (1). Bagasse pulp could replace eucalypt pulp which is more widely used in the local production of paper products. An opportunity exists for replacing the large quantity of mainly generic paper products imported to Australia. This includes 949,000 tonnes of generic photocopier papers (2). The use of bagasse pulp for paper manufacture is the main application area of interest for this study. Bagasse contains a large quantity of short parenchyma cells called ‘pith’. Around 30% of the shortest fibres are removed from bagasse prior to pulping. Despite the ‘depithing’ operations in conventional bagasse pulp mills, a large amount of pith remains in the pulp. Amongst Australian paper producers there is a perception that the high quantity of short fibres in bagasse pulp leads to poor filtration behaviour at the wet-end of a paper machine. Bagasse pulp’s poor filtration behaviour reduces paper production rates and consequently revenue when compared to paper production using locally made eucalypt pulp. Pulp filtration can be characterised by two interacting factors; permeability and compressibility. Surprisingly, there has previously been very little rigorous investigation into neither bagasse pulp permeability nor compressibility. Only freeness testing of bagasse pulp has been published in the open literature. As a result, this study has focussed on a detailed investigation of the filtration properties of bagasse pulp pads. As part of this investigation, this study investigated three options for improving the permeability and compressibility properties of Australian bagasse pulp pads. Two options for further pre-treating depithed bagasse prior to pulping were considered. Firstly, bagasse was fractionated based on size. Two bagasse fractions were produced, ‘coarse’ and ‘medium’ bagasse fractions. Secondly, bagasse was collected after being processed on two types of juice extraction technology, i.e. from a sugar mill and from a sugar diffuser. Finally one method of post-treating the bagasse pulp was investigated. The effects of chemical additives, which are known to improve freeness, were also assessed for their effect on pulp pad permeability and compressibility. Pre-treated Australian bagasse pulp samples were compared with several benchmark pulp samples. A sample of commonly used kraft Eucalyptus globulus pulp was obtained. A sample of depithed Argentinean bagasse, which is used for commercial paper production, was also obtained. A sample of Australian bagasse which was depithed as per typical factory operations was also produced for benchmarking purposes. The steady-state pulp pad permeability and compressibility parameters were determined experimentally using two purpose-built experimental rigs. In reality, steady-state conditions do not exist on a paper machine. The permeability changes as the sheet compresses over time. Hence, a dynamic model was developed which uses the experimentally determined steady-state permeability and compressibility parameters as inputs. The filtration model was developed with a view to designing pulp processing equipment that is suitable specifically for bagasse pulp. The predicted results of the dynamic model were compared to experimental data. The effectiveness of a polymeric and microparticle chemical additives for improving the retention of short fibres and increasing the drainage rate of a bagasse pulp slurry was determined in a third purpose-built rig; a modified Dynamic Drainage Jar (DDJ). These chemical additives were then used in the making of a pulp pad, and their effect on the steady-state and dynamic permeability and compressibility of bagasse pulp pads was determined. The most important finding from this investigation was that Australian bagasse pulp was produced with higher permeability than eucalypt pulp, despite a higher overall content of short fibres. It is thought this research outcome could enable Australian paper producers to switch from eucalypt pulp to bagasse pulp without sacrificing paper machine productivity. It is thought that two factors contributed to the high permeability of the bagasse pulp pad. Firstly, thicker cell walls of the bagasse pulp fibres resulted in high fibre stiffness. Secondly, the bagasse pulp had a large proportion of fibres longer than 1.3 mm. These attributes helped to reinforce the pulp pad matrix. The steady-state permeability and compressibility parameters for the eucalypt pulp were consistent with those found by previous workers. It was also found that Australian pulp derived from the ‘coarse’ bagasse fraction had higher steady-state permeability than the ‘medium’ fraction. However, there was no difference between bagasse pulp originating from a diffuser or a mill. The bagasse pre-treatment options investigated in this study were not found to affect the steady-state compressibility parameters of a pulp pad. The dynamic filtration model was found to give predictions that were in good agreement with experimental data for pads made from samples of pretreated bagasse pulp, provided at least some pith was removed prior to pulping. Applying vacuum to a pulp slurry in the modified DDJ dramatically reduced the drainage time. At any level of vacuum, bagasse pulp benefitted from chemical additives as quantified by reduced drainage time and increased retention of short fibres. Using the modified DDJ, it was observed that under specific conditions, a benchmark depithed bagasse pulp drained more rapidly than the ‘coarse’ bagasse pulp. In steady-state permeability and compressibility experiments, the addition of chemical additives improved the pad permeability and compressibility of a benchmark bagasse pulp with a high quantity of short fibres. Importantly, this effect was not observed for the ‘coarse’ bagasse pulp. However, dynamic filtration experiments showed that there was also a small observable improvement in filtration for the ‘medium’ bagasse pulp. The mechanism of bagasse pulp pad consolidation appears to be by fibre realignment. Chemical additives assist to lubricate the consolidation process. This study was complemented by pulp physical and chemical property testing and a microscopy study. In addition to its high pulp pad permeability, ‘coarse’ bagasse pulp often (but not always) had superior physical properties than a benchmark depithed bagasse pulp.
Resumo:
Abstract With the phenomenal growth of electronic data and information, there are many demands for the development of efficient and effective systems (tools) to perform the issue of data mining tasks on multidimensional databases. Association rules describe associations between items in the same transactions (intra) or in different transactions (inter). Association mining attempts to find interesting or useful association rules in databases: this is the crucial issue for the application of data mining in the real world. Association mining can be used in many application areas, such as the discovery of associations between customers’ locations and shopping behaviours in market basket analysis. Association mining includes two phases. The first phase, called pattern mining, is the discovery of frequent patterns. The second phase, called rule generation, is the discovery of interesting and useful association rules in the discovered patterns. The first phase, however, often takes a long time to find all frequent patterns; these also include much noise. The second phase is also a time consuming activity that can generate many redundant rules. To improve the quality of association mining in databases, this thesis provides an alternative technique, granule-based association mining, for knowledge discovery in databases, where a granule refers to a predicate that describes common features of a group of transactions. The new technique first transfers transaction databases into basic decision tables, then uses multi-tier structures to integrate pattern mining and rule generation in one phase for both intra and inter transaction association rule mining. To evaluate the proposed new technique, this research defines the concept of meaningless rules by considering the co-relations between data-dimensions for intratransaction-association rule mining. It also uses precision to evaluate the effectiveness of intertransaction association rules. The experimental results show that the proposed technique is promising.
Resumo:
To meet new challenges of Enterprise Systems that essentially go beyond the initial implementation, contemporary organizations seek employees with business process experts with software skills. Despite a healthy demand from the industry for such expertise, recent studies reveal that most Information Systems (IS) graduates are ill-equipped to meet the challenges of modern organizations. This paper shares insights and experiences from a course that is designed to provide a business process centric view of a market leading Enterprise System. The course, designed for both undergraduate and graduate students, uses two common business processes in a case study that employs both sequential and explorative exercises. Student feedback gained through two longitudinal surveys across two phases of the course demonstrates promising signs of the teaching approach.
Resumo:
This article examines the role of the recently introduced fair dealing exception for the purposes of parody and satire in Australian copyright law. Parody and satire, while central to Australian expression, pose a substantial challenge for copyright policy. The law is asked to strike a delicate balance between an author’s right to exploit their work, the interests of the public in stimulating free speech and critical discussion, the rights of artists who rely on existing material in creating their own expression, and the rights of all artists in their reputation and the integrity of their works. This article highlights the difficulty parodists and satirists have historically faced in Australia and examines the potential of the new fair dealing exceptions to relieve this difficulty. This article concludes that the new exceptions have the potential, if read broadly, not only to bridge the gap between humorous and non-humorous criticism, but also to allow for the use of copyright material to critique figures other than the copyright owner or author, extending to society generally. This article will argue that the new exceptions should be read broadly to further this important policy goal while also being limited in their application so as to prevent mere substitutable uses of copyright material. To achieve these twin goals, I suggest that the primary indication of fairness of an unlicensed parody should be whether or not it adds significant new expression so as not to be substitutable for the original work.