909 resultados para Large-volume Quartz Latites
Resumo:
Business Process Management (BPM) has increased in popularity and maturity in recent years. Large enterprises engage use process management approaches to model, manage and refine repositories of process models that detail the whole enterprise. These process models can run to the thousands in number, and may contain large hierarchies of tasks and control structures that become cumbersome to maintain. Tools are therefore needed to effectively traverse this process model space in an efficient manner, otherwise the repositories remain hard to use, and thus are lowered in their effectiveness. In this paper we analyse a range of BPM tools for their effectiveness in handling large process models. We establish that the present set of commercial tools is lacking in key areas regarding visualisation of, and interaction with, large process models. We then present six tool functionalities for the development of advanced business process visualisation and interaction, presenting a design for a tool that will exploit the latest advances in 2D and 3D computer graphics to enable fast and efficient search, traversal and modification of process models.
Resumo:
An information filtering (IF) system monitors an incoming document stream to find the documents that match the information needs specified by the user profiles. To learn to use the user profiles effectively is one of the most challenging tasks when developing an IF system. With the document selection criteria better defined based on the users’ needs, filtering large streams of information can be more efficient and effective. To learn the user profiles, term-based approaches have been widely used in the IF community because of their simplicity and directness. Term-based approaches are relatively well established. However, these approaches have problems when dealing with polysemy and synonymy, which often lead to an information overload problem. Recently, pattern-based approaches (or Pattern Taxonomy Models (PTM) [160]) have been proposed for IF by the data mining community. These approaches are better at capturing sematic information and have shown encouraging results for improving the effectiveness of the IF system. On the other hand, pattern discovery from large data streams is not computationally efficient. Also, these approaches had to deal with low frequency pattern issues. The measures used by the data mining technique (for example, “support” and “confidences”) to learn the profile have turned out to be not suitable for filtering. They can lead to a mismatch problem. This thesis uses the rough set-based reasoning (term-based) and pattern mining approach as a unified framework for information filtering to overcome the aforementioned problems. This system consists of two stages - topic filtering and pattern mining stages. The topic filtering stage is intended to minimize information overloading by filtering out the most likely irrelevant information based on the user profiles. A novel user-profiles learning method and a theoretical model of the threshold setting have been developed by using rough set decision theory. The second stage (pattern mining) aims at solving the problem of the information mismatch. This stage is precision-oriented. A new document-ranking function has been derived by exploiting the patterns in the pattern taxonomy. The most likely relevant documents were assigned higher scores by the ranking function. Because there is a relatively small amount of documents left after the first stage, the computational cost is markedly reduced; at the same time, pattern discoveries yield more accurate results. The overall performance of the system was improved significantly. The new two-stage information filtering model has been evaluated by extensive experiments. Tests were based on the well-known IR bench-marking processes, using the latest version of the Reuters dataset, namely, the Reuters Corpus Volume 1 (RCV1). The performance of the new two-stage model was compared with both the term-based and data mining-based IF models. The results demonstrate that the proposed information filtering system outperforms significantly the other IF systems, such as the traditional Rocchio IF model, the state-of-the-art term-based models, including the BM25, Support Vector Machines (SVM), and Pattern Taxonomy Model (PTM).
Resumo:
Review of 'The White Earth', La Boite Theatre Company, published in The Australian, 25 February 2009.
Resumo:
This report presents findings from the largest survey of aspiring creatives who work or intend to work in the digital content industries ever undertaken in Australia. Survey respondents included those with aspirations to work in the publicly-supported, less commercial end of the Creative Industries spectrum as well as those with aspirations to work in the digital content industries. The survey gathered rich data on their characteristics, skills and attributes, barriers to employment, workforce mobility, career intentions, professional development, mentors and industry supports, and participation in communities of practice. The survey sought to determine if aspiring creatives have the necessary skills and attributes to work effectively in the digital content industries. This task also involved finding out how they develop their skills and attributes, and what they need to develop them further.
Resumo:
Precise, up-to-date and increasingly detailed road maps are crucial for various advanced road applications, such as lane-level vehicle navigation, and advanced driver assistant systems. With the very high resolution (VHR) imagery from digital airborne sources, it will greatly facilitate the data acquisition, data collection and updates if the road details can be automatically extracted from the aerial images. In this paper, we proposed an effective approach to detect road lane information from aerial images with employment of the object-oriented image analysis method. Our proposed algorithm starts with constructing the DSM and true orthophotos from the stereo images. The road lane details are detected using an object-oriented rule based image classification approach. Due to the affection of other objects with similar spectral and geometrical attributes, the extracted road lanes are filtered with the road surface obtained by a progressive two-class decision classifier. The generated road network is evaluated using the datasets provided by Queensland department of Main Roads. The evaluation shows completeness values that range between 76% and 98% and correctness values that range between 82% and 97%.
Resumo:
The automatic extraction of road features from remote sensed images has been a topic of great interest within the photogrammetric and remote sensing communities for over 3 decades. Although various techniques have been reported in the literature, it is still challenging to efficiently extract the road details with the increasing of image resolution as well as the requirement for accurate and up-to-date road data. In this paper, we will focus on the automatic detection of road lane markings, which are crucial for many applications, including lane level navigation and lane departure warning. The approach consists of four steps: i) data preprocessing, ii) image segmentation and road surface detection, iii) road lane marking extraction based on the generated road surface, and iv) testing and system evaluation. The proposed approach utilized the unsupervised ISODATA image segmentation algorithm, which segments the image into vegetation regions, and road surface based only on the Cb component of YCbCr color space. A shadow detection method based on YCbCr color space is also employed to detect and recover the shadows from the road surface casted by the vehicles and trees. Finally, the lane marking features are detected from the road surface using the histogram clustering. The experiments of applying the proposed method to the aerial imagery dataset of Gympie, Queensland demonstrate the efficiency of the approach.
Resumo:
The realities of new technological and social conditions since the 1990s demand a new approach to literacy teaching. Looking onward from the original statement of aims of the multiliteracies movement in 1996, this volume brings together top-quality scholarship and research that has embraced the notion and features new contributions by many of the originators of this approach to literacy. Drawing on large research projects and empirical evidence, the authors explore practical and educational issues that relate to multiliteracies, such as assessment, pedagogy and curriculum. The viewpoint taken is that multiliteracies is a complementary socio-cultural approach to the new literacies that includes pedagogy and learning. The differences are addressed from a multiliteracies perspective – one that does not discount or undermine the new literacies, but shows new ways in which they are complementary. Computers and the Internet are transforming the way we work and communicate and the very notion of literacy itself. This volume offers frontline information and a vital update for those wishing to understand the evolution of multiliteracies and the current state of literacy theory in relation to it.
Resumo:
Motor vehicles are a major source of gaseous and particulate matter pollution in urban areas, particularly of ultrafine sized particles (diameters < 0.1 µm). Exposure to particulate matter has been found to be associated with serious health effects, including respiratory and cardiovascular disease, and mortality. Particle emissions generated by motor vehicles span a very broad size range (from around 0.003-10 µm) and are measured as different subsets of particle mass concentrations or particle number count. However, there exist scientific challenges in analysing and interpreting the large data sets on motor vehicle emission factors, and no understanding is available of the application of different particle metrics as a basis for air quality regulation. To date a comprehensive inventory covering the broad size range of particles emitted by motor vehicles, and which includes particle number, does not exist anywhere in the world. This thesis covers research related to four important and interrelated aspects pertaining to particulate matter generated by motor vehicle fleets. These include the derivation of suitable particle emission factors for use in transport modelling and health impact assessments; quantification of motor vehicle particle emission inventories; investigation of the particle characteristic modality within particle size distributions as a potential for developing air quality regulation; and review and synthesis of current knowledge on ultrafine particles as it relates to motor vehicles; and the application of these aspects to the quantification, control and management of motor vehicle particle emissions. In order to quantify emissions in terms of a comprehensive inventory, which covers the full size range of particles emitted by motor vehicle fleets, it was necessary to derive a suitable set of particle emission factors for different vehicle and road type combinations for particle number, particle volume, PM1, PM2.5 and PM1 (mass concentration of particles with aerodynamic diameters < 1 µm, < 2.5 µm and < 10 µm respectively). The very large data set of emission factors analysed in this study were sourced from measurement studies conducted in developed countries, and hence the derived set of emission factors are suitable for preparing inventories in other urban regions of the developed world. These emission factors are particularly useful for regions with a lack of measurement data to derive emission factors, or where experimental data are available but are of insufficient scope. The comprehensive particle emissions inventory presented in this thesis is the first published inventory of tailpipe particle emissions prepared for a motor vehicle fleet, and included the quantification of particle emissions covering the full size range of particles emitted by vehicles, based on measurement data. The inventory quantified particle emissions measured in terms of particle number and different particle mass size fractions. It was developed for the urban South-East Queensland fleet in Australia, and included testing the particle emission implications of future scenarios for different passenger and freight travel demand. The thesis also presents evidence of the usefulness of examining modality within particle size distributions as a basis for developing air quality regulations; and finds evidence to support the relevance of introducing a new PM1 mass ambient air quality standard for the majority of environments worldwide. The study found that a combination of PM1 and PM10 standards are likely to be a more discerning and suitable set of ambient air quality standards for controlling particles emitted from combustion and mechanically-generated sources, such as motor vehicles, than the current mass standards of PM2.5 and PM10. The study also reviewed and synthesized existing knowledge on ultrafine particles, with a specific focus on those originating from motor vehicles. It found that motor vehicles are significant contributors to both air pollution and ultrafine particles in urban areas, and that a standardized measurement procedure is not currently available for ultrafine particles. The review found discrepancies exist between outcomes of instrumentation used to measure ultrafine particles; that few data is available on ultrafine particle chemistry and composition, long term monitoring; characterization of their spatial and temporal distribution in urban areas; and that no inventories for particle number are available for motor vehicle fleets. This knowledge is critical for epidemiological studies and exposure-response assessment. Conclusions from this review included the recommendation that ultrafine particles in populated urban areas be considered a likely target for future air quality regulation based on particle number, due to their potential impacts on the environment. The research in this PhD thesis successfully integrated the elements needed to quantify and manage motor vehicle fleet emissions, and its novelty relates to the combining of expertise from two distinctly separate disciplines - from aerosol science and transport modelling. The new knowledge and concepts developed in this PhD research provide never before available data and methods which can be used to develop comprehensive, size-resolved inventories of motor vehicle particle emissions, and air quality regulations to control particle emissions to protect the health and well-being of current and future generations.
Resumo:
A major focus of research in nanotechnology is the development of novel, high throughput techniques for fabrication of arbitrarily shaped surface nanostructures of sub 100 nm to atomic scale. A related pursuit is the development of simple and efficient means for parallel manipulation and redistribution of adsorbed atoms, molecules and nanoparticles on surfaces – adparticle manipulation. These techniques will be used for the manufacture of nanoscale surface supported functional devices in nanotechnologies such as quantum computing, molecular electronics and lab-on-achip, as well as for modifying surfaces to obtain novel optical, electronic, chemical, or mechanical properties. A favourable approach to formation of surface nanostructures is self-assembly. In self-assembly, nanostructures are grown by aggregation of individual adparticles that diffuse by thermally activated processes on the surface. The passive nature of this process means it is generally not suited to formation of arbitrarily shaped structures. The self-assembly of nanostructures at arbitrary positions has been demonstrated, though these have typically required a pre-patterning treatment of the surface using sophisticated techniques such as electron beam lithography. On the other hand, a parallel adparticle manipulation technique would be suited for directing the selfassembly process to occur at arbitrary positions, without the need for pre-patterning the surface. There is at present a lack of techniques for parallel manipulation and redistribution of adparticles to arbitrary positions on the surface. This is an issue that needs to be addressed since these techniques can play an important role in nanotechnology. In this thesis, we propose such a technique – thermal tweezers. In thermal tweezers, adparticles are redistributed by localised heating of the surface. This locally enhances surface diffusion of adparticles so that they rapidly diffuse away from the heated regions. Using this technique, the redistribution of adparticles to form a desired pattern is achieved by heating the surface at specific regions. In this project, we have focussed on the holographic implementation of this approach, where the surface is heated by holographic patterns of interfering pulsed laser beams. This implementation is suitable for the formation of arbitrarily shaped structures; the only condition is that the shape can be produced by holographic means. In the simplest case, the laser pulses are linearly polarised and intersect to form an interference pattern that is a modulation of intensity along a single direction. Strong optical absorption at the intensity maxima of the interference pattern results in approximately a sinusoidal variation of the surface temperature along one direction. The main aim of this research project is to investigate the feasibility of the holographic implementation of thermal tweezers as an adparticle manipulation technique. Firstly, we investigate theoretically the surface diffusion of adparticles in the presence of sinusoidal modulation of the surface temperature. Very strong redistribution of adparticles is predicted when there is strong interaction between the adparticle and the surface, and the amplitude of the temperature modulation is ~100 K. We have proposed a thin metallic film deposited on a glass substrate heated by interfering laser beams (optical wavelengths) as a means of generating very large amplitude of surface temperature modulation. Indeed, we predict theoretically by numerical solution of the thermal conduction equation that amplitude of the temperature modulation on the metallic film can be much greater than 100 K when heated by nanosecond pulses with an energy ~1 mJ. The formation of surface nanostructures of less than 100 nm in width is predicted at optical wavelengths in this implementation of thermal tweezers. Furthermore, we propose a simple extension to this technique where spatial phase shift of the temperature modulation effectively doubles or triples the resolution. At the same time, increased resolution is predicted by reducing the wavelength of the laser pulses. In addition, we present two distinctly different, computationally efficient numerical approaches for theoretical investigation of surface diffusion of interacting adparticles – the Monte Carlo Interaction Method (MCIM) and the random potential well method (RPWM). Using each of these approaches we have investigated thermal tweezers for redistribution of both strongly and weakly interacting adparticles. We have predicted that strong interactions between adparticles can increase the effectiveness of thermal tweezers, by demonstrating practically complete adparticle redistribution into the low temperature regions of the surface. This is promising from the point of view of thermal tweezers applied to directed self-assembly of nanostructures. Finally, we present a new and more efficient numerical approach to theoretical investigation of thermal tweezers of non-interacting adparticles. In this approach, the local diffusion coefficient is determined from solution of the Fokker-Planck equation. The diffusion equation is then solved numerically using the finite volume method (FVM) to directly obtain the probability density of adparticle position. We compare predictions of this approach to those of the Ermak algorithm solution of the Langevin equation, and relatively good agreement is shown at intermediate and high friction. In the low friction regime, we predict and investigate the phenomenon of ‘optimal’ friction and describe its occurrence due to very long jumps of adparticles as they diffuse from the hot regions of the surface. Future research directions, both theoretical and experimental are also discussed.
Resumo:
The volume is a collection of papers that address issues associated with change in the delivery of VET programs in Queensland, foreshadowed by the release of The Queensland Skill Plan in 2006. Issues that relate to the implementation of the Actions identified in the Queensland Skills Plan are the focus of the collection. In particular, the incorporation of Information Communication Technologies (ICTs) and e-learning approaches in the delivery of training packages is a key topic, how such change can be managed in the delivery of training programs, as well as broader professional development issues for VET practitioners. Change at an organisational level is the focus of two papers. Lyn Ambrose uses ideas from Diffusion of Innovations Theory to consider how the adoption eLearning in a TAFE community can be addressed. The paper by Susan Todhunter also discusses the organisational challenges in change initiatives in TAFE Institutes. Specific issues related to in the professional development of VET teachers are the focus of the papers by Mary Campbell, Sharon Altena, and Judy Gronold. Mary Campbell discusses the importance of building staff capabilities within the TAFE system and how this might be managed. Sharon Altena considers how professional development programs are currently delivered and how new approaches to professional development for TAFE teachers are needed to ensure changes can be sustained in teaching practice. The paper by Judy Gronold takes up a specific challenge for VET practitioners in the Queensland Skills Plan. She addresses issues related to embedding employability skills into training delivery in order to address industries’ need for flexible, multi-skilled productive workers. Mark Driver discusses the issues resulting from increased number of mature-aged learners in VET programs and how this change in the demographic profile of students presents challenges to the VET system. In the paper by David McKee, implications in the incorporation of ICTs into trade training are discussed and the need for effective change management strategies to ensure a smooth transition to new ways of delivering trade training. Finally, in the paper by David Roberts, the potential of Problem-Based Learning (PBL) approaches in VET training and the role of ICTs within such approaches are discussed. David uses horticulture training as an example to discuss the issues in implementing PBL effectively in VET programs. These papers were completed by the authors as a part of their postgraduate studies at QUT. The views reported are those of the authors and should not be attributed to the Queensland Department of Education, Training and the Arts.
Resumo:
This volume is the second in a series that addresses change and development in the delivery of vocational and education programs in Queensland. A similar volume was published in 2007. Considerable change was foreshadowed for TAFE Queensland by the release of The Queensland Skill Plan (QSP) in 2006. This volume addresses implementation issues for the Actions identified in the QSP. The chapters focus on a breadth of issues that relate to the changing landscape for teaching and learning in TAFE Institutes. The incorporation of Information Communication Technologies (ICTs) and e-learning approaches into the delivery of training packages remain key foci for change, as was evident in the first volume of this series. The chapters also consider issues for some client groups in VET, as well as approaches to professional development to build the capabilities of staff for new teaching and learning environments. The chapter by Sandra Lawrence examines the professional development issues for staff across TAFE institutes in the implementation of the Learning Management System. Suzanne Walsh discusses the issues of new “learning spaces” and “Mode 2 learning in the re-development at Southbank Institute. The chapter by Angela Simpson focuses on VET in schools and school-to-work transition programs. Josie Drew, in her chapter, takes up the issues of embedding employability skills into the delivery of training packages through flexible delivery. The chapter by Colleen Hodgins focuses on the organisational challenges for Lead Institutes in relation to the professional development for TAFE educators in light of policy changes. Bradley Jones discusses the changing roles of libraries in VET contexts and their importance. He examines the adequacy of the VOCED database and reflects on the current nature, role, and practices of VET libraries. Finally, Piero Dametto discusses the pragmatics for TAFE educators in understanding the use of digital objects and learning objects within the LMS and LCMS systems that were presaged in the QSP. These papers were completed by the authors as a part of their postgraduate studies at QUT. The views reported are those of the authors and should not be attributed to the Queensland Department of Education, Training and the Arts. Donna Berthelsen Faculty of Education Queensland University of Technology
Resumo:
The volume is the third in a series that addresses change and development in the delivery of VET programs in Queensland. The chapters address a breadth of issues that relate to the changing landscape for teaching and learning in VET programs through e-learning. Organisational change is a key focus of this volume. James Waterson examines business and pedagogical perspectives for SkillsTech Australia to create teaching and learning environments that will enrich the learning experience for staff and students. Kerry Emerson explores the ways in which printing teachers might change current practices in order to deliver their training to apprentices and trainees – off-the-job, onthe- job and online, through e-learning. The chapter by Erik Dodwell takes up issues of user-friendly RPL interviews and the challenges to develop a model that may be applied consistently across all industry areas using conversational interviews. The chapters by Linda Roberston, Nina Woodrow, and Anushka Weerackody discuss teaching and learning for specific groups of students. Linda Roberston proposes a vision for learning support teachers in which the learner is encouraged to be a self-directed and autonomous learner who is capable of utilising e-learning resources and is information literate. Nina Woodrow presents a case for the diminished sense of occupational identity by adult literacy teachers and the implications of this loss of expertise across the VET sector. Constructive strategies for change are outlined. The chapter by Anushka Weerackody explores ideas to enhance the practices of ESL teachers in TAFE Queensland through an online community of practice. The issues of on-the-job training through e-learning for various trades is considered by Anne-Louise Johnston and the benefits and challenges for SkillsTech Australia in developing training partnerships with industry to deliver these changes. These papers were completed by the authors as a part of their postgraduate studies at QUT. The views reported are those of the authors and should not be attributed to the Queensland Department of Education and Training. Donna Berthelsen and Lauren Vogel Faculty of Education Queensland University of Technology December 2009
Resumo:
Conventional clinical therapies are unable to resolve osteochondral defects adequately, hence tissue engineering solutions are sought to address the challenge. A biphasic implant which was seeded with Mesenchymal Stem Cells (MSC) and coupled with an electrospun membrane was evaluated as an alternative. This dual phase construct comprised of a Polycaprolactone (PCL) cartilage scaffold and a Polycaprolactone - Tri Calcium Phosphate (PCL - TCP) osseous matrix. Autologous MSC was seeded into the entire implant via fibrin and the construct was inserted into critically sized osteochondral defects located at the medial condyle and patellar groove of pigs. The defect was resurfaced with a PCL - collagen electrospun mesh that served as a substitute for periosteal flap in preventing cell leakage. Controls either without implanted MSC or resurfacing membrane were included. After 6 months, cartilaginous repair was observed with a low occurrence of fibrocartilage at the medial condyle. Osteochondral repair was promoted and host cartilage degeneration was arrested as shown by the superior Glycosaminoglycan (GAG) maintenance. This positive morphological outcome was supported by a higher relative Young's modulus which indicated functional cartilage restoration. Bone in growth and remodeling occurred in all groups with a higher degree of mineralization in the experimental group. Tissue repair was compromised in the absence of the implanted cells or the resurfacing membrane. Moreover healing was inferior at the patellar groove as compared to the medial condyle and this was attributed to the native biomechanical features.