909 resultados para large volume
Resumo:
Traditional ceramic separation membranes, which are fabricated by applying colloidal suspensions of metal hydroxides to porous supports, tend to suffer from pinholes and cracks that seriously affect their quality. Other intrinsic problems for these membranes include dramatic losses of flux when the pore sizes are reduced to enhance selectivity and dead-end pores that make no contribution to filtration. In this work, we propose a new strategy for addressing these problems by constructing a hierarchically structured separation layer on a porous substrate using large titanate nanofibers and smaller boehmite nanofibers. The nanofibers are able to divide large voids into smaller ones without forming dead-end pores and with the minimum reduction of the total void volume. The separation layer of nanofibers has a porosity of over 70% of its volume, whereas the separation layer in conventional ceramic membranes has a porosity below 36% and inevitably includes dead-end pores that make no contribution to the flux. This radical change in membrane texture greatly enhances membrane performance. The resulting membranes were able to filter out 95.3% of 60-nm particles from a 0.01 wt % latex while maintaining a relatively high flux of between 800 and 1000 L/m2·h, under a low driving pressure (20 kPa). Such flow rates are orders of magnitude greater than those of conventional membranes with equal selectivity. Moreover, the flux was stable at approximately 800 L/m2·h with a selectivity of more than 95%, even after six repeated runs of filtration and calcination. Use of different supports, either porous glass or porous alumina, had no substantial effect on the performance of the membranes; thus, it is possible to construct the membranes from a variety of supports without compromising functionality. The Darcy equation satisfactorily describes the correlation between the filtration flux and the structural parameters of the new membranes. The assembly of nanofiber meshes to combine high flux with excellent selectivity is an exciting new direction in membrane fabrication.
Resumo:
Context The School of Information Technology at QUT has recently undertaken a major restructuring of their Bachelor of Information Technology (BIT) course. Some of the aims of this restructuring include a reduction in first year attrition and to provide an attractive degree course that meets both student and industry expectations. Emphasis has been placed on the first semester in the context of retaining students by introducing a set of four units that complement one another and provide introductory material on technology, programming and related skills, and generic skills that will aid the students throughout their undergraduate course and in their careers. This discussion relates to one of these four fist semester units, namely Building IT Systems. The aim of this unit is to create small Information Technology (IT) systems that use programming or scripting, databases as either standalone applications or web applications. In the prior history of teaching introductory computer programming at QUT, programming has been taught as a stand alone subject and integration of computer applications with other systems such as databases and networks was not undertaken until students had been given a thorough grounding in those topics as well. Feedback has indicated that students do not believe that working with a database requires programming skills. In fact, the teaching of the building blocks of computer applications have been compartmentalized and taught in isolation from each other. The teaching of introductory computer programming has been an industry requirement of IT degree courses as many jobs require at least some knowledge of the topic. Yet, computer programming is not a skill that all students have equal capabilities of learning (Bruce et al., 2004) and this is clearly shown by the volume of publications dedicated to this topic in the literature over a broad period of time (Eckerdal & Berglund, 2005; Mayer, 1981; Winslow, 1996). The teaching of this introductory material has been done pretty much the same way over the past thirty years. During this period of time that introductory computer programming courses have been taught at QUT, a number of different programming languages and programming paradigms have been used and different approaches to teaching and learning have been attempted in an effort to find the golden thread that would allow students to learn this complex topic. Unfortunately, computer programming is not a skill that can be learnt in one semester. Some basics can be learnt but it can take many years to master (Norvig, 2001). Faculty data typically has shown a bimodal distribution of results for students undertaking introductory programming courses with a high proportion of students receiving a high mark and a high proportion of students receiving a low or failing mark. This indicates that there are students who understand and excel with the introductory material while there is another group who struggle to understand the concepts and practices required to be able to translate a specification or problem statement into a computer program that achieves what is being requested. The consequence of a large group of students failing the introductory programming course has been a high level of attrition amongst first year students. This attrition level does not provide good continuity in student numbers in later years of the degree program and the current approach is not seen as sustainable.
Resumo:
Critical skills such as identifying and appreciating issues that confront firms engaging in international business, and the ability to undertake creative decision-making, are considered fundamental to the study of International Business. It has been argued that using audio-visual case studies can help develop such skills. However, this is difficult due to a lack of Australian case studies. This paper reviews the literature outlining the advantages believed to result from the use of audio-visual case studies, describes a project implemented in a large cohort of students studying International Business, reports on a pilot evaluation of the project, and outlines the findings and conclusions of the survey.
Resumo:
The need for large scale environmental monitoring to manage environmental change is well established. Ecologists have long used acoustics as a means of monitoring the environment in their field work, and so the value of an acoustic environmental observatory is evident. However, the volume of data generated by such an observatory would quickly overwhelm even the most fervent scientist using traditional methods. In this paper we present our steps towards realising a complete acoustic environmental observatory - i.e. a cohesive set of hardware sensors, management utilities, and analytical tools required for large scale environmental monitoring. Concrete examples of these elements, which are in active use by ecological scientists, are also presented
Resumo:
Introduction: 3.0 Tesla MRI offers the potential to quantify the volume fraction and structural texture of cancellous bone, along with quantification of marrow composition, in a single non-invasive examination. This study describes our preliminary investigations to identify parameters which describe cancellous bone structure including the relationships between texture and volume fraction.
Resumo:
This paper reports on an exploration of the concept of 'supervision' as applied to allied health professionals within a large mental health service in one Australian State. A two-part methodology was used, with focus group interviews conducted with allied health professionals, and semi-structured telephone interviews with service managers. Fifty-eight allied health professionals participated in a series of seven focus groups. Semi-structured interviews were conducted with the Directors or Managers of mental health services in all 21 regions in the state. Allied health professionals and service managers both considered supervision to be an important mechanism for ensuring staff competence and best practice outcomes for consumers and carers. There was strong endorsement of the need for clarification and articulation of supervision policies within the organization, and the provision of appropriate resourcing to enable supervision to occur. Current practice in supervision was seen as ad hoc and of variable standard; the need for training in supervision was seen as critical. The supervision needs of newly graduated allied health professionals and those working in rural and regional areas were also seen as important. The need for a flexible and accessible model of supervision was clearly demonstrated.
Resumo:
A few studies examined interactive effects between air pollution and temperature on health outcomes. This study is to examine if temperature modified effects of ozone and cardiovascular mortality in 95 large US cities. A nonparametric and a parametric regression models were separately used to explore interactive effects of temperature and ozone on cardiovascular mortality during May and October, 1987-2000. A Bayesian meta-analysis was used to pool estimates. Both models illustrate that temperature enhanced the ozone effects on mortality in the northern region, but obviously in the southern region. A 10-ppb increment in ozone was associated with 0.41 % (95% posterior interval (PI): -0.19 %, 0.93 %), 0.27 % (95% PI: -0.44 %, 0.87 %) and 1.68 % (95% PI: 0.07 %, 3.26 %) increases in daily cardiovascular mortality corresponding to low, moderate and high levels of temperature, respectively. We concluded that temperature modified effects of ozone, particularly in the northern region.
Resumo:
Business Process Management (BPM) has increased in popularity and maturity in recent years. Large enterprises engage use process management approaches to model, manage and refine repositories of process models that detail the whole enterprise. These process models can run to the thousands in number, and may contain large hierarchies of tasks and control structures that become cumbersome to maintain. Tools are therefore needed to effectively traverse this process model space in an efficient manner, otherwise the repositories remain hard to use, and thus are lowered in their effectiveness. In this paper we analyse a range of BPM tools for their effectiveness in handling large process models. We establish that the present set of commercial tools is lacking in key areas regarding visualisation of, and interaction with, large process models. We then present six tool functionalities for the development of advanced business process visualisation and interaction, presenting a design for a tool that will exploit the latest advances in 2D and 3D computer graphics to enable fast and efficient search, traversal and modification of process models.
Resumo:
An information filtering (IF) system monitors an incoming document stream to find the documents that match the information needs specified by the user profiles. To learn to use the user profiles effectively is one of the most challenging tasks when developing an IF system. With the document selection criteria better defined based on the users’ needs, filtering large streams of information can be more efficient and effective. To learn the user profiles, term-based approaches have been widely used in the IF community because of their simplicity and directness. Term-based approaches are relatively well established. However, these approaches have problems when dealing with polysemy and synonymy, which often lead to an information overload problem. Recently, pattern-based approaches (or Pattern Taxonomy Models (PTM) [160]) have been proposed for IF by the data mining community. These approaches are better at capturing sematic information and have shown encouraging results for improving the effectiveness of the IF system. On the other hand, pattern discovery from large data streams is not computationally efficient. Also, these approaches had to deal with low frequency pattern issues. The measures used by the data mining technique (for example, “support” and “confidences”) to learn the profile have turned out to be not suitable for filtering. They can lead to a mismatch problem. This thesis uses the rough set-based reasoning (term-based) and pattern mining approach as a unified framework for information filtering to overcome the aforementioned problems. This system consists of two stages - topic filtering and pattern mining stages. The topic filtering stage is intended to minimize information overloading by filtering out the most likely irrelevant information based on the user profiles. A novel user-profiles learning method and a theoretical model of the threshold setting have been developed by using rough set decision theory. The second stage (pattern mining) aims at solving the problem of the information mismatch. This stage is precision-oriented. A new document-ranking function has been derived by exploiting the patterns in the pattern taxonomy. The most likely relevant documents were assigned higher scores by the ranking function. Because there is a relatively small amount of documents left after the first stage, the computational cost is markedly reduced; at the same time, pattern discoveries yield more accurate results. The overall performance of the system was improved significantly. The new two-stage information filtering model has been evaluated by extensive experiments. Tests were based on the well-known IR bench-marking processes, using the latest version of the Reuters dataset, namely, the Reuters Corpus Volume 1 (RCV1). The performance of the new two-stage model was compared with both the term-based and data mining-based IF models. The results demonstrate that the proposed information filtering system outperforms significantly the other IF systems, such as the traditional Rocchio IF model, the state-of-the-art term-based models, including the BM25, Support Vector Machines (SVM), and Pattern Taxonomy Model (PTM).
Resumo:
Review of 'The White Earth', La Boite Theatre Company, published in The Australian, 25 February 2009.
Resumo:
This report presents findings from the largest survey of aspiring creatives who work or intend to work in the digital content industries ever undertaken in Australia. Survey respondents included those with aspirations to work in the publicly-supported, less commercial end of the Creative Industries spectrum as well as those with aspirations to work in the digital content industries. The survey gathered rich data on their characteristics, skills and attributes, barriers to employment, workforce mobility, career intentions, professional development, mentors and industry supports, and participation in communities of practice. The survey sought to determine if aspiring creatives have the necessary skills and attributes to work effectively in the digital content industries. This task also involved finding out how they develop their skills and attributes, and what they need to develop them further.
Resumo:
Precise, up-to-date and increasingly detailed road maps are crucial for various advanced road applications, such as lane-level vehicle navigation, and advanced driver assistant systems. With the very high resolution (VHR) imagery from digital airborne sources, it will greatly facilitate the data acquisition, data collection and updates if the road details can be automatically extracted from the aerial images. In this paper, we proposed an effective approach to detect road lane information from aerial images with employment of the object-oriented image analysis method. Our proposed algorithm starts with constructing the DSM and true orthophotos from the stereo images. The road lane details are detected using an object-oriented rule based image classification approach. Due to the affection of other objects with similar spectral and geometrical attributes, the extracted road lanes are filtered with the road surface obtained by a progressive two-class decision classifier. The generated road network is evaluated using the datasets provided by Queensland department of Main Roads. The evaluation shows completeness values that range between 76% and 98% and correctness values that range between 82% and 97%.
Resumo:
The automatic extraction of road features from remote sensed images has been a topic of great interest within the photogrammetric and remote sensing communities for over 3 decades. Although various techniques have been reported in the literature, it is still challenging to efficiently extract the road details with the increasing of image resolution as well as the requirement for accurate and up-to-date road data. In this paper, we will focus on the automatic detection of road lane markings, which are crucial for many applications, including lane level navigation and lane departure warning. The approach consists of four steps: i) data preprocessing, ii) image segmentation and road surface detection, iii) road lane marking extraction based on the generated road surface, and iv) testing and system evaluation. The proposed approach utilized the unsupervised ISODATA image segmentation algorithm, which segments the image into vegetation regions, and road surface based only on the Cb component of YCbCr color space. A shadow detection method based on YCbCr color space is also employed to detect and recover the shadows from the road surface casted by the vehicles and trees. Finally, the lane marking features are detected from the road surface using the histogram clustering. The experiments of applying the proposed method to the aerial imagery dataset of Gympie, Queensland demonstrate the efficiency of the approach.
Resumo:
The realities of new technological and social conditions since the 1990s demand a new approach to literacy teaching. Looking onward from the original statement of aims of the multiliteracies movement in 1996, this volume brings together top-quality scholarship and research that has embraced the notion and features new contributions by many of the originators of this approach to literacy. Drawing on large research projects and empirical evidence, the authors explore practical and educational issues that relate to multiliteracies, such as assessment, pedagogy and curriculum. The viewpoint taken is that multiliteracies is a complementary socio-cultural approach to the new literacies that includes pedagogy and learning. The differences are addressed from a multiliteracies perspective – one that does not discount or undermine the new literacies, but shows new ways in which they are complementary. Computers and the Internet are transforming the way we work and communicate and the very notion of literacy itself. This volume offers frontline information and a vital update for those wishing to understand the evolution of multiliteracies and the current state of literacy theory in relation to it.
Resumo:
Motor vehicles are a major source of gaseous and particulate matter pollution in urban areas, particularly of ultrafine sized particles (diameters < 0.1 µm). Exposure to particulate matter has been found to be associated with serious health effects, including respiratory and cardiovascular disease, and mortality. Particle emissions generated by motor vehicles span a very broad size range (from around 0.003-10 µm) and are measured as different subsets of particle mass concentrations or particle number count. However, there exist scientific challenges in analysing and interpreting the large data sets on motor vehicle emission factors, and no understanding is available of the application of different particle metrics as a basis for air quality regulation. To date a comprehensive inventory covering the broad size range of particles emitted by motor vehicles, and which includes particle number, does not exist anywhere in the world. This thesis covers research related to four important and interrelated aspects pertaining to particulate matter generated by motor vehicle fleets. These include the derivation of suitable particle emission factors for use in transport modelling and health impact assessments; quantification of motor vehicle particle emission inventories; investigation of the particle characteristic modality within particle size distributions as a potential for developing air quality regulation; and review and synthesis of current knowledge on ultrafine particles as it relates to motor vehicles; and the application of these aspects to the quantification, control and management of motor vehicle particle emissions. In order to quantify emissions in terms of a comprehensive inventory, which covers the full size range of particles emitted by motor vehicle fleets, it was necessary to derive a suitable set of particle emission factors for different vehicle and road type combinations for particle number, particle volume, PM1, PM2.5 and PM1 (mass concentration of particles with aerodynamic diameters < 1 µm, < 2.5 µm and < 10 µm respectively). The very large data set of emission factors analysed in this study were sourced from measurement studies conducted in developed countries, and hence the derived set of emission factors are suitable for preparing inventories in other urban regions of the developed world. These emission factors are particularly useful for regions with a lack of measurement data to derive emission factors, or where experimental data are available but are of insufficient scope. The comprehensive particle emissions inventory presented in this thesis is the first published inventory of tailpipe particle emissions prepared for a motor vehicle fleet, and included the quantification of particle emissions covering the full size range of particles emitted by vehicles, based on measurement data. The inventory quantified particle emissions measured in terms of particle number and different particle mass size fractions. It was developed for the urban South-East Queensland fleet in Australia, and included testing the particle emission implications of future scenarios for different passenger and freight travel demand. The thesis also presents evidence of the usefulness of examining modality within particle size distributions as a basis for developing air quality regulations; and finds evidence to support the relevance of introducing a new PM1 mass ambient air quality standard for the majority of environments worldwide. The study found that a combination of PM1 and PM10 standards are likely to be a more discerning and suitable set of ambient air quality standards for controlling particles emitted from combustion and mechanically-generated sources, such as motor vehicles, than the current mass standards of PM2.5 and PM10. The study also reviewed and synthesized existing knowledge on ultrafine particles, with a specific focus on those originating from motor vehicles. It found that motor vehicles are significant contributors to both air pollution and ultrafine particles in urban areas, and that a standardized measurement procedure is not currently available for ultrafine particles. The review found discrepancies exist between outcomes of instrumentation used to measure ultrafine particles; that few data is available on ultrafine particle chemistry and composition, long term monitoring; characterization of their spatial and temporal distribution in urban areas; and that no inventories for particle number are available for motor vehicle fleets. This knowledge is critical for epidemiological studies and exposure-response assessment. Conclusions from this review included the recommendation that ultrafine particles in populated urban areas be considered a likely target for future air quality regulation based on particle number, due to their potential impacts on the environment. The research in this PhD thesis successfully integrated the elements needed to quantify and manage motor vehicle fleet emissions, and its novelty relates to the combining of expertise from two distinctly separate disciplines - from aerosol science and transport modelling. The new knowledge and concepts developed in this PhD research provide never before available data and methods which can be used to develop comprehensive, size-resolved inventories of motor vehicle particle emissions, and air quality regulations to control particle emissions to protect the health and well-being of current and future generations.