367 resultados para STL-Format
Resumo:
QUT’s new metadata repository (data registry), Research Data Finder, has been designed to promote the visibility and discoverability of QUT research datasets. Funded by the Australian National Data Service (ANDS), it will provide a qualitative snapshot of research data outputs created or collected by members of the QUT research community that are available via open or mediated access. As a fully integrated metadata repository Research Data Finder aligns with institutional sources of truth, such as QUT’s research administrative system, ResearchMaster, as well as QUT’s Academic Profiles system to provide high quality data descriptions that increase awareness of, and access to, shareable research data. In addition, the repository and its workflows are designed to foster smoother data management practices, enhance opportunities for collaboration and research, promote cross-disciplinary research and maximize existing research datasets. The metadata schema used in Research Data Finder is the Registry Interchange Format - Collections and Services (RIF-CS), developed by ANDS in 2009. This comprehensive schema is potentially complex for researchers; unlike metadata for publications, which are often made publicly available with the official publication, metadata for datasets are not typically available and need to be created. Research Data Finder uses a hybrid self-deposit and mediated deposit system. In addition to automated ingests from ResearchMaster (research project information) and Academic Profiles system (researcher information), shareable data is identified at a number of key “trigger points” in the research cycle. These include: research grant proposals; ethics applications; Data Management Plans; Liaison Librarian data interviews; and thesis submissions. These ingested records can be supplemented with related metadata including links to related publications, such as those in QUT ePrints. Records deposited in Research Data Finder are harvested by ANDS and made available to a national and international audience via Research Data Australia, ANDS’ discovery service for Australian research data. Researcher and research group metadata records are also harvested by the National Library of Australia (NLA) and these records are then published in Trove (the NLA’s digital information portal). By contributing records to the national infrastructure, QUT data will become more visible. Within Australia and internationally, many funding bodies have already mandated the open access of publications produced from publicly funded research projects, such as those supported by the Australian Research Council (ARC), or the National Health and Medical Research Council (NHMRC). QUT will be well placed to respond to the rapidly evolving climate of research data management. This project is supported by the Australian National Data Service (ANDS). ANDS is supported by the Australian Government through the National Collaborative Research Infrastructure Strategy Program and the Education Investment Fund (EIF) Super Science Initiative.
Resumo:
Digital storytelling projects have proliferated in Australia since the early 2000s, and have been theorized as a means to disseminate the stories and voices of “ordinary” people. In this paper I examine through the case study of a 2009 digital storytelling project between the Australasian Centre for Interactive Design and a group identifying as Forgotten Australian whether digital storytelling in its predominant workshop-based format is able to meet the needs of profoundly marginalized and traumatized individuals and groups. For digital storytelling to be of use to marginalized groups as a means of communication or reflection a significant re-examination of the current approaches to its format, and its function needs to undertaken. This paper posits new ways of utilizing digital storytelling when dealing with trauma narratives.
Resumo:
excerpt: from soil and stone is a work consisting of fifty drawings on paper organised in a grid. Each drawing is small, only 19 by 14 centimetres, and set out in portrait format. They each reference, either explicitly or abstractly, natural phenomena. These include plant forms, pollens, seeds, pods, and leaf shapes and each is painted with a dizzying and liquid array of techniques and technical finesse. Colour is used sparingly but tellingly, and all are generously and wetly composed, with each seeming to flow into the necessary rightness of composition. Within the overall work the feel is sometimes of the archive, a personal kind where pressed flowers are stumbled upon within a book. At other times they seem to image the stellar as one confronts the immensity of some planet suspended in the void. Again, as a recurring theme in Reynolds’ oeuvre, the notion of the taxonomy is sounded. The drawings are laid out to display difference and reveal through contrast essence. It is a mapping that illuminates a generous plentitude.
Resumo:
Encompasses the whole BPM lifecycle, including process identification, modelling, analysis, redesign, automation and monitoring Class-tested textbook complemented with additional teaching material on the accompanying website Covers both relevant conceptual background, industrial standards and actionable skills Business Process Management (BPM) is the art and science of how work should be performed in an organization in order to ensure consistent outputs and to take advantage of improvement opportunities, e.g. reducing costs, execution times or error rates. Importantly, BPM is not about improving the way individual activities are performed, but rather about managing entire chains of events, activities and decisions that ultimately produce added value for an organization and its customers. This textbook encompasses the entire BPM lifecycle, from process identification to process monitoring, covering along the way process modelling, analysis, redesign and automation. Concepts, methods and tools from business management, computer science and industrial engineering are blended into one comprehensive and inter-disciplinary approach. The presentation is illustrated using the BPMN industry standard defined by the Object Management Group and widely endorsed by practitioners and vendors worldwide. In addition to explaining the relevant conceptual background, the book provides dozens of examples, more than 100 hands-on exercises – many with solutions – as well as numerous suggestions for further reading. The textbook is the result of many years of combined teaching experience of the authors, both at the undergraduate and graduate levels as well as in the context of professional training. Students and professionals from both business management and computer science will benefit from the step-by-step style of the textbook and its focus on fundamental concepts and proven methods. Lecturers will appreciate the class-tested format and the additional teaching material available on the accompanying website fundamentals-of-bpm.org.
Resumo:
Our daily lives become more and more dependent upon smartphones due to their increased capabilities. Smartphones are used in various ways from payment systems to assisting the lives of elderly or disabled people. Security threats for these devices become increasingly dangerous since there is still a lack of proper security tools for protection. Android emerges as an open smartphone platform which allows modification even on operating system level. Therefore, third-party developers have the opportunity to develop kernel-based low-level security tools which is not normal for smartphone platforms. Android quickly gained its popularity among smartphone developers and even beyond since it bases on Java on top of "open" Linux in comparison to former proprietary platforms which have very restrictive SDKs and corresponding APIs. Symbian OS for example, holding the greatest market share among all smartphone OSs, was closing critical APIs to common developers and introduced application certification. This was done since this OS was the main target for smartphone malwares in the past. In fact, more than 290 malwares designed for Symbian OS appeared from July 2004 to July 2008. Android, in turn, promises to be completely open source. Together with the Linux-based smartphone OS OpenMoko, open smartphone platforms may attract malware writers for creating malicious applications endangering the critical smartphone applications and owners� privacy. In this work, we present our current results in analyzing the security of Android smartphones with a focus on its Linux side. Our results are not limited to Android, they are also applicable to Linux-based smartphones such as OpenMoko Neo FreeRunner. Our contribution in this work is three-fold. First, we analyze android framework and the Linux-kernel to check security functionalities. We survey wellaccepted security mechanisms and tools which can increase device security. We provide descriptions on how to adopt these security tools on Android kernel, and provide their overhead analysis in terms of resource usage. As open smartphones are released and may increase their market share similar to Symbian, they may attract attention of malware writers. Therefore, our second contribution focuses on malware detection techniques at the kernel level. We test applicability of existing signature and intrusion detection methods in Android environment. We focus on monitoring events on the kernel; that is, identifying critical kernel, log file, file system and network activity events, and devising efficient mechanisms to monitor them in a resource limited environment. Our third contribution involves initial results of our malware detection mechanism basing on static function call analysis. We identified approximately 105 Executable and Linking Format (ELF) executables installed to the Linux side of Android. We perform a statistical analysis on the function calls used by these applications. The results of the analysis can be compared to newly installed applications for detecting significant differences. Additionally, certain function calls indicate malicious activity. Therefore, we present a simple decision tree for deciding the suspiciousness of the corresponding application. Our results present a first step towards detecting malicious applications on Android-based devices.
Resumo:
This article explores the role of principal leadership in creating a thinking school. It contributes to the school leadership literature by exploring the intersection of two important areas of study in education - school leadership and education for thinking - which is a particularly apt area of study, because effective school leadership is crucial if students are to learn to be critical and creative thinkers, yet this connection has not be widely investigated. We describe how one principal, Hinton, turned around an underperforming school by using critical and creative philosophical thinking as the focus for students, staff and parents. Then, drawing on the school leadership literature, the article describes seven attributes of school leadership beginning with four articulated by Leithwood and colleagues (2006) (building vision and setting direction; redesigning the organisation; understanding and developing people; managing the teaching and learning program), and adding three others (influence; self-development; and responding to context). This framework is then used in a case study format in a collaboration between practitioner and researchers to first explore evidence from empirical studies and personal reflection about Hinton's leadership of Buranda State School, and second to illuminate how these general features of school leadership apply to creating a thinking school. Based on the case study and using the general characteristics of school leadership, a framework for leading a thinking school is described. Because the framework is based on a turnaround school, this framework has wide applicability: to schools that are doing well as an indication of how to implement a contemporary approach to curriculum and pedagogy; and to schools that are underperforming and want a rigorous, high expectation and contemporary way to improve student learning.
Resumo:
Unless sustained, coordinated action is generated in road safety, road traffic deaths are poised to rise from approximately 1.3 to 1.9 million a year by 2020 (Krug, 2012). To generate this harmonised response, road safety management agencies are being urged to adopt multisectoral collaboration (WHO, 2009b), which is achievable through the principle of policy integration. Yet policy integration, in its current hierarchical format, is marred by a lack of universality of its interpretation, a failure to anticipate the complexities of coordinated effort, dearth of information about its design and the absence of a normative perspective to share responsibility. This paper addresses this ill-conception of policy integration by reconceptualising it through a qualitative examination of 16 road safety stakeholders’ written submissions, lodged with the Australian Transport Council in 2011. The resulting, new principle of policy integration, Participatory Deliberative Integration, provides a conceptual framework for the alignment of effort across stakeholders in transport, health, traffic law enforcement, relevant trades and the community. With the adoption of Participatory Deliberative Integration, road safety management agencies should secure the commitment of key stakeholders in the development and implementation of, amongst other policy measures, National Road Safety Strategies and Mix Mode Integrated Timetabling.
Resumo:
Purpose – The article aims to review a university course, offered to students in both Australia and Germany, to encourage them to learn about designing, implementing, marketing and evaluating information programs and services in order to build active and engaged communities. The concepts and processes of Web 2.0 technologies come together in the learning activities, with students establishing their own personal learning networks (PLNs). Design/methodology/approach – The case study examines the principles of learning and teaching that underpin the course and presents the students' own experiences of the challenges they faced as they explored the interactive, participative and collaborative dimensions of the web. Findings – The online format of the course and the philosophy of learning through play provided students with a safe and supportive environment for them to move outside of their comfort zones, to be creative, to experiment and to develop their professional personas. Reflection on learning was a key component that stressed the value of reflective practice in assisting library and information science (LIS) professionals to adapt confidently to the rapidly changing work environment. Originality/value – This study provides insights into the opportunities for LIS courses to work across geographical boundaries, to allow students to critically appraise library practice in different contexts and to become active participants in wider professional networks.
Resumo:
Balcony acoustic treatments can be demonstrated to provide important benefits in reducing road traffic noise within the balcony space and consequently internally for any adjacent room. The actual effect on road traffic noise is derived from a multitude of variables that can be broadly categorized into (a) acoustical and (b) geometrical for two distinct propagation volumes being (i) the street space, and (ii) the balcony space. A series of recent research activities in this area has incorporated the use of a combined image and diffuse source model, which can be used to predict the effect of balconies on road traffic noise for large number of scenarios. This paper investigates and presents a method and capability to summarize predictive data into user friendly guidelines aimed for use by acoustical professionals and architects and possible implementation in building design policies for environmental noise. The paper concludes with a presentation of the likely format of a potential design guide.
Resumo:
Particulate matter research is essential because of the well known significant adverse effects of aerosol particles on human health and the environment. In particular, identification of the origin or sources of particulate matter emissions is of paramount importance in assisting efforts to control and reduce air pollution in the atmosphere. This thesis aims to: identify the sources of particulate matter; compare pollution conditions at urban, rural and roadside receptor sites; combine information about the sources with meteorological conditions at the sites to locate the emission sources; compare sources based on particle size or mass; and ultimately, provide the basis for control and reduction in particulate matter concentrations in the atmosphere. To achieve these objectives, data was obtained from assorted local and international receptor sites over long sampling periods. The samples were analysed using Ion Beam Analysis and Scanning Mobility Particle Sizer methods to measure the particle mass with chemical composition and the particle size distribution, respectively. Advanced data analysis techniques were employed to derive information from large, complex data sets. Multi-Criteria Decision Making (MCDM), a ranking method, drew on data variability to examine the overall trends, and provided the rank ordering of the sites and years that sampling was conducted. Coupled with the receptor model Positive Matrix Factorisation (PMF), the pollution emission sources were identified and meaningful information pertinent to the prioritisation of control and reduction strategies was obtained. This thesis is presented in the thesis by publication format. It includes four refereed papers which together demonstrate a novel combination of data analysis techniques that enabled particulate matter sources to be identified and sampling site/year ranked. The strength of this source identification process was corroborated when the analysis procedure was expanded to encompass multiple receptor sites. Initially applied to identify the contributing sources at roadside and suburban sites in Brisbane, the technique was subsequently applied to three receptor sites (roadside, urban and rural) located in Hong Kong. The comparable results from these international and national sites over several sampling periods indicated similarities in source contributions between receptor site-types, irrespective of global location and suggested the need to apply these methods to air pollution investigations worldwide. Furthermore, an investigation into particle size distribution data was conducted to deduce the sources of aerosol emissions based on particle size and elemental composition. Considering the adverse effects on human health caused by small-sized particles, knowledge of particle size distribution and their elemental composition provides a different perspective on the pollution problem. This thesis clearly illustrates that the application of an innovative combination of advanced data interpretation methods to identify particulate matter sources and rank sampling sites/years provides the basis for the prioritisation of future air pollution control measures. Moreover, this study contributes significantly to knowledge based on chemical composition of airborne particulate matter in Brisbane, Australia and on the identity and plausible locations of the contributing sources. Such novel source apportionment and ranking procedures are ultimately applicable to environmental investigations worldwide.
Resumo:
This paper presents the details of experimental studies on the shear behaviour and strength of lipped channel beams (LCBs). The LCB sections are commonly used as flexural members in residential, industrial and commercial buildings. To ensure safe and efficient designs of LCBs, many research studies have been undertaken on the flexural behaviour of LCBs. To date, however, limited research has been conducted into the strength of LCB sections subject to shear actions. Therefore a detailed experimental study involving 20 tests was undertaken to investigate the shear behaviour and strength of LCBs. This research has shown the presence of increased shear capacity of LCBs due to the additional fixity along the web to flange juncture, but the current design rules (AS/NZS 4600 and AISI) ignore this effect and were thus found to be conservative. Therefore they were modified by including a higher elastic shear buckling coefficient. Ultimate shear capacity results obtained from the shear tests were compared with the modified shear capacity design rules. It was found that they are still conservative as they ignore the presence of post-buckling strength. Hence the AS/NZS 4600 and AISI design rules were further modified to include the available post-buckling strength. Suitable design rules were also developed under the direct strength method (DSM) format. This paper presents the details of this study and the results including the modified shear design rules.
Resumo:
Deterministic computer simulations of physical experiments are now common techniques in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena of this nature. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This thesis investigates some practical issues in the design and analysis of computer experiments and attempts to answer some of the questions faced by experimenters using computer experiments. In particular, the question of the number of computer experiments and how they should be augmented is studied and attention is given to when the response is a function over time.
Resumo:
Ramp metering is an effective motorway control tool beneficial for mainline traffic, but the long on-ramp queues created interfere with surface traffic profoundly. This study deals with the conflict between mainline benefits and thecosts of on-ramp and surface traffic. A novel local on-ramp queue management strategy with mainline speed recovery is proposed. Microscopic simulation is used to test the new strategy and compare it with other strategies. Simulation results reveal that the ramp metering with queue management strategy provides a good balance between the mainline and on-ramp performances.
Resumo:
Welcome to the Quality assessment matrix. This matrix is designed for highly qualified discipline experts to evaluate their course, major or unit in a systematic manner. The primary purpose of the Quality assessment matrix is to provide a tool that a group of academic staff at universities can collaboratively review the assessment within a course, major or unit annually. The annual review will result in you being read for an external curricula review at any point in time. This tool is designed for use in a workshop format with one, two or more academic staff, and will lead to an action plan for implementation.