898 resultados para REACTIVE APPROACH
Resumo:
A simple and effective down-sample algorithm, Peak-Hold-Down-Sample (PHDS) algorithm is developed in this paper to enable a rapid and efficient data transfer in remote condition monitoring applications. The algorithm is particularly useful for high frequency Condition Monitoring (CM) techniques, and for low speed machine applications since the combination of the high sampling frequency and low rotating speed will generally lead to large unwieldy data size. The effectiveness of the algorithm was evaluated and tested on four sets of data in the study. One set of the data was extracted from the condition monitoring signal of a practical industry application. Another set of data was acquired from a low speed machine test rig in the laboratory. The other two sets of data were computer simulated bearing defect signals having either a single or multiple bearing defects. The results disclose that the PHDS algorithm can substantially reduce the size of data while preserving the critical bearing defect information for all the data sets used in this work even when a large down-sample ratio was used (i.e., 500 times down-sampled). In contrast, the down-sample process using existing normal down-sample technique in signal processing eliminates the useful and critical information such as bearing defect frequencies in a signal when the same down-sample ratio was employed. Noise and artificial frequency components were also induced by the normal down-sample technique, thus limits its usefulness for machine condition monitoring applications.
Resumo:
Young drivers are overrepresented in motor vehicle crash rates, and their risk increases when carrying similar aged passengers. Graduated Driver Licensing strategies have demonstrated effectiveness in reducing fatalities among young drivers, however complementary approaches may further reduce crash rates. Previous studies conducted by the researchers have shown that there is considerable potential for a passenger focus in youth road safety interventions, particularly involving the encouragement of young passengers to intervene in their peers’ risky driving (Buckley, Chapman, Sheehan & Davidson, 2012). Additionally, this research has shown that technology-based applications may be a promising means of delivering passenger safety messages, particularly as young people are increasingly accessing web-based and mobile technologies. This research describes the participatory design process undertaken to develop a web-based road safety program, and involves feasibility testing of storyboards for a youth passenger safety application. Storyboards and framework web-based materials were initially developed for a passenger safety program, using the results of previous studies involving online and school-based surveys with young people. Focus groups were then conducted with 8 school staff and 30 senior school students at one public high school in the Australian Capital Territory. Young people were asked about the situations in which passengers may feel unsafe and potential strategies for intervening in their peers’ risky driving. Students were also shown the storyboards and framework web-based material and were asked to comment on design and content issues. Teachers were also shown the material and asked about their perceptions of program design and feasibility. The focus group data will be used as part of the participatory design process, in further developing the passenger safety program. This research describes an evidence-based approach to the development of a web-based application for youth passenger safety. The findings of this research and resulting technology will have important implications for the road safety education of senior high school students.
Resumo:
The Posttraumatic Growth Inventory (PTGI; Tedeschi & Calhoun, 1996) is the most commonly used measure of positive psychological change that can result from negotiating a traumatic experience. Whilst the PTGI has strong internal reliability, validity studies are still sparse. The presented research details trauma survivors’ understanding of items comprising the PTGI in order to qualitatively assess content validity. Participants were 14 trauma survivors who completed the PTGI and participated in a semi-structured interview. Thematic Analysis was conducted on participant’s transcribed interviews. One latent theme was identified reflecting that questions were consistently understood. A relationship was found between the constituent themes identified and the five factors of the PTGI. Participants answered the PTGI statements in a way that is consistent with the purpose of the instrument with only a small discrepancy found when some participants used the PTGI scale to indicate when a decrease in an element of the inventory had been experienced. Overall results supported the content validity of the PTGI.
Resumo:
Urban renewal is a significant issue in developed urban areas, with a particular problem for urban planners being redevelopment of land to meet demand whilst ensuring compatibility with existing land use. This paper presents a geographic information systems (GIS)-based decision support tool (called LUDS) to quantitatively assess land-use suitability for site redevelopment in urban renewal areas. This consists of a model for the suitability analysis and an affiliated land-information database for residential, commercial, industrial, G/I/C (government/institution/community) and open space land uses. Development has occurred with support from interviews with industry experts, focus group meetings and an experimental trial, combined with several advanced techniques and tools, including GIS data processing and spatial analysis, multi-criterion analysis, as well as the AHP method for constructing the model and database. As demonstrated in the trial, LUDS assists planners in making land-use decisions and supports the planning process in assessing urban land-use suitability for site redevelopment. Moreover, it facilitates public consultation (participatory planning) by providing stakeholders with an explicit understanding of planners' views.
Resumo:
Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.
Resumo:
Students in the middle years encounter an increasing range of unfamiliar visuals. Visual literacy, the ability to encode and decode visuals and to think visually, is an expectation of all middle years curriculum areas and an expectation of NAPLAN literacy and numeracy tests. This article presents a multidisciplinary approach to teaching visual literacy that links the content of all learning areas and encourages students to transfer skills from familiar to unfamiliar contexts. It proposes a classification of visuals in six parts: one-dimensional; two-dimensional; map; shape; connection; and picture, based on the properties, rather than the purpose, of the visual. By placing a visual in one of these six categories, students learn to transfer the skills used to decode familiar visuals to unfamiliar cases in the same category. The article also discusses a range of other teaching strategies that can be used to complement this multi-disciplinary approach.
Resumo:
Diversity management is recognised as a major challenge for organizations throughout the world. There is broad acceptance that when it comes to all aspects of workforce management major differences exist among individuals in terms of age, gender, national origin, physical capability, sexuality, religion and others. This chapter discusses the concept, meaning and application of managing that difference or ‘diversity’ through programs known as diversity management. It identifies and discusses the different contextual and theoretical approaches that frame diversity programs found in organizations today. A number of programs within different country contexts are examined. The discussion examines the challenges of diversity management its programs and its outcomes with a view to understanding the lessons learned and recommending future directions.
Mixed methods research approach to the development and review of competency standards for dietitians
Resumo:
Aim: Competency standards support a range of professional activities including the accreditation of university courses. Reviewing these standards is essential to ensure universities continue to produce well equipped graduates, who can meet the challenge of changing workforce requirements. This paper has two aims: a) to provide an overview of the methodological approaches utilised for compilation and review of the Competency Standards for Dietetics and b) to evaluate the Dietitians Association of Australia’s Competency Standards and capture emerging and contemporary dietetic practice. Methods: A literature review of the methods used to develop Competency Standards for dietitians in Australia, including entry level, advanced level and DAA Fellow competencies and other specific areas of competency, such as public health nutrition and nutrition education is outlined and compared to other allied health professions. The mixed methods methodology used in the most recent review is described in more detail. Results: The history of Dietetic Competency Standards development and review in Australia is compared to dietetic Competency Standards internationally and within other health professions in Australia. The political context in which these standards have been developed in Australia and which has determined their format is also discussed. The results of the most recent Competency Standards review are reported to highlight emerging practice in Australia. Conclusion: The mixed methods approach used in this review provides rich data about contemporary dietetic practice. Our view supports a planned review of all Competency Standards to ensure practice informs education and credentialling and we recommend the Dietitians Association of Australia consider this in future
A particle-based micromechanics approach to simulate structural changes of plant cells during drying
Resumo:
This paper is concerned with applying a particle-based approach to simulate the micro-level cellular structural changes of plant cells during drying. The objective of the investigation was to relate the micro-level structural properties such as cell area, diameter and perimeter to the change of moisture content of the cell. Model assumes a simplified cell which consists of two basic components, cell wall and cell fluid. The cell fluid is assumed to be a Newtonian fluid with higher viscosity compared to water and cell wall is assumed to be a visco-elastic solid boundary located around the cell fluid. Cell fluid is modelled with Smoothed Particle Hydrodynamics (SPH) technique and for the cell wall; a Discrete Element Method (DEM) is used. The developed model is two-dimensional, but accounts for three-dimensional physical properties of real plant cells. Drying phenomena is simulated as fluid mass reductions and the model is used to predict the above mentioned structural properties as a function of cell fluid mass. Model predictions are found to be in fairly good agreement with experimental data in literature and the particle-based approach is demonstrated to be suitable for numerical studies of drying related structural deformations. Also a sensitivity analysis is included to demonstrate the influence of key model parameters to model predictions.
Resumo:
This paper addresses development of an ingenious decision support system (iDSS) based on the methodology of survey instruments and identification of significant variables to be used in iDSS using statistical analysis. A survey was undertaken with pregnant women and factorial experimental design was chosen to acquire sample size. Variables with good reliability in any one of the statistical techniques such as Chi-square, Cronbach’s α and Classification Tree were incorporated in the iDSS. The ingenious decision support system was implemented with Visual Basic as front end and Microsoft SQL server management as backend. Outcome of the ingenious decision support system include advice on Symptoms, Diet and Exercise to pregnant women.
Resumo:
Cartilage defects heal imperfectly and osteoarthritic changes develop frequently as a result. Although the existence of specific behaviours of chondrocytes derived from various depth-related zones in vitro has been known for over 20 years, only a relatively small body of in vitro studies has been performed with zonal chondrocytes and current clinical treatment strategies do not reflect these native depth-dependent (zonal) differences. This is surprising since mimicking the zonal organization of articular cartilage in neo-tissue by the use of zonal chondrocyte subpopulations could enhance the functionality of the graft. Although some research groups including our own have made considerable progress in tailoring culture conditions using specific growth factors and biomechanical loading protocols, we conclude that an optimal regime has not yet been determined. Other unmet challenges include the lack of specific zonal cell sorting protocols and limited amounts of cells harvested per zone. As a result, the engineering of functional tissue has not yet been realized and no long-term in vivo studies using zonal chondrocytes have been described. This paper critically reviews the research performed to date and outlines our view of the potential future significance of zonal chondrocyte populations in regenerative approaches for the treatment of cartilage defects. Secondly, we briefly discuss the capabilities of additive manufacturing technologies that can not only create patient-specific grafts directly from medical imaging data sets but could also more accurately reproduce the complex 3D zonal extracellular matrix architecture using techniques such as hydrogel-based cell printing.
Resumo:
There is a growing gap between engineering practice and engineering education that may be contributing to less engineers practicing in industry. Coaching approach to learning and teaching has been proven to be an effective way to develop people in the workplace. A pilot coaching program is offered to Engineering and Technology students in Queensland University of Technology to enable holistic growth in order to better integrate them to the work force and society at large. The results and findings of this program will be published once the program has been completed
Resumo:
This article examines the philosophy and practice of open-source technology in the development of the jam2jam XO software for the One Laptop Per Child (OLPC) computer. It explores how open-source software principles, pragmatist philosophy, improvisation and constructionist epistemologies are operationalized in the design and development of music software, and how such reflection reveals both the strengths and weaknesses of the open-source software development paradigm. An overview of the jam2jam XO platform, its development processes and music educational uses is provided and resulting reflections on the strengths and weaknesses of open-source development for music education are discussed. From an educational and software development perspective, the act of creating open-source software is shown to be a valuable enterprise, however, just because the source code, creative content and experience design are accessible and 'open' to be changed, does not guarantee that educational practices in the use of that software will change. Research around the development and use of jam2jam XO suggests that open-source software development principles can have an impact beyond software development and on to aspects of experience design and learning relationships.
A multivariate approach to the identification of surrogate parameters for heavy metals in stormwater
Resumo:
Stormwater is a potential and readily available alternative source for potable water in urban areas. However, its direct use is severely constrained by the presence of toxic pollutants, such as heavy metals (HMs). The presence of HMs in stormwater is of concern because of their chronic toxicity and persistent nature. In addition to human health impacts, metals can contribute to adverse ecosystem health impact on receiving waters. Therefore, the ability to predict the levels of HMs in stormwater is crucial for monitoring stormwater quality and for the design of effective treatment systems. Unfortunately, the current laboratory methods for determining HM concentrations are resource intensive and time consuming. In this paper, applications of multivariate data analysis techniques are presented to identify potential surrogate parameters which can be used to determine HM concentrations in stormwater. Accordingly, partial least squares was applied to identify a suite of physicochemical parameters which can serve as indicators of HMs. Datasets having varied characteristics, such as land use and particle size distribution of solids, were analyzed to validate the efficacy of the influencing parameters. Iron, manganese, total organic carbon, and inorganic carbon were identified as the predominant parameters that correlate with the HM concentrations. The practical extension of the study outcomes to urban stormwater management is also discussed.