869 resultados para Reptrack Methodology
Resumo:
Forest health surveillance (FHS) of hardwood plantations commenced in Queensland in 1997 as plantations expanded following a state government planting initiative arising from the national 2020 forest policy vision. The estate was initially characterised by a large number of small plantations (10-50 ha), although this has changed more recently with the concentration of larger plantations in the central coast and South Burnett regions. Due to the disparate nature of the resource, drive- and walkthrough surveys of subsets of plantations have been undertaken in preference to aerial surveys. FHS has been effective in detecting a number of new hardwood pests in Queensland including erinose mites (Rhombacus and Acalox spp.), western white gum plate galler (Ophelimus sp.), Creiis psyllid and bronzing bug (Thaumastocoris sp.), in evaluating their potential impact and assisting in focussing future research efforts. Since 2003 there has been an increased emphasis on training operational staff to take a greater role in identifying and reporting on forest health issues. This has increased their awareness of forest health issues, but their limited time to specifically survey and report on pests and diseases, and high rates of staff turnover, necessitate frequent ongoing training. Consequently, common and widespread problems such as quambalaria shoot blight (Quambalaria pitereka), chrysomelid leaf beetles (mainly Paropsis atomaria) and erinose mites may be under-reported or not reported, and absence data may often not be recorded at all. Comment is made on the future directions that FHS may take in hardwood plantations in Queensland.
Resumo:
Eel tailed catfish, silver perch and Murray cod are three key recreational fishing species that have declined in the Murray-Darling Basin region. This research project will be an important step towards developing methods to restore and enhance stocks of these fish.
Resumo:
Competency standards document the knowledge, skills, and attitudes required for competent performance. This study develops competency standards for dietitians in order to substantiate an approach to competency standard development. Focus groups explored the current and emerging purpose, role, and function of the profession, which were used to draft competency standards. Consensus was then sought using two rounds of a Delphi survey. Seven focus groups were conducted with 28 participants (15 employers/practitioners, 5 academics, 8 new graduates). Eighty-two of 110 invited experts participated in round one and 67 experts completed round two. Four major functions of dietitians were identified: being a professional, influencing the health of individuals, groups, communities, and populations through evidence-based nutrition practice, and working collaboratively in teams. Overall there was a high level of consensus on the standards: 93% achieved agreement by participants in round one and all revised standards achieved consensus on round 2. The methodology provides a framework for other professions wishing to embark on competency standard review or development.
Resumo:
Background Excessive speed is a primary contributing factor to young novice road trauma, including intentional and unintentional speeds above posted limits or too fast for conditions. The objective of this research was to conduct a systematic review of recent investigations into novice drivers’ speed selection, with particular attention to applications and limitations of theory and methodology. Method Systematic searches of peer-reviewed and grey literature were conducted during September 2014. Abstract reviews identified 71 references potentially meeting selection criteria of investigations since the year 2000 into factors that influence (directly or indirectly) actual speed (i.e., behaviour or performance) of young (age <25 years) and/or novice (recently-licensed) drivers. Results Full paper reviews resulted in 30 final references: 15 focused on intentional speeding and 15 on broader speed selection investigations. Both sets identified a range of individual (e.g., beliefs, personality) and social (e.g., peer, adult) influences, were predominantly theory-driven and applied cross-sectional designs. Intentional speed investigations largely utilised self-reports while other investigations more often included actual driving (simulated or ‘real world’). The latter also identified cognitive workload and external environment influences, as well as targeted interventions. Discussion and implications Applications of theory have shifted the novice speed-related literature beyond a simplistic focus on intentional speeding as human error. The potential to develop a ‘grand theory’ of intentional speeding emerged and to fill gaps to understand broader speed selection influences. This includes need for future investigations of vehicle-related and physical environment-related influences and methodologies that move beyond cross-sectional designs and rely less on self-reports.
Resumo:
The article describes a new method for obtaining a holographic image of desired magnification, consistent with the stipulated criteria for its resolution and aberrations.
Resumo:
Modern Christian theology has been at pain with the schism between the Bible and theology, and between biblical studies and systematic theology. Brevard Springs Childs is one of biblical scholars who attempt to dismiss this “iron curtain” separating the two disciplines. The present thesis aims at analyzing Childs’ concept of theological exegesis in the canonical context. In the present study I employ the method of systematic analysis. The thesis consists of seven chapters. Introduction is the first chapter. The second chapter attempts to find out the most important elements which exercise influence on Childs’ methodology of biblical theology by sketching his academic development during his career. The third chapter attempts to deal with the crucial question why and how the concept of the canon is so important for Childs’ methodology of biblical theology. In chapter four I analyze why and how Childs is dissatisfied with historical-critical scholarship and I point out the differences and similarities between his canonical approach and historical criticism. The fifth chapter attempts at discussing Childs’ central concepts of theological exegesis by investigating whether a Christocentric approach is an appropriate way of creating a unified biblical theology. In the sixth chapter I present a critical evaluation and methodological reflection of Childs’ theological exegesis in the canonical context. The final chapter sums up the key points of Childs’ methodology of biblical theology. The basic results of this thesis are as follows: First, the fundamental elements of Childs’ theological thinking are rooted in Reformed theological tradition and in modern theological neo-orthodoxy and in its most prominent theologian, Karl Barth. The American Biblical Theological Movement and the controversy between Protestant liberalism and conservatism in the modern American context cultivate his theological sensitivity and position. Second, Childs attempts to dismiss negative influences of the historical-critical method by establishing canon-based theological exegesis leading into confessional biblical theology. Childs employs terminology such as canonical intentionality, the wholeness of the canon, the canon as the most appropriate context for doing a biblical theology, and the continuity of the two Testaments, in order to put into effect his canonical program. Childs demonstrates forcefully the inadequacies of the historical-critical method in creating biblical theology in biblical hermeneutics, doctrinal theology, and pastoral practice. His canonical approach endeavors to establish and create post-critical Christian biblical theology, and works within the traditional framework of faith seeking understanding. Third, Childs’ biblical theology has a double task: descriptive and constructive, the former connects biblical theology with exegesis, the later with dogmatic theology. He attempts to use a comprehensive model, which combines a thematic investigation of the essential theological contents of the Bible with a systematic analysis of the contents of the Christian faith. Childs also attempts to unite Old Testament theology and New Testament theology into one unified biblical theology. Fourth, some problematic points of Childs’ thinking need to be mentioned. For instance, his emphasis on the final form of the text of the biblical canon is highly controversial, yet Childs firmly believes in it, he even regards it as the corner stone of his biblical theology. The relationship between the canon and the doctrine of biblical inspiration is weak. He does not clearly define whether Scripture is God’s word or whether it only “witnesses” to it. Childs’ concepts of “the word of God” and “divine revelation” remain unclear, and their ontological status is ambiguous. Childs’ theological exegesis in the canonical context is a new attempt in the modern history of Christian theology. It expresses his sincere effort to create a path for doing biblical theology. Certainly, it was just a modest beginning of a long process.
Resumo:
Early detection of (pre-)signs of ulceration on a diabetic foot is valuable for clinical practice. Hyperspectral imaging is a promising technique for detection and classification of such (pre-)signs. However, the number of the spectral bands should be limited to avoid overfitting, which is critical for pixel classification with hyperspectral image data. The goal was to design a detector/classifier based on spectral imaging (SI) with a small number of optical bandpass filters. The performance and stability of the design were also investigated. The selection of the bandpass filters boils down to a feature selection problem. A dataset was built, containing reflectance spectra of 227 skin spots from 64 patients, measured with a spectrometer. Each skin spot was annotated manually by clinicians as "healthy" or a specific (pre-)sign of ulceration. Statistical analysis on the data set showed the number of required filters is between 3 and 7, depending on additional constraints on the filter set. The stability analysis revealed that shot noise was the most critical factor affecting the classification performance. It indicated that this impact could be avoided in future SI systems with a camera sensor whose saturation level is higher than 106, or by postimage processing.
Resumo:
The paper presents an innovative approach to modelling the causal relationships of human errors in rail crack incidents (RCI) from a managerial perspective. A Bayesian belief network is developed to model RCI by considering the human errors of designers, manufactures, operators and maintainers (DMOM) and the causal relationships involved. A set of dependent variables whose combinations express the relevant functions performed by each DMOM participant is used to model the causal relationships. A total of 14 RCI on Hong Kong’s mass transit railway (MTR) from 2008 to 2011 are used to illustrate the application of the model. Bayesian inference is used to conduct an importance analysis to assess the impact of the participants’ errors. Sensitivity analysis is then employed to gauge the effect the increased probability of occurrence of human errors on RCI. Finally, strategies for human error identification and mitigation of RCI are proposed. The identification of ability of maintainer in the case study as the most important factor influencing the probability of RCI implies the priority need to strengthen the maintenance management of the MTR system and that improving the inspection ability of the maintainer is likely to be an effective strategy for RCI risk mitigation.
Resumo:
Design based research (DBR) is an appropriate method for small scale educational research projects involving collaboration between teachers, students and researchers. It is particularly useful in collaborative projects where an intervention is implemented and evaluated in a grounded context. The intervention can be technological, or a new program required by policy changes. It can be applied to educational contexts, such as when English teachers undertake higher degree research projects in their own or others’ sites; or for academics working collaboratively as researchers with teams of teachers. In the case described here the paper shows that DBR is designed to make a difference in the real world contexts in which occurs.
Resumo:
Design research informs and supports practice by developing knowledge to improve the chances of producing successful products.Training in design research has been poorly supported. Design research uses human and natural/technical sciences, embracing all facets of design; its methods and tools are adapted from both these traditions. However, design researchers are rarely trained in methods from both the traditions. Research in traditional sciences focuses primarily on understanding phenomena related to human, natural, or technical systems. Design research focuses on supporting improvement of such systems, using understanding as a necessary but not sufficient step, and it must embrace methods for both understanding reality and developing support for its improvement. A one-semester, postgraduate-level, credited course that has been offered since 2002, entitled Methodology for Design Research, is described that teaches a methodology for carrying out research into design. Its steps are to clarify research success; to understand relevant phenomena of design and how these influence success; to use this to envision design improvement and develop proposals for supporting improvement; to evaluate support for its influence on success; and, if unacceptable, to modify, support, or improve the understanding of success and its links to the phenomena of design. This paper highlights some major issues about the status of design research and describes how design research methodology addresses these. The teaching material, model of delivery, and evaluation of the course on methodology for design research are discussed.
Resumo:
Biomimetics involves transfer from one or more biological examples to a technical system. This study addresses four questions. What are the essential steps in a biomimetic process? What is transferred? How can the transferred knowledge be structured in a way useful for biologists and engineers? Which guidelines can be given to support transfer in biomimetic design processes? In order to identify the essential steps involved in carrying out biomimetics, several procedures found in the literature were summarized, and four essential steps that are common across these procedures were identified. For identification of mechanisms for transfer, 20 biomimetic examples were collected and modeled according to a model. of causality called the SAPPhIRE model. These examples were then analyzed for identifying the underlying similarity between each biological and corresponding analogue technical system. Based on the SAPPhIRE model, four levels of abstraction at which transfer takes place were identified. Taking into account similarity, the biomimetic examples were assigned to the appropriate levels of abstraction of transfer. Based on the essential steps and the levels of transfer, guidelines for supporting transfer in biomimetic design were proposed and evaluated using design experiments. The 20 biological and analogue technical systems that were analyzed were similar in the physical effects used and at the most abstract levels of description of their functionality, but they were the least similar at the lowest levels of abstraction: the parts involved. Transfer most often was carried out at the physical effect level of abstraction. Compared to a generic set of guidelines based on the literature, the proposed guidelines improved design performance by about 60%. Further, the SAPPhIRE model turned out to be a useful representation for modeling complex biological systems and their functionality. Databases of biological systems, which are structured using the SAPPhIRE model, have the potential to aid biomimetic concept generation.
Resumo:
A combination of benzyltriethylammonium borohydride and chlorotrimethylsilane (1:1) in dichloromethane (0-25°C) has been found to be a convenient reagent system for the selective reduction of carboxylic acids to alcohols.
Resumo:
Distributed computing systems can be modeled adequately by Petri nets. The computation of invariants of Petri nets becomes necessary for proving the properties of modeled systems. This paper presents a two-phase, bottom-up approach for invariant computation and analysis of Petri nets. In the first phase, a newly defined subnet, called the RP-subnet, with an invariant is chosen. In the second phase, the selected RP-subnet is analyzed. Our methodology is illustrated with two examples viz., the dining philosophers' problem and the connection-disconnection phase of a transport protocol. We believe that this new method, which is computationally no worse than the existing techniques, would simplify the analysis of many practical distributed systems.